WorldWideScience

Sample records for scenarios interoperability infrastructures

  1. GEOSS AIP-2 Climate Change and Biodiversity Use Scenarios: Interoperability Infrastructures

    Science.gov (United States)

    Nativi, Stefano; Santoro, Mattia

    2010-05-01

    In the last years, scientific community is producing great efforts in order to study the effects of climate change on life on Earth. In this general framework, a key role is played by the impact of climate change on biodiversity. To assess this, several use scenarios require the modeling of climatological change impact on the regional distribution of biodiversity species. Designing and developing interoperability infrastructures which enable scientists to search, discover, access and use multi-disciplinary resources (i.e. datasets, services, models, etc.) is currently one of the main research fields for the Earth and Space Science Informatics. This presentation introduces and discusses an interoperability infrastructure which implements the discovery, access, and chaining of loosely-coupled resources in the climatology and biodiversity domains. This allows to set up and run forecast and processing models. The presented framework was successfully developed and experimented in the context of GEOSS AIP-2 (Global Earth Observation System of Systems, Architecture Implementation Pilot- Phase 2) Climate Change & Biodiversity thematic Working Group. This interoperability infrastructure is comprised of the following main components and services: a)GEO Portal: through this component end user is able to search, find and access the needed services for the scenario execution; b)Graphical User Interface (GUI): this component provides user interaction functionalities. It controls the workflow manager to perform the required operations for the scenario implementation; c)Use Scenario controller: this component acts as a workflow controller implementing the scenario business process -i.e. a typical climate change & biodiversity projection scenario; d)Service Broker implementing Mediation Services: this component realizes a distributed catalogue which federates several discovery and access components (exposing them through a unique CSW standard interface). Federated components

  2. GEOSS AIP-2 Climate Change and Biodiversity Use Scenarios: Interoperability Infrastructures (Invited)

    Science.gov (United States)

    Nativi, S.; Santoro, M.

    2009-12-01

    Currently, one of the major challenges for scientific community is the study of climate change effects on life on Earth. To achieve this, it is crucial to understand how climate change will impact on biodiversity and, in this context, several application scenarios require modeling the impact of climate change on distribution of individual species. In the context of GEOSS AIP-2 (Global Earth Observation System of Systems, Architecture Implementation Pilot- Phase 2), the Climate Change & Biodiversity thematic Working Group developed three significant user scenarios. A couple of them make use of a GEOSS-based framework to study the impact of climate change factors on regional species distribution. The presentation introduces and discusses this framework which provides an interoperability infrastructures to loosely couple standard services and components to discover and access climate and biodiversity data, and run forecast and processing models. The framework is comprised of the following main components and services: a)GEO Portal: through this component end user is able to search, find and access the needed services for the scenario execution; b)Graphical User Interface (GUI): this component provides user interaction functionalities. It controls the workflow manager to perform the required operations for the scenario implementation; c)Use Scenario controller: this component acts as a workflow controller implementing the scenario business process -i.e. a typical climate change & biodiversity projection scenario; d)Service Broker implementing Mediation Services: this component realizes a distributed catalogue which federates several discovery and access components (exposing them through a unique CSW standard interface). Federated components publish climate, environmental and biodiversity datasets; e)Ecological Niche Model Server: this component is able to run one or more Ecological Niche Models (ENM) on selected biodiversity and climate datasets; f)Data Access

  3. Interoperation of World-Wide Production e-Science Infrastructures

    CERN Document Server

    Riedel, M; Soddemann, T; Field, L; Navarro, JP; Casey, J; Litmaath, M; Baud, J; Koblitz, B; Catlett, C; Skow, D; Wang, S; Saeki, Y; Sato, H; Matsuoka, S; Geddes, N

    Many production Grid and e-Science infrastructures have begun to offer services to end-users during the past several years with an increasing number of scientific applications that require access to a wide variety of resources and services in multiple Grids. Therefore, the Grid Interoperation Now—Community Group of the Open Grid Forum—organizes and manages interoperation efforts among those production Grid infrastructures to reach the goal of a world-wide Grid vision on a technical level in the near future. This contribution highlights fundamental approaches of the group and discusses open standards in the context of production e-Science infrastructures.

  4. Towards sustainability: An interoperability outline for a Regional ARC based infrastructure in the WLCG and EGEE infrastructures

    International Nuclear Information System (INIS)

    Field, L; Gronager, M; Johansson, D; Kleist, J

    2010-01-01

    Interoperability of grid infrastructures is becoming increasingly important in the emergence of large scale grid infrastructures based on national and regional initiatives. To achieve interoperability of grid infrastructures adaptions and bridging of many different systems and services needs to be tackled. A grid infrastructure offers services for authentication, authorization, accounting, monitoring, operation besides from the services for handling and data and computations. This paper presents an outline of the work done to integrate the Nordic Tier-1 and 2s, which for the compute part is based on the ARC middleware, into the WLCG grid infrastructure co-operated by the EGEE project. Especially, a throughout description of integration of the compute services is presented.

  5. Secure and interoperable communication infrastructures for PPDR organisations

    Science.gov (United States)

    Müller, Wilmuth; Marques, Hugo; Pereira, Luis; Rodriguez, Jonathan; Brouwer, Frank; Bouwers, Bert; Politis, Ilias; Lykourgiotis, Asimakis; Ladas, Alexandros; Adigun, Olayinka; Jelenc, David

    2016-05-01

    The growing number of events affecting public safety and security (PS&S) on a regional scale with potential to grow up to large scale cross border disasters puts an increased pressure on agencies and organisation responsible for PS&S. In order to respond timely and in an adequate manner to such events, Public Protection and Disaster Relief (PPDR) organisations need to cooperate, align their procedures and activities, share the needed information and be interoperable. Existing PPDR/PMR technologies such as TETRA, TETRAPOL or P25, do not currently provide broadband capability nor is expected such technologies to be upgraded in the future. This presents a major limitation in supporting new services and information flows. Furthermore, there is no known standard that addresses interoperability of these technologies. In this contribution the design of a next generation communication infrastructure for PPDR organisations which fulfills the requirements of secure and seamless end-to-end communication and interoperable information exchange within the deployed communication networks is presented. Based on Enterprise Architecture of PPDR organisations, a next generation PPDR network that is backward compatible with legacy communication technologies is designed and implemented, capable of providing security, privacy, seamless mobility, QoS and reliability support for mission-critical Private Mobile Radio (PMR) voice and broadband data services. The designed solution provides a robust, reliable, and secure mobile broadband communications system for a wide variety of PMR applications and services on PPDR broadband networks, including the ability of inter-system, interagency and cross-border operations with emphasis on interoperability between users in PMR and LTE.

  6. An Architecture for Semantically Interoperable Electronic Health Records.

    Science.gov (United States)

    Toffanello, André; Gonçalves, Ricardo; Kitajima, Adriana; Puttini, Ricardo; Aguiar, Atualpa

    2017-01-01

    Despite the increasing adhesion of electronic health records, the challenge of semantic interoperability remains unsolved. The fact that different parties can exchange messages does not mean they can understand the underlying clinical meaning, therefore, it cannot be assumed or treated as a requirement. This work introduces an architecture designed to achieve semantic interoperability, in a way which organizations that follow different policies may still share medical information through a common infrastructure comparable to an ecosystem, whose organisms are exemplified within the Brazilian scenario. Nonetheless, the proposed approach describes a service-oriented design with modules adaptable to different contexts. We also discuss the establishment of an enterprise service bus to mediate a health infrastructure defined on top of international standards, such as openEHR and IHE. Moreover, we argue that, in order to achieve truly semantic interoperability in a wide sense, a proper profile must be published and maintained.

  7. Grid interoperability: the interoperations cookbook

    Energy Technology Data Exchange (ETDEWEB)

    Field, L; Schulz, M [CERN (Switzerland)], E-mail: Laurence.Field@cern.ch, E-mail: Markus.Schulz@cern.ch

    2008-07-01

    Over recent years a number of grid projects have emerged which have built grid infrastructures that are now the computing backbones for various user communities. A significant number of these communities are limited to one grid infrastructure due to the different middleware and procedures used in each grid. Grid interoperation is trying to bridge these differences and enable virtual organizations to access resources independent of the grid project affiliation. This paper gives an overview of grid interoperation and describes the current methods used to bridge the differences between grids. Actual use cases encountered during the last three years are discussed and the most important interfaces required for interoperability are highlighted. A summary of the standardisation efforts in these areas is given and we argue for moving more aggressively towards standards.

  8. Grid interoperability: the interoperations cookbook

    International Nuclear Information System (INIS)

    Field, L; Schulz, M

    2008-01-01

    Over recent years a number of grid projects have emerged which have built grid infrastructures that are now the computing backbones for various user communities. A significant number of these communities are limited to one grid infrastructure due to the different middleware and procedures used in each grid. Grid interoperation is trying to bridge these differences and enable virtual organizations to access resources independent of the grid project affiliation. This paper gives an overview of grid interoperation and describes the current methods used to bridge the differences between grids. Actual use cases encountered during the last three years are discussed and the most important interfaces required for interoperability are highlighted. A summary of the standardisation efforts in these areas is given and we argue for moving more aggressively towards standards

  9. Interoperability challenges in river discharge modelling: A cross domain application scenario

    Science.gov (United States)

    Santoro, Mattia; Andres, Volker; Jirka, Simon; Koike, Toshio; Looser, Ulrich; Nativi, Stefano; Pappenberger, Florian; Schlummer, Manuela; Strauch, Adrian; Utech, Michael; Zsoter, Ervin

    2018-06-01

    River discharge is a critical water cycle variable, as it integrates all the processes (e.g. runoff and evapotranspiration) occurring within a river basin and provides a hydrological output variable that can be readily measured. Its prediction is of invaluable help for many water-related tasks including water resources assessment and management, flood protection, and disaster mitigation. Observations of river discharge are important to calibrate and validate hydrological or coupled land, atmosphere and ocean models. This requires using datasets from different scientific domains (Water, Weather, etc.). Typically, such datasets are provided using different technological solutions. This complicates the integration of new hydrological data sources into application systems. Therefore, a considerable effort is often spent on data access issues instead of the actual scientific question. This paper describes the work performed to address multidisciplinary interoperability challenges related to river discharge modeling and validation. This includes definition and standardization of domain specific interoperability standards for hydrological data sharing and their support in global frameworks such as the Global Earth Observation System of Systems (GEOSS). The research was developed in the context of the EU FP7-funded project GEOWOW (GEOSS Interoperability for Weather, Ocean and Water), which implemented a "River Discharge" application scenario. This scenario demonstrates the combination of river discharge observations data from the Global Runoff Data Centre (GRDC) database and model outputs produced by the European Centre for Medium-Range Weather Forecasts (ECMWF) predicting river discharge based on weather forecast information in the context of the GEOSS.

  10. Landscape of the EU-US Research Infrastructures and actors: Moving towards international interoperability of earth system data

    Science.gov (United States)

    Asmi, Ari; Powers, Lindsay

    2015-04-01

    Research Infrastructures (RIs) are major long-term investments supporting innovative, bottom-up research activities. In the environmental research, they range from high atmosphere radars, to field observation networks and coordinated laboratory facilities. The Earth system is highly interactive and each part of the system interconnected across the spatial and disciplinary borders. However, due practical and historical reasons, the RIs are built from disciplinary points-of-view and separately in different parts of the world, with differing standards, policies, methods and research cultures. This heterogeneity provides necessary diversity to study the complex Earth system, but makes cross-disciplinary and/or global interoperability a challenge. Global actions towards better interoperability are surfacing, especially with EU and US. For example, recent mandates within the US government prioritize open data for federal agencies and federally funded science, and encourage collaboration among agencies to reduce duplication of efforts and increase efficient use of resources. There are several existing initiatives working toward these goals (e.g., COOPEUS, EarthCube, RDA, ICSU-WDS, DataOne, ESIP, USGEO, GEO). However, there is no cohesive framework to coordinate efforts among these, and other, entities. COOPEUS and EarthCube have now begun to map the landscape of interoperability efforts across earth science domains. The COOPEUS mapping effort describes the EU and US landscape of environmental research infrastructures to accomplish the following: identify gaps in services (data provision) necessary to address societal priorities; provide guidance for development of future research infrastructures; and identify opportunities for Research Infrastructures (RIs) to collaborate on issues of common interest. EarthCube mapping effort identifies opportunities to engage a broader community by identifying scientific domain organizations and entities. We present the current situation

  11. Data Distribution Service-Based Interoperability Framework for Smart Grid Testbed Infrastructure

    Directory of Open Access Journals (Sweden)

    Tarek A. Youssef

    2016-03-01

    Full Text Available This paper presents the design and implementation of a communication and control infrastructure for smart grid operation. The proposed infrastructure enhances the reliability of the measurements and control network. The advantages of utilizing the data-centric over message-centric communication approach are discussed in the context of smart grid applications. The data distribution service (DDS is used to implement a data-centric common data bus for the smart grid. This common data bus improves the communication reliability, enabling distributed control and smart load management. These enhancements are achieved by avoiding a single point of failure while enabling peer-to-peer communication and an automatic discovery feature for dynamic participating nodes. The infrastructure and ideas presented in this paper were implemented and tested on the smart grid testbed. A toolbox and application programing interface for the testbed infrastructure are developed in order to facilitate interoperability and remote access to the testbed. This interface allows control, monitoring, and performing of experiments remotely. Furthermore, it could be used to integrate multidisciplinary testbeds to study complex cyber-physical systems (CPS.

  12. KTM Tokamak operation scenarios software infrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Pavlov, V.; Baystrukov, K.; Golobkov, YU.; Ovchinnikov, A.; Meaentsev, A.; Merkulov, S.; Lee, A. [National Research Tomsk Polytechnic University, Tomsk (Russian Federation); Tazhibayeva, I.; Shapovalov, G. [National Nuclear Center (NNC), Kurchatov (Kazakhstan)

    2014-10-15

    One of the largest problems for tokamak devices such as Kazakhstan Tokamak for Material Testing (KTM) is the operation scenarios' development and execution. Operation scenarios may be varied often, so a convenient hardware and software solution is required for scenario management and execution. Dozens of diagnostic and control subsystems with numerous configuration settings may be used in an experiment, so it is required to automate the subsystem configuration process to coordinate changes of the related settings and to prevent errors. Most of the diagnostic and control subsystems software at KTM was unified using an extra software layer, describing the hardware abstraction interface. The experiment sequence was described using a command language. The whole infrastructure was brought together by a universal communication protocol supporting various media, including Ethernet and serial links. The operation sequence execution infrastructure was used at KTM to carry out plasma experiments.

  13. Scenario Based Network Infrastructure Planning

    DEFF Research Database (Denmark)

    Knudsen, Thomas Phillip; Pedersen, Jens Myrup; Madsen, Ole Brun

    2005-01-01

    The paper presents a method for IT infrastructure planning that take into account very long term developments in usages. The method creates a scenario for a final, time independent stage in the planning process. The method abstracts relevant modelling factors from available information...

  14. BIM Interoperability Limitations: Australian and Malaysian Rail Projects

    Directory of Open Access Journals (Sweden)

    Kenley Russell

    2016-01-01

    Full Text Available Building information modelling (BIM is defined as a process involving the generation and management of digital representation of physical and functional characteristics of a facility. The purpose of interoperability in integrated or “open” BIM is to facilitate the information exchange between different digital systems, models and tools. There has been effort towards data interoperability with development of open source standards and object-oriented models, such as industry foundation classes (IFC for vertical infrastructure. However, the lack of open data standards for the information exchange for horizontal infrastructure limits the adoption and effectiveness of integrated BIM. The paper outlines two interoperability issues for construction of rail infrastructure. The issues are presented in two case study reports, one from Australia and one from Malaysia. The each case study includes: a description of the project, the application of BIM in the project, a discussion of the promised BIM interoperability solution plus the identification of the unresolved lack of interoperability for horizontal infrastructure project management. The Moreton Bay Rail project in Australia introduces general software interoperability issues. The Light Rail Extension project in Kuala Lumpur outlines an example of the integration problems related to two different location data structures. The paper highlights how the continuing lack of data interoperability limits utilisation of integrated BIM for horizontal infrastructure rail projects.

  15. Towards technical interoperability in telemedicine.

    Energy Technology Data Exchange (ETDEWEB)

    Craft, Richard Layne, II

    2004-05-01

    For telemedicine to realize the vision of anywhere, anytime access to care, the question of how to create a fully interoperable technical infrastructure must be addressed. After briefly discussing how 'technical interoperability' compares with other types of interoperability being addressed in the telemedicine community today, this paper describes reasons for pursuing technical interoperability, presents a proposed framework for realizing technical interoperability, identifies key issues that will need to be addressed if technical interoperability is to be achieved, and suggests a course of action that the telemedicine community might follow to accomplish this goal.

  16. Interoperability Strategic Vision

    Energy Technology Data Exchange (ETDEWEB)

    Widergren, Steven E.; Knight, Mark R.; Melton, Ronald B.; Narang, David; Martin, Maurice; Nordman, Bruce; Khandekar, Aditya; Hardy, Keith S.

    2018-02-28

    The Interoperability Strategic Vision whitepaper aims to promote a common understanding of the meaning and characteristics of interoperability and to provide a strategy to advance the state of interoperability as applied to integration challenges facing grid modernization. This includes addressing the quality of integrating devices and systems and the discipline to improve the process of successfully integrating these components as business models and information technology improve over time. The strategic vision for interoperability described in this document applies throughout the electric energy generation, delivery, and end-use supply chain. Its scope includes interactive technologies and business processes from bulk energy levels to lower voltage level equipment and the millions of appliances that are becoming equipped with processing power and communication interfaces. A transformational aspect of a vision for interoperability in the future electric system is the coordinated operation of intelligent devices and systems at the edges of grid infrastructure. This challenge offers an example for addressing interoperability concerns throughout the electric system.

  17. An E-government Interoperability Platform Supporting Personal Data Protection Regulations

    Directory of Open Access Journals (Sweden)

    Laura González

    2016-08-01

    Full Text Available Public agencies are increasingly required to collaborate with each other in order to provide high-quality e-government services. This collaboration is usually based on the service-oriented approach and supported by interoperability platforms. Such platforms are specialized middleware-based infrastructures enabling the provision, discovery and invocation of interoperable software services. In turn, given that personal data handled by governments are often very sensitive, most governments have developed some sort of legislation focusing on data protection. This paper proposes solutions for monitoring and enforcing data protection laws within an E-government Interoperability Platform. In particular, the proposal addresses requirements posed by the Uruguayan Data Protection Law and the Uruguayan E-government Platform, although it can also be applied in similar scenarios. The solutions are based on well-known integration mechanisms (e.g. Enterprise Service Bus as well as recognized security standards (e.g. eXtensible Access Control Markup Language and were completely prototyped leveraging the SwitchYard ESB product.

  18. Reference architecture and interoperability model for data mining and fusion in scientific cross-domain infrastructures

    Science.gov (United States)

    Haener, Rainer; Waechter, Joachim; Grellet, Sylvain; Robida, Francois

    2017-04-01

    Interoperability is the key factor in establishing scientific research environments and infrastructures, as well as in bringing together heterogeneous, geographically distributed risk management, monitoring, and early warning systems. Based on developments within the European Plate Observing System (EPOS), a reference architecture has been devised that comprises architectural blue-prints and interoperability models regarding the specification of business processes and logic as well as the encoding of data, metadata, and semantics. The architectural blueprint is developed on the basis of the so called service-oriented architecture (SOA) 2.0 paradigm, which combines intelligence and proactiveness of event-driven with service-oriented architectures. SOA 2.0 supports analysing (Data Mining) both, static and real-time data in order to find correlations of disparate information that do not at first appear to be intuitively obvious: Analysed data (e.g., seismological monitoring) can be enhanced with relationships discovered by associating them (Data Fusion) with other data (e.g., creepmeter monitoring), with digital models of geological structures, or with the simulation of geological processes. The interoperability model describes the information, communication (conversations) and the interactions (choreographies) of all participants involved as well as the processes for registering, providing, and retrieving information. It is based on the principles of functional integration, implemented via dedicated services, communicating via service-oriented and message-driven infrastructures. The services provide their functionality via standardised interfaces: Instead of requesting data directly, users share data via services that are built upon specific adapters. This approach replaces the tight coupling at data level by a flexible dependency on loosely coupled services. The main component of the interoperability model is the comprehensive semantic description of the information

  19. Semantic Interoperability in Heterogeneous IoT Infrastructure for Healthcare

    Directory of Open Access Journals (Sweden)

    Sohail Jabbar

    2017-01-01

    Full Text Available Interoperability remains a significant burden to the developers of Internet of Things’ Systems. This is due to the fact that the IoT devices are highly heterogeneous in terms of underlying communication protocols, data formats, and technologies. Secondly due to lack of worldwide acceptable standards, interoperability tools remain limited. In this paper, we proposed an IoT based Semantic Interoperability Model (IoT-SIM to provide Semantic Interoperability among heterogeneous IoT devices in healthcare domain. Physicians communicate their patients with heterogeneous IoT devices to monitor their current health status. Information between physician and patient is semantically annotated and communicated in a meaningful way. A lightweight model for semantic annotation of data using heterogeneous devices in IoT is proposed to provide annotations for data. Resource Description Framework (RDF is a semantic web framework that is used to relate things using triples to make it semantically meaningful. RDF annotated patients’ data has made it semantically interoperable. SPARQL query is used to extract records from RDF graph. For simulation of system, we used Tableau, Gruff-6.2.0, and Mysql tools.

  20. Interoperability for electronic ID

    OpenAIRE

    Zygadlo, Zuzanna

    2009-01-01

    Electronic Business, including eBanking, eCommerce and eGovernmental services, is today based on a large variety of security solutions, comprising electronic IDs provided by a broad community of Public Key Infrastructure (PKI) vendors. Significant differences in implementations of those solutions introduce a problem of lack of interoperability in electronic business, which have not yet been resolved by standardization and interoperability initiatives based on existing PKI trust models. It i...

  1. On MDA - SOA based Intercloud Interoperability framework

    Directory of Open Access Journals (Sweden)

    Tahereh Nodehi

    2013-01-01

    Full Text Available Cloud computing has been one of the latest technologies which assures reliable delivery of on - demand computing services over the Internet. Cloud service providers have established geographically distributed data centers and computing resources, which are available online as service. The clouds operated by different service providers working together in collaboration can open up lots more spaces for innovative scenarios with huge amount of resources provisioning on demand. However, current cloud systems do not support intercloud interoperability. This paper is thus motivated to address Intercloud Interoperabilityby analyzing different methodologies that have been applied to resolve various scenarios of interoperability. Model Driven Architecture (MDA and Service Oriented Architecture (SOA method have been used to address interoperability in various scenarios, which also opens up spaces to address intercloud interoperability by making use of these well accepted methodologies. The focus of this document is to show Intercloud Interoperability can be supported through a Model Driven approach and Service Oriented systems. Moreover, the current state of the art in Intercloud, concept and benefits of MDA and SOA are discussed in the paper. At the same time this paper also proposes a generic architecture for MDA - SOA based framework, which can be useful for developing applications which will require intercloud interoperability. The paper justi fies the usability of the framework by a use - case scenario for dynamic workload migration among heterogeneous clouds.

  2. Defining inter-cloud architecture for interoperability and integration

    NARCIS (Netherlands)

    Demchenko, Y.; Ngo, C.; Makkes, M.X.; Strijkers, R.; de Laat, C.; Zimmermann, W.; Lee, Y.W.; Demchenko, Y.

    2012-01-01

    This paper presents an on-going research to develop the Inter-Cloud Architecture, which addresses the architectural problems in multi-provider multi-domain heterogeneous cloud based applications integration and interoperability, including integration and interoperability with legacy infrastructure

  3. Defining Inter-Cloud Architecture for Interoperability and Integration

    NARCIS (Netherlands)

    Demchenko, Y.; Ngo, C.; Makkes, M.X.; Strijkers, R.J.; Laat, C. de

    2012-01-01

    This paper presents on-going research to develop the Inter-Cloud Architecture that should address problems in multi-provider multi-domain heterogeneous Cloud based applications integration and interoperability, including integration and interoperability with legacy infrastructure services. Cloud

  4. Enterprise interoperability VI : Interoperability for Agility, Resilience and Plasticity of Collaboration

    CERN Document Server

    Bénaben, Frédérick; Poler, Raúl; Bourrières, Jean-Paul

    2014-01-01

    A concise reference to the state of the art in systems interoperability, Enterprise Interoperability VI will be of great value to engineers and computer scientists working in manufacturing and other process industries and to software engineers and electronic and manufacturing engineers working in the academic environment. Over 40 papers, ranging from academic research through case studies to industrial and administrative experience of interoperability show how, in a scenario of globalised markets, where the capacity to cooperate with other firms efficiently starts to become essential in order to remain in the market in an economically, socially and environmentally cost-effective manner, the most innovative enterprises are beginning to redesign their business model to become interoperable. This goal of interoperability is essential, not only from the perspective of the individual enterprise but also in the new business structures that are now emerging, such as supply chains, virtual enterprises, interconnected...

  5. Advancing Smart Grid Interoperability and Implementing NIST's Interoperability Roadmap

    Energy Technology Data Exchange (ETDEWEB)

    Basso,T.; DeBlasio, R.

    2010-04-01

    The IEEE American National Standards project P2030TM addressing smart grid interoperability and the IEEE 1547 series of standards addressing distributed resources interconnection with the grid have been identified in priority action plans in the Report to NIST on the Smart Grid Interoperability Standards Roadmap. This paper presents the status of the IEEE P2030 development, the IEEE 1547 series of standards publications and drafts, and provides insight on systems integration and grid infrastructure. The P2030 and 1547 series of standards are sponsored by IEEE Standards Coordinating Committee 21.

  6. Future Interoperability of Camp Protection Systems (FICAPS)

    Science.gov (United States)

    Caron, Sylvie; Gündisch, Rainer; Marchand, Alain; Stahl, Karl-Hermann

    2013-05-01

    The FICAPS Project has been established as a Project of the European Defence Agency based on an initiative of Germany and France. Goal of this Project was to derive Guidelines, which by a proper implementation in future developments improve Camp Protection Systems (CPS) by enabling and improving interoperability between Camp Protection Systems and its Equipments of different Nations involved in multinational missions. These Guidelines shall allow for: • Real-time information exchange between equipments and systems of different suppliers and nations (even via SatCom), • Quick and easy replacement of equipments (even of different Nations) at run-time in the field by means of plug and play capability, thus lowering the operational and logistic costs and making the system highly available, • Enhancement of system capabilities (open and modular systems) by adding new equipment with new capabilities (just plug-in, automatic adjustment of the HMI Human Machine Interface) without costly and time consuming validation and test on system level (validation and test can be done on Equipment level), Four scenarios have been identified to summarize the interoperability requirements from an operational viewpoint. To prove the definitions given in the Guideline Document, a French and a German Demonstration System, based on existing national assets, were realized. Demonstrations, showing the capabilities given by the defined interoperability requirements with respect to the operational scenarios, were performed. Demonstrations included remote control of a CPS by another CPS, remote sensor control (Electro-Optic/InfraRed EO/IR) and remote effector control. This capability can be applied to extend the protection area or to protect distant infrastructural assets Demonstrations have been performed. The required interoperability functionality was shown successfully. Even if the focus of the FICAPS project was on camp protection, the solution found is also appropriate for other

  7. Grid interoperability: joining grid information systems

    International Nuclear Information System (INIS)

    Flechl, M; Field, L

    2008-01-01

    A grid is defined as being 'coordinated resource sharing and problem solving in dynamic, multi-institutional virtual organizations'. Over recent years a number of grid projects, many of which have a strong regional presence, have emerged to help coordinate institutions and enable grids. Today, we face a situation where a number of grid projects exist, most of which are using slightly different middleware. Grid interoperation is trying to bridge these differences and enable Virtual Organizations to access resources at the institutions independent of their grid project affiliation. Grid interoperation is usually a bilateral activity between two grid infrastructures. Recently within the Open Grid Forum, the Grid Interoperability Now (GIN) Community Group is trying to build upon these bilateral activities. The GIN group is a focal point where all the infrastructures can come together to share ideas and experiences on grid interoperation. It is hoped that each bilateral activity will bring us one step closer to the overall goal of a uniform grid landscape. A fundamental aspect of a grid is the information system, which is used to find available grid services. As different grids use different information systems, interoperation between these systems is crucial for grid interoperability. This paper describes the work carried out to overcome these differences between a number of grid projects and the experiences gained. It focuses on the different techniques used and highlights the important areas for future standardization

  8. Case studies of scenario analysis for adaptive management of natural resource and infrastructure systems

    DEFF Research Database (Denmark)

    Hamilton, M.C.; Thekdi, S.A.; Jenicek, E.M.

    2013-01-01

    Management of natural resources and infrastructure systems for sustainability is complicated by uncertainties in the human and natural environment. Moreover, decisions are further complicated by contradictory views, values, and concerns that are rarely made explicit. Scenario analysis can play...... of emergent conditions and help to avoid regret and belated action. The purpose of this paper is to present several case studies in natural resources and infrastructure systems management where scenario analysis has been used to aide decision making under uncertainty. The case studies include several resource...... and infrastructure systems: (1) water resources (2) land-use corridors (3) energy infrastructure, and (4) coastal climate change adaptation. The case studies emphasize a participatory approach, where scenario analysis becomes a means of incorporating diverse stakeholder concerns and experience. This approach...

  9. Design and Operation of an Open, Interoperable Automated Demand Response Infrastructure for Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Ghatikar, Girish; Kiliccote, Sila; Watson, David; Koch, Ed; Hennage, Dan

    2009-05-01

    This paper describes the concept for and lessons from the development and field-testing of an open, interoperable communications infrastructure to support automated demand response (auto-DR). Automating DR allows greater levels of participation, improved reliability, and repeatability of the DR in participating facilities. This paper also presents the technical and architectural issues associated with auto-DR and description of the demand response automation server (DRAS), the client/server architecture-based middle-ware used to automate the interactions between the utilities or any DR serving entity and their customers for DR programs. Use case diagrams are presented to show the role of the DRAS between utility/ISO and the clients at the facilities.

  10. caCORE version 3: Implementation of a model driven, service-oriented architecture for semantic interoperability.

    Science.gov (United States)

    Komatsoulis, George A; Warzel, Denise B; Hartel, Francis W; Shanbhag, Krishnakant; Chilukuri, Ram; Fragoso, Gilberto; Coronado, Sherri de; Reeves, Dianne M; Hadfield, Jillaine B; Ludet, Christophe; Covitz, Peter A

    2008-02-01

    One of the requirements for a federated information system is interoperability, the ability of one computer system to access and use the resources of another system. This feature is particularly important in biomedical research systems, which need to coordinate a variety of disparate types of data. In order to meet this need, the National Cancer Institute Center for Bioinformatics (NCICB) has created the cancer Common Ontologic Representation Environment (caCORE), an interoperability infrastructure based on Model Driven Architecture. The caCORE infrastructure provides a mechanism to create interoperable biomedical information systems. Systems built using the caCORE paradigm address both aspects of interoperability: the ability to access data (syntactic interoperability) and understand the data once retrieved (semantic interoperability). This infrastructure consists of an integrated set of three major components: a controlled terminology service (Enterprise Vocabulary Services), a standards-based metadata repository (the cancer Data Standards Repository) and an information system with an Application Programming Interface (API) based on Domain Model Driven Architecture. This infrastructure is being leveraged to create a Semantic Service-Oriented Architecture (SSOA) for cancer research by the National Cancer Institute's cancer Biomedical Informatics Grid (caBIG).

  11. The GEOSS solution for enabling data interoperability and integrative research.

    Science.gov (United States)

    Nativi, Stefano; Mazzetti, Paolo; Craglia, Max; Pirrone, Nicola

    2014-03-01

    Global sustainability research requires an integrative research effort underpinned by digital infrastructures (systems) able to harness data and heterogeneous information across disciplines. Digital data and information sharing across systems and applications is achieved by implementing interoperability: a property of a product or system to work with other products or systems, present or future. There are at least three main interoperability challenges a digital infrastructure must address: technological, semantic, and organizational. In recent years, important international programs and initiatives are focusing on such an ambitious objective. This manuscript presents and combines the studies and the experiences carried out by three relevant projects, focusing on the heavy metal domain: Global Mercury Observation System, Global Earth Observation System of Systems (GEOSS), and INSPIRE. This research work recognized a valuable interoperability service bus (i.e., a set of standards models, interfaces, and good practices) proposed to characterize the integrative research cyber-infrastructure of the heavy metal research community. In the paper, the GEOSS common infrastructure is discussed implementing a multidisciplinary and participatory research infrastructure, introducing a possible roadmap for the heavy metal pollution research community to join GEOSS as a new Group on Earth Observation community of practice and develop a research infrastructure for carrying out integrative research in its specific domain.

  12. Risk analysis of underground infrastructures in urban areas

    International Nuclear Information System (INIS)

    Cagno, Enrico; De Ambroggi, Massimiliano; Grande, Ottavio; Trucco, Paolo

    2011-01-01

    The paper presents an integrated approach for vulnerability and resilience analysis for underground infrastructures, i.e. a societal risk analysis of the failures of underground services for an urban area. The approach is based on the detailed study of (1) domino-effects for the components of a single infrastructure and for a given set of infrastructures interoperated and/or belonging to the same area; (2) risk and vulnerability analysis of a given area; (3) identification of a set of intervention guidelines, in order to improve the overall system resilience. The use of an integrated (interoperability and area) approach, breaking down the analysis area extent into sub-areas and assessing the dependencies among sub-areas both in terms of interoperability and damage propagation of critical infrastructures, demonstrates a useful advantage in terms of resilience analysis, more consistent with the 'zoned' nature of failures of the underground infrastructures. An applied case, describing the interoperability and damage propagation analysis with the evaluation of time-dependency for the infrastructures and targets and of different kinds of interventions of the underground infrastructures of a town, is presented for this purpose.

  13. Telemedicine system interoperability architecture: concept description and architecture overview.

    Energy Technology Data Exchange (ETDEWEB)

    Craft, Richard Layne, II

    2004-05-01

    In order for telemedicine to realize the vision of anywhere, anytime access to care, it must address the question of how to create a fully interoperable infrastructure. This paper describes the reasons for pursuing interoperability, outlines operational requirements that any interoperability approach needs to consider, proposes an abstract architecture for meeting these needs, identifies candidate technologies that might be used for rendering this architecture, and suggests a path forward that the telemedicine community might follow.

  14. Semantically Interoperable XML Data.

    Science.gov (United States)

    Vergara-Niedermayr, Cristobal; Wang, Fusheng; Pan, Tony; Kurc, Tahsin; Saltz, Joel

    2013-09-01

    XML is ubiquitously used as an information exchange platform for web-based applications in healthcare, life sciences, and many other domains. Proliferating XML data are now managed through latest native XML database technologies. XML data sources conforming to common XML schemas could be shared and integrated with syntactic interoperability. Semantic interoperability can be achieved through semantic annotations of data models using common data elements linked to concepts from ontologies. In this paper, we present a framework and software system to support the development of semantic interoperable XML based data sources that can be shared through a Grid infrastructure. We also present our work on supporting semantic validated XML data through semantic annotations for XML Schema, semantic validation and semantic authoring of XML data. We demonstrate the use of the system for a biomedical database of medical image annotations and markups.

  15. Semantically Interoperable XML Data

    Science.gov (United States)

    Vergara-Niedermayr, Cristobal; Wang, Fusheng; Pan, Tony; Kurc, Tahsin; Saltz, Joel

    2013-01-01

    XML is ubiquitously used as an information exchange platform for web-based applications in healthcare, life sciences, and many other domains. Proliferating XML data are now managed through latest native XML database technologies. XML data sources conforming to common XML schemas could be shared and integrated with syntactic interoperability. Semantic interoperability can be achieved through semantic annotations of data models using common data elements linked to concepts from ontologies. In this paper, we present a framework and software system to support the development of semantic interoperable XML based data sources that can be shared through a Grid infrastructure. We also present our work on supporting semantic validated XML data through semantic annotations for XML Schema, semantic validation and semantic authoring of XML data. We demonstrate the use of the system for a biomedical database of medical image annotations and markups. PMID:25298789

  16. PACS/information systems interoperability using Enterprise Communication Framework.

    Science.gov (United States)

    alSafadi, Y; Lord, W P; Mankovich, N J

    1998-06-01

    Interoperability among healthcare applications goes beyond connectivity to allow components to exchange structured information and work together in a predictable, coordinated fashion. To facilitate building an interoperability infrastructure, an Enterprise Communication Framework (ECF) was developed by the members of the Andover Working Group for Healthcare Interoperability (AWG-OHI). The ECF consists of four models: 1) Use Case Model, 2) Domain Information Model (DIM), 3) Interaction Model, and 4) Message Model. To realize this framework, a software component called the Enterprise Communicator (EC) is used. In this paper, we will demonstrate the use of the framework in interoperating a picture archiving and communication system (PACS) with a radiology information system (RIS).

  17. OGC and Grid Interoperability in enviroGRIDS Project

    Science.gov (United States)

    Gorgan, Dorian; Rodila, Denisa; Bacu, Victor; Giuliani, Gregory; Ray, Nicolas

    2010-05-01

    EnviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is a 4-years FP7 Project aiming to address the subjects of ecologically unsustainable development and inadequate resource management. The project develops a Spatial Data Infrastructure of the Black Sea Catchment region. The geospatial technologies offer very specialized functionality for Earth Science oriented applications as well as the Grid oriented technology that is able to support distributed and parallel processing. One challenge of the enviroGRIDS project is the interoperability between geospatial and Grid infrastructures by providing the basic and the extended features of the both technologies. The geospatial interoperability technology has been promoted as a way of dealing with large volumes of geospatial data in distributed environments through the development of interoperable Web service specifications proposed by the Open Geospatial Consortium (OGC), with applications spread across multiple fields but especially in Earth observation research. Due to the huge volumes of data available in the geospatial domain and the additional introduced issues (data management, secure data transfer, data distribution and data computation), the need for an infrastructure capable to manage all those problems becomes an important aspect. The Grid promotes and facilitates the secure interoperations of geospatial heterogeneous distributed data within a distributed environment, the creation and management of large distributed computational jobs and assures a security level for communication and transfer of messages based on certificates. This presentation analysis and discusses the most significant use cases for enabling the OGC Web services interoperability with the Grid environment and focuses on the description and implementation of the most promising one. In these use cases we give a special attention to issues such as: the relations between computational grid and

  18. Intercloud Architecture for interoperability and integration

    NARCIS (Netherlands)

    Demchenko, Y.; Makkes, M.X.; Strijkers, R.J.; Laat, C. de

    2012-01-01

    This paper presents on-going research to develop the Intercloud Architecture Framework (ICAF) that addresses problems in multi-provider multi-domain heterogeneous cloud based infrastructure services and applications integration and interoperability. The paper refers to existing standards in Cloud

  19. Intercloud architecture for interoperability and integration

    NARCIS (Netherlands)

    Demchenko, Y.; Makkes, M.X.; Strijkers, R.; de Laat, C.

    2012-01-01

    This paper presents on-going research to develop the Intercloud Architecture Framework (ICAF) that addresses problems in multi-provider multi-domain heterogeneous cloud based infrastructure services and applications integration and interoperability. The paper refers to existing standards in Cloud

  20. The Italian Cloud-based brokering Infrastructure to sustain Interoperability for Operative Hydrology

    Science.gov (United States)

    Boldrini, E.; Pecora, S.; Bussettini, M.; Bordini, F.; Nativi, S.

    2015-12-01

    This work presents the informatics platform carried out to implement the National Hydrological Operative Information System of Italy. In particular, the presentation will focus on the governing aspects of the cloud infrastructure and brokering software that make possible to sustain the hydrology data flow between heterogeneous user clients and data providers.The Institute for Environmental Protection and Research, ISPRA (Istituto Superiore per la Protezione e la Ricerca Ambientale) in collaboration with the Regional Agency for Environmental Protection in the Emilia-Romagna region, ARPA-ER (Agenzia Regionale per la Prevenzione e l´Ambiente dell´Emilia-Romagna) and CNR-IIA (National Research Council of Italy) designed and developed an innovative platform for the discovery and access of hydrological data coming from 19 Italian administrative regions and 2 Italian autonomous provinces, in near real time. ISPRA has deployed and governs such a system. The presentation will introduce and discuss the technological barriers for interoperability as well as social and policy ones. The adopted solutions will be described outlining the sustainability challenges and benefits.

  1. Intercloud Architecture Framework for Interoperability and Integration

    NARCIS (Netherlands)

    Demchenko, Y.; Ngo, C.; Makkes, M.X.; Strijkers, R.J.

    2013-01-01

    This report presents on-going research to develop the Intercloud Architecture Framework (ICAF) that addresses interoperability and integration issues in multi-provider multi-domain heterogeneous Cloud based infrastructure services and applications provisioning, including integration and

  2. Cloud portability and interoperability issues and current trends

    CERN Document Server

    Di Martino, Beniamino; Esposito, Antonio

    2015-01-01

    This book offers readers a quick, comprehensive and up-to-date overview of the most important methodologies, technologies, APIs and standards related to the portability and interoperability of cloud applications and services, illustrated by a number of use cases representing a variety of interoperability and portability scenarios. The lack of portability and interoperability between cloud platforms at different service levels is the main issue affecting cloud-based services today. The brokering, negotiation, management, monitoring and reconfiguration of cloud resources are challenging tasks

  3. Infrastructuring When You Don’t

    DEFF Research Database (Denmark)

    Bolmsten, Johan; Dittrich, Yvonne

    2011-01-01

    infrastructures. Such infrastructures enable integration between different applications and tasks but, at the same time, introduce constraints to ensure interoperability. How can the ad vantages of End-User Development be kept without jeopardizing the integration between different applications? The article...

  4. Enabling interoperability-as-a-service for connected IoT infrastructures and Smart Objects

    DEFF Research Database (Denmark)

    Hovstø, Asbjørn; Guan, Yajuan; Quintero, Juan Carlos Vasquez

    2018-01-01

    Lack of interoperability is considered as the most important barrier to achieve the global integration of Internet-of-Things (IoT) ecosystems across borders of different disciplines, vendors and standards. Indeed, the current IoT landscape consists of a large set of non-interoperable infrastructu...

  5. Ocean Data Interoperability Platform (ODIP): developing a common framework for global marine data management

    Science.gov (United States)

    Glaves, H. M.

    2015-12-01

    In recent years marine research has become increasingly multidisciplinary in its approach with a corresponding rise in the demand for large quantities of high quality interoperable data as a result. This requirement for easily discoverable and readily available marine data is currently being addressed by a number of regional initiatives with projects such as SeaDataNet in Europe, Rolling Deck to Repository (R2R) in the USA and the Integrated Marine Observing System (IMOS) in Australia, having implemented local infrastructures to facilitate the exchange of standardised marine datasets. However, each of these systems has been developed to address local requirements and created in isolation from those in other regions.Multidisciplinary marine research on a global scale necessitates a common framework for marine data management which is based on existing data systems. The Ocean Data Interoperability Platform project is seeking to address this requirement by bringing together selected regional marine e-infrastructures for the purposes of developing interoperability across them. By identifying the areas of commonality and incompatibility between these data infrastructures, and leveraging the development activities and expertise of these individual systems, three prototype interoperability solutions are being created which demonstrate the effective sharing of marine data and associated metadata across the participating regional data infrastructures as well as with other target international systems such as GEO, COPERNICUS etc.These interoperability solutions combined with agreed best practice and approved standards, form the basis of a common global approach to marine data management which can be adopted by the wider marine research community. To encourage implementation of these interoperability solutions by other regional marine data infrastructures an impact assessment is being conducted to determine both the technical and financial implications of deploying them

  6. GéoSAS: A modular and interoperable Open Source Spatial Data Infrastructure for research

    Directory of Open Access Journals (Sweden)

    R. Bera

    2015-05-01

    Full Text Available To-date, the commonest way to deal with geographical information and processes still appears to consume local resources, i.e. locally stored data processed on a local desktop or server. The maturity and subsequent growing use of OGC standards to exchange data on the World Wide Web, enhanced in Europe by the INSPIRE Directive, is bound to change the way people (and among them research scientists, especially in environmental sciences make use of, and manage, spatial data. A clever use of OGC standards can help scientists to better store, share and use data, in particular for modelling. We propose a framework for online processing by making an intensive use of OGC standards. We illustrate it using the Spatial Data Infrastructure (SDI GéoSAS which is the SDI set up for researchers’ needs in our department. It is based on the existing open source, modular and interoperable Spatial Data Architecture geOrchestra.

  7. GéoSAS: A modular and interoperable Open Source Spatial Data Infrastructure for research

    Science.gov (United States)

    Bera, R.; Squividant, H.; Le Henaff, G.; Pichelin, P.; Ruiz, L.; Launay, J.; Vanhouteghem, J.; Aurousseau, P.; Cudennec, C.

    2015-05-01

    To-date, the commonest way to deal with geographical information and processes still appears to consume local resources, i.e. locally stored data processed on a local desktop or server. The maturity and subsequent growing use of OGC standards to exchange data on the World Wide Web, enhanced in Europe by the INSPIRE Directive, is bound to change the way people (and among them research scientists, especially in environmental sciences) make use of, and manage, spatial data. A clever use of OGC standards can help scientists to better store, share and use data, in particular for modelling. We propose a framework for online processing by making an intensive use of OGC standards. We illustrate it using the Spatial Data Infrastructure (SDI) GéoSAS which is the SDI set up for researchers' needs in our department. It is based on the existing open source, modular and interoperable Spatial Data Architecture geOrchestra.

  8. An E-government Interoperability Platform Supporting Personal Data Protection Regulations

    OpenAIRE

    González, Laura; Echevarría, Andrés; Morales, Dahiana; Ruggia, Raúl

    2016-01-01

    Public agencies are increasingly required to collaborate with each other in order to provide high-quality e-government services. This collaboration is usually based on the service-oriented approach and supported by interoperability platforms. Such platforms are specialized middleware-based infrastructures enabling the provision, discovery and invocation of interoperable software services. In turn, given that personal data handled by governments are often very sensitive, most governments have ...

  9. Interoperability for Enterprise Systems and Applications

    CERN Document Server

    Jardim-Gonçalves, Ricardo; Popplewell, Keith; Mendonça, João

    2016-01-01

    A concise reference to the state of the art in systems interoperability, Enterprise Interoperability VII will be of great value to engineers and computer scientists working in manufacturing and other process industries and to software engineers and electronic and manufacturing engineers working in the academic environment. Furthermore, it shows how knowledge of the meaning within information and the use to which it will be put have to be held in common between enterprises for consistent and efficient inter-enterprise networks. Over 30 papers, ranging from academic research through case studies to industrial and administrative experience of interoperability show how, in a scenario of globalised markets, where the capacity to cooperate with other organizations efficiently is essential in order to remain economically, socially and environmentally cost-effective, the most innovative digitized and networked enterprises ensure that their systems and applications are able to interoperate across heterogeneous collabo...

  10. District-Scale Green Infrastructure Scenarios for the Zidell Development Site, City of Portland

    Science.gov (United States)

    The report outlines technical assistance to develop green infrastructure scenarios for the Zidell Yards site consistent with the constraints of a recently remediated brownfield that can be implemented within a 15-20 year time horizon.

  11. Data and Mined-Knowledge Interoperability in eHealth Systems

    OpenAIRE

    Sartipi, Kamran; Najafi, Mehran; Kazemzadeh, Reza S.

    2008-01-01

    Current healthcare infrastructures in the advanced societies can not fulfil the demands for quality public health services which are characterized by patient-centric, seamless interoperation of heterogeneous healthcare systems, and nation-wide electronic health record services. Consequently, the governments and healthcare institutions are embracing new information and communication technologies to provide the necessary infrastructures for healthcare and medical services. In this chapter, we a...

  12. Interoperability, Scaling, and the Digital Libraries Research Agenda.

    Science.gov (United States)

    Lynch, Clifford; Garcia-Molina, Hector

    1996-01-01

    Summarizes reports and activities at the Information Infrastructure Technology and Applications workshop on digital libraries (Reston, Virginia, August 22, 1995). Defines digital library roles and identifies areas of needed research, including: interoperability; protocols for digital objects; collection management; interface design; human-computer…

  13. Interoperability for Entreprise Systems and Applications '12

    CERN Document Server

    Doumeingts, Guy; Katzy, Bernhard; Chalmeta, Ricardo

    2012-01-01

    Within a scenario of globalised markets, where the capacity to efficiently cooperate with other firms starts to become essential in order to remain in the market in an economically, socially and environmentally cost-effective manner, it can be seen how the most innovative enterprises are beginning to redesign their business model to become interoperable. This goal of interoperability is essential, not only from the perspective of the individual enterprise but also in the new business structures that are now emerging, such as supply chains, virtual enterprises, interconnected organisations or extended enterprises, as well as in mergers and acquisitions. Composed of over 40 papers, Enterprise Interoperability V ranges from academic research through case studies to industrial and administrative experience of interoperability. The international nature of the authorship contnues to broaden. Many of the papers have examples and illustrations calculated to deepen understanding and generate new ideas. The I-ESA'12 Co...

  14. Lowering Entry Barriers for Multidisciplinary Cyber(e)-Infrastructures

    Science.gov (United States)

    Nativi, S.

    2012-04-01

    Multidisciplinarity is more and more important to study the Earth System and address Global Changes. To achieve that, multidisciplinary cyber(e)-infrastructures are an important instrument. In the last years, several European, US and international initiatives have been started to carry out multidisciplinary infrastructures, including: the Spatial Information in the European Community (INSPIRE), the Global Monitoring for Environment and Security (GMES), the Data Observation Network for Earth (DataOne), and the Global Earth Observation System of Systems (GEOSS). The majority of these initiatives are developing service-based digital infrastructures asking scientific Communities (i.e. disciplinary Users and data Producers) to implement a set of standards for information interoperability. For scientific Communities, this has represented an entry barrier which has proved to be high, in several cases. In fact, both data Producers and Users do not seem to be willing to invest precious resources to become expert on interoperability solutions -on the contrary, they are focused on developing disciplinary and thematic capacities. Therefore, an important research topic is lowering entry barriers for joining multidisciplinary cyber(e)-Infrastructures. This presentation will introduce a new approach to achieve multidisciplinary interoperability underpinning multidisciplinary infrastructures and lowering the present entry barriers for both Users and data Producers. This is called the Brokering approach: it extends the service-based paradigm by introducing a new a Brokering layer or cloud which is in charge of managing all the interoperability complexity (e.g. data discovery, access, and use) thus easing Users' and Producers' burden. This approach was successfully experimented in the framework of several European FP7 Projects and in GEOSS.

  15. Transportation Energy Futures Series: Alternative Fuel Infrastructure Expansion: Costs, Resources, Production Capacity, and Retail Availability for Low-Carbon Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Melaina, W. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Heath, Garvin [National Renewable Energy Lab. (NREL), Golden, CO (United States); Sandor, Debra [National Renewable Energy Lab. (NREL), Golden, CO (United States); Steward, Darlene [National Renewable Energy Lab. (NREL), Golden, CO (United States); Vimmerstedt, Laura [National Renewable Energy Lab. (NREL), Golden, CO (United States); Warner, Ethan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Webster, Karen W. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2013-04-01

    The petroleum-based transportation fuel system is complex and highly developed, in contrast to the nascent low-petroleum, low-carbon alternative fuel system. This report examines how expansion of the low-carbon transportation fuel infrastructure could contribute to deep reductions in petroleum use and greenhouse gas (GHG) emissions across the U.S. transportation sector. Three low-carbon scenarios, each using a different combination of low-carbon fuels, were developed to explore infrastructure expansion trends consistent with a study goal of reducing transportation sector GHG emissions to 80% less than 2005 levels by 2050.These scenarios were compared to a business-as-usual (BAU) scenario and were evaluated with respect to four criteria: fuel cost estimates, resource availability, fuel production capacity expansion, and retail infrastructure expansion.

  16. The GIIDA (Management of the CNR Environmental Data for Interoperability) project

    Science.gov (United States)

    Nativi, S.

    2009-04-01

    This work presents the GIIDA (Gestione Integrata e Interoperativa dei Dati Ambientali del CNR) inter-departimental project of the Italian National Research Council (CNR). The project is an initiative of the Earth and Environment Department (Dipartimento Terra e Ambiente) of the CNR. GIIDA mission is "To implement the Spatial Information Infrastructure (SII) of CNR for Environmental and Earth Observation data". The project aims to design and develop a multidisciplinary cyber-infrastructure for the management, processing and evaluation of Earth and environmental data. This infrastructure will contribute to the Italian presence in international projects and initiatives, such as: INSPIRE, GMES, GEOSS and SEIS. The main GIIDA goals are: • Networking: To create a network of CNR Institutes for implementing a common information space and sharing spatial resources. • Observation: Re-engineering the environmental observation system of CNR • Modeling: Re-engineering the environmental modeling system del CNR • Processing: Re-engineering the environmental processing system del CNR • Mediation: To define mediation methods and instruments for implementing the international interoperability standards. The project started in July 2008 releasing a specification document of the GIIDA architecture for interoperability and security. Based on these documents, a Call for Proposals was issued in September 2008. GIIDA received 23 proposed pilots from 16 different Institutes belonging to five CNR Departments and from 15 non-CNR Institutions (e.g. three Italian regional administrations, three national research centers, four universities, some SMEs). These pilot were divided into thematic areas. In fact, GIIDA considers seven main thematic areas/domains: • Biodiversity; • Climate Changes; • Air Quality; • Soil and Water Quality; • Risks; • Infrastructures for Research and Public Administrations; • Sea and Marine resources Each of these thematic areas is covered by a

  17. Ocean Data Interoperability Platform (ODIP): using regional data systems for global ocean research

    Science.gov (United States)

    Schaap, D.; Thijsse, P.; Glaves, H.

    2017-12-01

    Ocean acidification, loss of coral reefs, sustainable exploitation of the marine environment are just a few of the challenges researchers around the world are currently attempting to understand and address. However, studies of these ecosystem level challenges are impossible unless researchers can discover and re-use the large volumes of interoperable multidisciplinary data that are currently only accessible through regional and global data systems that serve discreet, and often discipline specific, user communities. The plethora of marine data systems currently in existence are also using different standards, technologies and best practices making re-use of the data problematic for those engaged in interdisciplinary marine research. The Ocean Data Interoperability Platform (ODIP) is responding to this growing demand for discoverable, accessible and reusable data by establishing the foundations for a common global framework for marine data management. But creation of such an infrastructure is a major undertaking, and one that needs to be achieved in part by establishing different levels of interoperability across existing regional and global marine e-infrastructures. Workshops organised by ODIP II facilitate dialogue between selected regional and global marine data systems in an effort to identify potential solutions that integrate these marine e-infrastructures. The outcomes of these discussions have formed the basis for a number of prototype development tasks that aim to demonstrate effective sharing of data across multiple data systems, and allow users to access data from more than one system through a single access point. The ODIP II project is currently developing four prototype solutions that are establishing interoperability between selected regional marine data management infrastructures in Europe, the USA, Canada and Australia, and with the global POGO, IODE Ocean Data Portal (ODP) and GEOSS systems. The potential impact of implementing these solutions for

  18. Ocean Data Interoperability Platform (ODIP): developing a common framework for marine data management on a global scale

    Science.gov (United States)

    Glaves, Helen; Schaap, Dick

    2016-04-01

    The increasingly ocean basin level approach to marine research has led to a corresponding rise in the demand for large quantities of high quality interoperable data. This requirement for easily discoverable and readily available marine data is currently being addressed by initiatives such as SeaDataNet in Europe, Rolling Deck to Repository (R2R) in the USA and the Australian Ocean Data Network (AODN) with each having implemented an e-infrastructure to facilitate the discovery and re-use of standardised multidisciplinary marine datasets available from a network of distributed repositories, data centres etc. within their own region. However, these regional data systems have been developed in response to the specific requirements of their users and in line with the priorities of the funding agency. They have also been created independently of the marine data infrastructures in other regions often using different standards, data formats, technologies etc. that make integration of marine data from these regional systems for the purposes of basin level research difficult. Marine research at the ocean basin level requires a common global framework for marine data management which is based on existing regional marine data systems but provides an integrated solution for delivering interoperable marine data to the user. The Ocean Data Interoperability Platform (ODIP/ODIP II) project brings together those responsible for the management of the selected marine data systems and other relevant technical experts with the objective of developing interoperability across the regional e-infrastructures. The commonalities and incompatibilities between the individual data infrastructures are identified and then used as the foundation for the specification of prototype interoperability solutions which demonstrate the feasibility of sharing marine data across the regional systems and also with relevant larger global data services such as GEO, COPERNICUS, IODE, POGO etc. The potential

  19. Analysis of Jordan's Proposed Emergency Communication Interoperability Plan (JECIP) for Disaster Response

    National Research Council Canada - National Science Library

    Alzaghal, Mohamad H

    2008-01-01

    ... country. It is essential to build a robust and interoperable Information and Communication Technology (ICT) infrastructure before the disaster, which will facilitate patch/restore/reconstruct it when and after the disaster hits...

  20. Architecture of a Process Broker for Interoperable Geospatial Modeling on the Web

    Directory of Open Access Journals (Sweden)

    Lorenzo Bigagli

    2015-04-01

    Full Text Available The identification of appropriate mechanisms for process sharing and reuse by means of composition is considered a key enabler for the effective uptake of a global Earth Observation infrastructure, currently pursued by the international geospatial research community. Modelers in need of running complex workflows may benefit from outsourcing process composition to a dedicated external service, according to the brokering approach. This work introduces our architecture of a process broker, as a distributed information system for creating, validating, editing, storing, publishing and executing geospatial-modeling workflows. The broker provides a service framework for adaptation, reuse and complementation of existing processing resources (including models and geospatial services in general in the form of interoperable, executable workflows. The described solution has been experimentally applied in several use scenarios in the context of EU-funded projects and the Global Earth Observation System of Systems.

  1. Managing Interoperability for GEOSS - A Report from the SIF

    Science.gov (United States)

    Khalsa, S. J.; Actur, D.; Nativi, S.; Browdy, S.; Eglitis, P.

    2009-04-01

    The Global Earth Observation System of Systems (GEOSS) is a coordinating and integrating framework for Earth observing and information systems, which are contributed on a voluntary basis by Members and Participating Organizations of the intergovernmental Group on Earth Observations (GEO). GEOSS exists to support informed decision making for the benefit of society, including the implementation of international environmental treaty obligations. GEO Members and Participating organizations use the GEOSS Common Infrastructure (GCI) to register their Earth observation resources, thereby making them discoverable and consumable by both humans and client applications. Essential to meeting GEO user needs is a process for supporting interoperability of observing, processing, modeling and dissemination capabilities. The GEO Standards and Interoperability Forum (SIF) was created to develop, implement and oversee this process. The SIF supports GEO organizations contributing resources to the GEOSS by helping them understand and work with the GEOSS interoperability guidelines and encouraging them to register their "interoperability arrangements" (standards or other ad hoc arrangements for interoperability) in the GEOSS standards registry, which is part of the GCI. These registered interoperability arrangements support the actual services used to achieve interoperability of systems. By making information about these interoperability arrangements available to users of the GEOSS the SIF enhances the understanding and utility of contributed resources. We describe the procedures that the SIF has enacted to carry out its work. To operate effectively the SIF uses a workflow system and is establishing a set of regional teams and domain experts. In the near term our work has focused on population and review of the GEOSS Standards Registry, but we are also developing approaches to achieving progressive convergence on, and uptake of, an optimal set of interoperability arrangements for all of

  2. VADMC: The Infrastructure

    Directory of Open Access Journals (Sweden)

    Le Sidaner Pierre

    2012-09-01

    Full Text Available The Virtual Atomic and Molecular Data Centre (VAMDC; http://www.vamdc.eu is a European-Union-funded collaboration between several groups involved in the generation, evaluation, and use of atomic and molecular data. VAMDC aims at building a secure, documented, flexible and interoperable e-Science environment-based interface to existing atomic and molecular databases. The global infrastructure of this project uses technologies derived from the International Virtual Observatory Alliance (IVOA. The infrastructure, as well as the first database prototypes will be described.

  3. Using a CRIS for e-Infrastructure: e-Infrastructure for Scholarly Publications

    Directory of Open Access Journals (Sweden)

    E Dijk

    2010-05-01

    Full Text Available Scholarly publications are a major part of the research infrastructure. One way to make output available is to store the publications in Open Access Repositories (OAR. A Current Research Information System (CRIS that conforms to the standard CERIF (Common European Research Information Format could be a key component in the e-infrastructure. A CRIS provides the structure and makes it possible to interoperate the CRIS metadata at every stage of the research cycle. The international DRIVER projects are creating a European repository infrastructure. Knowledge Exchange has launched a project to develop a metadata exchange format for publications between CRIS and OAR systems.

  4. EC verifications of the infrastructure subsystem; EG-Pruefungen im Teilsystem Infrastruktur

    Energy Technology Data Exchange (ETDEWEB)

    Koeppel, M. [EISENBAHN-CERT, Bonn (Germany)

    2006-07-15

    With the entry into force of the TSIs (technical specifications for the interoperability) of the high-speed railway system, a new procedure has been created in Germany for authorizing railway infrastructures to enter service. It involves the nominated body for interoperability, Eisenbahn-Cert (EBC), in a role which inserts it ahead of the Federal Railway Authority (EBA). This article discusses the EC verification procedure on the basis of the experience already accumulated with it and the cases pending at present. It also deals with substantive rules concerning the infrastructure TSIs, comparing them with the national equivalents, and takes a look into the future as regards the rules that are going to be necessary for attaining interoperability for both high-speed railway systems and conventional ones. (orig.)

  5. Reflections on the role of open source in health information system interoperability.

    Science.gov (United States)

    Sfakianakis, S; Chronaki, C E; Chiarugi, F; Conforti, F; Katehakis, D G

    2007-01-01

    This paper reflects on the role of open source in health information system interoperability. Open source is a driving force in computer science research and the development of information systems. It facilitates the sharing of information and ideas, enables evolutionary development and open collaborative testing of code, and broadens the adoption of interoperability standards. In health care, information systems have been developed largely ad hoc following proprietary specifications and customized design. However, the wide deployment of integrated services such as Electronic Health Records (EHRs) over regional health information networks (RHINs) relies on interoperability of the underlying information systems and medical devices. This reflection is built on the experiences of the PICNIC project that developed shared software infrastructure components in open source for RHINs and the OpenECG network that offers open source components to lower the implementation cost of interoperability standards such as SCP-ECG, in electrocardiography. Open source components implementing standards and a community providing feedback from real-world use are key enablers of health care information system interoperability. Investing in open source is investing in interoperability and a vital aspect of a long term strategy towards comprehensive health services and clinical research.

  6. A web services choreography scenario for interoperating bioinformatics applications

    Directory of Open Access Journals (Sweden)

    Cheung David W

    2004-03-01

    Full Text Available Abstract Background Very often genome-wide data analysis requires the interoperation of multiple databases and analytic tools. A large number of genome databases and bioinformatics applications are available through the web, but it is difficult to automate interoperation because: 1 the platforms on which the applications run are heterogeneous, 2 their web interface is not machine-friendly, 3 they use a non-standard format for data input and output, 4 they do not exploit standards to define application interface and message exchange, and 5 existing protocols for remote messaging are often not firewall-friendly. To overcome these issues, web services have emerged as a standard XML-based model for message exchange between heterogeneous applications. Web services engines have been developed to manage the configuration and execution of a web services workflow. Results To demonstrate the benefit of using web services over traditional web interfaces, we compare the two implementations of HAPI, a gene expression analysis utility developed by the University of California San Diego (UCSD that allows visual characterization of groups or clusters of genes based on the biomedical literature. This utility takes a set of microarray spot IDs as input and outputs a hierarchy of MeSH Keywords that correlates to the input and is grouped by Medical Subject Heading (MeSH category. While the HTML output is easy for humans to visualize, it is difficult for computer applications to interpret semantically. To facilitate the capability of machine processing, we have created a workflow of three web services that replicates the HAPI functionality. These web services use document-style messages, which means that messages are encoded in an XML-based format. We compared three approaches to the implementation of an XML-based workflow: a hard coded Java application, Collaxa BPEL Server and Taverna Workbench. The Java program functions as a web services engine and interoperates

  7. Data interoperabilty between European Environmental Research Infrastructures and their contribution to global data networks

    Science.gov (United States)

    Kutsch, W. L.; Zhao, Z.; Hardisty, A.; Hellström, M.; Chin, Y.; Magagna, B.; Asmi, A.; Papale, D.; Pfeil, B.; Atkinson, M.

    2017-12-01

    Environmental Research Infrastructures (ENVRIs) are expected to become important pillars not only for supporting their own scientific communities, but also a) for inter-disciplinary research and b) for the European Earth Observation Program Copernicus as a contribution to the Global Earth Observation System of Systems (GEOSS) or global thematic data networks. As such, it is very important that data-related activities of the ENVRIs will be well integrated. This requires common policies, models and e-infrastructure to optimise technological implementation, define workflows, and ensure coordination, harmonisation, integration and interoperability of data, applications and other services. The key is interoperating common metadata systems (utilising a richer metadata model as the `switchboard' for interoperation with formal syntax and declared semantics). The metadata characterises data, services, users and ICT resources (including sensors and detectors). The European Cluster Project ENVRIplus has developed a reference model (ENVRI RM) for common data infrastructure architecture to promote interoperability among ENVRIs. The presentation will provide an overview of recent progress and give examples for the integration of ENVRI data in global integration networks.

  8. Ontologies for interaction : enabling serendipitous interoperability in smart environments

    NARCIS (Netherlands)

    Niezen, G.

    2012-01-01

    The thesis describes the design and development of an ontology and software framework to support user interaction in ubiquitous computing scenarios. The key goal of ubiquitous computing is "serendipitous interoperability", where devices that were not necessarily designed to work together should be

  9. Ocean Data Interoperability Platform (ODIP): developing a common framework for marine data management on a global scale

    Science.gov (United States)

    Schaap, Dick M. A.; Glaves, Helen

    2016-04-01

    Europe, the USA, and Australia are making significant progress in facilitating the discovery, access and long term stewardship of ocean and marine data through the development, implementation, population and operation of national, regional or international distributed ocean and marine observing and data management infrastructures such as SeaDataNet, EMODnet, IOOS, R2R, and IMOS. All of these developments are resulting in the development of standards and services implemented and used by their regional communities. The Ocean Data Interoperability Platform (ODIP) project is supported by the EU FP7 Research Infrastructures programme, National Science Foundation (USA) and Australian government and has been initiated 1st October 2012. Recently the project has been continued as ODIP II for another 3 years with EU HORIZON 2020 funding. ODIP includes all the major organisations engaged in ocean data management in EU, US, and Australia. ODIP is also supported by the IOC-IODE, closely linking this activity with its Ocean Data Portal (ODP) and Ocean Data Standards Best Practices (ODSBP) projects. The ODIP platform aims to ease interoperability between the regional marine data management infrastructures. Therefore it facilitates an organised dialogue between the key infrastructure representatives by means of publishing best practice, organising a series of international workshops and fostering the development of common standards and interoperability solutions. These are evaluated and tested by means of prototype projects. The presentation will give further background on the ODIP projects and the latest information on the progress of three prototype projects addressing: 1. establishing interoperability between the regional EU, USA and Australia data discovery and access services (SeaDataNet CDI, US NODC, and IMOS MCP) and contributing to the global GEOSS and IODE-ODP portals; 2. establishing interoperability between cruise summary reporting systems in Europe, the USA and

  10. Providing trust and interoperability to federate distributed biobanks.

    Science.gov (United States)

    Lablans, Martin; Bartholomäus, Sebastian; Uckert, Frank

    2011-01-01

    Biomedical research requires large numbers of well annotated, quality-assessed samples which often cannot be provided by a single biobank. Connecting biobanks, researchers and service providers raises numerous challenges including trust among partners and towards the infrastructure as well as interoperability problems. Therefore we develop a holistic, open-source and easy-to-use IT infrastructure. Our federated approach allows partners to reflect their organizational structures and protect their data sovereignty. The search service and the contact arrangement processes increase data sovereignty without stigmatizing for rejecting a specific cooperation. The infrastructure supports daily processes with an integrated basic sample manager and user-definable electronic case report forms. Interfaces for existing IT systems avoid re-entering of data. Moreover, resource virtualization is supported to make underutilized resources of some partners accessible to those with insufficient equipment for mutual benefit. The functionality of the resulting infrastructure is outlined in a use-case to demonstrate collaboration within a translational research network. Compared to other existing or upcoming infrastructures, our approach has ultimately the same goals, but relies on gentle incentives rather than top-down imposed progress.

  11. Introduction to the CLARIN Technical Infrastructure

    NARCIS (Netherlands)

    Odijk, Jan

    2017-01-01

    This chapter provides an introduction to the design of the CLARIN technical infrastructure, with a focus on the Netherlands part. It provides a basic introduction to the techniques behind PIDs, CMDI-metadata, authentication and authorisation (AAI), semantic interoperability related to CMDI-metadata,

  12. Towards E-Society Policy Interoperability

    Science.gov (United States)

    Iannella, Renato

    The move towards the Policy-Oriented Web is destined to provide support for policy expression and management in the core web layers. One of the most promising areas that can drive this new technology adoption is e-Society communities. With so much user-generated content being shared by these social networks, there is the real danger that the implicit sharing rules that communities have developed over time will be lost in translation in the new digital communities. This will lead to a corresponding loss in confidence in e-Society sites. The Policy-Oriented Web attempts to turn the implicit into the explicit with a common framework for policy language interoperability and awareness. This paper reports on the policy driving factors from the Social Networks experiences using real-world use cases and scenarios. In particular, the key functions of policy-awareness - for privacy, rights, and identity - will be the driving force that enables the e-Society to appreciate new interoperable policy regimes.

  13. Enhancing Data Interoperability with Web Services

    Science.gov (United States)

    Shrestha, S. R.; Zimble, D. A.; Wang, W.; Herring, D.; Halpert, M.

    2014-12-01

    In an effort to improve data access and interoperability of climate and weather data, the National Oceanic and Atmospheric Administration's (NOAA) Climate.gov and Climate Prediction Center (CPC) are exploring various platform solutions to enhance a user's ability to locate, preview, and acquire the data. The Climate.gov and CPC data team faces multiple challenges including the various kinds of data and formats, inconsistency of metadata records, variety of data service implementations, very large volumes of data and geographically distributed locations. We have created the Data Access and Interoperability project to design a web-based platform, where interoperability between systems can be leveraged to allow greater data discovery, access, visualization and delivery. In the interoperable data platform, systems can integrate with each other to support the synthesis of climate and weather data. Interoperability is the ability for users to discover the available climate and weather data, preview and interact with the data, and acquire the data in common digital formats through a simple web-based interface. The goal of the interoperable data platform is to leverage existing web services, implement the established standards and integrate with existing solutions across the earth sciences domain instead of creating new technologies. Towards this effort to improve the interoperability of the platform, we are collaborating with ESRI Inc. to provide climate and weather data via web services. In this presentation, we will discuss and demonstrate how to use ArcGIS to author RESTful based scientific web services using open standards. These web services are able to encapsulate the logic required to handle and describe scientific data through a variety of service types including, image, map, feature, geoprocessing, and their respective service methods. Combining these types of services and leveraging well-documented APIs, including the ArcGIS JavaScript API, we can afford to

  14. Definition and implementation of a SAML-XACML profile for authorization interoperability across grid middleware in OSG and EGEE

    Energy Technology Data Exchange (ETDEWEB)

    Garzoglio, Gabriele; Alderman, Ian; Altunay, Mine; Anathakrishnan, Rachana; Bester, Joe; Chadwick, Keith; Ciaschini, Vincenzo; Demchenko, Yuri; Ferraro, Andrea; Forti, Alberto; Groep, David; /Fermilab /NIKHEF, Amsterdam /Brookhaven /Amsterdam U. /SWITCH, Zurich /Bergen U. /INFN, CNAF /Argonne /Wisconsin U., Madison

    2009-04-01

    In order to ensure interoperability between middleware and authorization infrastructures used in the Open Science Grid (OSG) and the Enabling Grids for E-sciencE (EGEE) projects, an Authorization Interoperability activity was initiated in 2006. The interoperability goal was met in two phases: first, agreeing on a common authorization query interface and protocol with an associated profile that ensures standardized use of attributes and obligations; and second, implementing, testing, and deploying, on OSG and EGEE, middleware that supports the interoperability protocol and profile. The activity has involved people from OSG, EGEE, the Globus Toolkit project, and the Condor project. This paper presents a summary of the agreed-upon protocol, profile and the software components involved.

  15. Towards a Unified Global ICT Infrastructure

    DEFF Research Database (Denmark)

    Madsen, Ole Brun

    2006-01-01

    A successful evolution towards a unified global WAN platform allowing for the coexistence and interoperability of all kind of services requires careful planning of the next generation global cooperative wired and wireless information infrastructure. The absence of commonly agreed upon and adopted...... to be solved can be found in the interrelation between communication, connectivity and convergence. This paper will focus on steps to be taken in planning the physical infrastructure as a prerequisite for a successful evolution....

  16. Public Key Infrastructure (PKI) Interoperability: A Security Services Approach to Support Transfer of Trust

    National Research Council Canada - National Science Library

    Hansen, Anthony

    1999-01-01

    .... This thesis defines interoperability as the capacity to support trust through retention of security services across PKI domains at a defined level of assurance and examines the elements of PKI...

  17. Development Model for Research Infrastructures

    Science.gov (United States)

    Wächter, Joachim; Hammitzsch, Martin; Kerschke, Dorit; Lauterjung, Jörn

    2015-04-01

    . The maturity of individual scientific domains differs considerably. • Technologically and organisationally many different RI components have to be integrated. Individual systems are often complex and have a long-term history. Existing approaches are on different maturity levels, e.g. in relation to the standardisation of interfaces. • The concrete implementation process consists of independent and often parallel development activities. In many cases no detailed architectural blue-print for the envisioned system exists. • Most of the funding currently available for RI implementation is provided on a project basis. To increase the synergies in infrastructure development the authors propose a specific RI Maturity Model (RIMM) that is specifically qualified for open system-of-system environments. RIMM is based on the concepts of Capability Maturity Models for organisational development, concretely the Levels of Conceptual Interoperability Model (LCIM) specifying the technical, syntactical, semantic, pragmatic, dynamic, and conceptual layers of interoperation [1]. The model is complemented by the identification and integration of growth factors (according to the Nolan Stages Theory [2]). These factors include supply and demand factors. Supply factors comprise available resources, e.g., data, services and IT-management capabilities including organisations and IT-personal. Demand factors are the overall application portfolio for RIs but also the skills and requirements of scientists and communities using the infrastructure. RIMM thus enables a balanced development process of RI and RI components by evaluating the status of the supply and demand factors in relation to specific levels of interoperability. [1] Tolk, A., Diallo, A., Turnitsa, C. (2007): Applying the Levels of Conceptual Interoperability Model in Support of Integratability, Interoperability, and Composability for System-of-Systems Engineering. Systemics, Cybernetics and Informatics, Volume 5 - Number 5. [2

  18. An application of ETICS Co-Scheduling Mechanism to Interoperability and Compliance Validation of Grid Services

    CERN Document Server

    Ronchieri, Elisabetta; Diez-andino Sancho, Guillermo; DI Meglio, Alberto; Marzolla, Moreno

    2008-01-01

    Grid software projects require infrastructures in order to evaluate interoperability with other projects and compliance with predefined standards. Interoperability and compliance are quality attributes that are expected from all distributed projects. ETICS is designed to automate the investigation of this kind of problems. It integrates well-established procedures, tools and resources in a coherent framework and adaptes them to the special needs of these projects. Interoperability and compliance to standards are important quality attributes of software developed for Grid environments where many different parts of an interconnected system have to interact. Compliance to standard is one of the major factors in making sure that interoperating parts of a distributed system can actually interconnect and exchange information. Taking the case of the Grid environment (Foster and Kesselman, 2003), most of the projects that are developing software have not reached the maturity level of other communities yet and have di...

  19. Interoperability of Geographic Information: A Communication Process –Based Prototype

    Directory of Open Access Journals (Sweden)

    Jean Brodeur

    2005-04-01

    Full Text Available Since 1990, municipal, state/provincial, and federal governments have developed numerous geographic databases over the years to fulfill organizations' specific needs. As such, same real world topographic phenomena have been abstracted differently, for instance vegetation (surface, trees (surface, wooded area (line, wooded area (point and line, milieu boisé (surface, zone boisée (unknown geometry. Today, information about these geographic phenomena is accessible on the Internet from Web infrastructures specially developed to simplify their access. Early in the nineties, the development of interoperability of geographic information has been undertaken to solve syntactic, structural, and semantic heterogeneities as well as spatial and temporal heterogeneities to facilitate sharing and integration of such data. Recently, we have proposed a new conceptual framework for interoperability of geographic information based on the human communication process, cognitive science, and ontology, and introduced geosemantic proximity, a reasoning methodology to qualify dynamically the semantic similarity between geographic abstractions. This framework could be of interest to other disciplines. This paper presents the details of our framework for interoperability of geographic information as well as a prototype.

  20. Key Management Infrastructure Increment 2 (KMI Inc 2)

    Science.gov (United States)

    2016-03-01

    Infrastructure (KMI) is a unified, scalable, interoperable, and trusted infrastructure that provides net-centric key management services to systems that rely ...products to human users and devices (hereinafter referred to as "supported" or "security-enabled") to enable secure communications. The objectives for...Threshold met during Spiral 1 IOT &E and FOT&E. Connected Networks: Network Identification KMI products and services shall be provided to KMI clients via

  1. BENEFITS OF LINKED DATA FOR INTEROPERABILITY DURING CRISIS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    R. Roller

    2015-08-01

    Full Text Available Floodings represent a permanent risk to the Netherlands in general and to her power supply in particular. Data sharing is essential within this crisis scenario as a power cut affects a great variety of interdependant sectors. Currently used data sharing systems have been shown to hamper interoperability between stakeholders since they lack flexibility and there is no consensus in term definitions and interpretations. The study presented in this paper addresses these challenges by proposing a new data sharing solution based on Linked Data, a method of interlinking data points in a structured way on the web. A conceptual model for two data sharing parties in a flood-caused power cut crisis management scenario was developed to which relevant data were linked. The analysis revealed that the presented data sharing solution burderns its user with extra costs in the short run, but saves resources in the long run by overcoming interoperability problems of the legacy systems. The more stakeholders adopt Linked Data the stronger its benefits for data sharing will become.

  2. AN INTEROPERABLE ARCHITECTURE FOR AIR POLLUTION EARLY WARNING SYSTEM BASED ON SENSOR WEB

    Directory of Open Access Journals (Sweden)

    F. Samadzadegan

    2013-09-01

    Full Text Available Environmental monitoring systems deal with time-sensitive issues which require quick responses in emergency situations. Handling the sensor observations in near real-time and obtaining valuable information is challenging issues in these systems from a technical and scientific point of view. The ever-increasing population growth in urban areas has caused certain problems in developing countries, which has direct or indirect impact on human life. One of applicable solution for controlling and managing air quality by considering real time and update air quality information gathered by spatially distributed sensors in mega cities, using sensor web technology for developing monitoring and early warning systems. Urban air quality monitoring systems using functionalities of geospatial information system as a platform for analysing, processing, and visualization of data in combination with Sensor Web for supporting decision support systems in disaster management and emergency situations. This system uses Sensor Web Enablement (SWE framework of the Open Geospatial Consortium (OGC, which offers a standard framework that allows the integration of sensors and sensor data into spatial data infrastructures. SWE framework introduces standards for services to access sensor data and discover events from sensor data streams as well as definition set of standards for the description of sensors and the encoding of measurements. The presented system provides capabilities to collect, transfer, share, process air quality sensor data and disseminate air quality status in real-time. It is possible to overcome interoperability challenges by using standard framework. In a routine scenario, air quality data measured by in-situ sensors are communicated to central station where data is analysed and processed. The extracted air quality status is processed for discovering emergency situations, and if necessary air quality reports are sent to the authorities. This research

  3. An Interoperable Architecture for Air Pollution Early Warning System Based on Sensor Web

    Science.gov (United States)

    Samadzadegan, F.; Zahmatkesh, H.; Saber, M.; Ghazi khanlou, H. J.

    2013-09-01

    Environmental monitoring systems deal with time-sensitive issues which require quick responses in emergency situations. Handling the sensor observations in near real-time and obtaining valuable information is challenging issues in these systems from a technical and scientific point of view. The ever-increasing population growth in urban areas has caused certain problems in developing countries, which has direct or indirect impact on human life. One of applicable solution for controlling and managing air quality by considering real time and update air quality information gathered by spatially distributed sensors in mega cities, using sensor web technology for developing monitoring and early warning systems. Urban air quality monitoring systems using functionalities of geospatial information system as a platform for analysing, processing, and visualization of data in combination with Sensor Web for supporting decision support systems in disaster management and emergency situations. This system uses Sensor Web Enablement (SWE) framework of the Open Geospatial Consortium (OGC), which offers a standard framework that allows the integration of sensors and sensor data into spatial data infrastructures. SWE framework introduces standards for services to access sensor data and discover events from sensor data streams as well as definition set of standards for the description of sensors and the encoding of measurements. The presented system provides capabilities to collect, transfer, share, process air quality sensor data and disseminate air quality status in real-time. It is possible to overcome interoperability challenges by using standard framework. In a routine scenario, air quality data measured by in-situ sensors are communicated to central station where data is analysed and processed. The extracted air quality status is processed for discovering emergency situations, and if necessary air quality reports are sent to the authorities. This research proposed an

  4. Food product tracing technology capabilities and interoperability.

    Science.gov (United States)

    Bhatt, Tejas; Zhang, Jianrong Janet

    2013-12-01

    identify the supply-chain partner who provided the information prior to sharing this information with product tracing technology providers. The 9 traceability solution providers who agreed to participate in this project have their systems deployed in a wide range of sectors within the food industry including, but not limited to, livestock, dairy, produce, fruits, seafood, meat, and pork; as well as in pharmaceutical, automotive, retail, and other industries. Some have also been implemented across the globe including Canada, China, USA, Norway, and the EU, among others. This broad commercial use ensures that the findings of this work are applicable to a broad spectrum of the food system. Six of the 9 participants successfully completed the data entry phase of this test. To verify successful data entry for these 6, a demo or screenshots of the data set from each system's user interface was requested. Only 4 of the 6 were able to provide us with this evidence for verification. Of the 6 that completed data entry and moved on to the scenarios phase of the test, 5 were able to provide us with the responses to the scenarios. Time metrics were useful for evaluating the scalability and usability of each technology. Scalability was derived from the time it took to enter the nonstandardized data set into the system (ranges from 7 to 11 d). Usability was derived from the time it took to query the scenarios and provide the results (from a few hours to a week). The time was measured in days it took for the participants to respond after we supplied them all the information they would need to successfully execute each test/scenario. Two of the technology solution providers successfully implemented and participated in a proof-of-concept interoperable framework during Year 2 of this study. While not required, they also demonstrated this interoperability capability on the FSMA-mandated food product tracing pilots for the U.S. FDA. This has significant real-world impact since the

  5. Interoperable Cloud Networking for intelligent power supply; Interoperables Cloud Networking fuer intelligente Energieversorgung

    Energy Technology Data Exchange (ETDEWEB)

    Hardin, Dave [Invensys Operations Management, Foxboro, MA (United States)

    2010-09-15

    Intelligent power supply by a so-called Smart Grid will make it possible to control consumption by market-based pricing and signals for load reduction. This necessitates that both the energy rates and the energy information are distributed reliably and in real time to automation systems in domestic and other buildings and in industrial plants over a wide geographic range and across the most varied grid infrastructures. Effective communication at this level of complexity necessitates computer and grid resources that are normally only available in the computer centers of big industries. The cloud computing technology, which is described here in some detail, has all features to provide reliability, interoperability and efficiency for large-scale smart grid applications, at lower cost than traditional computer centers. (orig.)

  6. Balancing of Heterogeneity and Interoperability in E-Business Networks: The Role of Standards and Protocols

    OpenAIRE

    Frank-Dieter Dorloff; Ejub Kajan

    2012-01-01

    To reach this interoperability visibility and common understanding must be ensured on all levels of the interoperability pyramid. This includes common agreements about the visions, political and legal restrictions, clear descriptions about the collaboration scenarios, included business processes and-rules, the type and roles of the Documents, a common understandable vocabulary, etc. To do this in an effective and automatable manner, ICT based concepts, frameworks and models have to be defined...

  7. A development framework for semantically interoperable health information systems.

    Science.gov (United States)

    Lopez, Diego M; Blobel, Bernd G M E

    2009-02-01

    Semantic interoperability is a basic challenge to be met for new generations of distributed, communicating and co-operating health information systems (HIS) enabling shared care and e-Health. Analysis, design, implementation and maintenance of such systems and intrinsic architectures have to follow a unified development methodology. The Generic Component Model (GCM) is used as a framework for modeling any system to evaluate and harmonize state of the art architecture development approaches and standards for health information systems as well as to derive a coherent architecture development framework for sustainable, semantically interoperable HIS and their components. The proposed methodology is based on the Rational Unified Process (RUP), taking advantage of its flexibility to be configured for integrating other architectural approaches such as Service-Oriented Architecture (SOA), Model-Driven Architecture (MDA), ISO 10746, and HL7 Development Framework (HDF). Existing architectural approaches have been analyzed, compared and finally harmonized towards an architecture development framework for advanced health information systems. Starting with the requirements for semantic interoperability derived from paradigm changes for health information systems, and supported in formal software process engineering methods, an appropriate development framework for semantically interoperable HIS has been provided. The usability of the framework has been exemplified in a public health scenario.

  8. Developing a grid infrastructure in Cuba

    Energy Technology Data Exchange (ETDEWEB)

    Lopez Aldama, D.; Dominguez, M.; Ricardo, H.; Gonzalez, A.; Nolasco, E.; Fernandez, E.; Fernandez, M.; Sanchez, M.; Suarez, F.; Nodarse, F.; Moreno, N.; Aguilera, L.

    2007-07-01

    A grid infrastructure was deployed at Centro de Gestion de la Informacion y Desarrollo de la Energia (CUBAENERGIA) in the frame of EELA project and of a national initiative for developing a Cuban Network for Science. A stand-alone model was adopted to overcome connectivity limitations. The e-infrastructure is based on gLite-3.0 middleware and is fully compatible with EELA-infrastructure. Afterwards, the work was focused on grid applications. The application GATE was deployed from the early beginning for biomedical users. Further, two applications were deployed on the local grid infrastructure: MOODLE for e-learning and AERMOD for assessment of local dispersion of atmospheric pollutants. Additionally, our local grid infrastructure was made interoperable with a Java based distributed system for bioinformatics calculations. This experience could be considered as a suitable approach for national networks with weak Internet connections. (Author)

  9. Privacy-Preserving Data Aggregation Protocol for Fog Computing-Assisted Vehicle-to-Infrastructure Scenario

    Directory of Open Access Journals (Sweden)

    Yanan Chen

    2018-01-01

    Full Text Available Vehicle-to-infrastructure (V2I communication enables moving vehicles to upload real-time data about road surface situation to the Internet via fixed roadside units (RSU. Thanks to the resource restriction of mobile vehicles, fog computation-enhanced V2I communication scenario has received increasing attention recently. However, how to aggregate the sensed data from vehicles securely and efficiently still remains open to the V2I communication scenario. In this paper, a light-weight and anonymous aggregation protocol is proposed for the fog computing-based V2I communication scenario. With the proposed protocol, the data collected by the vehicles can be efficiently obtained by the RSU in a privacy-preserving manner. Particularly, we first suggest a certificateless aggregate signcryption (CL-A-SC scheme and prove its security in the random oracle model. The suggested CL-A-SC scheme, which is of independent interest, can achieve the merits of certificateless cryptography and signcryption scheme simultaneously. Then we put forward the anonymous aggregation protocol for V2I communication scenario as one extension of the suggested CL-A-SC scheme. Security analysis demonstrates that the proposed aggregation protocol achieves desirable security properties. The performance comparison shows that the proposed protocol significantly reduces the computation and communication overhead compared with the up-to-date protocols in this field.

  10. On the feasibility of interoperable schemes in hand biometrics.

    Science.gov (United States)

    Morales, Aythami; González, Ester; Ferrer, Miguel A

    2012-01-01

    Personal recognition through hand-based biometrics has attracted the interest of many researchers in the last twenty years. A significant number of proposals based on different procedures and acquisition devices have been published in the literature. However, comparisons between devices and their interoperability have not been thoroughly studied. This paper tries to fill this gap by proposing procedures to improve the interoperability among different hand biometric schemes. The experiments were conducted on a database made up of 8,320 hand images acquired from six different hand biometric schemes, including a flat scanner, webcams at different wavelengths, high quality cameras, and contactless devices. Acquisitions on both sides of the hand were included. Our experiment includes four feature extraction methods which determine the best performance among the different scenarios for two of the most popular hand biometrics: hand shape and palm print. We propose smoothing techniques at the image and feature levels to reduce interdevice variability. Results suggest that comparative hand shape offers better performance in terms of interoperability than palm prints, but palm prints can be more effective when using similar sensors.

  11. On the Feasibility of Interoperable Schemes in Hand Biometrics

    Science.gov (United States)

    Morales, Aythami; González, Ester; Ferrer, Miguel A.

    2012-01-01

    Personal recognition through hand-based biometrics has attracted the interest of many researchers in the last twenty years. A significant number of proposals based on different procedures and acquisition devices have been published in the literature. However, comparisons between devices and their interoperability have not been thoroughly studied. This paper tries to fill this gap by proposing procedures to improve the interoperability among different hand biometric schemes. The experiments were conducted on a database made up of 8,320 hand images acquired from six different hand biometric schemes, including a flat scanner, webcams at different wavelengths, high quality cameras, and contactless devices. Acquisitions on both sides of the hand were included. Our experiment includes four feature extraction methods which determine the best performance among the different scenarios for two of the most popular hand biometrics: hand shape and palm print. We propose smoothing techniques at the image and feature levels to reduce interdevice variability. Results suggest that comparative hand shape offers better performance in terms of interoperability than palm prints, but palm prints can be more effective when using similar sensors. PMID:22438714

  12. Making Network Markets in Education: The Development of Data Infrastructure in Australian Schooling

    Science.gov (United States)

    Sellar, Sam

    2017-01-01

    This paper examines the development of data infrastructure in Australian schooling with a specific focus on interoperability standards that help to make new markets for education data. The conceptual framework combines insights from studies of infrastructure, economic markets and digital data. The case of the Australian National Schools…

  13. Biodiversity information platforms: From standards to interoperability

    Directory of Open Access Journals (Sweden)

    Walter Berendsohn

    2011-11-01

    Full Text Available One of the most serious bottlenecks in the scientific workflows of biodiversity sciences is the need to integrate data from different sources, software applications, and services for analysis, visualisation and publication. For more than a quarter of a century the TDWG Biodiversity Information Standards organisation has a central role in defining and promoting data standards and protocols supporting interoperability between disparate and locally distributed systems. Although often not sufficiently recognized, TDWG standards are the foundation of many popular Biodiversity Informatics applications and infrastructures ranging from small desktop software solutions to large scale international data networks. However, individual scientists and groups of collaborating scientist have difficulties in fully exploiting the potential of standards that are often notoriously complex, lack non-technical documentations, and use different representations and underlying technologies. In the last few years, a series of initiatives such as Scratchpads, the EDIT Platform for Cybertaxonomy, and biowikifarm have started to implement and set up virtual work platforms for biodiversity sciences which shield their users from the complexity of the underlying standards. Apart from being practical work-horses for numerous working processes related to biodiversity sciences, they can be seen as information brokers mediating information between multiple data standards and protocols. The ViBRANT project will further strengthen the flexibility and power of virtual biodiversity working platforms by building software interfaces between them, thus facilitating essential information flows needed for comprehensive data exchange, data indexing, web-publication, and versioning. This work will make an important contribution to the shaping of an international, interoperable, and user-oriented biodiversity information infrastructure.

  14. Interoperability prototype between hospitals and general practitioners in Switzerland.

    Science.gov (United States)

    Alves, Bruno; Müller, Henning; Schumacher, Michael; Godel, David; Abu Khaled, Omar

    2010-01-01

    Interoperability in data exchange has the potential to improve the care processes and decrease costs of the health care system. Many countries have related eHealth initiatives in preparation or already implemented. In this area, Switzerland has yet to catch up. Its health system is fragmented, because of the federated nature of cantons. It is thus more difficult to coordinate efforts between the existing healthcare actors. In the Medicoordination project a pragmatic approach was selected: integrating several partners in healthcare on a regional scale in French speaking Switzerland. In parallel with the Swiss eHealth strategy, currently being elaborated by the Swiss confederation, particularly medium-sized hospitals and general practitioners were targeted in Medicoordination to implement concrete scenarios of information exchange between hospitals and general practitioners with a high added value. In this paper we focus our attention on a prototype implementation of one chosen scenario: the discharge summary. Although simple in concept, exchanging release letters shows small, hidden difficulties due to the multi-partner nature of the project. The added value of such a prototype is potentially high and it is now important to show that interoperability can work in practice.

  15. IHE cross-enterprise document sharing for imaging: interoperability testing software

    Directory of Open Access Journals (Sweden)

    Renaud Bérubé

    2010-09-01

    Full Text Available Abstract Background With the deployments of Electronic Health Records (EHR, interoperability testing in healthcare is becoming crucial. EHR enables access to prior diagnostic information in order to assist in health decisions. It is a virtual system that results from the cooperation of several heterogeneous distributed systems. Interoperability between peers is therefore essential. Achieving interoperability requires various types of testing. Implementations need to be tested using software that simulates communication partners, and that provides test data and test plans. Results In this paper we describe a software that is used to test systems that are involved in sharing medical images within the EHR. Our software is used as part of the Integrating the Healthcare Enterprise (IHE testing process to test the Cross Enterprise Document Sharing for imaging (XDS-I integration profile. We describe its architecture and functionalities; we also expose the challenges encountered and discuss the elected design solutions. Conclusions EHR is being deployed in several countries. The EHR infrastructure will be continuously evolving to embrace advances in the information technology domain. Our software is built on a web framework to allow for an easy evolution with web technology. The testing software is publicly available; it can be used by system implementers to test their implementations. It can also be used by site integrators to verify and test the interoperability of systems, or by developers to understand specifications ambiguities, or to resolve implementations difficulties.

  16. Water Resources Sustainability in Northwest Mexico: Analysis of Regional Infrastructure Plans under Historical and Climate Change Scenarios

    Science.gov (United States)

    Che, D.; Robles-Morua, A.; Mayer, A. S.; Vivoni, E. R.

    2012-12-01

    The arid state of Sonora, Mexico, has embarked on a large water infrastructure project to provide additional water supply and improved sanitation to the growing capital of Hermosillo. The main component of the Sonora SI project involves an interbasin transfer from rural to urban water users that has generated conflicts over water among different social sectors. Through interactions with regional stakeholders from agricultural and water management agencies, we ascertained the need for a long-term assessment of the water resources of one of the system components, the Sonora River Basin (SRB). A semi-distributed, daily watershed model that includes current and proposed reservoir infrastructure was applied to the SRB. This simulation framework allowed us to explore alternative scenarios of water supply from the SRB to Hermosillo under historical (1980-2010) and future (2031-2040) periods that include the impact of climate change. We compared three precipitation forcing scenarios for the historical period: (1) a network of ground observations from Mexican water agencies; (2) gridded fields from the North America Land Data Assimilation System (NLDAS) at 12 km resolution; and (3) gridded fields from the Weather Research and Forecasting (WRF) model at 10 km resolution. These were compared to daily historical observations at two stream gauging stations and two reservoirs to generate confidence in the simulation tools. We then tested the impact of climate change through the use of the A2 emissions scenario and HadCM3 boundary forcing on the WRF simulations of a future period. Our analysis is focused on the combined impact of existing and proposed reservoir infrastructure at two new sites on the water supply management in the SRB under historical and future climate conditions. We also explore the impact of climate variability and change on the bimodal precipitation pattern from winter frontal storms and the summertime North American monsoon and its consequences on water

  17. Building the Synergy between Public Sector and Research Data Infrastructures

    Science.gov (United States)

    Craglia, Massimo; Friis-Christensen, Anders; Ostländer, Nicole; Perego, Andrea

    2014-05-01

    INSPIRE is a European Directive aiming to establish a EU-wide spatial data infrastructure to give cross-border access to information that can be used to support EU environmental policies, as well as other policies and activities having an impact on the environment. In order to ensure cross-border interoperability of data infrastructures operated by EU Member States, INSPIRE sets out a framework based on common specifications for metadata, data, network services, data and service sharing, monitoring and reporting. The implementation of INSPIRE has reached important milestones: the INSPIRE Geoportal was launched in 2011 providing a single access point for the discovery of INSPIRE data and services across EU Member States (currently, about 300K), while all the technical specifications for the interoperability of data across the 34 INSPIRE themes were adopted at the end of 2013. During this period a number of EU and international initiatives has been launched, concerning cross-domain interoperability and (Linked) Open Data. In particular, the EU Open Data Portal, launched in December 2012, made provisions to access government and scientific data from EU institutions and bodies, and the EU ISA Programme (Interoperability Solutions for European Public Administrations) promotes cross-sector interoperability by sharing and re-using EU-wide and national standards and components. Moreover, the Research Data Alliance (RDA), an initiative jointly funded by the European Commission, the US National Science Foundation and the Australian Research Council, was launched in March 2013 to promote scientific data sharing and interoperability. The Joint Research Centre of the European Commission (JRC), besides being the technical coordinator of the implementation of INSPIRE, is also actively involved in the initiatives promoting cross-sector re-use in INSPIRE, and sustainable approaches to address the evolution of technologies - in particular, how to support Linked Data in INSPIRE and

  18. Grid Interoperation with ARC Middleware for the CMS Experiment

    CERN Document Server

    Edelmann, Erik; Frey, Jaime; Gronager, Michael; Happonen, Kalle; Johansson, Daniel; Kleist, Josva; Klem, Jukka; Koivumaki, Jesper; Linden, Tomas; Pirinen, Antti; Qing, Di

    2010-01-01

    The Compact Muon Solenoid (CMS) is one of the general purpose experiments at the CERN Large Hadron Collider (LHC). CMS computing relies on different grid infrastructures to provide computational and storage resources. The major grid middleware stacks used for CMS computing are gLite, Open Science Grid (OSG) and ARC (Advanced Resource Connector). Helsinki Institute of Physics (HIP) hosts one of the Tier-2 centers for CMS computing. CMS Tier-2 centers operate software systems for data transfers (PhEDEx), Monte Carlo production (ProdAgent) and data analysis (CRAB). In order to provide the Tier-2 services for CMS, HIP uses tools and components from both ARC and gLite grid middleware stacks. Interoperation between grid systems is a challenging problem and HIP uses two different solutions to provide the needed services. The first solution is based on gLite-ARC grid level interoperability. This allows to use ARC resources in CMS without modifying the CMS application software. The second solution is based on developi...

  19. Interoperability Assets for Patient Summary Components: A Gap Analysis.

    Science.gov (United States)

    Heitmann, Kai U; Cangioli, Giorgio; Melgara, Marcello; Chronaki, Catherine

    2018-01-01

    The International Patient Summary (IPS) standards aim to define the specifications for a minimal and non-exhaustive Patient Summary, which is specialty-agnostic and condition-independent, but still clinically relevant. Meanwhile, health systems are developing and implementing their own variation of a patient summary while, the eHealth Digital Services Infrastructure (eHDSI) initiative is deploying patient summary services across countries in the Europe. In the spirit of co-creation, flexible governance, and continuous alignment advocated by eStandards, the Trillum-II initiative promotes adoption of the patient summary by engaging standards organizations, and interoperability practitioners in a community of practice for digital health to share best practices, tools, data, specifications, and experiences. This paper compares operational aspects of patient summaries in 14 case studies in Europe, the United States, and across the world, focusing on how patient summary components are used in practice, to promote alignment and joint understanding that will improve quality of standards and lower costs of interoperability.

  20. Connectivity, interoperability and manageability challenges in internet of things

    Science.gov (United States)

    Haseeb, Shariq; Hashim, Aisha Hassan A.; Khalifa, Othman O.; Ismail, Ahmad Faris

    2017-09-01

    The vision of Internet of Things (IoT) is about interconnectivity between sensors, actuators, people and processes. IoT exploits connectivity between physical objects like fridges, cars, utilities, buildings and cities for enhancing the lives of people through automation and data analytics. However, this sudden increase in connected heterogeneous IoT devices takes a huge toll on the existing Internet infrastructure and introduces new challenges for researchers to embark upon. This paper highlights the effects of heterogeneity challenges on connectivity, interoperability, management in greater details. It also surveys some of the existing solutions adopted in the core network to solve the challenges of massive IoT deployment. The paper finally concludes that IoT architecture and network infrastructure needs to be reengineered ground-up, so that IoT solutions can be safely and efficiently deployed.

  1. Buildings Interoperability Landscape

    Energy Technology Data Exchange (ETDEWEB)

    Hardin, Dave [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Stephan, Eric G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wang, Weimin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Corbin, Charles D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Widergren, Steven E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-12-31

    Through its Building Technologies Office (BTO), the United States Department of Energy’s Office of Energy Efficiency and Renewable Energy (DOE-EERE) is sponsoring an effort to advance interoperability for the integration of intelligent buildings equipment and automation systems, understanding the importance of integration frameworks and product ecosystems to this cause. This is important to BTO’s mission to enhance energy efficiency and save energy for economic and environmental purposes. For connected buildings ecosystems of products and services from various manufacturers to flourish, the ICT aspects of the equipment need to integrate and operate simply and reliably. Within the concepts of interoperability lie the specification, development, and certification of equipment with standards-based interfaces that connect and work. Beyond this, a healthy community of stakeholders that contribute to and use interoperability work products must be developed. On May 1, 2014, the DOE convened a technical meeting to take stock of the current state of interoperability of connected equipment and systems in buildings. Several insights from that meeting helped facilitate a draft description of the landscape of interoperability for connected buildings, which focuses mainly on small and medium commercial buildings. This document revises the February 2015 landscape document to address reviewer comments, incorporate important insights from the Buildings Interoperability Vision technical meeting, and capture thoughts from that meeting about the topics to be addressed in a buildings interoperability vision. In particular, greater attention is paid to the state of information modeling in buildings and the great potential for near-term benefits in this area from progress and community alignment.

  2. On the Feasibility of Interoperable Schemes in Hand Biometrics

    Directory of Open Access Journals (Sweden)

    Miguel A. Ferrer

    2012-02-01

    Full Text Available Personal recognition through hand-based biometrics has attracted the interest of many researchers in the last twenty years. A significant number of proposals based on different procedures and acquisition devices have been published in the literature. However, comparisons between devices and their interoperability have not been thoroughly studied. This paper tries to fill this gap by proposing procedures to improve the interoperability among different hand biometric schemes. The experiments were conducted on a database made up of 8,320 hand images acquired from six different hand biometric schemes, including a flat scanner, webcams at different wavelengths, high quality cameras, and contactless devices. Acquisitions on both sides of the hand were included. Our experiment includes four feature extraction methods which determine the best performance among the different scenarios for two of the most popular hand biometrics: hand shape and palm print. We propose smoothing techniques at the image and feature levels to reduce interdevice variability. Results suggest that comparative hand shape offers better performance in terms of interoperability than palm prints, but palm prints can be more effective when using similar sensors.

  3. Hydrological Scenario Using Tools and Applications Available in enviroGRIDS Portal

    Science.gov (United States)

    Bacu, V.; Mihon, D.; Stefanut, T.; Rodila, D.; Cau, P.; Manca, S.; Soru, C.; Gorgan, D.

    2012-04-01

    Nowadays the decision makers but also citizens are concerning with the sustainability and vulnerability of land management practices on various aspects and in particular on water quality and quantity in complex watersheds. The Black Sea Catchment is an important watershed in the Central and East Europe. In the FP7 project enviroGRIDS [1] was developed a Web Portal that incorporates different tools and applications focused on geospatial data management, hydrologic model calibration, execution and visualization and training activities. This presentation highlights, from the end-user point of view, the scenario related with hydrological models using the tools and applications available in the enviroGRIDS Web Portal [2]. The development of SWAT (Soil Water Assessment Tool) hydrological models is a well known procedure for the hydrological specialists [3]. Starting from the primary data (information related to weather, soil properties, topography, vegetation, and land management practices of the particular watershed) that are used to develop SWAT hydrological models, to specific reports, about the water quality in the studied watershed, the hydrological specialist will use different applications available in the enviroGRIDS portal. The tools and applications available through the enviroGRIDS portal are not dealing with the building up of the SWAT hydrological models. They are mainly focused on: calibration procedure (gSWAT [4]) - uses the GRID computational infrastructure to speed-up the calibration process; development of specific scenarios (BASHYT [5]) - starts from an already calibrated SWAT hydrological model and defines new scenarios; execution of scenarios (gSWATSim [6]) - executes the scenarios exported from BASHYT; visualization (BASHYT) - displays charts, tables and maps. Each application is built-up as a stack of functional layers. We combine different layers of applications by vertical interoperability in order to build the desired complex functionality. On

  4. Ocean Data Interoperability Platform: developing a common global framework for marine data management

    Science.gov (United States)

    Glaves, Helen; Schaap, Dick

    2017-04-01

    elsewhere. To add a further layer of complexity there are also global initiatives providing marine data infrastructures e.g. IOC-IODE, POGO as well as those with a wider remit which includes environmental data e.g. GEOSS, COPERNICUS etc. Ecosystem level marine research requires a common framework for marine data management that supports the sharing of data across these regional and global data systems, and provides the user with access to the data available from these services via a single point of access. This framework must be based on existing data systems and established by developing interoperability between them. The Ocean Data and Interoperability Platform (ODIP/ODIP II) project brings together those organisations responsible for maintaining selected regional data infrastructures along with other relevant experts in order to identify the common standards and best practice necessary to underpin this framework, and to evaluate the differences and commonalties between the regional data infrastructures in order to establish interoperability between them for the purposes of data sharing. This coordinated approach is being demonstrated and validated through the development of a series of prototype interoperability solutions that demonstrate the mechanisms and standards necessary to facilitate the sharing of marine data across these existing data infrastructures.

  5. Achieving Interoperability Through Base Registries for Governmental Services and Document Management

    Science.gov (United States)

    Charalabidis, Yannis; Lampathaki, Fenareti; Askounis, Dimitris

    As digital infrastructures increase their presence worldwide, following the efforts of governments to provide citizens and businesses with high-quality one-stop services, there is a growing need for the systematic management of those newly defined and constantly transforming processes and electronic documents. E-government Interoperability Frameworks usually cater to the technical standards of e-government systems interconnection, but do not address service composition and use by citizens, businesses, or other administrations.

  6. Open Source Interoperability: It's More than Technology

    Directory of Open Access Journals (Sweden)

    Dominic Sartorio

    2008-01-01

    Full Text Available The Open Solutions Alliance is a consortium of leading commercial open source vendors, integrators and end users dedicated to the growth of open source based solutions in the enterprise. We believe Linux and other infrastructure software, such as Apache, has become mainstream, and packaged solutions represent the next great growth opportunity. However some unique challenges can temper that opportunity. These challenges include getting the word out about the maturity and enterprise-readiness of those solutions, ensuring interoperability both with each other and with other proprietary and legacy solutions, and ensuring healthy collaboration between vendors and their respective customer and developer communities.

  7. Space-Based Information Infrastructure Architecture for Broadband Services

    Science.gov (United States)

    Price, Kent M.; Inukai, Tom; Razdan, Rajendev; Lazeav, Yvonne M.

    1996-01-01

    This study addressed four tasks: (1) identify satellite-addressable information infrastructure markets; (2) perform network analysis for space-based information infrastructure; (3) develop conceptual architectures; and (4) economic assessment of architectures. The report concludes that satellites will have a major role in the national and global information infrastructure, requiring seamless integration between terrestrial and satellite networks. The proposed LEO, MEO, and GEO satellite systems have satellite characteristics that vary widely. They include delay, delay variations, poorer link quality and beam/satellite handover. The barriers against seamless interoperability between satellite and terrestrial networks are discussed. These barriers are the lack of compatible parameters, standards and protocols, which are presently being evaluated and reduced.

  8. IPv6 (Internet Protocol version 6) heterogeneous networking infrastructure for energy efficient building

    International Nuclear Information System (INIS)

    Ben Saad, Leila; Chauvenet, Cedric; Tourancheau, Bernard

    2012-01-01

    In the context of increasing developments of home, building and city automation, Power Line Communication (PLC) networking is called for unprecedented usage, especially for energy efficiency improvement. Our view of the future building networking infrastructure places PLC as the central point. We point that while Wireless Sensor Network (WSN) is necessary in the sensor and actuator networking infrastructure, PLC is mandatory for the smart-grid metering and command infrastructure. PLC will also serve the sensor/actuator infrastructure when the energy requirement of the probing system itself cannot be fulfilled by autonomous battery and harvesting based nodes. PLC may also provide the numerous bridges necessary to sustain a long lifetime (years) with the battery based WSN part of the infrastructure. This new role of PLC networking will be possible only if the interoperability between all media and technology is made possible. Thanks to the converging design of Internet Protocol version 6 (IPv6) networking layers, we show that such a full inter-interoperability is already possible even in very tiny constrained networking devices. Moreover, the low power PLC technology used in our experiments will be able to provide this smart grid monitoring without impacting noticeably the overall energy balance of the monitored system.

  9. Technological and Organisational Aspects of Global Research Data Infrastructures Towards Year 2020

    Directory of Open Access Journals (Sweden)

    Fotis Karagiannis

    2013-07-01

    Full Text Available A general-purpose Global Research Data Infrastructure (GRDI for all sciences and research purposes is not conceivable for the next decade as there are too many discipline-specific modalities that currently prevail for such generalisation efforts to be effective. On the other hand, a more pragmatic approach is to start from what currently exists, identify best practices and key issues, and promote effective inter-domain collaboration among different components forming an ecosystem. This will promote interoperability, data exchange, data preservation, and distributed access (among others. This ecosystem of interoperable research data infrastructures will be composed of regional, disciplinary, and multidisciplinary components, such as libraries, archives, and data centres, offering data services for both primary datasets and publications. The ecosystem will support data-intensive science and research and stimulate the interaction among all its elements, thus promoting multidisciplinary and interdisciplinary science. This special issue includes a set of independent papers from renowned experts on organisational and technological issues related to GRDIs. These documents feed into and compliment the GRDI2020 roadmap, which supports a Global Research Data Infrastructure ecosystem.

  10. AstroCloud, a Cyber-Infrastructure for Astronomy Research: Data Access and Interoperability

    Science.gov (United States)

    Fan, D.; He, B.; Xiao, J.; Li, S.; Li, C.; Cui, C.; Yu, C.; Hong, Z.; Yin, S.; Wang, C.; Cao, Z.; Fan, Y.; Mi, L.; Wan, W.; Wang, J.

    2015-09-01

    Data access and interoperability module connects the observation proposals, data, virtual machines and software. According to the unique identifier of PI (principal investigator), an email address or an internal ID, data can be collected by PI's proposals, or by the search interfaces, e.g. conesearch. Files associated with the searched results could be easily transported to cloud storages, including the storage with virtual machines, or several commercial platforms like Dropbox. Benefitted from the standards of IVOA (International Observatories Alliance), VOTable formatted searching result could be sent to kinds of VO software. Latter endeavor will try to integrate more data and connect archives and some other astronomical resources.

  11. Semantic Web Technologies as the Foundation for the Information Infrastructure

    NARCIS (Netherlands)

    Van Oosterom, Peter; Zlatanova, S.; Van Harmelen, Frank; Van Oosterom, Peter; Zlatanova, S

    2008-01-01

    The Semantic Web is arising over the pas few years as a realistic option for a world wide Information Infrastructure, with its promises of semantic interoperability and serendipitous reuse. In this paper we will analyse the essential ingredients of semantic technologies, what makes them suitable as

  12. Quantifying the conservation gains from shared access to linear infrastructure.

    Science.gov (United States)

    Runge, Claire A; Tulloch, Ayesha I T; Gordon, Ascelin; Rhodes, Jonathan R

    2017-12-01

    The proliferation of linear infrastructure such as roads and railways is a major global driver of cumulative biodiversity loss. One strategy for reducing habitat loss associated with development is to encourage linear infrastructure providers and users to share infrastructure networks. We quantified the reductions in biodiversity impact and capital costs under linear infrastructure sharing of a range of potential mine to port transportation links for 47 mine locations operated by 28 separate companies in the Upper Spencer Gulf Region of South Australia. We mapped transport links based on least-cost pathways for different levels of linear-infrastructure sharing and used expert-elicited impacts of linear infrastructure to estimate the consequences for biodiversity. Capital costs were calculated based on estimates of construction costs, compensation payments, and transaction costs. We evaluated proposed mine-port links by comparing biodiversity impacts and capital costs across 3 scenarios: an independent scenario, where no infrastructure is shared; a restricted-access scenario, where the largest mining companies share infrastructure but exclude smaller mining companies from sharing; and a shared scenario where all mining companies share linear infrastructure. Fully shared development of linear infrastructure reduced overall biodiversity impacts by 76% and reduced capital costs by 64% compared with the independent scenario. However, there was considerable variation among companies. Our restricted-access scenario showed only modest biodiversity benefits relative to the independent scenario, indicating that reductions are likely to be limited if the dominant mining companies restrict access to infrastructure, which often occurs without policies that promote sharing of infrastructure. Our research helps illuminate the circumstances under which infrastructure sharing can minimize the biodiversity impacts of development. © 2017 The Authors. Conservation Biology published

  13. Grid Interoperation with ARC middleware for the CMS experiment

    International Nuclear Information System (INIS)

    Edelmann, Erik; Groenager, Michael; Johansson, Daniel; Kleist, Josva; Field, Laurence; Qing, Di; Frey, Jaime; Happonen, Kalle; Klem, Jukka; Koivumaeki, Jesper; Linden, Tomas; Pirinen, Antti

    2010-01-01

    The Compact Muon Solenoid (CMS) is one of the general purpose experiments at the CERN Large Hadron Collider (LHC). CMS computing relies on different grid infrastructures to provide computational and storage resources. The major grid middleware stacks used for CMS computing are gLite, Open Science Grid (OSG) and ARC (Advanced Resource Connector). Helsinki Institute of Physics (HIP) hosts one of the Tier-2 centers for CMS computing. CMS Tier-2 centers operate software systems for data transfers (PhEDEx), Monte Carlo production (ProdAgent) and data analysis (CRAB). In order to provide the Tier-2 services for CMS, HIP uses tools and components from both ARC and gLite grid middleware stacks. Interoperation between grid systems is a challenging problem and HIP uses two different solutions to provide the needed services. The first solution is based on gLite-ARC grid level interoperability. This allows to use ARC resources in CMS without modifying the CMS application software. The second solution is based on developing specific ARC plugins in CMS software.

  14. Grid Interoperation with ARC middleware for the CMS experiment

    Energy Technology Data Exchange (ETDEWEB)

    Edelmann, Erik; Groenager, Michael; Johansson, Daniel; Kleist, Josva [Nordic DataGrid Facility, Kastruplundgade 22, 1., DK-2770 Kastrup (Denmark); Field, Laurence; Qing, Di [CERN, CH-1211 Geneve 23 (Switzerland); Frey, Jaime [University of Wisconsin-Madison, 1210 W. Dayton St., Madison, WI (United States); Happonen, Kalle; Klem, Jukka; Koivumaeki, Jesper; Linden, Tomas; Pirinen, Antti, E-mail: Jukka.Klem@cern.c [Helsinki Institute of Physics, PO Box 64, FIN-00014 University of Helsinki (Finland)

    2010-04-01

    The Compact Muon Solenoid (CMS) is one of the general purpose experiments at the CERN Large Hadron Collider (LHC). CMS computing relies on different grid infrastructures to provide computational and storage resources. The major grid middleware stacks used for CMS computing are gLite, Open Science Grid (OSG) and ARC (Advanced Resource Connector). Helsinki Institute of Physics (HIP) hosts one of the Tier-2 centers for CMS computing. CMS Tier-2 centers operate software systems for data transfers (PhEDEx), Monte Carlo production (ProdAgent) and data analysis (CRAB). In order to provide the Tier-2 services for CMS, HIP uses tools and components from both ARC and gLite grid middleware stacks. Interoperation between grid systems is a challenging problem and HIP uses two different solutions to provide the needed services. The first solution is based on gLite-ARC grid level interoperability. This allows to use ARC resources in CMS without modifying the CMS application software. The second solution is based on developing specific ARC plugins in CMS software.

  15. CERIF-CRIS for the European e-Infrastructure

    Directory of Open Access Journals (Sweden)

    K Jeffery

    2010-04-01

    Full Text Available The European e-infrastructure is the ICT support for research although the infrastructure will be extended for commercial/business use. It supports the research process across funding agencies to research institutions to innovation. It supports experimental facilities, modelling and simulation, communication between researchers, and workflow of research processes and research management. We propose the core should be CERIF: an EU recommendation to member states for exchanging research information and for homogeneous access to heterogeneous information. CERIF can also integrate associated systems (such as finance, human resource, project management, and library services and provides interoperation among research institutions, research funders, and innovators.

  16. An information infrastructure for earthquake science

    Science.gov (United States)

    Jordan, T. H.; Scec/Itr Collaboration

    2003-04-01

    The Southern California Earthquake Center (SCEC), in collaboration with the San Diego Supercomputer Center, the USC Information Sciences Institute,IRIS, and the USGS, has received a large five-year grant from the NSF's ITR Program and its Geosciences Directorate to build a new information infrastructure for earthquake science. In many respects, the SCEC/ITR Project presents a microcosm of the IT efforts now being organized across the geoscience community, including the EarthScope initiative. The purpose of this presentation is to discuss the experience gained by the project thus far and lay out the challenges that lie ahead; our hope is to encourage cross-discipline collaboration in future IT advancements. Project goals have been formulated in terms of four "computational pathways" related to seismic hazard analysis (SHA). For example, Pathway 1 involves the construction of an open-source, object-oriented, and web-enabled framework for SHA computations that can incorporate a variety of earthquake forecast models, intensity-measure relationships, and site-response models, while Pathway 2 aims to utilize the predictive power of wavefield simulation in modeling time-dependent ground motion for scenario earthquakes and constructing intensity-measure relationships. The overall goal is to create a SCEC "community modeling environment" or collaboratory that will comprise the curated (on-line, documented, maintained) resources needed by researchers to develop and use these four computational pathways. Current activities include (1) the development and verification of the computational modules, (2) the standardization of data structures and interfaces needed for syntactic interoperability, (3) the development of knowledge representation and management tools, (4) the construction SCEC computational and data grid testbeds, and (5) the creation of user interfaces for knowledge-acquisition, code execution, and visualization. I will emphasize the increasing role of standardized

  17. E-Infrastructure and Data Management for Global Change Research

    Science.gov (United States)

    Allison, M. L.; Gurney, R. J.; Cesar, R.; Cossu, R.; Gemeinholzer, B.; Koike, T.; Mokrane, M.; Peters, D.; Nativi, S.; Samors, R.; Treloar, A.; Vilotte, J. P.; Visbeck, M.; Waldmann, H. C.

    2014-12-01

    The Belmont Forum, a coalition of science funding agencies from 15 countries, is supporting an 18-month effort to assess the state of international of e-infrastructures and data management so that global change data and information can be more easily and efficiently exchanged internationally and across domains. Ultimately, this project aims to address the Belmont "Challenge" to deliver knowledge needed for action to avoid and adapt to detrimental environmental change, including extreme hazardous events. This effort emerged from conclusions by the Belmont Forum that transformative approaches and innovative technologies are needed for heterogeneous data/information to be integrated and made interoperable for researchers in disparate fields, and for myriad uses across international, institutional, disciplinary, spatial and temporal boundaries. The project will deliver a Community Strategy and Implementation Plan to prioritize international funding opportunities and long-term policy recommendations on how the Belmont Forum can implement a more coordinated, holistic, and sustainable approach to funding and supporting global change research. The Plan is expected to serve as the foundation of future Belmont Forum funding calls for proposals in support of research science goals as well as to establish long term e-infrastructure. More than 120 scientists, technologists, legal experts, social scientists, and other experts are participating in six Work Packages to develop the Plan by spring, 2015, under the broad rubrics of Architecture/Interoperability and Governance: Data Integration for Multidisciplinary Research; Improved Interface between Computation & Data Infrastructures; Harmonization of Global Data Infrastructure; Data Sharing; Open Data; and Capacity Building. Recommendations could lead to a more coordinated approach to policies, procedures and funding mechanisms to support e-infrastructures in a more sustainable way.

  18. Interconnecting Multidiscilinary Data Infrastructures: From Federation to Brokering Framework

    Science.gov (United States)

    Nativi, S.

    2014-12-01

    Standardization and federation activities have been played an essential role to push interoperability at the disciplinary and cross-disciplinary level. However, they demonstrated not to be sufficient to resolve important interoperability challenges, including: disciplinary heterogeneity, cross-organizations diversities, cultural differences. Significant international initiatives like GEOSS, IODE, and CEOS demonstrated that a federation system dealing with global and multi-disciplinary domain turns out to be rater complex, raising more the already high entry level barriers for both Providers and Users. In particular, GEOSS demonstrated that standardization and federation actions must be accompanied and complemented by a brokering approach. Brokering architecture and its implementing technologies are able to implement an effective interoperability level among multi-disciplinary systems, lowering the entry level barriers for both data providers and users. This presentation will discuss the brokering philosophy as a complementary approach for standardization and federation to interconnect existing and heterogeneous infrastructures and systems. The GEOSS experience will be analyzed, specially.

  19. DIESIS : An Interoperable European Federated Simulation Network for Critical Infrastructures

    NARCIS (Netherlands)

    Rome, E.; Bologna, S.; Gelenbe, E.; Luiijf, H.A.M.; Masucci, V.

    2009-01-01

    Critical Infrastructures (CI) that are vital for a society and an economy, such as telecommunication systems, energy supply systems, transport systems and others, are getting more and more complex. Dependencies emerge in various ways, due to the use of information and communication technologies,

  20. The interoperability force in the ERP field

    Science.gov (United States)

    Boza, Andrés; Cuenca, Llanos; Poler, Raúl; Michaelides, Zenon

    2015-04-01

    Enterprise resource planning (ERP) systems participate in interoperability projects and this participation sometimes leads to new proposals for the ERP field. The aim of this paper is to identify the role that interoperability plays in the evolution of ERP systems. To go about this, ERP systems have been first identified within interoperability frameworks. Second, the initiatives in the ERP field driven by interoperability requirements have been identified from two perspectives: technological and business. The ERP field is evolving from classical ERP as information system integrators to a new generation of fully interoperable ERP. Interoperability is changing the way of running business, and ERP systems are changing to adapt to the current stream of interoperability.

  1. GEOWOW: a drought scenario for multidisciplinary data access and use

    Science.gov (United States)

    Santoro, Mattia; Sorichetta, Alessandro; Roglia, Elena; Craglia, Massimo; Nativi, Stefano

    2013-04-01

    Recent enhancements of the GEOSS Common Infrastructure (GCI; http://www.earthobservations.org/gci_gci.shtml), and in particular the introduction of a middleware in the GCI that brokers across heterogeneous information systems, have increased significantly the number of information resources discoverable worldwide. Now the challenge moves to the next level of ensuring access and use of the resources discovered, which have many different and domain-specific data models, communication protocols, encoding formats, etc. The GEOWOW Project - GEOSS interoperability for Weather, Ocean and Water, http://www.geowow.eu - developed a set of multidisciplinary use scenarios to advance the present GCI. This work describes the "Easy discovery and use of GEOSS resources for addressing multidisciplinary challenges related to drought scenarios" showcase demonstrated at the last GEO Plenary in Foz de Iguazu (Brazil). The scientific objectives of this showcase include: prevention and mitigation of water scarcity and drought situations, assessment of the population and geographical area potentially affected, evaluation of the possible distribution of mortality and economic loss risk, and support in building greater capacity to cope with drought. The need to address these challenges calls for producing scientifically robust and consistent information about the extent of land affected by drought and degradation. Similarly, in this context it is important: (i) to address uncertainties about the way in which various biological, physical, social, and economic factors interact each other and influence the occurrence of drought events, and (ii) to develop and test adequate indices and/or combination of them for monitoring and forecasting drought in different geographic locations and at various spatial scales (Brown et al., 2002). The scientific objectives above can be met with an increased interoperability across the multidisciplinary domains relevant to this drought scenario. In particular

  2. OOI CyberInfrastructure - Next Generation Oceanographic Research

    Science.gov (United States)

    Farcas, C.; Fox, P.; Arrott, M.; Farcas, E.; Klacansky, I.; Krueger, I.; Meisinger, M.; Orcutt, J.

    2008-12-01

    Software has become a key enabling technology for scientific discovery, observation, modeling, and exploitation of natural phenomena. New value emerges from the integration of individual subsystems into networked federations of capabilities exposed to the scientific community. Such data-intensive interoperability networks are crucial for future scientific collaborative research, as they open up new ways of fusing data from different sources and across various domains, and analysis on wide geographic areas. The recently established NSF OOI program, through its CyberInfrastructure component addresses this challenge by providing broad access from sensor networks for data acquisition up to computational grids for massive computations and binding infrastructure facilitating policy management and governance of the emerging system-of-scientific-systems. We provide insight into the integration core of this effort, namely, a hierarchic service-oriented architecture for a robust, performant, and maintainable implementation. We first discuss the relationship between data management and CI crosscutting concerns such as identity management, policy and governance, which define the organizational contexts for data access and usage. Next, we detail critical services including data ingestion, transformation, preservation, inventory, and presentation. To address interoperability issues between data represented in various formats we employ a semantic framework derived from the Earth System Grid technology, a canonical representation for scientific data based on DAP/OPeNDAP, and related data publishers such as ERDDAP. Finally, we briefly present the underlying transport based on a messaging infrastructure over the AMQP protocol, and the preservation based on a distributed file system through SDSC iRODS.

  3. System architecture of communication infrastructures for PPDR organisations

    Science.gov (United States)

    Müller, Wilmuth

    2017-04-01

    The growing number of events affecting public safety and security (PS and S) on a regional scale with potential to grow up to large scale cross border disasters puts an increased pressure on organizations responsible for PS and S. In order to respond timely and in an adequate manner to such events Public Protection and Disaster Relief (PPDR) organizations need to cooperate, align their procedures and activities, share the needed information and be interoperable. Existing PPDR/PMR technologies do not provide broadband capability, which is a major limitation in supporting new services hence new information flows and currently they have no successor. There is also no known standard that addresses interoperability of these technologies. The paper at hands provides an approach to tackle the above mentioned aspects by defining an Enterprise Architecture (EA) of PPDR organizations and a System Architecture of next generation PPDR communication networks for a variety of applications and services on broadband networks, including the ability of inter-system, inter-agency and cross-border operations. The Open Safety and Security Architecture Framework (OSSAF) provides a framework and approach to coordinate the perspectives of different types of stakeholders within a PS and S organization. It aims at bridging the silos in the chain of commands and on leveraging interoperability between PPDR organizations. The framework incorporates concepts of several mature enterprise architecture frameworks including the NATO Architecture Framework (NAF). However, OSSAF is not providing details on how NAF should be used for describing the OSSAF perspectives and views. In this contribution a mapping of the NAF elements to the OSSAF views is provided. Based on this mapping, an EA of PPDR organizations with a focus on communication infrastructure related capabilities is presented. Following the capability modeling, a system architecture for secure and interoperable communication infrastructures

  4. Implementing standards for the interoperability among healthcare providers in the public regionalized Healthcare Information System of the Lombardy Region.

    Science.gov (United States)

    Barbarito, Fulvio; Pinciroli, Francesco; Mason, John; Marceglia, Sara; Mazzola, Luca; Bonacina, Stefano

    2012-08-01

    Information technologies (ITs) have now entered the everyday workflow in a variety of healthcare providers with a certain degree of independence. This independence may be the cause of difficulty in interoperability between information systems and it can be overcome through the implementation and adoption of standards. Here we present the case of the Lombardy Region, in Italy, that has been able, in the last 10 years, to set up the Regional Social and Healthcare Information System, connecting all the healthcare providers within the region, and providing full access to clinical and health-related documents independently from the healthcare organization that generated the document itself. This goal, in a region with almost 10 millions citizens, was achieved through a twofold approach: first, the political and operative push towards the adoption of the Health Level 7 (HL7) standard within single hospitals and, second, providing a technological infrastructure for data sharing based on interoperability specifications recognized at the regional level for messages transmitted from healthcare providers to the central domain. The adoption of such regional interoperability specifications enabled the communication among heterogeneous systems placed in different hospitals in Lombardy. Integrating the Healthcare Enterprise (IHE) integration profiles which refer to HL7 standards are adopted within hospitals for message exchange and for the definition of integration scenarios. The IHE patient administration management (PAM) profile with its different workflows is adopted for patient management, whereas the Scheduled Workflow (SWF), the Laboratory Testing Workflow (LTW), and the Ambulatory Testing Workflow (ATW) are adopted for order management. At present, the system manages 4,700,000 pharmacological e-prescriptions, and 1,700,000 e-prescriptions for laboratory exams per month. It produces, monthly, 490,000 laboratory medical reports, 180,000 radiology medical reports, 180

  5. Clinical data integration model. Core interoperability ontology for research using primary care data.

    Science.gov (United States)

    Ethier, J-F; Curcin, V; Barton, A; McGilchrist, M M; Bastiaens, H; Andreasson, A; Rossiter, J; Zhao, L; Arvanitis, T N; Taweel, A; Delaney, B C; Burgun, A

    2015-01-01

    This article is part of the Focus Theme of METHODS of Information in Medicine on "Managing Interoperability and Complexity in Health Systems". Primary care data is the single richest source of routine health care data. However its use, both in research and clinical work, often requires data from multiple clinical sites, clinical trials databases and registries. Data integration and interoperability are therefore of utmost importance. TRANSFoRm's general approach relies on a unified interoperability framework, described in a previous paper. We developed a core ontology for an interoperability framework based on data mediation. This article presents how such an ontology, the Clinical Data Integration Model (CDIM), can be designed to support, in conjunction with appropriate terminologies, biomedical data federation within TRANSFoRm, an EU FP7 project that aims to develop the digital infrastructure for a learning healthcare system in European Primary Care. TRANSFoRm utilizes a unified structural / terminological interoperability framework, based on the local-as-view mediation paradigm. Such an approach mandates the global information model to describe the domain of interest independently of the data sources to be explored. Following a requirement analysis process, no ontology focusing on primary care research was identified and, thus we designed a realist ontology based on Basic Formal Ontology to support our framework in collaboration with various terminologies used in primary care. The resulting ontology has 549 classes and 82 object properties and is used to support data integration for TRANSFoRm's use cases. Concepts identified by researchers were successfully expressed in queries using CDIM and pertinent terminologies. As an example, we illustrate how, in TRANSFoRm, the Query Formulation Workbench can capture eligibility criteria in a computable representation, which is based on CDIM. A unified mediation approach to semantic interoperability provides a

  6. Adoption of a SAML-XACML Profile for Authorization Interoperability across Grid Middleware in OSG and EGEE

    International Nuclear Information System (INIS)

    Garzoglio, G; Chadwick, K; Dykstra, D; Hesselroth, T; Levshina, T; Sharma, N; Timm, S; Bester, J; Martin, S; Groep, D; Koeroo, O; Salle, M; Verstegen, A; Gu, J; Sim, A

    2011-01-01

    The Authorization Interoperability activity was initiated in 2006 to foster interoperability between middleware and authorization infrastructures deployed in the Open Science Grid (OSG) and the Enabling Grids for E-sciencE (EGEE) projects. This activity delivered a common authorization protocol and a set of libraries that implement that protocol. In addition, a set of the most common Grid gateways, or Policy Enforcement Points (Globus Toolkit v4 Gatekeeper, GridFTP, dCache, etc.) and site authorization services, or Policy Decision Points (LCAS/LCMAPS, SCAS, GUMS, etc.) have been integrated with these libraries. At this time, various software providers, including the Globus Toolkit v5, BeStMan, and the Site AuthoriZation service (SAZ), are integrating the authorization interoperability protocol with their products. In addition, as more and more software supports the same protocol, the community is converging on LCMAPS as a common module for identity attribute parsing and authorization call-out. This paper presents this effort, discusses the status of adoption of the common protocol and projects the community work on authorization in the near future.

  7. Adoption of a SAML-XACML profile for authorization interoperability across grid middleware in OSG and EGEE

    Energy Technology Data Exchange (ETDEWEB)

    Garzoglio, G. [Fermilab; Bester, J. [Argonne; Chadwick, K. [Fermilab; Dykstra, D. [Fermilab; Groep, D. [NIKHEF, Amsterdam; Gu, J. [LBL, Berkeley; Hesselroth, T. [Fermilab; Koeroo, O. [NIKHEF, Amsterdam; Levshina, T. [Fermilab; Martin, S. [Argonne; Salle, M. [NIKHEF, Amsterdam; Sharma, N. [Fermilab; Sim, A. [LBL, Berkeley; Timm, S. [Fermilab; Verstegen, A. [NIKHEF, Amsterdam

    2011-01-01

    The Authorization Interoperability activity was initiated in 2006 to foster interoperability between middleware and authorization infrastructures deployed in the Open Science Grid (OSG) and the Enabling Grids for E-sciencE (EGEE) projects. This activity delivered a common authorization protocol and a set of libraries that implement that protocol. In addition, a set of the most common Grid gateways, or Policy Enforcement Points (Globus Toolkit v4 Gatekeeper, GridFTP, dCache, etc.) and site authorization services, or Policy Decision Points (LCAS/LCMAPS, SCAS, GUMS, etc.) have been integrated with these libraries. At this time, various software providers, including the Globus Toolkit v5, BeStMan, and the Site AuthoriZation service (SAZ), are integrating the authorization interoperability protocol with their products. In addition, as more and more software supports the same protocol, the community is converging on LCMAPS as a common module for identity attribute parsing and authorization call-out. This paper presents this effort, discusses the status of adoption of the common protocol and projects the community work on authorization in the near future.

  8. Innovation in OGC: The Interoperability Program

    Directory of Open Access Journals (Sweden)

    George Percivall

    2015-10-01

    Full Text Available The OGC Interoperability Program is a source of innovation in the development of open standards. The approach to innovation is based on hands-on; collaborative engineering leading to more mature standards and implementations. The process of the Interoperability Program engages a community of sponsors and participants based on an economic model that benefits all involved. Each initiative begins with an innovative approach to identify interoperability needs followed by agile software development to advance the state of technology to the benefit of society. Over eighty initiatives have been conducted in the Interoperability Program since the breakthrough Web Mapping Testbed began the program in 1999. OGC standards that were initiated in Interoperability Program are the basis of two thirds of the certified compliant products.

  9. Towards an enterprise interoperability framework

    CSIR Research Space (South Africa)

    Kotzé, P

    2010-06-01

    Full Text Available This paper presents relevant interoperability approaches and solutions applied to global/international networked (collaborative) enterprises or organisations and conceptualise an enhanced enterprise interoperability framework. The paper covers...

  10. A Theory of Interoperability Failures

    National Research Council Canada - National Science Library

    McBeth, Michael S

    2003-01-01

    This paper develops a theory of interoperability failures. Interoperability in this paper refers to the exchange of information and the use of information, once exchanged, between two or more systems...

  11. Transportation Energy Futures Series: Alternative Fuel Infrastructure Expansion: Costs, Resources, Production Capacity, and Retail Availability for Low-Carbon Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Melaina, M. W.; Heath, G.; Sandor, D.; Steward, D.; Vimmerstedt, L.; Warner, E.; Webster, K. W.

    2013-04-01

    Achieving the Department of Energy target of an 80% reduction in greenhouse gas emissions by 2050 depends on transportation-related strategies combining technology innovation, market adoption, and changes in consumer behavior. This study examines expanding low-carbon transportation fuel infrastructure to achieve deep GHG emissions reductions, with an emphasis on fuel production facilities and retail components serving light-duty vehicles. Three distinct low-carbon fuel supply scenarios are examined: Portfolio: Successful deployment of a range of advanced vehicle and fuel technologies; Combustion: Market dominance by hybridized internal combustion engine vehicles fueled by advanced biofuels and natural gas; Electrification: Market dominance by electric drive vehicles in the LDV sector, including battery electric, plug-in hybrid, and fuel cell vehicles, that are fueled by low-carbon electricity and hydrogen. A range of possible low-carbon fuel demand outcomes are explored in terms of the scale and scope of infrastructure expansion requirements and evaluated based on fuel costs, energy resource utilization, fuel production infrastructure expansion, and retail infrastructure expansion for LDVs. This is one of a series of reports produced as a result of the Transportation Energy Futures (TEF) project, a Department of Energy-sponsored multi-agency project initiated to pinpoint underexplored transportation-related strategies for abating GHGs and reducing petroleum dependence.

  12. Assessing readiness of cyberinfrastructure resources for cross-domain interoperability: a view from an NSF EarthCube roadmap

    Science.gov (United States)

    Zaslavsky, Ilya; Couch, Alva; Richard, Stephen; Valentine, David; Lehnert, Kerstin; Stocks, Karen; Murphy, Philip

    2013-04-01

    EarthCube is a new research initiative of the U.S. National Science Foundation, with the mission to develop community-guided cyberinfrastructure integrating data, models and other resources across geoscience disciplines. Analysis and modeling of physical processes that are not confined to disciplinary or jurisdictional boundaries, requires that data and models can be re-used outside the original context in which they are collected or developed. Infrastructure readiness for cross-domain interoperability encompasses the capabilities that need to be in place to enable such information re-use and ensure that it is both scientifically sound and technically feasible. In an ideal cross-domain information integration scenario, resources can be discovered via federated catalogs and linked nomenclatures from neighboring domains, while standard data services can be used to transparently compile composite data products and to integrate information using commonality in key data characteristics related to shared models of spatial features, time measurements, and observations. The main premise of the cross-domain readiness assessment is that when accessing domain resources from another domain, a user expects to be able to discover these resources, interpret them, retrieve the information, and integrate it with other data. Documentation of the resource must be sufficient for a user in a different context to determine fitness for use, and establish trust in scientific soundness. As part of an EarthCube roadmap focused on cross-domain interoperability, we explored a number of approaches to cyberinfrastructure readiness assessment, addressing both readiness of existing resources, and readiness of processes that enable cross-domain communication and information exchange across disciplinary boundaries. Our initial assessment considers basic infrastructure components required to enable cross-domain interoperability in the geosciences. These components, and the evaluation metrics

  13. Lemnos Interoperable Security Program

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, John [Tennessee Valley Authority, Knoxville, TN (United States); Halbgewachs, Ron [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Chavez, Adrian [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Smith, Rhett [Schweitzer Engineering Laboratories, Chattanooga, TN (United States); Teumim, David [Teumim Technical, Allentown, PA (United States)

    2012-01-31

    The manner in which the control systems are being designed and operated in the energy sector is undergoing some of the most significant changes in history due to the evolution of technology and the increasing number of interconnections to other system. With these changes however come two significant challenges that the energy sector must face; 1) Cyber security is more important than ever before, and 2) Cyber security is more complicated than ever before. A key requirement in helping utilities and vendors alike in meeting these challenges is interoperability. While interoperability has been present in much of the discussions relating to technology utilized within the energy sector and especially the Smart Grid, it has been absent in the context of cyber security. The Lemnos project addresses these challenges by focusing on the interoperability of devices utilized within utility control systems which support critical cyber security functions. In theory, interoperability is possible with many of the cyber security solutions available to utilities today. The reality is that the effort required to achieve cyber security interoperability is often a barrier for utilities. For example, consider IPSec, a widely-used Internet Protocol to define Virtual Private Networks, or tunnels , to communicate securely through untrusted public and private networks. The IPSec protocol suite has a significant number of configuration options and encryption parameters to choose from, which must be agreed upon and adopted by both parties establishing the tunnel. The exercise in getting software or devices from different vendors to interoperate is labor intensive and requires a significant amount of security expertise by the end user. Scale this effort to a significant number of devices operating over a large geographical area and the challenge becomes so overwhelming that it often leads utilities to pursue solutions from a single vendor. These single vendor solutions may inadvertently lock

  14. Assessing Storm Vulnerabilities and Resilience Strategies: A Scenario-Method for Engaging Stakeholders of Public/Private Maritime Infrastructure

    Science.gov (United States)

    Becker, A.; Burroughs, R.

    2014-12-01

    This presentation discusses a new method to assess vulnerability and resilience strategies for stakeholders of coastal-dependent transportation infrastructure, such as seaports. Much coastal infrastructure faces increasing risk to extreme events resulting from sea level rise and tropical storms. As seen after Hurricane Sandy, natural disasters result in economic costs, damages to the environment, and negative consequences on resident's quality of life. In the coming decades, tough decisions will need to be made about investment measures to protect critical infrastructure. Coastal communities will need to weigh the costs and benefits of a new storm barrier, for example, against those of retrofitting, elevating or simply doing nothing. These decisions require understanding the priorities and concerns of stakeholders. For ports, these include shippers, insurers, tenants, and ultimate consumers of the port cargo on a local and global scale, all of whom have a stake in addressing port vulnerabilities.Decision-makers in exposed coastal areas need tools to understand stakeholders concerns and perceptions of potential resilience strategies. For ports, they need answers to: 1) How will stakeholders be affected? 2) What strategies could be implemented to build resilience? 3) How effectively would the strategies mitigate stakeholder concerns? 4) What level of time and investment would strategies require? 5) Which stakeholders could/should take responsibility? Our stakeholder-based method provides answers to questions 1-3 and forms the basis for further work to address 4 and 5.Together with an expert group, we developed a pilot study for stakeholders of Rhode Island's critical energy port, the Port of Providence. Our method uses a plausible extreme storm scenario with localized visualizations and a portfolio of potential resilience strategies. We tailor a multi-criteria decision analysis tool and, through a series of workshops, we use the storm scenario, resilience strategies

  15. Interoperability of remote handling control system software modules at Divertor Test Platform 2 using middleware

    International Nuclear Information System (INIS)

    Tuominen, Janne; Rasi, Teemu; Mattila, Jouni; Siuko, Mikko; Esque, Salvador; Hamilton, David

    2013-01-01

    Highlights: ► The prototype DTP2 remote handling control system is a heterogeneous collection of subsystems, each realizing a functional area of responsibility. ► Middleware provides well-known, reusable solutions to problems, such as heterogeneity, interoperability, security and dependability. ► A middleware solution was selected and integrated with the DTP2 RH control system. The middleware was successfully used to integrate all relevant subsystems and functionality was demonstrated. -- Abstract: This paper focuses on the inter-subsystem communication channels in a prototype distributed remote handling control system at Divertor Test Platform 2 (DTP2). The subsystems are responsible for specific tasks and, over the years, their development has been carried out using various platforms and programming languages. The communication channels between subsystems have different priorities, e.g. very high messaging rate and deterministic timing or high reliability in terms of individual messages. Generally, a control system's communication infrastructure should provide interoperability, scalability, performance and maintainability. An attractive approach to accomplish this is to use a standardized and proven middleware implementation. The selection of a middleware can have a major cost impact in future integration efforts. In this paper we present development done at DTP2 using the Object Management Group's (OMG) standard specification for Data Distribution Service (DDS) for ensuring communications interoperability. DDS has gained a stable foothold especially in the military field. It lacks a centralized broker, thereby avoiding a single-point-of-failure. It also includes an extensive set of Quality of Service (QoS) policies. The standard defines a platform- and programming language independent model and an interoperability wire protocol that enables DDS vendor interoperability, allowing software developers to avoid vendor lock-in situations

  16. Towards an advanced e-Infrastructure for Civil Protection applications: Research Strategies and Innovation Guidelines

    Science.gov (United States)

    Mazzetti, P.; Nativi, S.; Verlato, M.; Angelini, V.

    2009-04-01

    In the context of the EU co-funded project CYCLOPS (http://www.cyclops-project.eu) the problem of designing an advanced e-Infrastructure for Civil Protection (CP) applications has been addressed. As a preliminary step, some studies about European CP systems and operational applications were performed in order to define their specific system requirements. At a higher level it was verified that CP applications are usually conceived to map CP Business Processes involving different levels of processing including data access, data processing, and output visualization. At their core they usually run one or more Earth Science models for information extraction. The traditional approach based on the development of monolithic applications presents some limitations related to flexibility (e.g. the possibility of running the same models with different input data sources, or different models with the same data sources) and scalability (e.g. launching several runs for different scenarios, or implementing more accurate and computing-demanding models). Flexibility can be addressed adopting a modular design based on a SOA and standard services and models, such as OWS and ISO for geospatial services. Distributed computing and storage solutions could improve scalability. Basing on such considerations an architectural framework has been defined. It is made of a Web Service layer providing advanced services for CP applications (e.g. standard geospatial data sharing and processing services) working on the underlying Grid platform. This framework has been tested through the development of prototypes as proof-of-concept. These theoretical studies and proof-of-concept demonstrated that although Grid and geospatial technologies would be able to provide significant benefits to CP applications in terms of scalability and flexibility, current platforms are designed taking into account requirements different from CP. In particular CP applications have strict requirements in terms of: a) Real

  17. Smart Grid Interoperability Maturity Model

    Energy Technology Data Exchange (ETDEWEB)

    Widergren, Steven E.; Levinson, Alex; Mater, J.; Drummond, R.

    2010-04-28

    The integration of automation associated with electricity resources (including transmission and distribution automation and demand-side resources operated by end-users) is key to supporting greater efficiencies and incorporating variable renewable resources and electric vehicles into the power system. The integration problems faced by this community are analogous to those faced in the health industry, emergency services, and other complex communities with many stakeholders. To highlight this issue and encourage communication and the development of a smart grid interoperability community, the GridWise Architecture Council (GWAC) created an Interoperability Context-Setting Framework. This "conceptual model" has been helpful to explain the importance of organizational alignment in addition to technical and informational interface specifications for "smart grid" devices and systems. As a next step to building a community sensitive to interoperability, the GWAC is investigating an interoperability maturity model (IMM) based on work done by others to address similar circumstances. The objective is to create a tool or set of tools that encourages a culture of interoperability in this emerging community. The tools would measure status and progress, analyze gaps, and prioritize efforts to improve the situation.

  18. Enhancements of the "eHabitat

    Science.gov (United States)

    Santoro, M.; Dubois, G.; Schulz, M.; Skøien, J. O.; Nativi, S.; Peedell, S.; Boldrini, E.

    2012-04-01

    The number of interoperable research infrastructures has increased significantly with the growing awareness of the efforts made by the Global Earth Observation System of Systems (GEOSS). One of the Social Benefit Areas (SBA) that is benefiting most from GEOSS is biodiversity, given the costs of monitoring the environment and managing complex information, from space observations to species records including their genetic characteristics. But GEOSS goes beyond the simple sharing of the data as it encourages the connectivity of models (the GEOSS Model Web), an approach easing the handling of often complex multi-disciplinary questions such as understanding the impact of environmental and climatological factors on ecosystems and habitats. In the context of GEOSS Architecture Implementation Pilot - Phase 3 (AIP-3), the EC-funded EuroGEOSS and GENESIS projects have developed and successfully demonstrated the "eHabitat" use scenario dealing with Climate Change and Biodiversity domains. Based on the EuroGEOSS multidisciplinary brokering infrastructure and on the DOPA (Digital Observatory for Protected Areas, see http://dopa.jrc.ec.europa.eu/), this scenario demonstrated how a GEOSS-based interoperability infrastructure can aid decision makers to assess and possibly forecast the irreplaceability of a given protected area, an essential indicator for assessing the criticality of threats this protected area is exposed to. The "eHabitat" use scenario was advanced in the GEOSS Sprint to Plenary activity; the advanced scenario will include the "EuroGEOSS Data Access Broker" and a new version of the eHabitat model in order to support the use of uncertain data. The multidisciplinary interoperability infrastructure which is used to demonstrate the "eHabitat" use scenario is composed of the following main components: a) A Discovery Broker: this component is able to discover resources from a plethora of different and heterogeneous geospatial services, presenting them on a single and

  19. Unmanned Ground Vehicle (UGV) Interoperability Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The UGV Interoperability Lab provides the capability to verify vendor conformance against government-defined interoperability profiles (IOPs). This capability allows...

  20. OR.NET: a service-oriented architecture for safe and dynamic medical device interoperability.

    Science.gov (United States)

    Kasparick, Martin; Schmitz, Malte; Andersen, Björn; Rockstroh, Max; Franke, Stefan; Schlichting, Stefan; Golatowski, Frank; Timmermann, Dirk

    2018-02-23

    Modern surgical departments are characterized by a high degree of automation supporting complex procedures. It recently became apparent that integrated operating rooms can improve the quality of care, simplify clinical workflows, and mitigate equipment-related incidents and human errors. Particularly using computer assistance based on data from integrated surgical devices is a promising opportunity. However, the lack of manufacturer-independent interoperability often prevents the deployment of collaborative assistive systems. The German flagship project OR.NET has therefore developed, implemented, validated, and standardized concepts for open medical device interoperability. This paper describes the universal OR.NET interoperability concept enabling a safe and dynamic manufacturer-independent interconnection of point-of-care (PoC) medical devices in the operating room and the whole clinic. It is based on a protocol specifically addressing the requirements of device-to-device communication, yet also provides solutions for connecting the clinical information technology (IT) infrastructure. We present the concept of a service-oriented medical device architecture (SOMDA) as well as an introduction to the technical specification implementing the SOMDA paradigm, currently being standardized within the IEEE 11073 service-oriented device connectivity (SDC) series. In addition, the Session concept is introduced as a key enabler for safe device interconnection in highly dynamic ensembles of networked medical devices; and finally, some security aspects of a SOMDA are discussed.

  1. Interoperability and HealthGRID.

    Science.gov (United States)

    Bescos, C; Schmitt, D; Kass, J; García-Barbero, M; Kantchev, P

    2005-01-01

    GRID technology, with initiatives like the GGF, will have the potential to allow both competition and interoperability not only among applications and toolkits, but also among implementations of key services. The pyramid of eHealth interoperability should be achieved from standards in communication and data security, storage and processing, to the policy initiatives, including organizational protocols, financing procedures, and legal framework. The open challenges for GRID use in clinical fields illustrate the potential of the combination of grid technologies with medical routine into a wider interoperable framework. The Telemedicine Alliance is a consortium (ESA, WHO and ITU), initiated in 2002, in building a vision for the provision of eHealth to European citizens by 2010. After a survey with more that 50 interviews of experts, interoperability was identified as the main showstopper to eHealth implementation. There are already several groups and organizations contributing to standardization. TM-Alliance is supporting the "e-Health Standardization Coordination Group" (eHSCG). It is now, in the design and development phase of GRID technology in Health, the right moment to act with the aim of achieving an interoperable and open framework. The Health area should benefit from the initiatives started at the GGF in terms of global architecture and services definitions, as well as from the security and other web services applications developed under the Internet umbrella. There is a risk that existing important results of the standardization efforts in this area are not taken up simply because they are not always known.

  2. The Microbial Resource Research Infrastructure MIRRI: Strength through Coordination

    Directory of Open Access Journals (Sweden)

    Erko Stackebrandt

    2015-11-01

    Full Text Available Microbial resources have been recognized as essential raw materials for the advancement of health and later for biotechnology, agriculture, food technology and for research in the life sciences, as their enormous abundance and diversity offer an unparalleled source of unexplored solutions. Microbial domain biological resource centres (mBRC provide live cultures and associated data to foster and support the development of basic and applied science in countries worldwide and especially in Europe, where the density of highly advanced mBRCs is high. The not-for-profit and distributed project MIRRI (Microbial Resource Research Infrastructure aims to coordinate access to hitherto individually managed resources by developing a pan-European platform which takes the interoperability and accessibility of resources and data to a higher level. Providing a wealth of additional information and linking to datasets such as literature, environmental data, sequences and chemistry will enable researchers to select organisms suitable for their research and enable innovative solutions to be developed. The current independent policies and managed processes will be adapted by partner mBRCs to harmonize holdings, services, training, and accession policy and to share expertise. The infrastructure will improve access to enhanced quality microorganisms in an appropriate legal framework and to resource-associated data in a more interoperable way.

  3. An EarthCube Roadmap for Cross-Domain Interoperability in the Geosciences: Governance Aspects

    Science.gov (United States)

    Zaslavsky, I.; Couch, A.; Richard, S. M.; Valentine, D. W.; Stocks, K.; Murphy, P.; Lehnert, K. A.

    2012-12-01

    The goal of cross-domain interoperability is to enable reuse of data and models outside the original context in which these data and models are collected and used and to facilitate analysis and modeling of physical processes that are not confined to disciplinary or jurisdictional boundaries. A new research initiative of the U.S. National Science Foundation, called EarthCube, is developing a roadmap to address challenges of interoperability in the earth sciences and create a blueprint for community-guided cyberinfrastructure accessible to a broad range of geoscience researchers and students. Infrastructure readiness for cross-domain interoperability encompasses the capabilities that need to be in place for such secondary or derivative-use of information to be both scientifically sound and technically feasible. In this initial assessment we consider the following four basic infrastructure components that need to be present to enable cross-domain interoperability in the geosciences: metadata catalogs (at the appropriate community defined granularity) that provide standard discovery services over datasets, data access services, models and other resources of the domain; vocabularies that support unambiguous interpretation of domain resources and metadata; services used to access data repositories and other resources including models, visualizations and workflows; and formal information models that define structure and semantics of the information returned on service requests. General standards for these components have been proposed; they form the backbone of large scale integration activities in the geosciences. By utilizing these standards, EarthCube research designs can take advantage of data discovery across disciplines using the commonality in key data characteristics related to shared models of spatial features, time measurements, and observations. Data can be discovered via federated catalogs and linked nomenclatures from neighboring domains, while standard data

  4. Linked data for transaction based enterprise interoperability

    NARCIS (Netherlands)

    Folmer, E.J.A.; Krukkert, D.

    2015-01-01

    Interoperability is of major importance in B2B environments. Starting with EDI in the ‘80s, currently interoperability relies heavily on XMLbased standards. Although having great impact, still issues remain to be solved for improving B2B interoperability. These issues include lack of dynamics, cost

  5. Towards a single seismological service infrastructure in Europe

    Science.gov (United States)

    Spinuso, A.; Trani, L.; Frobert, L.; Van Eck, T.

    2012-04-01

    In the last five year services and data providers, within the seismological community in Europe, focused their efforts in migrating the way of opening their archives towards a Service Oriented Architecture (SOA). This process tries to follow pragmatically the technological trends and available solutions aiming at effectively improving all the data stewardship activities. These advancements are possible thanks to the cooperation and the follow-ups of several EC infrastructural projects that, by looking at general purpose techniques, combine their developments envisioning a multidisciplinary platform for the earth observation as the final common objective (EPOS, Earth Plate Observation System) One of the first results of this effort is the Earthquake Data Portal (http://www.seismicportal.eu), which provides a collection of tools to discover, visualize and access a variety of seismological data sets like seismic waveform, accelerometric data, earthquake catalogs and parameters. The Portal offers a cohesive distributed search environment, linking data search and access across multiple data providers through interactive web-services, map-based tools and diverse command-line clients. Our work continues under other EU FP7 projects. Here we will address initiatives in two of those projects. The NERA, (Network of European Research Infrastructures for Earthquake Risk Assessment and Mitigation) project will implement a Common Services Architecture based on OGC services APIs, in order to provide Resource-Oriented common interfaces across the data access and processing services. This will improve interoperability between tools and across projects, enabling the development of higher-level applications that can uniformly access the data and processing services of all participants. This effort will be conducted jointly with the VERCE project (Virtual Earthquake and Seismology Research Community for Europe). VERCE aims to enable seismologists to exploit the wealth of seismic data

  6. Bandwidth Analysis of Smart Meter Network Infrastructure

    DEFF Research Database (Denmark)

    Balachandran, Kardi; Olsen, Rasmus Løvenstein; Pedersen, Jens Myrup

    2014-01-01

    Advanced Metering Infrastructure (AMI) is a net-work infrastructure in Smart Grid, which links the electricity customers to the utility company. This network enables smart services by making it possible for the utility company to get an overview of their customers power consumption and also control...... devices in their costumers household e.g. heat pumps. With these smart services, utility companies can do load balancing on the grid by shifting load using resources the customers have. The problem investigated in this paper is what bandwidth require-ments can be expected when implementing such network...... to utilize smart meters and which existing broadband network technologies can facilitate this smart meter service. Initially, scenarios for smart meter infrastructure are identified. The paper defines abstraction models which cover the AMI scenarios. When the scenario has been identified a general overview...

  7. Interoperability of remote handling control system software modules at Divertor Test Platform 2 using middleware

    Energy Technology Data Exchange (ETDEWEB)

    Tuominen, Janne, E-mail: janne.m.tuominen@tut.fi [Tampere University of Technology, Department of Intelligent Hydraulics and Automation, Tampere (Finland); Rasi, Teemu; Mattila, Jouni [Tampere University of Technology, Department of Intelligent Hydraulics and Automation, Tampere (Finland); Siuko, Mikko [VTT, Technical Research Centre of Finland, Tampere (Finland); Esque, Salvador [F4E, Fusion for Energy, Torres Diagonal Litoral B3, Josep Pla2, 08019, Barcelona (Spain); Hamilton, David [ITER Organization, Route de Vinon sur Verdon, 13115 Saint Paul Lez Durance (France)

    2013-10-15

    Highlights: ► The prototype DTP2 remote handling control system is a heterogeneous collection of subsystems, each realizing a functional area of responsibility. ► Middleware provides well-known, reusable solutions to problems, such as heterogeneity, interoperability, security and dependability. ► A middleware solution was selected and integrated with the DTP2 RH control system. The middleware was successfully used to integrate all relevant subsystems and functionality was demonstrated. -- Abstract: This paper focuses on the inter-subsystem communication channels in a prototype distributed remote handling control system at Divertor Test Platform 2 (DTP2). The subsystems are responsible for specific tasks and, over the years, their development has been carried out using various platforms and programming languages. The communication channels between subsystems have different priorities, e.g. very high messaging rate and deterministic timing or high reliability in terms of individual messages. Generally, a control system's communication infrastructure should provide interoperability, scalability, performance and maintainability. An attractive approach to accomplish this is to use a standardized and proven middleware implementation. The selection of a middleware can have a major cost impact in future integration efforts. In this paper we present development done at DTP2 using the Object Management Group's (OMG) standard specification for Data Distribution Service (DDS) for ensuring communications interoperability. DDS has gained a stable foothold especially in the military field. It lacks a centralized broker, thereby avoiding a single-point-of-failure. It also includes an extensive set of Quality of Service (QoS) policies. The standard defines a platform- and programming language independent model and an interoperability wire protocol that enables DDS vendor interoperability, allowing software developers to avoid vendor lock-in situations.

  8. Designing learning management system interoperability in semantic web

    Science.gov (United States)

    Anistyasari, Y.; Sarno, R.; Rochmawati, N.

    2018-01-01

    The extensive adoption of learning management system (LMS) has set the focus on the interoperability requirement. Interoperability is the ability of different computer systems, applications or services to communicate, share and exchange data, information, and knowledge in a precise, effective and consistent way. Semantic web technology and the use of ontologies are able to provide the required computational semantics and interoperability for the automation of tasks in LMS. The purpose of this study is to design learning management system interoperability in the semantic web which currently has not been investigated deeply. Moodle is utilized to design the interoperability. Several database tables of Moodle are enhanced and some features are added. The semantic web interoperability is provided by exploited ontology in content materials. The ontology is further utilized as a searching tool to match user’s queries and available courses. It is concluded that LMS interoperability in Semantic Web is possible to be performed.

  9. An Emergent Micro-Services Approach to Digital Curation Infrastructure

    OpenAIRE

    Abrams, Stephen; Kunze, John; Loy, David

    2010-01-01

    In order better to meet the needs of its diverse University of California (UC) constituencies, the California Digital Library UC Curation Center is re-envisioning its approach to digital curation infrastructure by devolving function into a set of granular, independent, but interoperable micro-services. Since each of these services is small and self-contained, they are more easily developed, deployed, maintained, and enhanced; at the same time, complex curation function can emerge from the str...

  10. Impact of coalition interoperability on PKI

    Science.gov (United States)

    Krall, Edward J.

    2003-07-01

    This paper examines methods for providing PKI interoperability among units of a coalition of armed forces drawn from different nations. The area in question is tactical identity management, for the purposes of confidentiality, integrity and non-repudiation in such a dynamic coalition. The interoperating applications under consideration range from email and other forms of store-and-forward messaging to TLS and IPSEC-protected real-time communications. Six interoperability architectures are examined with advantages and disadvantages of each described in the paper.

  11. Uganda's National Transmission Backbone Infrastructure Project: Technical Challenges and the Way Forward

    Science.gov (United States)

    Bulega, T.; Kyeyune, A.; Onek, P.; Sseguya, R.; Mbabazi, D.; Katwiremu, E.

    2011-10-01

    Several publications have identified technical challenges facing Uganda's National Transmission Backbone Infrastructure project. This research addresses the technical limitations of the National Transmission Backbone Infrastructure project, evaluates the goals of the project, and compares the results against the technical capability of the backbone. The findings of the study indicate a bandwidth deficit, which will be addressed by using dense wave division multiplexing repeaters, leasing bandwidth from private companies. Microwave links for redundancy, a Network Operation Center for operation and maintenance, and deployment of wireless interoperability for microwave access as a last-mile solution are also suggested.

  12. Standards-based Content Resources: A Prerequisite for Content Integration and Content Interoperability

    Directory of Open Access Journals (Sweden)

    Christian Galinski

    2010-05-01

    Full Text Available Objective: to show how standards-based approaches for content standardization, content management, content related services and tools as well as the respective certification systems not only guarantee reliable content integration and content interoperability, but also are of particular benefit to people with special needs in eAccessibility/eInclusion. Method: document MoU/MG/05 N0221 ''Semantic Interoperability and the need for a coherent policy for a framework of distributed, possibly federated repositories for all kinds of content items on a world-wide scale''2, which was adopted in 2005, was a first step towards the formulation of global interoperability requirements for structured content. These requirements -based on advanced terminological principles- were taken up in EU-projects such as IN-SAFETY (INfrastructure and SAFETY and OASIS (Open architecture for Accessible Services Integration and Standardization. Results: Content integration and content interoperability are key concepts in connection with the emergence of state-of-the-art distributed and federated databases/repositories of structured content. Given the fact that linguistic content items are increasingly combined with or embedded in non-linguistic content items (and vice versa, a systemic and generic approach to data modelling and content management has become the order of the day. Fulfilling the requirements of capability for multilinguality and multimodality, based on open standards makes software and database design fit for eAccessibility/eInclusion from the outset. It also makes structured content capable for global content integration and content interoperability, because it enhances its potential for being re-used and re-purposed in totally different eApplications. Such content as well as the methods, tools and services applied can be subject to new kinds of certification schemes which also should be based on standards. Conclusions: Content must be totally reliable in some

  13. A step-by-step methodology for enterprise interoperability projects

    Science.gov (United States)

    Chalmeta, Ricardo; Pazos, Verónica

    2015-05-01

    Enterprise interoperability is one of the key factors for enhancing enterprise competitiveness. Achieving enterprise interoperability is an extremely complex process which involves different technological, human and organisational elements. In this paper we present a framework to help enterprise interoperability. The framework has been developed taking into account the three domains of interoperability: Enterprise Modelling, Architecture and Platform and Ontologies. The main novelty of the framework in comparison to existing ones is that it includes a step-by-step methodology that explains how to carry out an enterprise interoperability project taking into account different interoperability views, like business, process, human resources, technology, knowledge and semantics.

  14. Maturity Model for Advancing Smart Grid Interoperability

    Energy Technology Data Exchange (ETDEWEB)

    Knight, Mark; Widergren, Steven E.; Mater, J.; Montgomery, Austin

    2013-10-28

    Abstract—Interoperability is about the properties of devices and systems to connect and work properly. Advancing interoperability eases integration and maintenance of the resulting interconnection. This leads to faster integration, lower labor and component costs, predictability of projects and the resulting performance, and evolutionary paths for upgrade. When specifications are shared and standardized, competition and novel solutions can bring new value streams to the community of stakeholders involved. Advancing interoperability involves reaching agreement for how things join at their interfaces. The quality of the agreements and the alignment of parties involved in the agreement present challenges that are best met with process improvement techniques. The GridWise® Architecture Council (GWAC) sponsored by the United States Department of Energy is supporting an effort to use concepts from capability maturity models used in the software industry to advance interoperability of smart grid technology. An interoperability maturity model has been drafted and experience is being gained through trials on various types of projects and community efforts. This paper describes the value and objectives of maturity models, the nature of the interoperability maturity model and how it compares with other maturity models, and experiences gained with its use.

  15. FLTSATCOM interoperability applications

    Science.gov (United States)

    Woolford, Lynn

    A mobile Fleet Satellite Communications (FLTSATCOM) system called the Mobile Operational Control Center (MOCC) was developed which has demonstrated the ability to be interoperable with many of the current FLTSATCOM command and control channels. This low-cost system is secure in all its communications, is lightweight, and provides a gateway for other communications formats. The major elements of this system are made up of a personal computer, a protocol microprocessor, and off-the-shelf mobile communication components. It is concluded that with both FLTSATCOM channel protocol and data format interoperability, the MOCC has the ability provide vital information in or near real time, which significantly improves mission effectiveness.

  16. Secure Interoperable Open Smart Grid Demonstration Project

    Energy Technology Data Exchange (ETDEWEB)

    Magee, Thoman [Consolidated Edison Company Of New York, Inc., NY (United States)

    2014-12-28

    The Consolidated Edison, Inc., of New York (Con Edison) Secure Interoperable Open Smart Grid Demonstration Project (SGDP), sponsored by the United States (US) Department of Energy (DOE), demonstrated that the reliability, efficiency, and flexibility of the grid can be improved through a combination of enhanced monitoring and control capabilities using systems and resources that interoperate within a secure services framework. The project demonstrated the capability to shift, balance, and reduce load where and when needed in response to system contingencies or emergencies by leveraging controllable field assets. The range of field assets includes curtailable customer loads, distributed generation (DG), battery storage, electric vehicle (EV) charging stations, building management systems (BMS), home area networks (HANs), high-voltage monitoring, and advanced metering infrastructure (AMI). The SGDP enables the seamless integration and control of these field assets through a common, cyber-secure, interoperable control platform, which integrates a number of existing legacy control and data systems, as well as new smart grid (SG) systems and applications. By integrating advanced technologies for monitoring and control, the SGDP helps target and reduce peak load growth, improves the reliability and efficiency of Con Edison’s grid, and increases the ability to accommodate the growing use of distributed resources. Con Edison is dedicated to lowering costs, improving reliability and customer service, and reducing its impact on the environment for its customers. These objectives also align with the policy objectives of New York State as a whole. To help meet these objectives, Con Edison’s long-term vision for the distribution grid relies on the successful integration and control of a growing penetration of distributed resources, including demand response (DR) resources, battery storage units, and DG. For example, Con Edison is expecting significant long-term growth of DG

  17. Improving the quality of EMI Releases by leveraging the EMI Testing Infrastructure

    International Nuclear Information System (INIS)

    Aiftimiei, C; Ceccanti, A; Dongiovanni, D; Giacomini, F; Meglio, A Di

    2012-01-01

    What is an EMI Release? What is its life cycle? How is its quality assured through a continuous integration and large scale acceptance testing? These are the main questions that this article will answer, by presenting the EMI release management process with emphasis on the role played by the Testing Infrastructure in improving the quality of the middleware provided by the project. The European Middleware Initiative (EMI) is a close collaboration of four major European technology providers: ARC, gLite, UNICORE and dCache. Its main objective is to deliver a consolidated set of components for deployment in EGI (as part of the Unified Middleware Distribution, UMD), PRACE and other DCIs. The harmonized set of EMI components thus enables the interoperability and integration between Grids. EMI aims at creating an effective environment that satisfies the requirements of the scientific communities relying on it. The EMI distribution is organized in periodic major releases whose development and maintenance follow a 5-phase yearly cycle: i) requirements collection and analysis; ii) development and test planning; iii) software development, testing and certification; iv) release certification and validation and v) release and maintenance. In this article we present in detail the implementation of operational and infrastructural resources supporting the certification and validation phase of the release. The main goal of this phase is to harmonize into a single release the strongly inter-dependent products coming from various development teams through parallel certification paths. To achieve this goal the continuous integration and large scale acceptance testing performed on the EMI Testing Infrastructure plays a key role. The purpose of this infrastructure is to provide a system where both the production and the release candidate product versions are deployed. On this system inter-component testing by different product team testers can concurrently take place. The Testing

  18. Assessing the Robustness of Green Infrastructure under Stochastic Design Storms and Climate Change Scenarios

    Science.gov (United States)

    Chui, T. F. M.; Yang, Y.

    2017-12-01

    Green infrastructures (GI) have been widely used to mitigate flood risk, improve surface water quality, and to restore predevelopment hydrologic regimes. Commonly-used GI include, bioretention system, porous pavement and green roof, etc. They are normally sized to fulfil different design criteria (e.g. providing certain storage depths, limiting peak surface flow rates) that are formulated for current climate conditions. While GI commonly have long lifespan, the sensitivity of their performance to climate change is however unclear. This study first proposes a method to formulate suitable design criteria to meet different management interests (e.g. different levels of first flush reduction and peak flow reduction). Then typical designs of GI are proposed. In addition, a high resolution stochastic design storm generator using copulas and random cascade model is developed, which is calibrated using recorded rainfall time series. Then, few climate change scenarios are generated by varying the duration and depth of design storms, and changing the parameters of the calibrated storm generator. Finally, the performance of GI with typical designs under the random synthesized design storms are then assessed using numerical modeling. The robustness of the designs is obtained by the comparing their performance in the future scenarios to the current one. This study overall examines the robustness of the current GI design criteria under uncertain future climate conditions, demonstrating whether current GI design criteria should be modified to account for climate change.

  19. Next Generation Air Quality Platform: Openness and Interoperability for the Internet of Things

    Directory of Open Access Journals (Sweden)

    Alexander Kotsev

    2016-03-01

    Full Text Available The widespread diffusion of sensors, mobile devices, social media and open data are reconfiguring the way data underpinning policy and science are being produced and consumed. This in turn is creating both opportunities and challenges for policy-making and science. There can be major benefits from the deployment of the IoT in smart cities and environmental monitoring, but to realize such benefits, and reduce potential risks, there is an urgent need to address current limitations, including the interoperability of sensors, data quality, security of access and new methods for spatio-temporal analysis. Within this context, the manuscript provides an overview of the AirSensEUR project, which establishes an affordable open software/hardware multi-sensor platform, which is nonetheless able to monitor air pollution at low concentration levels. AirSensEUR is described from the perspective of interoperable data management with emphasis on possible use case scenarios, where reliable and timely air quality data would be essential.

  20. Next Generation Air Quality Platform: Openness and Interoperability for the Internet of Things.

    Science.gov (United States)

    Kotsev, Alexander; Schade, Sven; Craglia, Massimo; Gerboles, Michel; Spinelle, Laurent; Signorini, Marco

    2016-03-18

    The widespread diffusion of sensors, mobile devices, social media and open data are reconfiguring the way data underpinning policy and science are being produced and consumed. This in turn is creating both opportunities and challenges for policy-making and science. There can be major benefits from the deployment of the IoT in smart cities and environmental monitoring, but to realize such benefits, and reduce potential risks, there is an urgent need to address current limitations, including the interoperability of sensors, data quality, security of access and new methods for spatio-temporal analysis. Within this context, the manuscript provides an overview of the AirSensEUR project, which establishes an affordable open software/hardware multi-sensor platform, which is nonetheless able to monitor air pollution at low concentration levels. AirSensEUR is described from the perspective of interoperable data management with emphasis on possible use case scenarios, where reliable and timely air quality data would be essential.

  1. Cloud Environment Automation: from infrastructure deployment to application monitoring

    Science.gov (United States)

    Aiftimiei, C.; Costantini, A.; Bucchi, R.; Italiano, A.; Michelotto, D.; Panella, M.; Pergolesi, M.; Saletta, M.; Traldi, S.; Vistoli, C.; Zizzi, G.; Salomoni, D.

    2017-10-01

    The potential offered by the cloud paradigm is often limited by technical issues, rules and regulations. In particular, the activities related to the design and deployment of the Infrastructure as a Service (IaaS) cloud layer can be difficult to apply and time-consuming for the infrastructure maintainers. In this paper the research activity, carried out during the Open City Platform (OCP) research project [1], aimed at designing and developing an automatic tool for cloud-based IaaS deployment is presented. Open City Platform is an industrial research project funded by the Italian Ministry of University and Research (MIUR), started in 2014. It intends to research, develop and test new technological solutions open, interoperable and usable on-demand in the field of Cloud Computing, along with new sustainable organizational models that can be deployed for and adopted by the Public Administrations (PA). The presented work and the related outcomes are aimed at simplifying the deployment and maintenance of a complete IaaS cloud-based infrastructure.

  2. Model for Trans-sector Digital Interoperability

    NARCIS (Netherlands)

    Madureira, António; den Hartog, Frank; Goncalves da Silva, Eduardo; Baken, Nico; Zhao, L.; Macaulay, L.

    2009-01-01

    Interoperability refers to the ability of two or more systems or components to exchange information and to use the information that has been exchanged. The importance of interoperability has grown together with the adoption of Digital Information Networks (DINs). DINs refer to information networks

  3. Model for Trans-sector Digital Interoperability

    NARCIS (Netherlands)

    Madureira, A.; Den Hartog, F.; Silva, E.; Baken, N.

    2010-01-01

    Interoperability refers to the ability of two or more systems or components to exchange information and to use the information that has been exchanged. The importance of interoperability has grown together with the adoption of Digital Information Networks (DINs). DINs refer to information networks

  4. Model for Trans-sector Digital Interoperability

    NARCIS (Netherlands)

    Popplewell, Keith; Madureira, António; Harding, Jenny; den Hartog, Frank; Goncalves da Silva, Eduardo; Poler, Raul; Chalmeta, Ricardo; Baken, Nico

    Interoperability refers to the ability of two or more systems or components to exchange information and to use the information that has been exchanged. The importance of interoperability has grown together with the adoption of Digital Information Networks (DINs). DINs refer to information networks

  5. Standards to open and interoperable digital libraries

    Directory of Open Access Journals (Sweden)

    Luís Fernando Sayão

    2007-12-01

    Full Text Available Interoperability is one of the main issues in creating a networked system of digital libraries. However, the interoperability as the way to accomplish data exchange and service collaboration requires adoption of a set of open standards covering all digital repository processes. The aim of this document is to revise the most important standards, protocols and the best pratices that form the framework to an open and fully interoperable digital library.

  6. An interoperable research data infrastructure to support climate service development

    Science.gov (United States)

    De Filippis, Tiziana; Rocchi, Leandro; Rapisardi, Elena

    2018-02-01

    Accessibility, availability, re-use and re-distribution of scientific data are prerequisites to build climate services across Europe. From this perspective the Institute of Biometeorology of the National Research Council (IBIMET-CNR), aiming at contributing to the sharing and integration of research data, has developed a research data infrastructure to support the scientific activities conducted in several national and international research projects. The proposed architecture uses open-source tools to ensure sustainability in the development and deployment of Web applications with geographic features and data analysis functionalities. The spatial data infrastructure components are organized in typical client-server architecture and interact from the data provider download data process to representation of the results to end users. The availability of structured raw data as customized information paves the way for building climate service purveyors to support adaptation, mitigation and risk management at different scales.This work is a bottom-up collaborative initiative between different IBIMET-CNR research units (e.g. geomatics and information and communication technology - ICT; agricultural sustainability; international cooperation in least developed countries - LDCs) that embrace the same approach for sharing and re-use of research data and informatics solutions based on co-design, co-development and co-evaluation among different actors to support the production and application of climate services. During the development phase of Web applications, different users (internal and external) were involved in the whole process so as to better define user needs and suggest the implementation of specific custom functionalities. Indeed, the services are addressed to researchers, academics, public institutions and agencies - practitioners who can access data and findings from recent research in the field of applied meteorology and climatology.

  7. Comprehensive scenario management of sustainable spatial planning and urban water services.

    Science.gov (United States)

    Baron, Silja; Hoek, Jannis; Kaufmann Alves, Inka; Herz, Sabine

    2016-01-01

    Adaptations of existing central water supply and wastewater disposal systems to demographic, climatic and socioeconomic changes require a profound knowledge about changing influencing factors. The paper presents a scenario management approach for the identification of future developments of drivers influencing water infrastructures. This method is designed within a research project with the objective of developing an innovative software-based optimisation and decision support system for long-term transformations of existing infrastructures of water supply, wastewater and energy in rural areas. Drivers of water infrastructures comprise engineering and spatial factors and these are predicted by different methods and techniques. The calculated developments of the drivers are illustrated for a model municipality. The developed scenario-manager enables the generation of comprehensive scenarios by combining different drivers. The scenarios are integrated into the optimisation model as input parameters. Furthermore, the result of the optimisation process - an optimal transformation strategy for water infrastructures - can have impacts on the existing fee system. General adaptation possibilities of the present fee system are presented.

  8. A technological infrastructure to sustain Internetworked Enterprises

    Science.gov (United States)

    La Mattina, Ernesto; Savarino, Vincenzo; Vicari, Claudia; Storelli, Davide; Bianchini, Devis

    In the Web 3.0 scenario, where information and services are connected by means of their semantics, organizations can improve their competitive advantage by publishing their business and service descriptions. In this scenario, Semantic Peer to Peer (P2P) can play a key role in defining dynamic and highly reconfigurable infrastructures. Organizations can share knowledge and services, using this infrastructure to move towards value networks, an emerging organizational model characterized by fluid boundaries and complex relationships. This chapter collects and defines the technological requirements and architecture of a modular and multi-Layer Peer to Peer infrastructure for SOA-based applications. This technological infrastructure, based on the combination of Semantic Web and P2P technologies, is intended to sustain Internetworked Enterprise configurations, defining a distributed registry and enabling more expressive queries and efficient routing mechanisms. The following sections focus on the overall architecture, while describing the layers that form it.

  9. An infrastructure for ontology-based information systems in biomedicine: RICORDO case study.

    Science.gov (United States)

    Wimalaratne, Sarala M; Grenon, Pierre; Hoehndorf, Robert; Gkoutos, Georgios V; de Bono, Bernard

    2012-02-01

    The article presents an infrastructure for supporting the semantic interoperability of biomedical resources based on the management (storing and inference-based querying) of their ontology-based annotations. This infrastructure consists of: (i) a repository to store and query ontology-based annotations; (ii) a knowledge base server with an inference engine to support the storage of and reasoning over ontologies used in the annotation of resources; (iii) a set of applications and services allowing interaction with the integrated repository and knowledge base. The infrastructure is being prototyped and developed and evaluated by the RICORDO project in support of the knowledge management of biomedical resources, including physiology and pharmacology models and associated clinical data. The RICORDO toolkit and its source code are freely available from http://ricordo.eu/relevant-resources. sarala@ebi.ac.uk.

  10. A Web Service Protocol Realizing Interoperable Internet of Things Tasking Capability

    Science.gov (United States)

    Huang, Chih-Yuan; Wu, Cheng-Hung

    2016-01-01

    The Internet of Things (IoT) is an infrastructure that interconnects uniquely-identifiable devices using the Internet. By interconnecting everyday appliances, various monitoring, and physical mashup applications can be constructed to improve human’s daily life. In general, IoT devices provide two main capabilities: sensing and tasking capabilities. While the sensing capability is similar to the World-Wide Sensor Web, this research focuses on the tasking capability. However, currently, IoT devices created by different manufacturers follow different proprietary protocols and are locked in many closed ecosystems. This heterogeneity issue impedes the interconnection between IoT devices and damages the potential of the IoT. To address this issue, this research aims at proposing an interoperable solution called tasking capability description that allows users to control different IoT devices using a uniform web service interface. This paper demonstrates the contribution of the proposed solution by interconnecting different IoT devices for different applications. In addition, the proposed solution is integrated with the OGC SensorThings API standard, which is a Web service standard defined for the IoT sensing capability. Consequently, the Extended SensorThings API can realize both IoT sensing and tasking capabilities in an integrated and interoperable manner. PMID:27589759

  11. A Web Service Protocol Realizing Interoperable Internet of Things Tasking Capability.

    Science.gov (United States)

    Huang, Chih-Yuan; Wu, Cheng-Hung

    2016-08-31

    The Internet of Things (IoT) is an infrastructure that interconnects uniquely-identifiable devices using the Internet. By interconnecting everyday appliances, various monitoring, and physical mashup applications can be constructed to improve human's daily life. In general, IoT devices provide two main capabilities: sensing and tasking capabilities. While the sensing capability is similar to the World-Wide Sensor Web, this research focuses on the tasking capability. However, currently, IoT devices created by different manufacturers follow different proprietary protocols and are locked in many closed ecosystems. This heterogeneity issue impedes the interconnection between IoT devices and damages the potential of the IoT. To address this issue, this research aims at proposing an interoperable solution called tasking capability description that allows users to control different IoT devices using a uniform web service interface. This paper demonstrates the contribution of the proposed solution by interconnecting different IoT devices for different applications. In addition, the proposed solution is integrated with the OGC SensorThings API standard, which is a Web service standard defined for the IoT sensing capability. Consequently, the Extended SensorThings API can realize both IoT sensing and tasking capabilities in an integrated and interoperable manner.

  12. Interoperability does matter

    Directory of Open Access Journals (Sweden)

    Manfred Goepel

    2006-04-01

    Full Text Available In companies, the historically developed IT systems are mostly application islands. They always produce good results if the system's requirements and surroundings are not changed and as long as a system interface is not needed. With the ever increas-ing dynamic and globalization of the market, however, these IT islands are certain to collapse. Interoperability (IO is the bid of the hour, assuming the integration of users, data, applications and processes. In the following, important IO enablers such as ETL, EAI, and SOA will be examined on the basis of practica-bility. It will be shown that especially SOA produces a surge of interoperability that could rightly be referred to as IT evolution.

  13. Requirements for Interoperability in Healthcare Information Systems

    Directory of Open Access Journals (Sweden)

    Rita Noumeir

    2012-01-01

    Full Text Available Interoperability is a requirement for the successful deployment of Electronic Health Records (EHR. EHR improves the quality of healthcare by enabling access to all relevant information at the diagnostic decision moment, regardless of location. It is a system that results from the cooperation of several heterogeneous distributed subsystems that need to successfully exchange information relative to a specific healthcare process. This paper analyzes interoperability impediments in healthcare by first defining them and providing concrete healthcare examples, followed by discussion of how specifications can be defined and how verification can be conducted to eliminate those impediments and ensure interoperability in healthcare. This paper also analyzes how Integrating the Healthcare Enterprise (IHE has been successful in enabling interoperability, and identifies some neglected aspects that need attention.

  14. Scenario-based resilience assessment framework for critical infrastructure systems: Case study for seismic resilience of seaports

    International Nuclear Information System (INIS)

    Shafieezadeh, Abdollah; Ivey Burden, Lindsay

    2014-01-01

    A number of metrics in the past have been proposed and numerically implemented to assess the overall performance of large systems during natural disasters and their recovery in the aftermath of the events. Among such performance measures, resilience is a reliable metric. This paper proposes a probabilistic framework for scenario-based resilience assessment of infrastructure systems. The method accounts for uncertainties in the process including the correlation of the earthquake intensity measures, fragility assessment of structural components, estimation of repair requirements, the repair process, and finally the service demands. The proposed method is applied to a hypothetical seaport terminal and the system level performance of the seaport is assessed using various performance metrics. Results of this analysis have shown that medium to large seismic events may significantly disrupt the operation of seaports right after the event and the recovery process may take months. The proposed framework will enable port stakeholders to systematically assess the most-likely performance of the system during expected future earthquake events. - Highlights: • A scenario-based framework for seismic resilience assessment of systems is presented. • Seismic resilience of a hypothetical seaport with realistic settings is studied. • Berth availability is found to govern seaport functionality following earthquakes

  15. An interoperable research data infrastructure to support climate service development

    Directory of Open Access Journals (Sweden)

    T. De Filippis

    2018-02-01

    Full Text Available Accessibility, availability, re-use and re-distribution of scientific data are prerequisites to build climate services across Europe. From this perspective the Institute of Biometeorology of the National Research Council (IBIMET-CNR, aiming at contributing to the sharing and integration of research data, has developed a research data infrastructure to support the scientific activities conducted in several national and international research projects. The proposed architecture uses open-source tools to ensure sustainability in the development and deployment of Web applications with geographic features and data analysis functionalities. The spatial data infrastructure components are organized in typical client–server architecture and interact from the data provider download data process to representation of the results to end users. The availability of structured raw data as customized information paves the way for building climate service purveyors to support adaptation, mitigation and risk management at different scales.This work is a bottom-up collaborative initiative between different IBIMET-CNR research units (e.g. geomatics and information and communication technology – ICT; agricultural sustainability; international cooperation in least developed countries – LDCs that embrace the same approach for sharing and re-use of research data and informatics solutions based on co-design, co-development and co-evaluation among different actors to support the production and application of climate services. During the development phase of Web applications, different users (internal and external were involved in the whole process so as to better define user needs and suggest the implementation of specific custom functionalities. Indeed, the services are addressed to researchers, academics, public institutions and agencies – practitioners who can access data and findings from recent research in the field of applied meteorology and climatology.

  16. IoT interoperability : a hub-based approach

    OpenAIRE

    Blackstock, Michael; Lea, Rodger

    2014-01-01

    Interoperability in the Internet of Things is critical for emerging services and applications. In this paper we advocate the use of IoT ‘hubs’ to aggregate things using web protocols, and suggest a staged approach to interoperability. In the context of a UK government funded project involving 8 IoT projects to address cross-domain IoT interoperability, we introduce the HyperCat IoT catalogue specification. We then describe the tools and techniques we developed to adapt an existing data portal...

  17. Global evaluation of nuclear infrastructure utilization scenarios (GENIUS)

    International Nuclear Information System (INIS)

    unzik-Gougar, Mary Lou; Juchau, Christopher A.; Pasamehmetoglu, Kemal; Wilson, Paul P.H.; Oliver, Kyle M.; Turinsky, Paul J.; Abdel-Khalik, Hany S.; Hays, Ross; Stover, Tracy E.

    2007-01-01

    A new and unique fuel cycle systems code has been developed. Need for this analysis tool was established via methodical development of technical functions and requirements followed by an evaluation of existing fuel cycle codes. As demonstrated by analysis of GNEP-type scenarios, the GENIUS code discretely tracks nuclear material from beginning to end of the fuel cycle and among any number of independent regions. Users can define scenarios starting with any/all existing reactors and fuel cycle facilities or with an ideal futuristic arrangement. Development and preliminary application of GENIUS capabilities in uncertainty analysis/propagation and multi-parameter optimization have also been accomplished. (authors)

  18. IHE based interoperability - benefits and challenges.

    Science.gov (United States)

    Wozak, Florian; Ammenwerth, Elske; Hörbst, Alexander; Sögner, Peter; Mair, Richard; Schabetsberger, Thomas

    2008-01-01

    Optimized workflows and communication between institutions involved in a patient's treatment process can lead to improved quality and efficiency in the healthcare sector. Electronic Health Records (EHRs) provide a patient-centered access to clinical data across institutional boundaries supporting the above mentioned aspects. Interoperability is regarded as vital success factor. However a clear definition of interoperability does not exist. The aim of this work is to define and to assess interoperability criteria as required for EHRs. The definition and assessment of interoperability criteria is supported by the analysis of existing literature and personal experience as well as by discussions with several domain experts. Criteria for interoperability addresses the following aspects: Interfaces, Semantics, Legal and organizational aspects and Security. The Integrating the Healthcare Enterprises initiative (IHE) profiles make a major contribution to these aspects, but they also arise new problems. Flexibility for adoption to different organizational/regional or other specific conditions is missing. Regional or national initiatives should get a possibility to realize their specific needs within the boundaries of IHE profiles. Security so far is an optional element which is one of IHE greatest omissions. An integrated security approach seems to be preferable. Irrespective of the so far practical significance of the IHE profiles it appears to be of great importance, that the profiles are constantly checked against practical experiences and are continuously adapted.

  19. The role of architecture and ontology for interoperability.

    Science.gov (United States)

    Blobel, Bernd; González, Carolina; Oemig, Frank; Lopéz, Diego; Nykänen, Pirkko; Ruotsalainen, Pekka

    2010-01-01

    Turning from organization-centric to process-controlled or even to personalized approaches, advanced healthcare settings have to meet special interoperability challenges. eHealth and pHealth solutions must assure interoperability between actors cooperating to achieve common business objectives. Hereby, the interoperability chain also includes individually tailored technical systems, but also sensors and actuators. For enabling corresponding pervasive computing and even autonomic computing, individualized systems have to be based on an architecture framework covering many domains, scientifically managed by specialized disciplines using their specific ontologies in a formalized way. Therefore, interoperability has to advance from a communication protocol to an architecture-centric approach mastering ontology coordination challenges.

  20. Underground infrastructure damage for a Chicago scenario

    Energy Technology Data Exchange (ETDEWEB)

    Dey, Thomas N [Los Alamos National Laboratory; Bos, Rabdall J [Los Alamos National Laboratory

    2011-01-25

    Estimating effects due to an urban IND (improvised nuclear device) on underground structures and underground utilities is a challenging task. Nuclear effects tests performed at the Nevada Test Site (NTS) during the era of nuclear weapons testing provides much information on how underground military structures respond. Transferring this knowledge to answer questions about the urban civilian environment is needed to help plan responses to IND scenarios. Explosions just above the ground surface can only couple a small fraction of the blast energy into an underground shock. The various forms of nuclear radiation have limited penetration into the ground. While the shock transmitted into the ground carries only a small fraction of the blast energy, peak stresses are generally higher and peak ground displacement is lower than in the air blast. While underground military structures are often designed to resist stresses substantially higher than due to the overlying rocks and soils (overburden), civilian structures such as subways and tunnels would generally only need to resist overburden conditions with a suitable safety factor. Just as we expect the buildings themselves to channel and shield air blast above ground, basements and other underground openings as well as changes of geology will channel and shield the underground shock wave. While a weaker shock is expected in an urban environment, small displacements on very close-by faults, and more likely, soils being displaced past building foundations where utility lines enter could readily damaged or disable these services. Immediately near an explosion, the blast can 'liquefy' a saturated soil creating a quicksand-like condition for a period of time. We extrapolate the nuclear effects experience to a Chicago-based scenario. We consider the TARP (Tunnel and Reservoir Project) and subway system and the underground lifeline (electric, gas, water, etc) system and provide guidance for planning this scenario.

  1. Geographically Based Hydrogen Consumer Demand and Infrastructure Analysis: Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Melendez, M.; Milbrandt, A.

    2006-10-01

    In FY 2004 and 2005, NREL developed a proposed minimal infrastructure to support nationwide deployment of hydrogen vehicles by offering infrastructure scenarios that facilitated interstate travel. This report identifies key metropolitan areas and regions on which to focus infrastructure efforts during the early hydrogen transition.

  2. Mars base buildup scenarios

    International Nuclear Information System (INIS)

    Blacic, J.D.

    1985-01-01

    Two surface base build-up scenarios are presented in order to help visualize the mission and to serve as a basis for trade studies. In the first scenario, direct manned landings on the Martian surface occur early in the missions and scientific investigation is the main driver and rationale. In the second scenario, early development of an infrastructure to exploite the volatile resources of the Martian moons for economic purposes is emphasized. Scientific exploration of the surface is delayed at first, but once begun develops rapidly aided by the presence of a permanently manned orbital station

  3. Facilitating Semantic Interoperability Among Ocean Data Systems: ODIP-R2R Student Outcomes

    Science.gov (United States)

    Stocks, K. I.; Chen, Y.; Shepherd, A.; Chandler, C. L.; Dockery, N.; Elya, J. L.; Smith, S. R.; Ferreira, R.; Fu, L.; Arko, R. A.

    2014-12-01

    With informatics providing an increasingly important set of tools for geoscientists, it is critical to train the next generation of scientists in information and data techniques. The NSF-supported Rolling Deck to Repository (R2R) Program works with the academic fleet community to routinely document, assess, and preserve the underway sensor data from U.S. research vessels. The Ocean Data Interoperability Platform (ODIP) is an EU-US-Australian collaboration fostering interoperability among regional e-infrastructures through workshops and joint prototype development. The need to align terminology between systems is a common challenge across all of the ODIP prototypes. Five R2R students were supported to address aspects of semantic interoperability within ODIP. Developing a vocabulary matching service that links terms from different vocabularies with similar concept. The service implements Google Refine reconciliation service interface such that users can leverage Google Refine application as a friendly user interface while linking different vocabulary terms. Developing Resource Description Framework (RDF) resources that map Shipboard Automated Meteorological Oceanographic System (SAMOS) vocabularies to internationally served vocabularies. Each SAMOS vocabulary term (data parameter and quality control flag) will be described as an RDF resource page. These RDF resources allow for enhanced discoverability and retrieval of SAMOS data by enabling data searches based on parameter. Improving data retrieval and interoperability by exposing data and mapped vocabularies using Semantic Web technologies. We have collaborated with ODIP participating organizations in order to build a generalized data model that will be used to populate a SPARQL endpoint in order to provide expressive querying over our data files. Mapping local and regional vocabularies used by R2R to those used by ODIP partners. This work is described more fully in a companion poster. Making published Linked Data

  4. Building a Global Earth Observation System of Systems (GEOSS) and Its Interoperability Challenges

    Science.gov (United States)

    Ryan, B. J.

    2015-12-01

    Launched in 2005 by industrialized nations, the Group on Earth Observations (GEO) began building the Global Earth Observation System of Systems (GEOSS). Consisting of both a policy framework, and an information infrastructure, GEOSS, was intended to link and/or integrate the multitude of Earth observation systems, primarily operated by its Member Countries and Participating Organizations, so that users could more readily benefit from global information assets for a number of society's key environmental issues. It was recognized that having ready access to observations from multiple systems was a prerequisite for both environmental decision-making, as well as economic development. From the very start, it was also recognized that the shear complexity of the Earth's system cannot be captured by any single observation system, and that a federated, interoperable approach was necessary. While this international effort has met with much success, primarily in advancing broad, open data policies and practices, challenges remain. In 2014 (Geneva, Switzerland) and 2015 (Mexico City, Mexico), Ministers from GEO's Member Countries, including the European Commission, came together to assess progress made during the first decade (2005 to 2015), and approve implementation strategies and mechanisms for the second decade (2016 to 2025), respectively. The approved implementation strategies and mechanisms are intended to advance GEOSS development thereby facilitating the increased uptake of Earth observations for informed decision-making. Clearly there are interoperability challenges that are technological in nature, and several will be discussed in this presentation. There are, however, interoperability challenges that can be better characterized as economic, governmental and/or political in nature, and these will be discussed as well. With the emergence of the Sustainable Development Goals (SDGs), the World Conference on Disaster Risk Reduction (WCDRR), and the United Nations

  5. Medical Device Plug-and-Play Interoperability Standards and Technology Leadership

    Science.gov (United States)

    2017-10-01

    Award Number: W81XWH-09-1-0705 TITLE: “Medical Device Plug-and-Play Interoperability Standards and Technology Leadership” PRINCIPAL INVESTIGATOR...Sept 2016 – 20 Sept 2017 4. TITLE AND SUBTITLE “Medical Device Plug-and-Play Interoperability 5a. CONTRACT NUMBER Standards and Technology ...efficiency through interoperable medical technologies . We played a leadership role on interoperability safety standards (AAMI, AAMI/UL Joint

  6. The strategy for the development of information society in Serbia by 2020: Information security and critical infrastructure

    Directory of Open Access Journals (Sweden)

    Danijela D. Protić

    2012-10-01

    Full Text Available The development of technology has changed the world economy and induced new political trends. The European Union (EU and many non-EU member states apply the strategies of information society development that raise the level of information security (IS. The Serbian Government (Government has adopted the Strategy for Information Society in Serbia by 2020 (Strategy, and pointed to the challenges for the development of a modern Serbian information society. This paper presents an overview of the open-ended questions about IS, critical infrastructures and protection of critical infrastructures. Based on publicly available data, some critical national infrastructures are listed. As a possible solution to the problem of IS, the Public Key Infrastructure (PKI-based Information security integrated information system (ISIIS is presented. The ISIIS provides modularity and interoperability of critical infrastructures both in Serbia and neighboring countries.

  7. Building a Disciplinary, World‐Wide Data Infrastructure

    Directory of Open Access Journals (Sweden)

    Françoise Genova

    2017-04-01

    Full Text Available Sharing scientific data with the objective of making it discoverable, accessible, reusable, and interoperable requires work and presents challenges being faced at the disciplinary level to define in particular how the data should be formatted and described. This paper represents the Proceedings of a session held at SciDataCon 2016 (Denver, 12–13 September 2016. It explores the way a range of disciplines, namely materials science, crystallography, astronomy, earth sciences, humanities and linguistics, get organized at the international level to address those challenges. The disciplinary culture with respect to data sharing, science drivers, organization, lessons learnt and the elements of the data infrastructure which are or could be shared with others are briefly described. Commonalities and differences are assessed. Common key elements for success are identified: data sharing should be science driven; defining the disciplinary part of the interdisciplinary standards is mandatory but challenging; sharing of applications should accompany data sharing. Incentives such as journal and funding agency requirements are also similar. For all, social aspects are more challenging than technological ones. Governance is more diverse, often specific to the discipline organization. Being problem‐driven is also a key factor of success for building bridges to enable interdisciplinary research. Several international data organizations such as CODATA, RDA and WDS can facilitate the establishment of disciplinary interoperability frameworks. As a spin‐off of the session, a RDA Disciplinary Interoperability Interest Group is proposed to bring together representatives across disciplines to better organize and drive the discussion for prioritizing, harmonizing and efficiently articulating disciplinary needs.

  8. Interoperability Guidelines for Lithuanian E-Learning Management Systems

    Directory of Open Access Journals (Sweden)

    Eugenijus Kurilovas

    2011-08-01

    Full Text Available Purpose – the paper aims to analyse e-learning content and repositories along with the problems of learning organisation interoperability. The main objective of the paper is to analyse scientific research results and the newest international experience in the area and to provide interoperability guidelines and recommendations for the implementation of appropriate Lithuanian state programmes. The learning content and repositories recommendations are designed for the implementation of the Lithuanian education portal project as well as Lithuanian Virtual University (LVU programme’s information services’ (LABT / eLABa and e-learning services’ (LieDM sub-programmes. The whole education institution recommendations are designed for the maintenance and development of LVU programme’s management services’ (LieMSIS system.Design/methodology/approach – methods used for the general analysis of proposed interoperability guidelines (reccomendations were bibliographic research and comparative analysis of Lithuanian and foreign scientific works published in periodicals and large-scale EU-funded interoperability projects deliverables. System analysis and comparative analysis methods were used in order to formulate and analyse systems’ interoperability guidelines and recommendations. The author employed the experimental research method while working in the appropriate EU-funded interoperability projects to form the guidelines (recommendations. In order to summarize the results, the evaluative research method was used..Findings – the international guidelines and recommendations presented in the paper could be suitable for implementation while developing Lithuanian state education information systems such as the Lithuanian education portal, the Lithuanian academic libraries’ (eLABa system, the Lithuanian distance learning system (LieDM, and the Lithuanian universities’ management system (LieMSIS.Research limitations/implications – the paper

  9. Interoperability Guidelines for Lithuanian E-Learning Management Systems

    Directory of Open Access Journals (Sweden)

    Eugenijus Kurilovas

    2013-08-01

    Full Text Available Purpose – the paper aims to analyse e-learning content and repositories along with the problems of learning organisation interoperability. The main objective of the paper is to analyse scientific research results and the newest international experience in the area and to provide interoperability guidelines and recommendations for the implementation of appropriate Lithuanian state programmes. The learning content and repositories recommendations are designed for the implementation of the Lithuanian education portal project as well as Lithuanian Virtual University (LVU programme’s information services’ (LABT / eLABa and e-learning services’ (LieDM sub-programmes. The whole education institution recommendations are designed for the maintenance and development of LVU programme’s management services’ (LieMSIS system.Design/methodology/approach – methods used for the general analysis of proposed interoperability guidelines (reccomendations were bibliographic research and comparative analysis of Lithuanian and foreign scientific works published in periodicals and large-scale EU-funded interoperability projects deliverables. System analysis and comparative analysis methods were used in order to formulate and analyse systems’ interoperability guidelines and recommendations. The author employed the experimental research method while working in the appropriate EU-funded interoperability projects to form the guidelines (recommendations. In order to summarize the results, the evaluative research method was used.Findings – the international guidelines and recommendations presented in the paper could be suitable for implementation while developing Lithuanian state education information systems such as the Lithuanian education portal, the Lithuanian academic libraries’ (eLABa system, the Lithuanian distance learning system (LieDM, and the Lithuanian universities’ management system (LieMSIS.Research limitations/implications – the paper

  10. Interoperability of Web Archives and Digital Libraries

    DEFF Research Database (Denmark)

    Kalb, Hendrik; Lazaridou, Paraskevi; Pinsent, Edward

    2013-01-01

    The interoperability of web archives and digital libraries is crucial to avoid silos of preserved data and content. While various researches focus on specfic facets of the challenge to interoperate, there is a lack of empirical work about the overall situation of actual challenges. We conduct...

  11. Policy Model of Sustainable Infrastructure Development (Case Study : Bandarlampung City, Indonesia)

    Science.gov (United States)

    Persada, C.; Sitorus, S. R. P.; Marimin; Djakapermana, R. D.

    2018-03-01

    Infrastructure development does not only affect the economic aspect, but also social and environmental, those are the main dimensions of sustainable development. Many aspects and actors involved in urban infrastructure development requires a comprehensive and integrated policy towards sustainability. Therefore, it is necessary to formulate an infrastructure development policy that considers various dimensions of sustainable development. The main objective of this research is to formulate policy of sustainable infrastructure development. In this research, urban infrastructure covers transportation, water systems (drinking water, storm water, wastewater), green open spaces and solid waste. This research was conducted in Bandarlampung City. This study use a comprehensive modeling, namely the Multi Dimensional Scaling (MDS) with Rapid Appraisal of Infrastructure (Rapinfra), it uses of Analytic Network Process (ANP) and it uses system dynamics model. The findings of the MDS analysis showed that the status of Bandarlampung City infrastructure sustainability is less sustainable. The ANP analysis produces 8 main indicators of the most influential in the development of sustainable infrastructure. The system dynamics model offered 4 scenarios of sustainable urban infrastructure policy model. The best scenario was implemented into 3 policies consist of: the integrated infrastructure management, the population control, and the local economy development.

  12. Special Topic Interoperability and EHR: Combining openEHR, SNOMED, IHE, and Continua as approaches to interoperability on national eHealth.

    Science.gov (United States)

    Beštek, Mate; Stanimirović, Dalibor

    2017-08-09

    The main aims of the paper comprise the characterization and examination of the potential approaches regarding interoperability. This includes openEHR, SNOMED, IHE, and Continua as combined interoperability approaches, possibilities for their incorporation into the eHealth environment, and identification of the main success factors in the field, which are necessary for achieving required interoperability, and consequently, for the successful implementation of eHealth projects in general. The paper represents an in-depth analysis regarding the potential application of openEHR, SNOMED, IHE and Continua approaches in the development and implementation process of eHealth in Slovenia. The research method used is both exploratory and deductive in nature. The methodological framework is grounded on information retrieval with a special focus on research and charting of existing experience in the field, and sources, both electronic and written, which include interoperability concepts and related implementation issues. The paper will try to answer the following inquiries that are complementing each other: 1. Scrutiny of the potential approaches, which could alleviate the pertinent interoperability issues in the Slovenian eHealth context. 2. Analyzing the possibilities (requirements) for their inclusion in the construction process for individual eHealth solutions. 3. Identification and charting the main success factors in the interoperability field that critically influence development and implementation of eHealth projects in an efficient manner. Provided insights and identified success factors could serve as a constituent of the strategic starting points for continuous integration of interoperability principles into the healthcare domain. Moreover, the general implementation of the identified success factors could facilitate better penetration of ICT into the healthcare environment and enable the eHealth-based transformation of the health system especially in the countries

  13. Multi-Agent Decision Support Tool to Enable Interoperability among Heterogeneous Energy Systems

    Directory of Open Access Journals (Sweden)

    Brígida Teixeira

    2018-02-01

    Full Text Available Worldwide electricity markets are undergoing a major restructuring process. One of the main reasons for the ongoing changes is to enable the adaptation of current market models to the new paradigm that arises from the large-scale integration of distributed generation sources. In order to deal with the unpredictability caused by the intermittent nature of the distributed generation and the large number of variables that contribute to the energy sector balance, it is extremely important to use simulation systems that are capable of dealing with the required complexity. This paper presents the Tools Control Center (TOOCC, a framework that allows the interoperability between heterogeneous energy and power simulation systems through the use of ontologies, allowing the simulation of scenarios with a high degree of complexity, through the cooperation of the individual capacities of each system. A case study based on real data is presented in order to demonstrate the interoperability capabilities of TOOCC. The simulation considers the energy management of a microgrid of a real university campus, from the perspective of the network manager and also of its consumers/producers, in a projection for a typical day of the winter of 2050.

  14. Key pillars of data interoperability in Earth Sciences - INSPIRE and beyond

    Science.gov (United States)

    Tomas, Robert; Lutz, Michael

    2013-04-01

    encoding. However, since the conceptual models are independent of concrete encodings, it is also possible to derive other encodings (e.g. based on RDF). Registers provide unique and persistent identifiers for a number of different types of information items (e.g. terms from a controlled vocabulary or units of measure) and allow their consistent management and versioning. By using these identifiers in data, references to specific information items can be made unique and unambiguous. It is important that these interoperability solutions are not developed in isolation - for Europe only. This has been identified from the beginning, and therefore, international standards have been taken into account and been widely referred to in INSPIRE. This mutual cooperation with international standardisation activities needs to be maintained or even extended. For example, where INSPIRE has gone beyond existing standards, the INSPIRE interoperability solutions should be introduced to the international standardisation initiatives. However, in some cases, it is difficult to choose the appropriate international organization or standardisation body (e.g. where there are several organizations overlapping in scope) or to achieve international agreements that accept European specifics. Furthermore, the development of the INSPIRE specifications (to be legally adopted in 2013) is only a beginning of the effort to make environmental data interoperable. Their actual implementation by data providers across Europe, as well as the rapid development in the earth sciences (e.g. from new simulation models, scientific advances, etc.) and ICT technology will lead to requests for changes. It is therefore crucial to ensure the long-term sustainable maintenance and further development of the proposed infrastructure. This task cannot be achieved by the INSPIRE coordination team of the European Commission alone. It is therefore crucial to closely involve relevant (where possible, umbrella) organisations in the

  15. Government Services Information Infrastructure Management

    Energy Technology Data Exchange (ETDEWEB)

    Cavallini, J.S.; Aiken, R.J.

    1995-04-01

    The Government Services Information Infrastructure (GSII) is that portion of the NII used to link Government and its services, enables virtual agency concepts, protects privacy, and supports emergency preparedness needs. The GSII is comprised of the supporting telecommunications technologies, network and information services infrastructure and the applications that use these. The GSII is an enlightened attempt by the Clinton/Gore Administration to form a virtual government crossing agency boundaries to interoperate more closely with industry and with the public to greatly improve the delivery of government services. The GSII and other private sector efforts, will have a significant impact on the design, development, and deployment of the NII, even if only through the procurement of such services. The Federal Government must adopt new mechanisms and new paradigms for the management of the GSII, including improved acquisition and operation of GSII components in order to maximize benefits. Government requirements and applications will continue to evolv. The requirements from government services and users of form affinity groups that more accurately and effectively define these common requirements, that drive the adoption and use of industry standards, and that provide a significant technology marketplace.

  16. Technical Infrastructure of the COSMOS Portal

    Directory of Open Access Journals (Sweden)

    Nikolaos Doulamis

    2008-12-01

    Full Text Available This paper presents the main operations and technologies implemented in the framework of the EU funded COSMOS project. COSMOS introduces an advanced web repository which allows teachers and students to search, retrieve, access educational content and re-use educational material for creating earning activities through a specifically designed web interface incorporating innovative technological solutions. The repository is based on an IEEE LOM representation of the content which supports educational scenarios and learning activities as well. The architecture also supports tools for describing and managing digital content rights, which are interoperably represented using the Creative Commons Rights Expression Language (ccREL.

  17. Enabling European Archaeological Research: The ARIADNE E-Infrastructure

    Directory of Open Access Journals (Sweden)

    Nicola Aloia

    2017-03-01

    Full Text Available Research e-infrastructures, digital archives and data services have become important pillars of scientific enterprise that in recent decades has become ever more collaborative, distributed and data-intensive. The archaeological research community has been an early adopter of digital tools for data acquisition, organisation, analysis and presentation of research results of individual projects. However, the provision of e-infrastructure and services for data sharing, discovery, access and re-use has lagged behind. This situation is being addressed by ARIADNE: the Advanced Research Infrastructure for Archaeological Dataset Networking in Europe. This EU-funded network has developed an e-infrastructure that enables data providers to register and provide access to their resources (datasets, collections through the ARIADNE data portal, facilitating discovery, access and other services across the integrated resources. This article describes the current landscape of data repositories and services for archaeologists in Europe, and the issues that make interoperability between them difficult to realise. The results of the ARIADNE surveys on users' expectations and requirements are also presented. The main section of the article describes the architecture of the e-infrastructure, core services (data registration, discovery and access and various other extant or experimental services. The on-going evaluation of the data integration and services is also discussed. Finally, the article summarises lessons learned, and outlines the prospects for the wider engagement of the archaeological research community in sharing data through ARIADNE.

  18. PERSPECTIVES ON INTEROPERABILITY INTEGRATION WITHIN NATO DEFENSE PLANNING PROCESS

    Directory of Open Access Journals (Sweden)

    Florian CIOCAN

    2011-01-01

    Full Text Available Interoperability is not a new area of effort at NATO level. In fact, interoperability and more specifi cally standardization, has been a key element of the Alliance’s approach to fi elding forces for decades. But as the security and operational environment has been in a continuous change, the need to face the new threats and the current involvement in challenging operations in Afghanistan and elsewhere alongside with the necessity to interoperate at lower and lower levels of command with an increasing number of nations, including non-NATO ISAF partners, NGOs, and other organizations, have made the task even more challenging. In this respect Interoperability Integration within NATO Defense Planning Process will facilitate the timely identifi cation, development and delivery of required forces and capabilities that are interoperable and adequately prepared, equipped, trained and supported to undertake the Alliance’s full spectrum of missions.

  19. The EPOS e-Infrastructure

    Science.gov (United States)

    Jeffery, Keith; Bailo, Daniele

    2014-05-01

    The European Plate Observing System (EPOS) is integrating geoscientific information concerning earth movements in Europe. We are approaching the end of the PP (Preparatory Project) phase and in October 2014 expect to continue with the full project within ESFRI (European Strategic Framework for Research Infrastructures). The key aspects of EPOS concern providing services to allow homogeneous access by end-users over heterogeneous data, software, facilities, equipment and services. The e-infrastructure of EPOS is the heart of the project since it integrates the work on organisational, legal, economic and scientific aspects. Following the creation of an inventory of relevant organisations, persons, facilities, equipment, services, datasets and software (RIDE) the scale of integration required became apparent. The EPOS e-infrastructure architecture has been developed systematically based on recorded primary (user) requirements and secondary (interoperation with other systems) requirements through Strawman, Woodman and Ironman phases with the specification - and developed confirmatory prototypes - becoming more precise and progressively moving from paper to implemented system. The EPOS architecture is based on global core services (Integrated Core Services - ICS) which access thematic nodes (domain-specific European-wide collections, called thematic Core Services - TCS), national nodes and specific institutional nodes. The key aspect is the metadata catalog. In one dimension this is described in 3 levels: (1) discovery metadata using well-known and commonly used standards such as DC (Dublin Core) to enable users (via an intelligent user interface) to search for objects within the EPOS environment relevant to their needs; (2) contextual metadata providing the context of the object described in the catalog to enable a user or the system to determine the relevance of the discovered object(s) to their requirement - the context includes projects, funding, organisations

  20. CCR+: Metadata Based Extended Personal Health Record Data Model Interoperable with the ASTM CCR Standard.

    Science.gov (United States)

    Park, Yu Rang; Yoon, Young Jo; Jang, Tae Hun; Seo, Hwa Jeong; Kim, Ju Han

    2014-01-01

    Extension of the standard model while retaining compliance with it is a challenging issue because there is currently no method for semantically or syntactically verifying an extended data model. A metadata-based extended model, named CCR+, was designed and implemented to achieve interoperability between standard and extended models. Furthermore, a multilayered validation method was devised to validate the standard and extended models. The American Society for Testing and Materials (ASTM) Community Care Record (CCR) standard was selected to evaluate the CCR+ model; two CCR and one CCR+ XML files were evaluated. In total, 188 metadata were extracted from the ASTM CCR standard; these metadata are semantically interconnected and registered in the metadata registry. An extended-data-model-specific validation file was generated from these metadata. This file can be used in a smartphone application (Health Avatar CCR+) as a part of a multilayered validation. The new CCR+ model was successfully evaluated via a patient-centric exchange scenario involving multiple hospitals, with the results supporting both syntactic and semantic interoperability between the standard CCR and extended, CCR+, model. A feasible method for delivering an extended model that complies with the standard model is presented herein. There is a great need to extend static standard models such as the ASTM CCR in various domains: the methods presented here represent an important reference for achieving interoperability between standard and extended models.

  1. Interoperable cross-domain semantic and geospatial framework for automatic change detection

    Science.gov (United States)

    Kuo, Chiao-Ling; Hong, Jung-Hong

    2016-01-01

    With the increasingly diverse types of geospatial data established over the last few decades, semantic interoperability in integrated applications has attracted much interest in the field of Geographic Information System (GIS). This paper proposes a new strategy and framework to process cross-domain geodata at the semantic level. This framework leverages the semantic equivalence of concepts between domains through bridge ontology and facilitates the integrated use of different domain data, which has been long considered as an essential superiority of GIS, but is impeded by the lack of understanding about the semantics implicitly hidden in the data. We choose the task of change detection to demonstrate how the introduction of ontology concept can effectively make the integration possible. We analyze the common properties of geodata and change detection factors, then construct rules and summarize possible change scenario for making final decisions. The use of topographic map data to detect changes in land use shows promising success, as far as the improvement of efficiency and level of automation is concerned. We believe the ontology-oriented approach will enable a new way for data integration across different domains from the perspective of semantic interoperability, and even open a new dimensionality for the future GIS.

  2. Author identities an interoperability problem solved by a collaborative solution

    Science.gov (United States)

    Fleischer, D.; Czerniak, A.; Schirnick, C.

    2012-12-01

    The identity of authors and data providers is crucial for personalized interoperability. The marketplace of available identifiers is packed and the right choice is getting more and more complicated. Even though there are more then 15 different systems available there are still some under development and proposed to come up by the end of 2012 ('PubMed Central Author ID' and ORCID). Data Management on a scale beyond the size of a single research institute but on the scale of a scientific site including a university with student education program needs to tackle this problem and so did the Kiel Data Management an Infrastructure. The main problem with the identities of researchers is the quite high frequency changes in positions during a scientist life. The required system needed to be a system that already contained the potential of preregistered people with their scientific publications from other countries, institutions and organizations. Scanning the author ID marketplace brought up, that there us a high risk of additional workload to the researcher itself or the administration due to the fact that individuals need to register an ID for themselves or the chosen register is not yet big enough to simply find the right entry. On the other hand libraries deal with authors and their publications now for centuries and they have high quality catalogs with person identities already available. Millions of records internationally mapped are available by collaboration with libraries and can be used in exactly the same scope. The international collaboration between libraries (VIAF) provides a mapping between libraries from the US, CA, UK, FR, GER and many more. The international library author identification system made it possible to actually reach at the first matching a success of 60% of all scientists. The additional advantage is that librarians can finalize the Identity system in a kind of background process. The Kiel Data Management Infrastructure initiated a web service

  3. Robotics Systems Joint Project Office (RSJPO) Interoperability Profiles (IOPS) 101

    Science.gov (United States)

    2012-07-01

    interoperability, although they are supported by some interoperability attributes  For example, stair climbing » Stair climbing is not something that...IOPs need to specify » However, the mobility & actuation related interoperable messages can be used to provide stair climbing » Also...interoperability can enable management of different poses or modes, one of which may be stair climbing R O B O T IC S Y S T E M S J P O L e a d e r s h i p

  4. Public Key Infrastructure (PKI) Interoperability: A Security Services Approach to Support Transfer of Trust

    National Research Council Canada - National Science Library

    Hansen, Anthony

    1999-01-01

    Public key infrastructure (PKI) technology is at a primitive stage characterized by deployment of PKIs that are engineered to support the provision of security services within individual enterprises, and are not able to support...

  5. Investigation of Automated Terminal Interoperability Test

    OpenAIRE

    Brammer, Niklas

    2008-01-01

    In order to develop and secure the functionality of its cellular communications systems, Ericsson deals with numerous R&D and I&V activities. One important aspect is interoperability with mobile terminals from different vendors on the world market. Therefore Ericsson co-operates with mobile platform and user equipment manufacturers. These companies visit the interoperability developmental testing (IoDT) laboratories in Linköping to test their developmental products and prototypes in o...

  6. Smart Grid Interoperability Maturity Model Beta Version

    Energy Technology Data Exchange (ETDEWEB)

    Widergren, Steven E.; Drummond, R.; Giroti, Tony; Houseman, Doug; Knight, Mark; Levinson, Alex; longcore, Wayne; Lowe, Randy; Mater, J.; Oliver, Terry V.; Slack, Phil; Tolk, Andreas; Montgomery, Austin

    2011-12-02

    The GridWise Architecture Council was formed by the U.S. Department of Energy to promote and enable interoperability among the many entities that interact with the electric power system. This balanced team of industry representatives proposes principles for the development of interoperability concepts and standards. The Council provides industry guidance and tools that make it an available resource for smart grid implementations. In the spirit of advancing interoperability of an ecosystem of smart grid devices and systems, this document presents a model for evaluating the maturity of the artifacts and processes that specify the agreement of parties to collaborate across an information exchange interface. You are expected to have a solid understanding of large, complex system integration concepts and experience in dealing with software component interoperation. Those without this technical background should read the Executive Summary for a description of the purpose and contents of the document. Other documents, such as checklists, guides, and whitepapers, exist for targeted purposes and audiences. Please see the www.gridwiseac.org website for more products of the Council that may be of interest to you.

  7. Toward an Interoperability Architecture

    National Research Council Canada - National Science Library

    Buddenberg, Rex

    2001-01-01

    .... The continued burgeoning of the Internet constitutes an existence proof. But a common networking base is insufficient to reach a goal of cross-system interoperability - the large information system...

  8. Interactive test tool for interoperable C-ITS development

    NARCIS (Netherlands)

    Voronov, A.; Englund, C.; Bengtsson, H.H.; Chen, L.; Ploeg, J.; Jongh, J.F.C.M. de; Sluis, H.J.D. van de

    2015-01-01

    This paper presents the architecture of an Interactive Test Tool (ITT) for interoperability testing of Cooperative Intelligent Transport Systems (C-ITS). Cooperative systems are developed by different manufacturers at different locations, which makes interoperability testing a tedious task. Up until

  9. Hydrogen for buses in London: A scenario analysis of changes over time in refuelling infrastructure costs

    International Nuclear Information System (INIS)

    Shayegan, S.; Pearson, P.J.G.; Hart, D.

    2009-01-01

    The lack of a hydrogen refuelling infrastructure is one of the major obstacles to the introduction of the hydrogen vehicles to the road transport market. To help overcome this hurdle a likely transitional solution is to introduce hydrogen for niche applications such as buses or other types of fleet vehicles for which fuel demand is predictable and localised. This paper analyses the costs of different hydrogen production-delivery pathways, via a case study of buses in London. Scenario analysis over time (2007-2025) is used to investigate potential changes to the cost of hydrogen as a result of technology development, growing demand for hydrogen and changes in energy prices (gas and electricity). It is found that factors related to hydrogen demand have the greatest effect on the unit cost of hydrogen, while for the whole of the analysis period, on-site SMR (steam methane reforming) remains the least-cost production-delivery pathway. (author)

  10. Data interoperability software solution for emergency reaction in the Europe Union

    Science.gov (United States)

    Casado, R.; Rubiera, E.; Sacristan, M.; Schütte, F.; Peters, R.

    2015-07-01

    Emergency management becomes more challenging in international crisis episodes because of cultural, semantic and linguistic differences between all stakeholders, especially first responders. Misunderstandings between first responders makes decision making slower and more difficult. However, spread and development of networks and IT-based emergency management systems (EMSs) have improved emergency responses, which have become more coordinated. Despite improvements made in recent years, EMSs have not still solved problems related to cultural, semantic and linguistic differences which are the real cause of slower decision making. In addition, from a technical perspective, the consolidation of current EMSs and the different formats used to exchange information offers another problem to be solved in any solution proposed for information interoperability between heterogeneous EMSs in different contexts. To overcome these problems, we present a software solution based on semantic and mediation technologies. EMERGency ELements (EMERGEL) (Fundacion CTIC and AntwortING Ingenieurbüro PartG, 2013), a common and modular ontology shared by all the stakeholders, has been defined. It offers the best solution to gather all stakeholders' knowledge in a unique and flexible data model, taking into account different countries' cultural and linguistic issues. To deal with the diversity of data protocols and formats, we have designed a service-oriented architecture for data interoperability (named DISASTER: Data Interoperability Solution At STakeholders Emergency Reaction) providing a flexible extensible solution to solve the mediation issues. Web services have been adopted as specific technology to implement this paradigm that has the most significant academic and industrial visibility and attraction. Contributions of this work have been validated through the design and development of a cross-border realistic prototype scenario, actively involving both emergency managers and emergency

  11. Evaluating the Organizational Interoperability Maturity Level in ICT Research Center

    Directory of Open Access Journals (Sweden)

    Manijeh Haghighinasab

    2011-03-01

    Full Text Available Interoperability refers to the ability to provide services and to accept services from other systems or devices. Collaborative enterprises face additional challenges to interoperate seamlessly within a networked organization. The major task here is to assess the maturity level of interoperating organizations. For this purpose the maturity models for enterprise were reviewed based on vendors’ reliability and advantages versus disadvantages. Interoperability maturity model was deduced from ATHENA project as European Integrated Project in 2005, this model named as EIMM was examined in Iran information and Communication Institute as a leading Telecommunication organization. 115 questionnaires were distributed between staff of 4 departments: Information Technology, Communication Technology, Security and Strategic studies regarding six areas of concern: Enterprise Modeling, Business Strategy Process, Organization and Competences, Products and Services, Systems and Technology, Legal Environment, Security and Trust at five maturity levels: Performed, Modeled , Integrated, Interoperable and Optimizing maturity. The findings showed different levels of maturity in this Institute. To achieve Interoperability level, appropriate practices are proposed for promotion to the higher levels.

  12. Architectures for the Development of the National Interoperability Framework in Romania

    Directory of Open Access Journals (Sweden)

    Codrin-Florentin NISIOIU

    2015-10-01

    Full Text Available The authors of Digital Agenda consider that Europe do not take fully advantage of interoperability. They believe that we need effective interoperability between IT products and services to build a truly Digital Society. The Digital Agenda can only be effective if all the elements and applications are interoperable and based on open standards and platforms. In this context, I propose in this article a specific architecture for developing Romanian National Interoperability framework.

  13. A logical approach to semantic interoperability in healthcare.

    Science.gov (United States)

    Bird, Linda; Brooks, Colleen; Cheong, Yu Chye; Tun, Nwe Ni

    2011-01-01

    Singapore is in the process of rolling out a number of national e-health initiatives, including the National Electronic Health Record (NEHR). A critical enabler in the journey towards semantic interoperability is a Logical Information Model (LIM) that harmonises the semantics of the information structure with the terminology. The Singapore LIM uses a combination of international standards, including ISO 13606-1 (a reference model for electronic health record communication), ISO 21090 (healthcare datatypes), and SNOMED CT (healthcare terminology). The LIM is accompanied by a logical design approach, used to generate interoperability artifacts, and incorporates mechanisms for achieving unidirectional and bidirectional semantic interoperability.

  14. Modeling Hydrogen Refueling Infrastructure to Support Passenger Vehicles †

    Directory of Open Access Journals (Sweden)

    Matteo Muratori

    2018-05-01

    Full Text Available The year 2014 marked hydrogen fuel cell electric vehicles (FCEVs first becoming commercially available in California, where significant investments are being made to promote the adoption of alternative transportation fuels. A refueling infrastructure network that guarantees adequate coverage and expands in line with vehicle sales is required for FCEVs to be successfully adopted by private customers. In this paper, we provide an overview of modelling methodologies used to project hydrogen refueling infrastructure requirements to support FCEV adoption, and we describe, in detail, the National Renewable Energy Laboratory’s scenario evaluation and regionalization analysis (SERA model. As an example, we use SERA to explore two alternative scenarios of FCEV adoption: one in which FCEV deployment is limited to California and several major cities in the United States; and one in which FCEVs reach widespread adoption, becoming a major option as passenger vehicles across the entire country. Such scenarios can provide guidance and insights for efforts required to deploy the infrastructure supporting transition toward different levels of hydrogen use as a transportation fuel for passenger vehicles in the United States.

  15. The caCORE Software Development Kit: Streamlining construction of interoperable biomedical information services

    Directory of Open Access Journals (Sweden)

    Warzel Denise

    2006-01-01

    Full Text Available Abstract Background Robust, programmatically accessible biomedical information services that syntactically and semantically interoperate with other resources are challenging to construct. Such systems require the adoption of common information models, data representations and terminology standards as well as documented application programming interfaces (APIs. The National Cancer Institute (NCI developed the cancer common ontologic representation environment (caCORE to provide the infrastructure necessary to achieve interoperability across the systems it develops or sponsors. The caCORE Software Development Kit (SDK was designed to provide developers both within and outside the NCI with the tools needed to construct such interoperable software systems. Results The caCORE SDK requires a Unified Modeling Language (UML tool to begin the development workflow with the construction of a domain information model in the form of a UML Class Diagram. Models are annotated with concepts and definitions from a description logic terminology source using the Semantic Connector component. The annotated model is registered in the Cancer Data Standards Repository (caDSR using the UML Loader component. System software is automatically generated using the Codegen component, which produces middleware that runs on an application server. The caCORE SDK was initially tested and validated using a seven-class UML model, and has been used to generate the caCORE production system, which includes models with dozens of classes. The deployed system supports access through object-oriented APIs with consistent syntax for retrieval of any type of data object across all classes in the original UML model. The caCORE SDK is currently being used by several development teams, including by participants in the cancer biomedical informatics grid (caBIG program, to create compatible data services. caBIG compatibility standards are based upon caCORE resources, and thus the caCORE SDK has

  16. Data Modeling Challenges of Advanced Interoperability.

    Science.gov (United States)

    Blobel, Bernd; Oemig, Frank; Ruotsalainen, Pekka

    2018-01-01

    Progressive health paradigms, involving many different disciplines and combining multiple policy domains, requires advanced interoperability solutions. This results in special challenges for modeling health systems. The paper discusses classification systems for data models and enterprise business architectures and compares them with the ISO Reference Architecture. On that basis, existing definitions, specifications and standards of data models for interoperability are evaluated and their limitations are discussed. Amendments to correctly use those models and to better meet the aforementioned challenges are offered.

  17. Promoting Interoperability: The Case for Discipline-Specific PSAPS

    Science.gov (United States)

    2014-12-01

    multijurisdictional, interoperability is a key factor for success. Responses to 9/11,9 the Oso mudslides in Washington, the Boston Marathon bombing...Continuum125 2. Functional Interoperability As demonstrated by the 9/11 attacks, the Oso mudslide in Washington, the Boston Marathon bombing, and other large

  18. Participatory Infrastructuring of Community Energy

    DEFF Research Database (Denmark)

    Capaccioli, Andrea; Poderi, Giacomo; Bettega, Mela

    2016-01-01

    Thanks to renewable energies the decentralized energy system model is becoming more relevant in the production and distribution of energy. The scenario is important in order to achieve a successful energy transition. This paper presents a reflection on the ongoing experience of infrastructuring a...

  19. Collaborative Access Control For Critical Infrastructures

    Science.gov (United States)

    Baina, Amine; El Kalam, Anas Abou; Deswarte, Yves; Kaaniche, Mohamed

    A critical infrastructure (CI) can fail with various degrees of severity due to physical and logical vulnerabilities. Since many interdependencies exist between CIs, failures can have dramatic consequences on the entire infrastructure. This paper focuses on threats that affect information and communication systems that constitute the critical information infrastructure (CII). A new collaborative access control framework called PolyOrBAC is proposed to address security problems that are specific to CIIs. The framework offers each organization participating in a CII the ability to collaborate with other organizations while maintaining control of its resources and internal security policy. The approach is demonstrated on a practical scenario involving the electrical power grid.

  20. Toward semantic interoperability with linked foundational ontologies in ROMULUS

    CSIR Research Space (South Africa)

    Khan, ZC

    2013-06-01

    Full Text Available A purpose of a foundational ontology is to solve interoperability issues among ontologies. Many foundational ontologies have been developed, reintroducing the ontology interoperability problem. We address this with the new online foundational...

  1. Interoperability of Demand Response Resources Demonstration in NY

    Energy Technology Data Exchange (ETDEWEB)

    Wellington, Andre

    2014-03-31

    The Interoperability of Demand Response Resources Demonstration in NY (Interoperability Project) was awarded to Con Edison in 2009. The objective of the project was to develop and demonstrate methodologies to enhance the ability of customer sited Demand Response resources to integrate more effectively with electric delivery companies and regional transmission organizations.

  2. The Influence of Information Systems Interoperability on Economic Activity in Poland

    Directory of Open Access Journals (Sweden)

    Ganczar Małgorzata

    2017-12-01

    Full Text Available In the text, I discuss the abilities and challenges of information systems interoperability. The anticipated and expected result of interoperability is to improve the provision of public utility services to citizens and companies by means of facilitating the provision of public utility services on the basis of a “single window” principle and reducing the costs incurred by public administrations, companies, and citizens, resulting from the efficiency of the provision of public utility services. In the article, the conceptual framework of interoperability is elaborated upon. Moreover, information systems and public registers for entrepreneurs in Poland exemplify whether the interoperability may be applied and, if so, whether interoperability fulfils its targets to the extent of e-Government services for entrepreneurs.

  3. NASA JPL Distributed Systems Technology (DST) Object-Oriented Component Approach for Software Inter-Operability and Reuse

    Science.gov (United States)

    Hall, Laverne; Hung, Chaw-Kwei; Lin, Imin

    2000-01-01

    The purpose of this paper is to provide a description of NASA JPL Distributed Systems Technology (DST) Section's object-oriented component approach to open inter-operable systems software development and software reuse. It will address what is meant by the terminology object component software, give an overview of the component-based development approach and how it relates to infrastructure support of software architectures and promotes reuse, enumerate on the benefits of this approach, and give examples of application prototypes demonstrating its usage and advantages. Utilization of the object-oriented component technology approach for system development and software reuse will apply to several areas within JPL, and possibly across other NASA Centers.

  4. Participatory evaluation of regional light rail scenarios: A Flemish case on sustainable mobility and land-use

    International Nuclear Information System (INIS)

    Vermote, Levi; Macharis, Cathy; Hollevoet, Joachim; Putman, Koen

    2014-01-01

    Highlights: • We propose three light rail scenarios, each covering a specific landscape structure to curtail private vehicle-driven urban sprawl in the Flemish rhombus. • We used the participatory multi-actor multi-criteria analysis (MAMCA) to assess the social, economic and environmental impact of alternative light rail scenarios. • We discuss catalyst measures to comply with the identified drawbacks of the proposed scenarios. - Abstract: Rail transit is generally acknowledged as an alternative transport mode in contributing towards sustainable mobility. In addition to minimising negative externalities, rail transit has sustainable land-use opportunities to integrate transport- and spatial planning. The objective of this paper is to determine the impact of integrative light rail scenarios and their ability to curtail private vehicle driven urban sprawl in the Flemish rhombus. The paper proposes three light rail scenarios: an infrastructural scenario; tramification scenario; and spatial rail scenario, each covering a specific landscape structure to reorganise the dispersed spatial environment in Flanders in the long-term. We used the participatory multi-actor multi-criteria analysis (MAMCA) which incorporates the objectives of all involved stakeholders to assess the impact of the scenarios. The infrastructural alternative scenario gained most support among the involved stakeholders, on the grounds of improved multimodality, enhanced user amenities, reduced implementation costs, moderated greenhouse gas emissions and mitigated infrastructural barrier effects. Despite the merits of the infrastructural scenario in terms of stakeholder objectives, few possibilities are included to elaborate upon sustainable land-use development. In response to the low performance of this assessment criterion, catalyst measures are discussed to support the implementation

  5. Special topic interoperability and EHR: Combining openEHR, SNOMED, IHE, and continua as approaches to interoperability on national ehealth

    DEFF Research Database (Denmark)

    Bestek, M.; Stanimirovi, D.

    2017-01-01

    into the eHealth environment, and identification of the main success factors in the field, which are necessary for achieving required interoperability, and consequently, for the successful implementation of eHealth projects in general. Methods: The paper represents an in-depth analysis regarding...... the potential application of openEHR, SNOMED, IHE and Continua approaches in the development and implementation process of eHealth in Slovenia. The research method used is both exploratory and deductive in nature. The methodological framework is grounded on information retrieval with a special focus on research...... could alleviate the pertinent interoperability issues in the Slovenian eHealth context. 2. Analyzing the possibilities (requirements) for their inclusion in the construction process for individual eHealth solutions. 3. Identification and charting the main success factors in the interoperability field...

  6. Simulating economic effects of disruptions in the telecommunications infrastructure.

    Energy Technology Data Exchange (ETDEWEB)

    Cox, Roger Gary; Barton, Dianne Catherine; Reinert, Rhonda K.; Eidson, Eric D.; Schoenwald, David Alan

    2004-01-01

    CommAspen is a new agent-based model for simulating the interdependent effects of market decisions and disruptions in the telecommunications infrastructure on other critical infrastructures in the U.S. economy such as banking and finance, and electric power. CommAspen extends and modifies the capabilities of Aspen-EE, an agent-based model previously developed by Sandia National Laboratories to analyze the interdependencies between the electric power system and other critical infrastructures. CommAspen has been tested on a series of scenarios in which the communications network has been disrupted, due to congestion and outages. Analysis of the scenario results indicates that communications networks simulated by the model behave as their counterparts do in the real world. Results also show that the model could be used to analyze the economic impact of communications congestion and outages.

  7. Scenarios for remote gas production

    International Nuclear Information System (INIS)

    Tangen, Grethe; Molnvik, Mona J.

    2009-01-01

    The amount of natural gas resources accessible via proven production technology and existing infrastructure is declining. Therefore, smaller and less accessible gas fields are considered for commercial exploitation. The research project Enabling production of remote gas builds knowledge and technology aiming at developing competitive remote gas production based on floating LNG and chemical gas conversion. In this project, scenarios are used as basis for directing research related to topics that affect the overall design and operation of such plants. Selected research areas are safety, environment, power supply, operability and control. The paper summarises the scenario building process as a common effort among research institutes and industry. Further, it documents four scenarios for production of remote gas and outlines how the scenarios are applied to establish research strategies and adequate plans in a multidisciplinary project. To ensure relevance of the scenarios, it is important to adapt the building process to the current problem and the scenarios should be developed with extensive participation of key personnel.

  8. Patterns in Standards and Technologies for Economic Information Systems Interoperability

    Directory of Open Access Journals (Sweden)

    Vasile Irimia

    2012-06-01

    Full Text Available This paper presets results from a review of the current standards used for collaboration between economic information systems, including web services and service oriented architecture, EDI, ebXML framework, RosettaNet framework, cXML, xCBL UBL, BPMN, BPEL, WS-CDL, ASN.1, and others. Standards have a key role in promoting economic information system interoperability, and thus enable collaboration. Analyzing the current standards, technologies and applications used for economic information systems interoperability has revealed a common pattern that runs through all of them. From this pattern we construct a basic model of interoperability around which we relate and judge all standards, technologies and applications for economic information systems interoperability.

  9. The next generation of interoperability agents in healthcare.

    Science.gov (United States)

    Cardoso, Luciana; Marins, Fernando; Portela, Filipe; Santos, Manuel; Abelha, António; Machado, José

    2014-05-16

    Interoperability in health information systems is increasingly a requirement rather than an option. Standards and technologies, such as multi-agent systems, have proven to be powerful tools in interoperability issues. In the last few years, the authors have worked on developing the Agency for Integration, Diffusion and Archive of Medical Information (AIDA), which is an intelligent, agent-based platform to ensure interoperability in healthcare units. It is increasingly important to ensure the high availability and reliability of systems. The functions provided by the systems that treat interoperability cannot fail. This paper shows the importance of monitoring and controlling intelligent agents as a tool to anticipate problems in health information systems. The interaction between humans and agents through an interface that allows the user to create new agents easily and to monitor their activities in real time is also an important feature, as health systems evolve by adopting more features and solving new problems. A module was installed in Centro Hospitalar do Porto, increasing the functionality and the overall usability of AIDA.

  10. The Next Generation of Interoperability Agents in Healthcare

    Directory of Open Access Journals (Sweden)

    Luciana Cardoso

    2014-05-01

    Full Text Available Interoperability in health information systems is increasingly a requirement rather than an option. Standards and technologies, such as multi-agent systems, have proven to be powerful tools in interoperability issues. In the last few years, the authors have worked on developing the Agency for Integration, Diffusion and Archive of Medical Information (AIDA, which is an intelligent, agent-based platform to ensure interoperability in healthcare units. It is increasingly important to ensure the high availability and reliability of systems. The functions provided by the systems that treat interoperability cannot fail. This paper shows the importance of monitoring and controlling intelligent agents as a tool to anticipate problems in health information systems. The interaction between humans and agents through an interface that allows the user to create new agents easily and to monitor their activities in real time is also an important feature, as health systems evolve by adopting more features and solving new problems. A module was installed in Centro Hospitalar do Porto, increasing the functionality and the overall usability of AIDA.

  11. A data infrastructure for the assessment of health care performance: lessons from the BRIDGE-health project.

    Science.gov (United States)

    Bernal-Delgado, Enrique; Estupiñán-Romero, Francisco

    2018-01-01

    The integration of different administrative data sources from a number of European countries has been shown useful in the assessment of unwarranted variations in health care performance. This essay describes the procedures used to set up a data infrastructure (e.g., data access and exchange, definition of the minimum common wealth of data required, and the development of the relational logic data model) and, the methods to produce trustworthy healthcare performance measurements (e.g., ontologies standardisation and quality assurance analysis). The paper ends providing some hints on how to use these lessons in an eventual European infrastructure on public health research and monitoring. Although the relational data infrastructure developed has been proven accurate, effective to compare health system performance across different countries, and efficient enough to deal with hundred of millions of episodes, the logic data model might not be responsive if the European infrastructure aims at including electronic health records and carrying out multi-cohort multi-intervention comparative effectiveness research. The deployment of a distributed infrastructure based on semantic interoperability, where individual data remain in-country and open-access scripts for data management and analysis travel around the hubs composing the infrastructure, might be a sensible way forward.

  12. Reference architecture for interoperability testing of Electric Vehicle charging

    NARCIS (Netherlands)

    Lehfuss, F.; Nohrer, M.; Werkmany, E.; Lopezz, J.A.; Zabalaz, E.

    2015-01-01

    This paper presents a reference architecture for interoperability testing of electric vehicles as well as their support equipment with the smart grid and the e-Mobility environment. Pan-European Electric Vehicle (EV)-charging is currently problematic as there are compliance and interoperability

  13. COOPEUS - connecting research infrastructures in environmental sciences

    Science.gov (United States)

    Koop-Jakobsen, Ketil; Waldmann, Christoph; Huber, Robert

    2015-04-01

    The COOPEUS project was initiated in 2012 bringing together 10 research infrastructures (RIs) in environmental sciences from the EU and US in order to improve the discovery, access, and use of environmental information and data across scientific disciplines and across geographical borders. The COOPEUS mission is to facilitate readily accessible research infrastructure data to advance our understanding of Earth systems through an international community-driven effort, by: Bringing together both user communities and top-down directives to address evolving societal and scientific needs; Removing technical, scientific, cultural and geopolitical barriers for data use; and Coordinating the flow, integrity and preservation of information. A survey of data availability was conducted among the COOPEUS research infrastructures for the purpose of discovering impediments for open international and cross-disciplinary sharing of environmental data. The survey showed that the majority of data offered by the COOPEUS research infrastructures is available via the internet (>90%), but the accessibility to these data differ significantly among research infrastructures; only 45% offer open access on their data, whereas the remaining infrastructures offer restricted access e.g. do not release raw data or sensible data, demand user registration or require permission prior to release of data. These rules and regulations are often installed as a form of standard practice, whereas formal data policies are lacking in 40% of the infrastructures, primarily in the EU. In order to improve this situation COOPEUS has installed a common data-sharing policy, which is agreed upon by all the COOPEUS research infrastructures. To investigate the existing opportunities for improving interoperability among environmental research infrastructures, COOPEUS explored the opportunities with the GEOSS common infrastructure (GCI) by holding a hands-on workshop. Through exercises directly registering resources

  14. Security Analysis of Smart Grid Cyber Physical Infrastructures Using Modeling and Game Theoretic Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Abercrombie, Robert K [ORNL; Sheldon, Frederick T. [University of Idaho

    2015-01-01

    Cyber physical computing infrastructures typically consist of a number of sites are interconnected. Its operation critically depends both on cyber components and physical components. Both types of components are subject to attacks of different kinds and frequencies, which must be accounted for the initial provisioning and subsequent operation of the infrastructure via information security analysis. Information security analysis can be performed using game theory implemented in dynamic Agent Based Game Theoretic (ABGT) simulations. Such simulations can be verified with the results from game theory analysis and further used to explore larger scale, real world scenarios involving multiple attackers, defenders, and information assets. We concentrated our analysis on the electric sector failure scenarios and impact analyses by the NESCOR Working Group Study, From the Section 5 electric sector representative failure scenarios; we extracted the four generic failure scenarios and grouped them into three specific threat categories (confidentiality, integrity, and availability) to the system. These specific failure scenarios serve as a demonstration of our simulation. The analysis using our ABGT simulation demonstrates how to model the electric sector functional domain using a set of rationalized game theoretic rules decomposed from the failure scenarios in terms of how those scenarios might impact the cyber physical infrastructure network with respect to CIA.

  15. An Ontological Solution to Support Interoperability in the Textile Industry

    Science.gov (United States)

    Duque, Arantxa; Campos, Cristina; Jiménez-Ruiz, Ernesto; Chalmeta, Ricardo

    Significant developments in information and communication technologies and challenging market conditions have forced enterprises to adapt their way of doing business. In this context, providing mechanisms to guarantee interoperability among heterogeneous organisations has become a critical issue. Even though prolific research has already been conducted in the area of enterprise interoperability, we have found that enterprises still struggle to introduce fully interoperable solutions, especially, in terms of the development and application of ontologies. Thus, the aim of this paper is to introduce basic ontology concepts in a simple manner and to explain the advantages of the use of ontologies to improve interoperability. We will also present a case study showing the implementation of an application ontology for an enterprise in the textile/clothing sector.

  16. Interoperability of Information Systems Managed and Used by the Local Health Departments.

    Science.gov (United States)

    Shah, Gulzar H; Leider, Jonathon P; Luo, Huabin; Kaur, Ravneet

    2016-01-01

    In the post-Affordable Care Act era marked by interorganizational collaborations and availability of large amounts of electronic data from other community partners, it is imperative to assess the interoperability of information systems used by the local health departments (LHDs). To describe the level of interoperability of LHD information systems and identify factors associated with lack of interoperability. This mixed-methods research uses data from the 2015 Informatics Capacity and Needs Assessment Survey, with a target population of all LHDs in the United States. A representative sample of 650 LHDs was drawn using a stratified random sampling design. A total of 324 completed responses were received (50% response rate). Qualitative data were used from a key informant interview study of LHD informatics staff from across the United States. Qualitative data were independently coded by 2 researchers and analyzed thematically. Survey data were cleaned, bivariate comparisons were conducted, and a multivariable logistic regression was run to characterize factors associated with interoperability. For 30% of LHDs, no systems were interoperable, and 38% of LHD respondents indicated some of the systems were interoperable. Significant determinants of interoperability included LHDs having leadership support (adjusted odds ratio [AOR] = 3.54), control of information technology budget allocation (AOR = 2.48), control of data systems (AOR = 2.31), having a strategic plan for information systems (AOR = 1.92), and existence of business process analysis and redesign (AOR = 1.49). Interoperability of all systems may be an informatics goal, but only a small proportion of LHDs reported having interoperable systems, pointing to a substantial need among LHDs nationwide.

  17. Geographic Hotspots of Critical National Infrastructure.

    Science.gov (United States)

    Thacker, Scott; Barr, Stuart; Pant, Raghav; Hall, Jim W; Alderson, David

    2017-12-01

    Failure of critical national infrastructures can result in major disruptions to society and the economy. Understanding the criticality of individual assets and the geographic areas in which they are located is essential for targeting investments to reduce risks and enhance system resilience. Within this study we provide new insights into the criticality of real-life critical infrastructure networks by integrating high-resolution data on infrastructure location, connectivity, interdependence, and usage. We propose a metric of infrastructure criticality in terms of the number of users who may be directly or indirectly disrupted by the failure of physically interdependent infrastructures. Kernel density estimation is used to integrate spatially discrete criticality values associated with individual infrastructure assets, producing a continuous surface from which statistically significant infrastructure criticality hotspots are identified. We develop a comprehensive and unique national-scale demonstration for England and Wales that utilizes previously unavailable data from the energy, transport, water, waste, and digital communications sectors. The testing of 200,000 failure scenarios identifies that hotspots are typically located around the periphery of urban areas where there are large facilities upon which many users depend or where several critical infrastructures are concentrated in one location. © 2017 Society for Risk Analysis.

  18. Scientific Digital Libraries, Interoperability, and Ontologies

    Science.gov (United States)

    Hughes, J. Steven; Crichton, Daniel J.; Mattmann, Chris A.

    2009-01-01

    Scientific digital libraries serve complex and evolving research communities. Justifications for the development of scientific digital libraries include the desire to preserve science data and the promises of information interconnectedness, correlative science, and system interoperability. Shared ontologies are fundamental to fulfilling these promises. We present a tool framework, some informal principles, and several case studies where shared ontologies are used to guide the implementation of scientific digital libraries. The tool framework, based on an ontology modeling tool, was configured to develop, manage, and keep shared ontologies relevant within changing domains and to promote the interoperability, interconnectedness, and correlation desired by scientists.

  19. Interoperable and standard e-Health solution over Bluetooth.

    Science.gov (United States)

    Martinez, I; Del Valle, P; Munoz, P; Trigo, J D; Escayola, J; Martínez-Espronceda, M; Muñoz, A; Serrano, L; Garcia, J

    2010-01-01

    The new paradigm of e-Health demands open sensors and middleware components that permit transparent integration and end-to-end interoperability of new personal health devices. The use of standards seems to be the internationally adopted way to solve these problems. This paper presents the implementation of an end-to-end standards-based e-Health solution. This includes ISO/IEEE11073 standard for the interoperability of the medical devices in the patient environment and EN13606 standard for the interoperable exchange of the Electronic Healthcare Record. The design strictly fulfills all the technical features of the most recent versions of both standards. The implemented prototype has been tested in a laboratory environment to demonstrate its feasibility for its further transfer to the healthcare system.

  20. Interoperability of Services in an Open Broadband Market : Cases from the Netherlands

    NARCIS (Netherlands)

    Burgmeijer, J.

    2006-01-01

    End-to-end interoperability of broadband services and networks is a condition for an open broadband market. A business model for broadband service interoperability is given. Two cases from the Netherlands, of initiatives from the market to reach interoperability, are presented: E-norm and FIST VoIP.

  1. Interoperability

    DEFF Research Database (Denmark)

    Savin, Andrej

    be limited. Fourth, Data protection “by design” would be distinguished from data protection “by default”. Fifth, new fundamental rights would be introduced and the old ones clarified. Sixth, new rules on controllers’ and processors’ duties, on supervisory authorities and on sanctions would be introduced....... Finally, the Commission would obtain significant new powers to adopt delegated acts. This appendix explores the impact that the proposed Regulation might have on interoperability of user-­‐generated services.4 Since the proposed Regulation is an instrument of high complexity, only those provisions...... of direct relevance for the project and Work Package 5 will be analysed here....

  2. Benefit quantification of interoperability in coordinate metrology

    DEFF Research Database (Denmark)

    Savio, E.; Carmignato, S.; De Chiffre, Leonardo

    2014-01-01

    these inefficiencies. The paper presents a methodology for an economic evaluation of interoperability benefits with respect to the verification of geometrical product specifications. It requires input data from testing and inspection activities, as well as information on training of personnel and licensing of software......One of the factors contributing to limited reproducibility of coordinate measurements is the use of different inspection software. Time-consuming efforts for translation of part programmes are sometimes needed, and interoperability of inspection equipment has the potential to reduce...

  3. Interoperability, Enterprise Architectures, and IT Governance in Government

    OpenAIRE

    Scholl , Hans ,; Kubicek , Herbert; Cimander , Ralf

    2011-01-01

    Part 4: Architecture, Security and Interoperability; International audience; Government represents a unique, and also uniquely complex, environment for interoperation of information systems as well as for integration of workflows and processes across governmental levels and branches. While private-sector organizations by and large have the capacity to implement “enterprise architectures” in a relatively straightforward fashion, for notable reasons governments do not enjoy such luxury. For thi...

  4. System and methods of resource usage using an interoperable management framework

    Science.gov (United States)

    Heileman, Gregory L.; Jamkhedkar, Pramod A.; Lamb, Christopher C.

    2017-10-31

    Generic rights expression language allowing interoperability across different computing environments including resource usage of different applications. A formal framework for usage management provides scaffolding upon which interoperable usage management systems can be built. Certain features of the framework are standardized, such as the operational semantics, including areas free of standards that necessitate choice and innovation to achieve a balance of flexibility and usability for interoperability in usage management systems.

  5. Enabling interoperability in planetary sciences and heliophysics: The case for an information model

    Science.gov (United States)

    Hughes, J. Steven; Crichton, Daniel J.; Raugh, Anne C.; Cecconi, Baptiste; Guinness, Edward A.; Isbell, Christopher E.; Mafi, Joseph N.; Gordon, Mitchell K.; Hardman, Sean H.; Joyner, Ronald S.

    2018-01-01

    The Planetary Data System has developed the PDS4 Information Model to enable interoperability across diverse science disciplines. The Information Model is based on an integration of International Organization for Standardization (ISO) level standards for trusted digital archives, information model development, and metadata registries. Where controlled vocabularies provides a basic level of interoperability by providing a common set of terms for communication between both machines and humans the Information Model improves interoperability by means of an ontology that provides semantic information or additional related context for the terms. The information model was defined by team of computer scientists and science experts from each of the diverse disciplines in the Planetary Science community, including Atmospheres, Geosciences, Cartography and Imaging Sciences, Navigational and Ancillary Information, Planetary Plasma Interactions, Ring-Moon Systems, and Small Bodies. The model was designed to be extensible beyond the Planetary Science community, for example there are overlaps between certain PDS disciplines and the Heliophysics and Astrophysics disciplines. "Interoperability" can apply to many aspects of both the developer and the end-user experience, for example agency-to-agency, semantic level, and application level interoperability. We define these types of interoperability and focus on semantic level interoperability, the type of interoperability most directly enabled by an information model.

  6. Evaluation of Enterprise Architecture Interoperability

    National Research Council Canada - National Science Library

    Jamison, Theresa A; Niska, Brice T; Layman, Phillip A; Whitney, Steven P

    2005-01-01

    ...), which describes these architectures. The purpose of this project, suggested by Air Force Space Command, was to examine the value of existing analytical tools in making an interoperability assessment of individual enterprises, as well...

  7. The DFG Viewer for Interoperability in Germany

    Directory of Open Access Journals (Sweden)

    Ralf Goebel

    2010-02-01

    Full Text Available This article deals with the DFG Viewer for Interoperability, a free and open source web-based viewer for digitised books, and assesses its relevance for interoperability in Germany. First the specific situation in Germany is described, including the important role of the Deutsche Forschungsgemeinschaft (German Research Foundation. The article then moves on to the overall concept of the viewer and its technical background. It introduces the data formats and standards used, it briefly illustrates how the viewer works and includes a few examples.

  8. On the formal definition of the systems' interoperability capability: an anthropomorphic approach

    Science.gov (United States)

    Zdravković, Milan; Luis-Ferreira, Fernando; Jardim-Goncalves, Ricardo; Trajanović, Miroslav

    2017-03-01

    The extended view of enterprise information systems in the Internet of Things (IoT) introduces additional complexity to the interoperability problems. In response to this, the problem of systems' interoperability is revisited by taking into the account the different aspects of philosophy, psychology, linguistics and artificial intelligence, namely by analysing the potential analogies between the processes of human and system communication. Then, the capability to interoperate as a property of the system, is defined as a complex ability to seamlessly sense and perceive a stimulus from its environment (assumingly, a message from any other system), make an informed decision about this perception and consequently, articulate a meaningful and useful action or response, based on this decision. Although this capability is defined on the basis of the existing interoperability theories, the proposed approach to its definition excludes the assumption on the awareness of co-existence of two interoperating systems. Thus, it establishes the links between the research of interoperability of systems and intelligent software agents, as one of the systems' digital identities.

  9. Securing military information systems on public infrastructure

    CSIR Research Space (South Africa)

    Botha, P

    2015-03-01

    Full Text Available to set up in time for scenarios which require real time information. This may force communications to utilise public infrastructure. Securing communications for military mobile and Web based systems over public networks poses a greater challenge compared...

  10. On the applicability of schema integration techniques to database interoperation

    NARCIS (Netherlands)

    Vermeer, Mark W.W.; Apers, Peter M.G.

    1996-01-01

    We discuss the applicability of schema integration techniques developed for tightly-coupled database interoperation to interoperation of databases stemming from different modelling contexts. We illustrate that in such an environment, it is typically quite difficult to infer the real-world semantics

  11. Testing Realistic Disaster Scenarios for Space Weather: The Economic Impacts of Electricity Transmission Infrastructure Failure in the UK

    Science.gov (United States)

    Gibbs, M.; Oughton, E. J.; Hapgood, M. A.

    2017-12-01

    The socio-economic impacts of space weather have been under-researched, despite this threat featuring on the UK's National Risk Register. In this paper, a range of Realistic Disaster Scenarios due to failure in electricity transmission infrastructure are tested. We use regional Geomagnetically Induced Current (GIC) studies to identify areas in the UK high-voltage power system deemed to be high-risk. The potential level of disruption arising from a large geomagnetic disturbance in these `hot spots' on local economic activity is explored. Electricity is a necessary factor of production without which businesses cannot operate, so even short term power loss can cause significant loss of value. We utilise a spatially disaggregated approach that focuses on quantifying employment disruption by industrial sector, and relating this to direct Gross Value Added loss. We then aggregate this direct loss into a set of shocks to undertake macroeconomic modelling of different scenarios, to obtain the total economic impact which includes both direct and indirect supply chain disruption effects. These results are reported for a range of temporal periods, with the minimum increment being a one-hour blackout. This work furthers our understanding of the economic impacts of space weather, and can inform future reviews of the UK's National Risk Register. The key contribution of the paper is that the results can be used in the future cost-benefit analysis of investment in space weather forecasting.

  12. Implementing Interoperability in the Seafood Industry: Learning from Experiences in Other Sectors.

    Science.gov (United States)

    Bhatt, Tejas; Gooch, Martin; Dent, Benjamin; Sylvia, Gilbert

    2017-08-01

    Interoperability of communication and information technologies within and between businesses operating along supply chains is being pursued and implemented in numerous industries worldwide to increase the efficiency and effectiveness of operations. The desire for greater interoperability is also driven by the need to reduce business risk through more informed management decisions. Interoperability is achieved by the development of a technology architecture that guides the design and implementation of communication systems existing within individual businesses and between businesses comprising the supply chain. Technology architectures are developed through a purposeful dialogue about why the architecture is required, the benefits and opportunities that the architecture offers the industry, and how the architecture will translate into practical results. An assessment of how the finance, travel, and health industries and a sector of the food industry-fresh produce-have implemented interoperability was conducted to identify lessons learned that can aid the development of interoperability in the seafood industry. The findings include identification of the need for strong, effective governance during the establishment and operation of an interoperability initiative to ensure the existence of common protocols and standards. The resulting insights were distilled into a series of principles for enabling syntactic and semantic interoperability in any industry, which we summarize in this article. Categorized as "structural," "operational," and "integrative," the principles describe requirements and solutions that are pivotal to enabling businesses to create and capture value from full chain interoperability. The principles are also fundamental to allowing governments and advocacy groups to use traceability for public good. © 2017 Institute of Food Technologists®.

  13. Global Land Transport Infrastructure Requirements

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-06-01

    Over the next four decades, global passenger and freight travel is expected to double over 2010 levels. In order to accommodate this growth, it is expected that the world will need to add nearly 25 million paved road lane-kilometres and 335 000 rail track kilometres. In addition, it is expected that between 45 000 square kilometres and 77 000 square kilometres of new parking spaces will be added to accommodate vehicle stock growth. These land transport infrastructure additions, when combined with operations, maintenance and repairs, are expected to cost as much as USD 45 trillion by 2050. This publication reports on the International Energy Agency’s (IEA) analysis of infrastructure requirements to support projected road and rail travel through 2050, using the IEA Mobility Model. It considers land transport infrastructure additions to support travel growth to 2050. It also considers potential savings if countries pursue “avoid and shift” policies: in this scenario, cumulative global land transport infrastructure spending could decrease as much as USD 20 trillion by 2050 over baseline projections.

  14. Developing multinational radioactive waste repositories: Infrastructural framework and scenarios of cooperation

    International Nuclear Information System (INIS)

    2004-10-01

    Currently the management of radioactive wastes centres on national strategies for collection, treatment, interim storage and disposal. This tendency to focus exclusively on national strategies reflects the fact that radioactive waste is a sensitive political issue, making cooperation among countries difficult. It is consistent with the accepted principle that a country that enjoys the benefit of nuclear energy, or the utilization of nuclear technology, should also take full responsibility for managing the generated radioactive waste. However, there are countries whose radioactive waste volumes do not easily justify a national repository, and/or countries that do not have the resources or favourable natural conditions for waste disposal to dedicate to a national repository project or would prefer to collaborate in shared initiatives because of their economic advantages. In such cases it may be appropriate for these countries to engage in a multinational collaborative effort to ensure that they have access to a common repository, in order that they can fulfil their responsibilities for their managing wastes safely. In response to requests from several Member States expressing an interest in multinational disposal options, the IAEA produced in 1998 a TECDOC outlining the important factors to be taken into account in the process of realizing such options. These factors include for example, technical (safety), institutional (legal, safeguards), economic (financial) socio-political (public acceptance) and ethical considerations. The present report reviews the work done in the previous study, taking into account developments since its publication as well as current activities in the field of multinational repositories. The report attempts to define the concepts involved in the creation of multinational repositories, to explore the likely scenarios, to examine the conditions for successful implementation, and to point out the benefits and challenges inherent to

  15. Plugfest 2009: Global Interoperability in Telerobotics and Telemedicine.

    Science.gov (United States)

    King, H Hawkeye; Hannaford, Blake; Kwok, Ka-Wai; Yang, Guang-Zhong; Griffiths, Paul; Okamura, Allison; Farkhatdinov, Ildar; Ryu, Jee-Hwan; Sankaranarayanan, Ganesh; Arikatla, Venkata; Tadano, Kotaro; Kawashima, Kenji; Peer, Angelika; Schauß, Thomas; Buss, Martin; Miller, Levi; Glozman, Daniel; Rosen, Jacob; Low, Thomas

    2010-05-07

    Despite the great diversity of teleoperator designs and applications, their underlying control systems have many similarities. These similarities can be exploited to enable inter-operability between heterogeneous systems. We have developed a network data specification, the Interoperable Telerobotics Protocol, that can be used for Internet based control of a wide range of teleoperators. In this work we test interoperable telerobotics on the global Internet, focusing on the telesurgery application domain. Fourteen globally dispersed telerobotic master and slave systems were connected in thirty trials in one twenty four hour period. Users performed common manipulation tasks to demonstrate effective master-slave operation. With twenty eight (93%) successful, unique connections the results show a high potential for standardizing telerobotic operation. Furthermore, new paradigms for telesurgical operation and training are presented, including a networked surgery trainer and upper-limb exoskeleton control of micro-manipulators.

  16. Pemanfaatan Google API Untuk Model Interoperability Web Berbasis PHP Dengan Google Drive

    OpenAIRE

    Sumiari, Ni Kadek

    2015-01-01

    Dalam sebuah website tercapinya interoperability suatu system sangatlah penting. Penggunaan database berbasis Mysql, Sql Server ataupun oracle memang sudah sangat lumrah dipergunakan dalam sebuah system berbasis website. Namun penggunaan database tersebut tidak bisa menjamin apakah interoperability dari system tersebut dapat tercapai. Selain dari keamanan data dari segi implementasi system juga cukup sulit. Salah satu solusi dalam mencapi interoperability suatu system berbasis website adalah...

  17. Establishing Interoperability of a Blog Archive through Linked Open Data

    DEFF Research Database (Denmark)

    Kalb, Hendrik; Lazaridou, Paraskevi; Trier, Matthias

    2013-01-01

    on archived data. However, interoperability among BlogForever archives, as well as with other digital libraries, is necessary in order to avoid silos of data. In this paper, we reveal some of our efforts to establish interoperability through the application of Linked Open data....

  18. Augmenting interoperability across repositories architectural ideas

    CERN Multimedia

    CERN. Geneva

    2005-01-01

    The aDORe digital repository architecture designed and implemented by the Los Alamos Research Library is fully standards-based and highly modular, with the various components of the architecture interacting in a protocol-driven manner. Although aDORe was designed for use in the context of the Los Alamos Library, its modular and standards-based design has led to interesting insights regarding possible new levels of interoperability in a federation of heterogeneous repositories. The presentation will discuss these insights, and will illustrate that attractive federations of repositories can be built by introducing rather basic interoperability requirements. The presentation will also show that, once these requirements are met, a powerful service framework that overlays the federation can emerge.

  19. GEOSS interoperability for Weather, Ocean and Water

    Science.gov (United States)

    Richardson, David; Nyenhuis, Michael; Zsoter, Ervin; Pappenberger, Florian

    2013-04-01

    "Understanding the Earth system — its weather, climate, oceans, atmosphere, water, land, geodynamics, natural resources, ecosystems, and natural and human-induced hazards — is crucial to enhancing human health, safety and welfare, alleviating human suffering including poverty, protecting the global environment, reducing disaster losses, and achieving sustainable development. Observations of the Earth system constitute critical input for advancing this understanding." With this in mind, the Group on Earth Observations (GEO) started implementing the Global Earth Observation System of Systems (GEOSS). GEOWOW, short for "GEOSS interoperability for Weather, Ocean and Water", is supporting this objective. GEOWOW's main challenge is to improve Earth observation data discovery, accessibility and exploitability, and to evolve GEOSS in terms of interoperability, standardization and functionality. One of the main goals behind the GEOWOW project is to demonstrate the value of the TIGGE archive in interdisciplinary applications, providing a vast amount of useful and easily accessible information to the users through the GEO Common Infrastructure (GCI). GEOWOW aims at developing funcionalities that will allow easy discovery, access and use of TIGGE archive data and of in-situ observations, e.g. from the Global Runoff Data Centre (GRDC), to support applications such as river discharge forecasting.TIGGE (THORPEX Interactive Grand Global Ensemble) is a key component of THORPEX: a World Weather Research Programme to accelerate the improvements in the accuracy of 1-day to 2 week high-impact weather forecasts for the benefit of humanity. The TIGGE archive consists of ensemble weather forecast data from ten global NWP centres, starting from October 2006, which has been made available for scientific research. The TIGGE archive has been used to analyse hydro-meteorological forecasts of flooding in Europe as well as in China. In general the analysis has been favourable in terms of

  20. The Role of Markup for Enabling Interoperability in Health Informatics

    Directory of Open Access Journals (Sweden)

    Steve eMckeever

    2015-05-01

    Full Text Available Interoperability is the faculty of making information systems work together. In this paper we will distinguish a number of different forms that interoperability can take and show how they are realised on a variety of physiological and health care use cases. The last fifteen years has seen the rise of very cheap digital storage both on and off cite. With the advent of the 'Internet of Things' people's expectations are for greater interconnectivity and seamless interoperability. The potential impact these technologies have on healthcare are dramatic: from improved diagnoses through immediate access to a patient's electronic health record, to 'in silico' modeling of organs and early stage drug trials, to predictive medicine based on top-down modeling of disease progression and treatment. We will begin by looking at the underlying technology, classify the various kinds of interoperability that exist in the field, and discuss how they are realised. We conclude with a discussion on future possibilities that big data and further standardizations will enable.

  1. The role of markup for enabling interoperability in health informatics.

    Science.gov (United States)

    McKeever, Steve; Johnson, David

    2015-01-01

    Interoperability is the faculty of making information systems work together. In this paper we will distinguish a number of different forms that interoperability can take and show how they are realized on a variety of physiological and health care use cases. The last 15 years has seen the rise of very cheap digital storage both on and off site. With the advent of the Internet of Things people's expectations are for greater interconnectivity and seamless interoperability. The potential impact these technologies have on healthcare are dramatic: from improved diagnoses through immediate access to a patient's electronic health record, to in silico modeling of organs and early stage drug trials, to predictive medicine based on top-down modeling of disease progression and treatment. We will begin by looking at the underlying technology, classify the various kinds of interoperability that exist in the field, and discuss how they are realized. We conclude with a discussion on future possibilities that big data and further standardizations will enable.

  2. A Spatial Data Infrastructure for Environmental Noise Data in Europe.

    Science.gov (United States)

    Abramic, Andrej; Kotsev, Alexander; Cetl, Vlado; Kephalopoulos, Stylianos; Paviotti, Marco

    2017-07-06

    Access to high quality data is essential in order to better understand the environmental and health impact of noise in an increasingly urbanised world. This paper analyses how recent developments of spatial data infrastructures in Europe can significantly improve the utilization of data and streamline reporting on a pan-European scale. The Infrastructure for Spatial Information in the European Community (INSPIRE), and Environmental Noise Directive (END) described in this manuscript provide principles for data management that, once applied, would lead to a better understanding of the state of environmental noise. Furthermore, shared, harmonised and easily discoverable environmental spatial data, required by the INSPIRE, would also support the data collection needed for the assessment and development of strategic noise maps. Action plans designed by the EU Member States to reduce noise and mitigate related effects can be shared to the public through already established nodes of the European spatial data infrastructure. Finally, data flows regarding reporting on the state of environment and END implementation to the European level can benefit by applying a decentralised e-reporting service oriented infrastructure. This would allow reported data to be maintained, frequently updated and enable pooling of information from/to other relevant and interrelated domains such as air quality, transportation, human health, population, marine environment or biodiversity. We describe those processes and provide a use case in which noise data from two neighbouring European countries are mapped to common data specifications, defined by INSPIRE, thus ensuring interoperability and harmonisation.

  3. BIM Methodology Approach to Infrastructure Design: Case Study of Paniga Tunnel

    Science.gov (United States)

    Osello, Anna; Rapetti, Niccolò; Semeraro, Francesco

    2017-10-01

    Nowadays, the implementation of Building Information Modelling (BIM) in civil design represent a new challenge for the AECO (Architecture, Engineering, Construction, Owner and Operator) world, which will involve the interest of many researchers in the next years. It is due to the incentives of Public Administration and European Directives that aim to improve the efficiency and to enhance a better management of the complexity of infrastructure projects. For these reasons, the goal of this research is to propose a methodology for the use of BIM in a tunnel project, analysing the definition of a correct level of detail (LOD) and the possibility to share information via interoperability for FEM analysis.

  4. A Pragmatic Approach to Sustainable Interoperability for the Web 2.0 World

    Science.gov (United States)

    Wright, D. J.; Sankaran, S.

    2015-12-01

    In the geosciences, interoperability is a fundamental requirement. Members of various standards organizations such as the OGC and ISO-TC 211 have done yeomen services to promote a standards-centric approach to manage the interoperability challenges that organizations face today. The specific challenges that organizations face when adopting interoperability patterns are very many. One approach, that of mandating the use of specific standards has been reasonably successful. But scientific communities, as with all others, ultimately want their solutions to be widely accepted and used. And to this end there is a crying need to explore all possible interoperability patterns without restricting the choices to mandated standards. Standards are created by a slow and deliberative process that sometimes takes a long time to come to fruition and therefore sometime feel to fall short of user expectations. It seems therefore that organizations are left with a series of perceived orthogonal requirements when they want to pursue interoperability. They want a robust but agile solution, a mature approach that also needs to satisfy latest technology trends and so on. Sustainable interoperability patterns need to be forward looking and should choose the patterns and paradigms of the Web 2.0 generation. To this end, the key is to choose platform technologies that embrace multiple interoperability mechanisms that are built on fundamental "open" principles and which align with popular mainstream patterns. We seek to explore data-, metadata- and web service-related interoperability patterns through the prism of building solutions that encourage strong implementer and end-user engagement, improved usability and scalability considerations, and appealing developer frameworks that can grow the audience. The path to tread is not new, and the geocommunity only needs to observe and align its end goals with current Web 2.0 patterns to realize all the benefits that today we all take for granted

  5. Requirements for and barriers towards interoperable ehealth technology in primary care

    NARCIS (Netherlands)

    Oude Nijeweme-d'Hollosy, Wendeline; van Velsen, Lex Stefan; Huygens, Martine; Hermens, Hermanus J.

    Despite eHealth technology's rapid growth, eHealth applications are rarely embedded within primary care, mostly because systems lack interoperability. This article identifies requirements for, and barriers towards, interoperable eHealth technology from healthcare professionals' perspective -- the

  6. Risk Management Considerations for Interoperable Acquisition

    National Research Council Canada - National Science Library

    Meyers, B. C

    2006-01-01

    .... The state of risk management practice -- the specification of standards and the methodologies to implement them -- is addressed and examined with respect to the needs of system-of-systems interoperability...

  7. Connected Lighting System Interoperability Study Part 1: Application Programming Interfaces

    Energy Technology Data Exchange (ETDEWEB)

    Gaidon, Clement [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Poplawski, Michael [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2017-10-31

    First in a series of studies that focuses on interoperability as realized by the use of Application Programming Interfaces (APIs), explores the diversity of such interfaces in several connected lighting systems; characterizes the extent of interoperability that they provide; and illustrates challenges, limitations, and tradeoffs that were encountered during this exploration.

  8. Metadata behind the Interoperability of Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Miguel Angel Manso Callejo

    2009-05-01

    Full Text Available Wireless Sensor Networks (WSNs produce changes of status that are frequent, dynamic and unpredictable, and cannot be represented using a linear cause-effect approach. Consequently, a new approach is needed to handle these changes in order to support dynamic interoperability. Our approach is to introduce the notion of context as an explicit representation of changes of a WSN status inferred from metadata elements, which in turn, leads towards a decision-making process about how to maintain dynamic interoperability. This paper describes the developed context model to represent and reason over different WSN status based on four types of contexts, which have been identified as sensing, node, network and organisational contexts. The reasoning has been addressed by developing contextualising and bridges rules. As a result, we were able to demonstrate how contextualising rules have been used to reason on changes of WSN status as a first step towards maintaining dynamic interoperability.

  9. INTEROPERABLE FRAMEWORK SOLUTION TO ICU HEALTH CARE MONITORING

    Directory of Open Access Journals (Sweden)

    Shola Usha Rani

    2015-03-01

    Full Text Available An interoperable telehealth system provides an independent healthcare solution for better management of health and wellness. It allows people to manage their heart disease and diabetes etc. by sending their health parameters like blood pressure, heart rate, glucose levels, temperature, weight, respiration from remote place to health professional, and get real-time feedback on their condition. Here different medical devices are connected to the patient for monitoring. Each kind of device is manufactured by different vendors. And each device information and communication requires different installation and network design. It causes design complexities and network overheads when moving patients for diagnosis examinations. This problem will be solved by interoperability among devices. The ISO/IEEE 11073 is an international standard which produces interoperable hospital information system solution to medical devices. One such type of integrated environment that requires the integration of medical devices is ICU (Intensive Care Unit. This paper presents the issues for ICU monitoring system and framework solution for it.

  10. SERENITY in e-Business and Smart Item Scenarios

    Science.gov (United States)

    Benameur, Azzedine; Khoury, Paul El; Seguran, Magali; Sinha, Smriti Kumar

    SERENITY Artefacts, like Class, Patterns, Implementations and Executable Components for Security & Dependability (S&D) in addition to Serenity Runtime Framework (SRF) are discussed in previous chapters. How to integrate these artefacts with applications in Serenity approach is discussed here with two scenarios. The e-Business scenario is a standard loan origination process in a bank. The Smart Item scenario is an Ambient intelligence case study where we take advantage of Smart Items to provide an electronic healthcare infrastructure for remote healthcare assistance. In both cases, we detail how the prototype implementations of the scenarios select proper executable components through Serenity Runtime Framework and then demonstrate how these executable components of the S&D Patterns are deployed.

  11. Building climate change into infrastructure codes and standards

    International Nuclear Information System (INIS)

    Auld, H.; Klaasen, J.; Morris, R.; Fernandez, S.; MacIver, D.; Bernstein, D.

    2009-01-01

    'Full text:' Building codes and standards and the climatic design values embedded within these legal to semi-legal documents have profound safety, health and economic implications for Canada's infrastructure systems. The climatic design values that have been used for the design of almost all of today's more than $5.5 Trillion in infrastructure are based on historical climate data and assume that the extremes of the past will represent future conditions. Since new infrastructure based on codes and standards will be built to survive for decades to come, it is critically important that existing climatic design information be as accurate and up-to-date as possible, that the changing climate be monitored to detect and highlight vulnerabilities of existing infrastructure, that forensic studies of climate-related failures be undertaken and that codes and standards processes incorporate future climates and extremes as much as possible. Uncertainties in the current climate change models and their scenarios currently challenge our ability to project future extremes regionally and locally. Improvements to the spatial and temporal resolution of these climate change scenarios, along with improved methodologies to treat model biases and localize results, will allow future codes and standards to better reflect the extremes and weathering conditions expected over the lifespan of structures. In the meantime, other information and code processes can be used to incorporate changing climate conditions into upcoming infrastructure codes and standards, to “bridge” the model uncertainty gap and to complement the state of existing projections. This presentation will outline some of the varied information and processes that will be used to incorporate climate change adaptation into the next development cycle of the National Building Code of Canada and numerous other national CSA infrastructure standards. (author)

  12. The National Flood Interoperability Experiment: Bridging Resesarch and Operations

    Science.gov (United States)

    Salas, F. R.

    2015-12-01

    The National Weather Service's new National Water Center, located on the University of Alabama campus in Tuscaloosa, will become the nation's hub for comprehensive water resources forecasting. In conjunction with its federal partners the US Geological Survey, Army Corps of Engineers and Federal Emergency Management Agency, the National Weather Service will operationally support both short term flood prediction and long term seasonal forecasting of water resource conditions. By summer 2016, the National Water Center will begin evaluating four streamflow data products at the scale of the NHDPlus river reaches (approximately 2.67 million). In preparation for the release of these products, from September 2014 to August 2015, the National Weather Service partnered with the Consortium of Universities for the Advancement of Hydrologic Science, Inc. to support the National Flood Interoperability Experiment which included a seven week in-residence Summer Institute in Tuscaloosa for university students interested in learning about operational hydrology and flood forecasting. As part of the experiment, 15 hour forecasts from the operational High Resolution Rapid Refresh atmospheric model were used to drive a three kilometer Noah-MP land surface model loosely coupled to a RAPID river routing model operating on the NHDPlus dataset. This workflow was run every three hours during the Summer Institute and the results were made available to those engaged to pursue a range of research topics focused on flood forecasting (e.g. reservoir operations, ensemble forecasting, probabilistic flood inundation mapping, rainfall product evaluation etc.) Although the National Flood Interoperability Experiment was finite in length, it provided a platform through which the academic community could engage federal agencies and vice versa to narrow the gap between research and operations and demonstrate how state of the art research infrastructure, models, services, datasets etc. could be utilized

  13. Visual Development Environment for Semantically Interoperable Smart Cities Applications

    OpenAIRE

    Roukounaki , Aikaterini; Soldatos , John; Petrolo , Riccardo; Loscri , Valeria; Mitton , Nathalie; Serrano , Martin

    2015-01-01

    International audience; This paper presents an IoT architecture for the semantic interoperability of diverse IoT systems and applications in smart cities. The architecture virtualizes diverse IoT systems and ensures their modelling and representation according to common standards-based IoT ontologies. Furthermore, based on this architecture, the paper introduces a first-of-a-kind visual development environment which eases the development of semantically interoperable applications in smart cit...

  14. Recent ARC developments: Through modularity to interoperability

    International Nuclear Information System (INIS)

    Smirnova, O; Cameron, D; Ellert, M; Groenager, M; Johansson, D; Kleist, J; Dobe, P; Joenemo, J; Konya, B; Fraagaat, T; Konstantinov, A; Nilsen, J K; Saada, F Ould; Qiang, W; Read, A; Kocan, M; Marton, I; Nagy, Zs; Moeller, S; Mohn, B

    2010-01-01

    The Advanced Resource Connector (ARC) middleware introduced by NorduGrid is one of the basic Grid solutions used by scientists worldwide. While being well-proven in daily use by a wide variety of scientific applications at large-scale infrastructures like the Nordic DataGrid Facility (NDGF) and smaller scale projects, production ARC of today is still largely based on conventional Grid technologies and custom interfaces introduced a decade ago. In order to guarantee sustainability, true cross-system portability and standards-compliance based interoperability, the ARC community undertakes a massive effort of implementing modular Web Service (WS) approach into the middleware. With support from the EU KnowARC project, new components were introduced and the existing key ARC services got extended with WS technology based standard-compliant interfaces following a service-oriented architecture. Such components include the hosting environment framework, the resource-coupled execution service, the re-engineered client library, the self-healing storage solution and the peer-to-peer information system, to name a few. Gradual introduction of these new services and client tools into the production middleware releases is carried out together with NDGF and thus ensures a smooth transition to the next generation Grid middleware. Standard interfaces and modularity of the new component design are essential for ARC contributions to the planned Universal Middleware Distribution of the European Grid Initiative.

  15. Recent ARC developments: Through modularity to interoperability

    Energy Technology Data Exchange (ETDEWEB)

    Smirnova, O; Cameron, D; Ellert, M; Groenager, M; Johansson, D; Kleist, J [NDGF, Kastruplundsgade 22, DK-2770 Kastrup (Denmark); Dobe, P; Joenemo, J; Konya, B [Lund University, Experimental High Energy Physics, Institute of Physics, Box 118, SE-22100 Lund (Sweden); Fraagaat, T; Konstantinov, A; Nilsen, J K; Saada, F Ould; Qiang, W; Read, A [University of Oslo, Department of Physics, P. O. Box 1048, Blindern, N-0316 Oslo (Norway); Kocan, M [Pavol Jozef Safarik University, Faculty of Science, Jesenna 5, SK-04000 Kosice (Slovakia); Marton, I; Nagy, Zs [NIIF/HUNGARNET, Victor Hugo 18-22, H-1132 Budapest (Hungary); Moeller, S [University of Luebeck, Inst. Of Neuro- and Bioinformatics, Ratzeburger Allee 160, D-23538 Luebeck (Germany); Mohn, B, E-mail: oxana.smirnova@hep.lu.s [Uppsala University, Department of Physics and Astronomy, Div. of Nuclear and Particle Physics, Box 535, SE-75121 Uppsala (Sweden)

    2010-04-01

    The Advanced Resource Connector (ARC) middleware introduced by NorduGrid is one of the basic Grid solutions used by scientists worldwide. While being well-proven in daily use by a wide variety of scientific applications at large-scale infrastructures like the Nordic DataGrid Facility (NDGF) and smaller scale projects, production ARC of today is still largely based on conventional Grid technologies and custom interfaces introduced a decade ago. In order to guarantee sustainability, true cross-system portability and standards-compliance based interoperability, the ARC community undertakes a massive effort of implementing modular Web Service (WS) approach into the middleware. With support from the EU KnowARC project, new components were introduced and the existing key ARC services got extended with WS technology based standard-compliant interfaces following a service-oriented architecture. Such components include the hosting environment framework, the resource-coupled execution service, the re-engineered client library, the self-healing storage solution and the peer-to-peer information system, to name a few. Gradual introduction of these new services and client tools into the production middleware releases is carried out together with NDGF and thus ensures a smooth transition to the next generation Grid middleware. Standard interfaces and modularity of the new component design are essential for ARC contributions to the planned Universal Middleware Distribution of the European Grid Initiative.

  16. 77 FR 19575 - Promoting Interoperability in the 700 MHz Commercial Spectrum; Interoperability of Mobile User...

    Science.gov (United States)

    2012-04-02

    ... efforts and/or through modifications to the Commission's technical rules or other regulatory measures. The... regulatory measures. \\1\\ The Commission has a longstanding interest in promoting the interoperability of... standards for Long-Term Evolution (LTE) wireless broadband technology are developed by the 3rd Generation...

  17. Interoperability of Electronic Health Records: A Physician-Driven Redesign.

    Science.gov (United States)

    Miller, Holly; Johns, Lucy

    2018-01-01

    PURPOSE: Electronic health records (EHRs), now used by hundreds of thousands of providers and encouraged by federal policy, have the potential to improve quality and decrease costs in health care. But interoperability, although technically feasible among different EHR systems, is the weak link in the chain of logic. Interoperability is inhibited by poor understanding, by suboptimal implementation, and at times by a disinclination to dilute market share or patient base on the part of vendors or providers, respectively. The intent of this project has been to develop a series of practicable recommendations that, if followed by EHR vendors and users, can promote and enhance interoperability, helping EHRs reach their potential. METHODOLOGY: A group of 11 physicians, one nurse, and one health policy consultant, practicing from California to Massachusetts, has developed a document titled "Feature and Function Recommendations To Optimize Clinician Usability of Direct Interoperability To Enhance Patient Care" that offers recommendations from the clinician point of view. This report introduces some of these recommendations and suggests their implications for policy and the "virtualization" of EHRs. CONCLUSION: Widespread adoption of even a few of these recommendations by designers and vendors would enable a major advance toward the "Triple Aim" of improving the patient experience, improving the health of populations, and reducing per capita costs.

  18. Infrastructural consequences of the use of various energy sources in the Netherlands

    International Nuclear Information System (INIS)

    Ham, P.J. van der; Hoffman, R.M.; Reckman, E.; Wegenwijs, F.W.

    1984-01-01

    In the framework of the Public Discussion Energy Policy in the Netherlands, two rather divergent energy scenarios have been proposed: the Industrial Recuperation Scenario and the Energy Saving Scenario. In this report, city and country planning aspects of energy scenarios are considered, using the above-mentioned scenarios as a frame of reference. Infrastructural consequences of energy options like coal, uranium, wind, and combined heat-electricity generation, especially those of coal and nuclear power, are discussed. A comparative evaluation is made of various siting plans for nuclear plants. (G.J.P.)

  19. Spatial policy, planning and infrastructure investment: Lessons from ...

    African Journals Online (AJOL)

    Dr Louis J. Waldeck, Manager, Urban Dynamics Laboratory, CSIR Built ... funded Integrated Planning and Development Modelling (IPDM) project, the article ... areas ought to be grounded in robust and rigorous analysis and scenario evaluation. ... Partnership Infrastructure Grants ... in water supply and regional bulk.

  20. Interoperable Archetypes With a Three Folded Terminology Governance.

    Science.gov (United States)

    Pederson, Rune; Ellingsen, Gunnar

    2015-01-01

    The use of openEHR archetypes increases the interoperability of clinical terminology, and in doing so improves upon the availability of clinical terminology for both primary and secondary purposes. Where clinical terminology is employed in the EPR system, research reports conflicting a results for the use of structuring and standardization as measurements of success. In order to elucidate this concept, this paper focuses on the effort to establish a national repository for openEHR based archetypes in Norway where clinical terminology could be included with benefit for interoperability three folded.

  1. Interoperable eHealth Platform for Personalized Smart Services

    DEFF Research Database (Denmark)

    Mihaylov, Mihail Rumenov; Mihovska, Albena Dimitrova; Kyriazakos, Sofoklis

    2015-01-01

    personalized context-aware applications to serve the user's needs. This paper proposes the use of advised sensing, context-aware and cloud-based lifestyle reasoning to design an innovative eHealth platform that supports highly personalized smart services to primary users. The architecture of the platform has...... been designed in accordance with the interoperability requirements and standards as proposed by ITU-T and Continua Alliance. In particular, we define the interface dependencies and functional requirements needed, to allow eCare and eHealth vendors to manufacture interoperable sensors, ambient and home...

  2. Inter-operability

    International Nuclear Information System (INIS)

    Plaziat, J.F.; Moulin, P.; Van Beurden, R.; Ballet, E.

    2005-01-01

    Building an internal gas market implies establishing harmonized rules for cross border trading between operators. To that effect, the European association EASEE-gas is carrying out standards and procedures, commonly called 'inter-operability'. Set up in 2002, the Association brings together all segments of the gas industry: producers, transporters, distributors, traders and shippers, suppliers, consumers and service providers. This workshop presents the latest status on issues such as barriers to gas trade in Europe, rules and procedures under preparation by EASEE-gas, and the implementation schedule of these rules by operators. This article gathers 5 presentations about this topic given at the gas conference

  3. Benchmarking infrastructure for mutation text mining.

    Science.gov (United States)

    Klein, Artjom; Riazanov, Alexandre; Hindle, Matthew M; Baker, Christopher Jo

    2014-02-25

    Experimental research on the automatic extraction of information about mutations from texts is greatly hindered by the lack of consensus evaluation infrastructure for the testing and benchmarking of mutation text mining systems. We propose a community-oriented annotation and benchmarking infrastructure to support development, testing, benchmarking, and comparison of mutation text mining systems. The design is based on semantic standards, where RDF is used to represent annotations, an OWL ontology provides an extensible schema for the data and SPARQL is used to compute various performance metrics, so that in many cases no programming is needed to analyze results from a text mining system. While large benchmark corpora for biological entity and relation extraction are focused mostly on genes, proteins, diseases, and species, our benchmarking infrastructure fills the gap for mutation information. The core infrastructure comprises (1) an ontology for modelling annotations, (2) SPARQL queries for computing performance metrics, and (3) a sizeable collection of manually curated documents, that can support mutation grounding and mutation impact extraction experiments. We have developed the principal infrastructure for the benchmarking of mutation text mining tasks. The use of RDF and OWL as the representation for corpora ensures extensibility. The infrastructure is suitable for out-of-the-box use in several important scenarios and is ready, in its current state, for initial community adoption.

  4. Benchmarking infrastructure for mutation text mining

    Science.gov (United States)

    2014-01-01

    Background Experimental research on the automatic extraction of information about mutations from texts is greatly hindered by the lack of consensus evaluation infrastructure for the testing and benchmarking of mutation text mining systems. Results We propose a community-oriented annotation and benchmarking infrastructure to support development, testing, benchmarking, and comparison of mutation text mining systems. The design is based on semantic standards, where RDF is used to represent annotations, an OWL ontology provides an extensible schema for the data and SPARQL is used to compute various performance metrics, so that in many cases no programming is needed to analyze results from a text mining system. While large benchmark corpora for biological entity and relation extraction are focused mostly on genes, proteins, diseases, and species, our benchmarking infrastructure fills the gap for mutation information. The core infrastructure comprises (1) an ontology for modelling annotations, (2) SPARQL queries for computing performance metrics, and (3) a sizeable collection of manually curated documents, that can support mutation grounding and mutation impact extraction experiments. Conclusion We have developed the principal infrastructure for the benchmarking of mutation text mining tasks. The use of RDF and OWL as the representation for corpora ensures extensibility. The infrastructure is suitable for out-of-the-box use in several important scenarios and is ready, in its current state, for initial community adoption. PMID:24568600

  5. Enterprise Interoperability - Proceedings of the 5th International IFIP Working Conference on Enterprise Interoperability, IWEI 2013

    NARCIS (Netherlands)

    van Sinderen, Marten J.; Oude Luttighuis, P.H.W.M.; Folmer, Erwin Johan Albert; Bosems, S.; Unknown, [Unknown

    IWEI is an International IFIP Working Conference covering all aspects of enterprise interoperability with the purpose of achieving flexible cross-organizational collaboration through integrated support at business and technical levels. It provides a forum for discussing ideas and results among both

  6. The future of gas infrastructures in Eurasia

    International Nuclear Information System (INIS)

    Klaassen, Ger; McDonald, Alan; Jimin Zhao

    2001-01-01

    The IIASA-WEC study global energy perspectives emphasized trends toward cleaner, more flexible, and more convenient final energy forms, delivered chiefly by energy grids, and noted potential energy infrastructure deficiencies in Eurasia. We compare planned interregional gas pipelines and LNG terminals in Eurasia with the study's projected trade flows for 2020. We focus on the study's three high-growth scenarios and single middle course scenario. The comparison indicates that high gas consumption in a scenario need not imply high gas trade. For the former Soviet Union, a robust strategy across all six scenarios is to implement existing plans and proposals for expanding gas export capacity. For Eastern Europe, significant import capacity expansions beyond current plans and proposals are needed in all but the middle course scenario. Western European plans and proposals need to be increased only in two high gas consumption scenarios. Planned and proposed capacities for the Middle East (exports) and centrally planned Asia (imports) most closely match a high gas trade scenario, but are otherwise excessive. Paradoxically, for the Pacific OECD, more short-term import capacity is needed in scenarios with low gas consumption than in high-consumption scenarios. For Southeast Asia, proposed import capacities are significantly higher than scenario trade projections. (Author)

  7. Examining the Relationship between Electronic Health Record Interoperability and Quality Management

    Science.gov (United States)

    Purcell, Bernice M.

    2013-01-01

    A lack of interoperability impairs data quality among health care providers' electronic health record (EHR) systems. The problem is whether the International Organization for Standardization (ISO) 9000 principles relate to the problem of interoperability in implementation of EHR systems. The purpose of the nonexperimental quantitative research…

  8. An infrastructure with a unified control plane to integrate IP into optical metro networks to provide flexible and intelligent bandwidth on demand for cloud computing

    Science.gov (United States)

    Yang, Wei; Hall, Trevor

    2012-12-01

    The Internet is entering an era of cloud computing to provide more cost effective, eco-friendly and reliable services to consumer and business users and the nature of the Internet traffic will undertake a fundamental transformation. Consequently, the current Internet will no longer suffice for serving cloud traffic in metro areas. This work proposes an infrastructure with a unified control plane that integrates simple packet aggregation technology with optical express through the interoperation between IP routers and electrical traffic controllers in optical metro networks. The proposed infrastructure provides flexible, intelligent, and eco-friendly bandwidth on demand for cloud computing in metro areas.

  9. A Guide to Understanding Emerging Interoperability Technologies

    National Research Council Canada - National Science Library

    Bollinger, Terry

    2000-01-01

    .... Over time, individual interoperability problems tend to disappear as the resources involved literally become part of one system through integration and standardization, but the overall problem...

  10. Interoperability of CAD Standards and Robotics in CIME

    DEFF Research Database (Denmark)

    Sørensen, Torben

    The research presented in this dissertation concerns the identification of problems and provision of solutions for increasing the degree of interoperability between CAD, CACSD (Computer Aided Control Systems Design) and CAR (Computer Aided Robotics) in Computer Integrated Manufacturing and Engine......The research presented in this dissertation concerns the identification of problems and provision of solutions for increasing the degree of interoperability between CAD, CACSD (Computer Aided Control Systems Design) and CAR (Computer Aided Robotics) in Computer Integrated Manufacturing......· The development of a STEP based interface for general control system data and functions, especially related to robot motion control for interoperability of CAD, CACSD, and CAR systems for the extension of the inter-system communication capabilities beyond the stage achieved up to now.This interface development...... comprehends the following work:· The definition of the concepts of 'information' and 'information model', and the selection of a proper information modeling methodology within the STEP methodologies.· The elaboration of a general function model of a generic robot motion controller in IDEF0 for interface...

  11. Evaluation of Interoperability Protocols in Repositories of Electronic Theses and Dissertations

    Science.gov (United States)

    Hakimjavadi, Hesamedin; Masrek, Mohamad Noorman

    2013-01-01

    Purpose: The purpose of this study is to evaluate the status of eight interoperability protocols within repositories of electronic theses and dissertations (ETDs) as an introduction to further studies on feasibility of deploying these protocols in upcoming areas of interoperability. Design/methodology/approach: Three surveys of 266 ETD…

  12. Scenario-based approach to risk analysis in support of cyber security

    Energy Technology Data Exchange (ETDEWEB)

    Gertman, D. I.; Folkers, R.; Roberts, J. [Idaho National Laboratory, Roberts and Folkers Associates, LLC, Idaho Falls, ID 83404 (United States)

    2006-07-01

    The US infrastructure is continually challenged by hostile nation states and others who would do us harm. Cyber vulnerabilities and weaknesses are potential targets and are the result of years of construction and technological improvement in a world less concerned with security than is currently the case. As a result, cyber attack presents a class of challenges for which we are just beginning to prepare. What has been done in the nuclear, chemical and energy sectors as a means of anticipating and preparing for randomly occurring accidents and off-normal events is to develop scenarios as a means by which to prioritize and quantify risk and to take action. However, the number of scenarios risk analysts can develop is almost limitless. How do we ascertain which scenario has the greatest merit? One of the more important contributions of probabilistic risk analysis (PRA) has been to quantify the initiating event probability associated with various classes of accidents; and to quantify the occurrence of various conditions, i.e., end-states, as a function of these important accident sequences. Typically, various classes of conditions are represented by scenarios and are quantified in terms of cut sets and binned into end states. For example, the nuclear industry has a well-defined set of initiating events that are studied in assessing risk. The maturation of risk analysis for cyber security from accounting for barriers or looking at conditions statically to one of ascertaining the probability associated with certain events is, in part, dependent upon the adoption of a scenario-based approach. For example, scenarios take into account threats to personnel and public safety; economic damage, and compromises to major operational and safety functions. Scenarios reflect system, equipment, and component configurations as well as key human-system interactions related to event detection, diagnosis, mitigation and restoration of systems. As part of a cyber attack directed toward

  13. Scenario-based approach to risk analysis in support of cyber security

    International Nuclear Information System (INIS)

    Gertman, D. I.; Folkers, R.; Roberts, J.

    2006-01-01

    The US infrastructure is continually challenged by hostile nation states and others who would do us harm. Cyber vulnerabilities and weaknesses are potential targets and are the result of years of construction and technological improvement in a world less concerned with security than is currently the case. As a result, cyber attack presents a class of challenges for which we are just beginning to prepare. What has been done in the nuclear, chemical and energy sectors as a means of anticipating and preparing for randomly occurring accidents and off-normal events is to develop scenarios as a means by which to prioritize and quantify risk and to take action. However, the number of scenarios risk analysts can develop is almost limitless. How do we ascertain which scenario has the greatest merit? One of the more important contributions of probabilistic risk analysis (PRA) has been to quantify the initiating event probability associated with various classes of accidents; and to quantify the occurrence of various conditions, i.e., end-states, as a function of these important accident sequences. Typically, various classes of conditions are represented by scenarios and are quantified in terms of cut sets and binned into end states. For example, the nuclear industry has a well-defined set of initiating events that are studied in assessing risk. The maturation of risk analysis for cyber security from accounting for barriers or looking at conditions statically to one of ascertaining the probability associated with certain events is, in part, dependent upon the adoption of a scenario-based approach. For example, scenarios take into account threats to personnel and public safety; economic damage, and compromises to major operational and safety functions. Scenarios reflect system, equipment, and component configurations as well as key human-system interactions related to event detection, diagnosis, mitigation and restoration of systems. As part of a cyber attack directed toward

  14. Distributed Data Management on the Petascale using Heterogeneous Grid Infrastructures with DQ2

    CERN Document Server

    Branco, M; Salgado, P; Lassnig, M

    2008-01-01

    We describe Don Quijote 2 (DQ2), a new approach to the management of large scientific datasets by a dedicated middleware. This middleware is designed to handle the data organisation and data movement on the petascale for the High-Energy Physics Experiment ATLAS at CERN. DQ2 is able to maintain a well-defined quality of service in a scalable way, guarantees data consistency for the collaboration and bridges the gap between EGEE, OSG and NorduGrid infrastructures to enable true interoperability. DQ2 is specifically designed to support the access and management of large scientific datasets produced by the ATLAS experiment using heterogeneous Grid infrastructures. The DQ2 middleware manages those datasets with global services, local site services and enduser interfaces. The global services, or central catalogues, are responsible for the mapping of individual files onto DQ2 datasets. The local site services are responsible for tracking files available on-site, managing data movement and guaranteeing consistency of...

  15. An Interoperability Framework and Capability Profiling for Manufacturing Software

    Science.gov (United States)

    Matsuda, M.; Arai, E.; Nakano, N.; Wakai, H.; Takeda, H.; Takata, M.; Sasaki, H.

    ISO/TC184/SC5/WG4 is working on ISO16100: Manufacturing software capability profiling for interoperability. This paper reports on a manufacturing software interoperability framework and a capability profiling methodology which were proposed and developed through this international standardization activity. Within the context of manufacturing application, a manufacturing software unit is considered to be capable of performing a specific set of function defined by a manufacturing software system architecture. A manufacturing software interoperability framework consists of a set of elements and rules for describing the capability of software units to support the requirements of a manufacturing application. The capability profiling methodology makes use of the domain-specific attributes and methods associated with each specific software unit to describe capability profiles in terms of unit name, manufacturing functions, and other needed class properties. In this methodology, manufacturing software requirements are expressed in terns of software unit capability profiles.

  16. Dynamic Business Networks: A Headache for Sustainable Systems Interoperability

    Science.gov (United States)

    Agostinho, Carlos; Jardim-Goncalves, Ricardo

    Collaborative networked environments emerged with the spread of the internet, contributing to overcome past communication barriers, and identifying interoperability as an essential property. When achieved seamlessly, efficiency is increased in the entire product life cycle. Nowadays, most organizations try to attain interoperability by establishing peer-to-peer mappings with the different partners, or in optimized networks, by using international standard models as the core for information exchange. In current industrial practice, mappings are only defined once, and the morphisms that represent them, are hardcoded in the enterprise systems. This solution has been effective for static environments, where enterprise and product models are valid for decades. However, with an increasingly complex and dynamic global market, models change frequently to answer new customer requirements. This paper draws concepts from the complex systems science and proposes a framework for sustainable systems interoperability in dynamic networks, enabling different organizations to evolve at their own rate.

  17. Environmental Models as a Service: Enabling Interoperability ...

    Science.gov (United States)

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantage of streamlined deployment processes and affordable cloud access to move algorithms and data to the web for discoverability and consumption. In these deployments, environmental models can become available to end users through RESTful web services and consistent application program interfaces (APIs) that consume, manipulate, and store modeling data. RESTful modeling APIs also promote discoverability and guide usability through self-documentation. Embracing the RESTful paradigm allows models to be accessible via a web standard, and the resulting endpoints are platform- and implementation-agnostic while simultaneously presenting significant computational capabilities for spatial and temporal scaling. RESTful APIs present data in a simple verb-noun web request interface: the verb dictates how a resource is consumed using HTTP methods (e.g., GET, POST, and PUT) and the noun represents the URL reference of the resource on which the verb will act. The RESTful API can self-document in both the HTTP response and an interactive web page using the Open API standard. This lets models function as an interoperable service that promotes sharing, documentation, and discoverability. Here, we discuss the

  18. D-ATM, a working example of health care interoperability: From dirt path to gravel road.

    Science.gov (United States)

    DeClaris, John-William

    2009-01-01

    For many years, there have been calls for interoperability within health care systems. The technology currently exists and is being used in business areas like banking and commerce, to name a few. Yet the question remains, why has interoperability not been achieved in health care? This paper examines issues encountered and success achieved with interoperability during the development of the Digital Access To Medication (D-ATM) project, sponsored by the Substance Abuse and Mental Health Services Administration (SAMHSA). D-ATM is the first government funded interoperable patient management system. The goal of this paper is to provide lessons learned and propose one possible road map for health care interoperability within private industry and how government can help.

  19. The Next Generation Information Infrastructure for International Trade

    DEFF Research Database (Denmark)

    Henningsson, Stefan; Gal, Uri; Bjørn-Andersen, Niels

    2011-01-01

    that are in control of their business. Trusted traders are entitled to trade facilitations, faster border crossing, and fewer physical inspections. To enable the use of trusted traders, changes are required to the information infrastructure (II) of international trade. This article complements existing works on e......-Government interoperability by a theoretically driven approach with theoretical development of the II concept and how II can be modified as additional focus. Following the principles of IS design research, this paper presents a design proposition for the II of international trade. Using theories of II development and change......Regulators and actors in international trade are facing a difficult challenge of increasing control and security while at the same time lowering the administrative burden for traders. As a tentative response, the European Commission has introduced the concept of “trusted traders”: certified traders...

  20. Assessing the risk posed by natural hazards to infrastructures

    Science.gov (United States)

    Eidsvig, Unni Marie K.; Kristensen, Krister; Vidar Vangelsten, Bjørn

    2017-03-01

    This paper proposes a model for assessing the risk posed by natural hazards to infrastructures, with a focus on the indirect losses and loss of stability for the population relying on the infrastructure. The model prescribes a three-level analysis with increasing level of detail, moving from qualitative to quantitative analysis. The focus is on a methodology for semi-quantitative analyses to be performed at the second level. The purpose of this type of analysis is to perform a screening of the scenarios of natural hazards threatening the infrastructures, identifying the most critical scenarios and investigating the need for further analyses (third level). The proposed semi-quantitative methodology considers the frequency of the natural hazard, different aspects of vulnerability, including the physical vulnerability of the infrastructure itself, and the societal dependency on the infrastructure. An indicator-based approach is applied, ranking the indicators on a relative scale according to pre-defined ranking criteria. The proposed indicators, which characterise conditions that influence the probability of an infrastructure malfunctioning caused by a natural event, are defined as (1) robustness and buffer capacity, (2) level of protection, (3) quality/level of maintenance and renewal, (4) adaptability and quality of operational procedures and (5) transparency/complexity/degree of coupling. Further indicators describe conditions influencing the socio-economic consequences of the infrastructure malfunctioning, such as (1) redundancy and/or substitution, (2) cascading effects and dependencies, (3) preparedness and (4) early warning, emergency response and measures. The aggregated risk estimate is a combination of the semi-quantitative vulnerability indicators, as well as quantitative estimates of the frequency of the natural hazard, the potential duration of the infrastructure malfunctioning (e.g. depending on the required restoration effort) and the number of users of

  1. Modelling the South African fruit export infrastructure: A case study

    Directory of Open Access Journals (Sweden)

    FG Ortmann

    2006-06-01

    Full Text Available A description is provided of work performed as part of the fruit logistics infrastructure project commissioned by the South African Deciduous Fruit Producers’ Trust and coordinated by the South African Council for Scientific and Industrial Research, as described in [Van Dyk FE & Maspero E, 2004, An analysis of the South African fruit logistics infrastructure, ORiON, 20(1, pp. 55–72]. After a brief introduction to the problem, two models (a single-commodity graph theoretic model and a multi-commodity mathematical programming model are derived for determining the maximal weekly flow or throughput of fresh fruit through the South African national export infrastructure. These models are solved for two extreme seasonal export scenarios and the solutions show that no export infrastructure expansion is required in the near future - observed bottlenecks are not fundamental to the infrastructure and its capacities, but are rather due to sub-optimal management and utilisation of the existing infrastructure.

  2. Epimenides: Interoperability Reasoning for Digital Preservation

    NARCIS (Netherlands)

    Kargakis, Yannis; Tzitzikas, Yannis; van Horik, M.P.M.

    2014-01-01

    This paper presents Epimenides, a system that implements a novel interoperability dependency reasoning approach for assisting digital preservation activities. A distinctive feature is that it can model also converters and emulators, and the adopted modelling approach enables the automatic reasoning

  3. Visions, Scenarios and Action Plans Towards Next Generation Tanzania Power System

    Directory of Open Access Journals (Sweden)

    Alex Kyaruzi

    2012-10-01

    Full Text Available This paper presents strategic visions, scenarios and action plans for enhancing Tanzania Power Systems towards next generation Smart Power Grid. It first introduces the present Tanzanian power grid and the challenges ahead in terms of generation capacity, financial aspect, technical and non-technical losses, revenue loss, high tariff, aging infrastructure, environmental impact and the interconnection with the neighboring countries. Then, the current initiatives undertaken by the Tanzania government in response to the present challenges and the expected roles of smart grid in overcoming these challenges in the future with respect to the scenarios presented are discussed. The developed scenarios along with visions and recommended action plans towards the future Tanzanian power system can be exploited at all governmental levels to achieve public policy goals and help develop business opportunities by motivating domestic and international investments in modernizing the nation’s electric power infrastructure. In return, it should help build the green energy economy.

  4. Modelling and approaching pragmatic interoperability of distributed geoscience data

    Science.gov (United States)

    Ma, Xiaogang

    2010-05-01

    Interoperability of geodata, which is essential for sharing information and discovering insights within a cyberinfrastructure, is receiving increasing attention. A key requirement of interoperability in the context of geodata sharing is that data provided by local sources can be accessed, decoded, understood and appropriately used by external users. Various researchers have discussed that there are four levels in data interoperability issues: system, syntax, schematics and semantics, which respectively relate to the platform, encoding, structure and meaning of geodata. Ontology-driven approaches have been significantly studied addressing schematic and semantic interoperability issues of geodata in the last decade. There are different types, e.g. top-level ontologies, domain ontologies and application ontologies and display forms, e.g. glossaries, thesauri, conceptual schemas and logical theories. Many geodata providers are maintaining their identified local application ontologies in order to drive standardization in local databases. However, semantic heterogeneities often exist between these local ontologies, even though they are derived from equivalent disciplines. In contrast, common ontologies are being studied in different geoscience disciplines (e.g., NAMD, SWEET, etc.) as a standardization procedure to coordinate diverse local ontologies. Semantic mediation, e.g. mapping between local ontologies, or mapping local ontologies to common ontologies, has been studied as an effective way of achieving semantic interoperability between local ontologies thus reconciling semantic heterogeneities in multi-source geodata. Nevertheless, confusion still exists in the research field of semantic interoperability. One problem is caused by eliminating elements of local pragmatic contexts in semantic mediation. Comparing to the context-independent feature of a common domain ontology, local application ontologies are closely related to elements (e.g., people, time, location

  5. Towards an IT infrastructure for compliance management by data interoperability : The changing role of authorities

    OpenAIRE

    Hofman, W.J.; Bastiaansen, H.J.M.

    2013-01-01

    Since 9/11, the Customs Trade Partnership Against Terrorism (C-TPAT) was an initiative to increase container security. Through the Entry Summary Declaration (ENS), authorities require shipping lines to timely submit data to the first port of call in the EC.. However, an ENS contains insufficient data for proper risk analysis. This paper presents an IT infrastructure to capture so-called upstream data that allows customs to match delivery - with container data. It proposes Semantic Web technol...

  6. Design and Implement AN Interoperable Internet of Things Application Based on AN Extended Ogc Sensorthings Api Standard

    Science.gov (United States)

    Huang, C. Y.; Wu, C. H.

    2016-06-01

    The Internet of Things (IoT) is an infrastructure that interconnects uniquely-identifiable devices using the Internet. By interconnecting everyday appliances, various monitoring and physical mashup applications can be constructed to improve people's daily life. However, IoT devices created by different manufacturers follow different proprietary protocols and cannot communicate with each other. This heterogeneity issue causes different products to be locked in multiple closed ecosystems that we call IoT silos. In order to address this issue, a common industrial solution is the hub approach, which implements connectors to communicate with IoT devices following different protocols. However, with the growing number of proprietary protocols proposed by device manufacturers, IoT hubs need to support and maintain a lot of customized connectors. Hence, we believe the ultimate solution to address the heterogeneity issue is to follow open and interoperable standard. Among the existing IoT standards, the Open Geospatial Consortium (OGC) SensorThings API standard supports comprehensive conceptual model and query functionalities. The first version of SensorThings API mainly focuses on connecting to IoT devices and sharing sensor observations online, which is the sensing capability. Besides the sensing capability, IoT devices could also be controlled via the Internet, which is the tasking capability. While the tasking capability was not included in the first version of the SensorThings API standard, this research aims on defining the tasking capability profile and integrates with the SensorThings API standard, which we call the extended-SensorThings API in this paper. In general, this research proposes a lightweight JSON-based web service description, the "Tasking Capability Description", allowing device owners and manufacturers to describe different IoT device protocols. Through the extended- SensorThings API, users and applications can follow a coherent protocol to control Io

  7. ICD-11 (JLMMS) and SCT Inter-Operation.

    Science.gov (United States)

    Mamou, Marzouk; Rector, Alan; Schulz, Stefan; Campbell, James; Solbrig, Harold; Rodrigues, Jean-Marie

    2016-01-01

    The goal of this work is to contribute to a smooth and semantically sound inter-operability between the ICD-11 (International Classification of Diseases-11th revision Joint Linearization for Mortality, Morbidity and Statistics) and SNOMED CT (SCT). To guarantee such inter-operation between a classification, characterized by a single hierarchy of mutually exclusive and exhaustive classes, as is the JLMMS successor of ICD-10 on the one hand, and the multi-hierarchical, ontology-based clinical terminology SCT on the other hand, we use ontology axioms that logically express generalizable truths. This is expressed by the compositional grammar of SCT, together with queries on axiomsof SCT. We test the feasibility of the method on the circulatory chapter of ICD-11 JLMMS and present limitations and results.

  8. CERN Infrastructure Evolution

    CERN Document Server

    Bell, Tim

    2012-01-01

    The CERN Computer Centre is reviewing strategies for optimizing the use of the existing infrastructure in the future, and in the likely scenario that any extension will be remote from CERN, and in the light of the way other large facilities are today being operated. Over the past six months, CERN has been investigating modern and widely-used tools and procedures used for virtualisation, clouds and fabric management in order to reduce operational effort, increase agility and support unattended remote computer centres. This presentation will give the details on the project’s motivations, current status and areas for future investigation.

  9. A scenario-based procedure for seismic risk analysis

    International Nuclear Information System (INIS)

    Kluegel, J.-U.; Mualchin, L.; Panza, G.F.

    2006-12-01

    A new methodology for seismic risk analysis based on probabilistic interpretation of deterministic or scenario-based hazard analysis, in full compliance with the likelihood principle and therefore meeting the requirements of modern risk analysis, has been developed. The proposed methodology can easily be adjusted to deliver its output in a format required for safety analysts and civil engineers. The scenario-based approach allows the incorporation of all available information collected in a geological, seismotectonic and geotechnical database of the site of interest as well as advanced physical modelling techniques to provide a reliable and robust deterministic design basis for civil infrastructures. The robustness of this approach is of special importance for critical infrastructures. At the same time a scenario-based seismic hazard analysis allows the development of the required input for probabilistic risk assessment (PRA) as required by safety analysts and insurance companies. The scenario-based approach removes the ambiguity in the results of probabilistic seismic hazard analysis (PSHA) which relies on the projections of Gutenberg-Richter (G-R) equation. The problems in the validity of G-R projections, because of incomplete to total absence of data for making the projections, are still unresolved. Consequently, the information from G-R must not be used in decisions for design of critical structures or critical elements in a structure. The scenario-based methodology is strictly based on observable facts and data and complemented by physical modelling techniques, which can be submitted to a formalised validation process. By means of sensitivity analysis, knowledge gaps related to lack of data can be dealt with easily, due to the limited amount of scenarios to be investigated. The proposed seismic risk analysis can be used with confidence for planning, insurance and engineering applications. (author)

  10. The development of a cislunar space infrastructure

    Science.gov (United States)

    Buck, C. A.; Johnson, A. S.; Mcglinchey, J. M.; Ryan, K. D.

    1989-01-01

    The primary objective of this Advanced Mission Design Program is to define the general characteristics and phased evolution of a near-Earth space infrastructure. The envisioned foundation includes a permanently manned, self-sustaining base on the lunar surface, a space station at the Libration Point between earth and the moon (L1), and a transportation system that anchors these elements to the Low Earth Orbit (LEO) station. The implementation of this conceptual design was carried out with the idea that the infrastructure is an important step in a larger plan to expand man's capabilities in space science and technology. Such expansion depends on low cost, reliable, and frequent access to space for those who wish to use the multiple benefits of this environment. The presence of a cislunar space infrastructure would greatly facilitate the staging of future planetary missions, as well as the full exploration of the lunar potential for science and industry. The rationale for, and a proposed detailed scenario in support of, the cislunar space infrastructure are discussed.

  11. Cyber Security Insider Threats :: Government’s Role in Protecting India’s Critical Infrastructure Sectors

    OpenAIRE

    Vohra, Pulkit

    2014-01-01

    This research identifies the problem of insider threats in the critical infrastructure sectors of India. It is structured to answer the research question: "Why insider threats should be the primary concern for Indian government to protect its critical infrastructure sectors.” It defines the critical infrastructure sectors and portrays the cyber security scenario of India. Also, through the research study, it identifies the lack of awareness and non-seriousness of employees in the critical sec...

  12. Interoperability in practice: case study of the Slovenian independence war of 1991

    Directory of Open Access Journals (Sweden)

    Vladimir Prebilič

    2015-08-01

    Full Text Available The paper will examine the theory of the interoperability of armed forces through the case of he Slovenian Independence War of 1991. Although defense system interoperability is a well-established concept, there are many obstacles to its implementation. Some defense systems do not deliberately support the idea of interoperability. One such example is the total defense system in SFR Yugoslavia, which is comprised of two defense components: the Yugoslav People’s Army (YPA and territorial defense structures organized by the federal republic. The question of interoperability is highly relevant since the war was fought between the YPA and the defense forces of the newly proclaimed independent state, Slovenia, who were partners in the total defense concept. Due to the clear asymmetry, interoperability offered a great advantage in the independence war. The Slovenian defense forces were combined into three structures: the former militia as an internal security element, the territorial defense as a military component, and the national protection forces as a “civil” defense element. Although each structure had its own command and organizational structure, during the Slovenian War they were combined into a well-structured and organized defense element that achieved victory against a much stronger, better equipped, and better supported army.

  13. Semantic and syntactic interoperability in online processing of big Earth observation data.

    Science.gov (United States)

    Sudmanns, Martin; Tiede, Dirk; Lang, Stefan; Baraldi, Andrea

    2018-01-01

    The challenge of enabling syntactic and semantic interoperability for comprehensive and reproducible online processing of big Earth observation (EO) data is still unsolved. Supporting both types of interoperability is one of the requirements to efficiently extract valuable information from the large amount of available multi-temporal gridded data sets. The proposed system wraps world models, (semantic interoperability) into OGC Web Processing Services (syntactic interoperability) for semantic online analyses. World models describe spatio-temporal entities and their relationships in a formal way. The proposed system serves as enabler for (1) technical interoperability using a standardised interface to be used by all types of clients and (2) allowing experts from different domains to develop complex analyses together as collaborative effort. Users are connecting the world models online to the data, which are maintained in a centralised storage as 3D spatio-temporal data cubes. It allows also non-experts to extract valuable information from EO data because data management, low-level interactions or specific software issues can be ignored. We discuss the concept of the proposed system, provide a technical implementation example and describe three use cases for extracting changes from EO images and demonstrate the usability also for non-EO, gridded, multi-temporal data sets (CORINE land cover).

  14. The eXtensible ontology development (XOD) principles and tool implementation to support ontology interoperability.

    Science.gov (United States)

    He, Yongqun; Xiang, Zuoshuang; Zheng, Jie; Lin, Yu; Overton, James A; Ong, Edison

    2018-01-12

    Ontologies are critical to data/metadata and knowledge standardization, sharing, and analysis. With hundreds of biological and biomedical ontologies developed, it has become critical to ensure ontology interoperability and the usage of interoperable ontologies for standardized data representation and integration. The suite of web-based Ontoanimal tools (e.g., Ontofox, Ontorat, and Ontobee) support different aspects of extensible ontology development. By summarizing the common features of Ontoanimal and other similar tools, we identified and proposed an "eXtensible Ontology Development" (XOD) strategy and its associated four principles. These XOD principles reuse existing terms and semantic relations from reliable ontologies, develop and apply well-established ontology design patterns (ODPs), and involve community efforts to support new ontology development, promoting standardized and interoperable data and knowledge representation and integration. The adoption of the XOD strategy, together with robust XOD tool development, will greatly support ontology interoperability and robust ontology applications to support data to be Findable, Accessible, Interoperable and Reusable (i.e., FAIR).

  15. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation.

    Science.gov (United States)

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  16. On the Use of Geographic Information in Humanities Research Infrastructure: A Case Study on Cultural Heritage

    Directory of Open Access Journals (Sweden)

    Albina Mościcka

    2018-03-01

    Full Text Available As an invaluable source of knowledge about the past, cultural heritage may be an important element of the humanities research infrastructure, along with other elements, such as spatial references. Therefore, this paper attempts to provide an answer to the questions concerning the ways in which spatial information can contribute to the development of this infrastructure and the aspects of storytelling based on cultural resources that can be supported by such infrastructure. The objective of the methodology that was used was to combine the aspects that refer to spatial information and cultural items into a single, common issue, and to describe them in a formalized way with use of Unified Modeling Language (UML. As a result, the study presents a proposal of the Humanities Infrastructure Architecture based on spatially-oriented movable cultural items, taking into account their use in the context of interoperability, along with the concept of creating spatial databases that would include movable monuments. The authors also demonstrate that the ISO 19100 series of geographical information standards may be a source of interesting conceptual solutions that may be used in the process of the standardization of geographical information that was recorded in the descriptions of cultural heritage items in form of metadata and data structure descriptions.

  17. 76 FR 4102 - Smart Grid Interoperability Standards; Supplemental Notice of Technical Conference

    Science.gov (United States)

    2011-01-24

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. RM11-2-000] Smart Grid Interoperability Standards; Supplemental Notice of Technical Conference January 13, 2011. On December 21, 2010, the Federal Energy Regulatory Commission announced that a Technical Conference on Smart Grid Interoperability Standards will be held on Monday...

  18. An Interoperable Cartographic Database

    OpenAIRE

    Slobodanka Ključanin; Zdravko Galić

    2007-01-01

    The concept of producing a prototype of interoperable cartographic database is explored in this paper, including the possibilities of integration of different geospatial data into the database management system and their visualization on the Internet. The implementation includes vectorization of the concept of a single map page, creation of the cartographic database in an object-relation database, spatial analysis, definition and visualization of the database content in the form of a map on t...

  19. MPEG-7-based description infrastructure for an audiovisual content analysis and retrieval system

    Science.gov (United States)

    Bailer, Werner; Schallauer, Peter; Hausenblas, Michael; Thallinger, Georg

    2005-01-01

    We present a case study of establishing a description infrastructure for an audiovisual content-analysis and retrieval system. The description infrastructure consists of an internal metadata model and access tool for using it. Based on an analysis of requirements, we have selected, out of a set of candidates, MPEG-7 as the basis of our metadata model. The openness and generality of MPEG-7 allow using it in broad range of applications, but increase complexity and hinder interoperability. Profiling has been proposed as a solution, with the focus on selecting and constraining description tools. Semantic constraints are currently only described in textual form. Conformance in terms of semantics can thus not be evaluated automatically and mappings between different profiles can only be defined manually. As a solution, we propose an approach to formalize the semantic constraints of an MPEG-7 profile using a formal vocabulary expressed in OWL, which allows automated processing of semantic constraints. We have defined the Detailed Audiovisual Profile as the profile to be used in our metadata model and we show how some of the semantic constraints of this profile can be formulated using ontologies. To work practically with the metadata model, we have implemented a MPEG-7 library and a client/server document access infrastructure.

  20. 76 FR 66040 - NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Draft...

    Science.gov (United States)

    2011-10-25

    ...-01] NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Draft... draft version of the NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0... Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Release 2.0) (Draft) for public review and...

  1. A scenario elicitation methodology to identify the drivers of electricity infrastructure cost in South America

    Science.gov (United States)

    Moksnes, Nandi; Taliotis, Constantinos; Broad, Oliver; de Moura, Gustavo; Howells, Mark

    2017-04-01

    Developing a set of scenarios to assess a proposed policy or future development pathways requires a certain level of information, as well as establishing the socio-economic context. As the future is difficult to predict, great care in defining the selected scenarios is needed. Even so it can be difficult to assess if the selected scenario is covering the possible solution space. Instead, this paper's methodology develops a large set of scenarios (324) in OSeMOSYS using the SAMBA 2.0 (South America Model Base) model to assess long-term electricity supply scenarios and applies a scenario-discovery statistical data mining algorithm, Patient Rule Induction Method (PRIM). By creating a multidimensional space, regions related to high and low cost can be identified as well as their key driver. The six key drivers are defined a priori in three (high, medium, low) or two levers (high, low): 1) Demand projected from GDP, population, urbanization and transport, 2) Fossil fuel price, 3) Climate change impact on hydropower, 4) Renewable technology learning rate, 5) Discount rate, 6) CO2 emission targets.

  2. A state-of-the-art review of interoperability amongst heterogeneous software systems

    Directory of Open Access Journals (Sweden)

    Carlos Mario Zapata Jaramillo

    2009-05-01

    Full Text Available Information systems are sets of interacting elements aimed at supporting entrepreneurial or business activities; they cannot thus coexist in an isolated way but require their data to be shared so as to increase their productivity. Such systems’ interoperability is normally accomplished through mark-up standards, query languages and web services. The literature contains work related to software system interoperability; however, it presents some difficulties, such as the need for using the same platforms and different programming languages, the use of read only languages and the deficiencies in the formalism used for achieving it. This paper presents a critical review of the advances made regarding heterogeneous software systems’ interoperability.

  3. A modular (almost) automatic set-up for elastic multi-tenants cloud (micro)infrastructures

    Science.gov (United States)

    Amoroso, A.; Astorino, F.; Bagnasco, S.; Balashov, N. A.; Bianchi, F.; Destefanis, M.; Lusso, S.; Maggiora, M.; Pellegrino, J.; Yan, L.; Yan, T.; Zhang, X.; Zhao, X.

    2017-10-01

    An auto-installing tool on an usb drive can allow for a quick and easy automatic deployment of OpenNebula-based cloud infrastructures remotely managed by a central VMDIRAC instance. A single team, in the main site of an HEP Collaboration or elsewhere, can manage and run a relatively large network of federated (micro-)cloud infrastructures, making an highly dynamic and elastic use of computing resources. Exploiting such an approach can lead to modular systems of cloud-bursting infrastructures addressing complex real-life scenarios.

  4. Interoperable and accessible census and survey data from IPUMS.

    Science.gov (United States)

    Kugler, Tracy A; Fitch, Catherine A

    2018-02-27

    The first version of the Integrated Public Use Microdata Series (IPUMS) was released to users in 1993, and since that time IPUMS has come to stand for interoperable and accessible census and survey data. Initially created to harmonize U.S. census microdata over time, IPUMS now includes microdata from the U.S. and international censuses and from surveys on health, employment, and other topics. IPUMS also provides geo-spatial data, aggregate population data, and environmental data. IPUMS supports ten data products, each disseminating an integrated data collection with a set of tools that make complex data easy to find, access, and use. Key features are record-level integration to create interoperable datasets, user-friendly interfaces, and comprehensive metadata and documentation. The IPUMS philosophy aligns closely with the FAIR principles of findability, accessibility, interoperability, and re-usability. IPUMS data have catalyzed knowledge generation across a wide range of social science and other disciplines, as evidenced by the large volume of publications and other products created by the vast IPUMS user community.

  5. Envri Cluster - a Community-Driven Platform of European Environmental Researcher Infrastructures for Providing Common E-Solutions for Earth Science

    Science.gov (United States)

    Asmi, A.; Sorvari, S.; Kutsch, W. L.; Laj, P.

    2017-12-01

    European long-term environmental research infrastructures (often referred as ESFRI RIs) are the core facilities for providing services for scientists in their quest for understanding and predicting the complex Earth system and its functioning that requires long-term efforts to identify environmental changes (trends, thresholds and resilience, interactions and feedbacks). Many of the research infrastructures originally have been developed to respond to the needs of their specific research communities, however, it is clear that strong collaboration among research infrastructures is needed to serve the trans-boundary research requires exploring scientific questions at the intersection of different scientific fields, conducting joint research projects and developing concepts, devices, and methods that can be used to integrate knowledge. European Environmental research infrastructures have already been successfully worked together for many years and have established a cluster - ENVRI cluster - for their collaborative work. ENVRI cluster act as a collaborative platform where the RIs can jointly agree on the common solutions for their operations, draft strategies and policies and share best practices and knowledge. Supporting project for the ENVRI cluster, ENVRIplus project, brings together 21 European research infrastructures and infrastructure networks to work on joint technical solutions, data interoperability, access management, training, strategies and dissemination efforts. ENVRI cluster act as one stop shop for multidisciplinary RI users, other collaborative initiatives, projects and programmes and coordinates and implement jointly agreed RI strategies.

  6. Interoperability as a quality label for portable & wearable health monitoring systems.

    Science.gov (United States)

    Chronaki, Catherine E; Chiarugi, Franco

    2005-01-01

    Advances in ICT promising universal access to high quality care, reduction of medical errors, and containment of health care costs, have renewed interest in electronic health records (EHR) standards and resulted in comprehensive EHR adoption programs in many European states. Health cards, and in particular the European health insurance card, present an opportunity for instant cross-border access to emergency health data including allergies, medication, even a reference ECG. At the same time, research and development in miniaturized medical devices and wearable medical sensors promise continuous health monitoring in a comfortable, flexible, and fashionable way. These trends call for the seamless integration of medical devices and intelligent wearables into an active EHR exploiting the vast information available to increase medical knowledge and establish personal wellness profiles. In a mobile connected world with empowered health consumers and fading barriers between health and healthcare, interoperability has a strong impact on consumer trust. As a result, current interoperability initiatives are extending the traditional standardization process to embrace implementation, validation, and conformance testing. In this paper, starting from the OpenECG initiative, which promotes the consistent implementation of interoperability standards in electrocardiography and supports a worldwide community with data sets, open source tools, specifications, and online conformance testing, we discuss EHR interoperability as a quality label for personalized health monitoring systems. Such a quality label would support big players and small enterprises in creating interoperable eHealth products, while opening the way for pervasive healthcare and the take-up of the eHealth market.

  7. XACML profile and implementation for authorization interoperability between OSG and EGEE

    International Nuclear Information System (INIS)

    Garzoglio, G; Altunay, M; Chadwick, K; Hesselroth, T D; Levshina, T; Sfiligoi, I; Alderman, I; Miller, Z; Ananthakrishnan, R; Bester, J; Ciaschini, V; Ferraro, A; Forti, A; Demchenko, Y; Groep, D; Koeroo, O; Hover, J; Packard, J; Joie, C La; Sagehaug, H

    2010-01-01

    The Open Science Grid (OSG) and the Enabling Grids for E-sciencE (EGEE) have a common security model, based on Public Key Infrastructure. Grid resources grant access to users because of their membership in a Virtual Organization (VO), rather than on personal identity. Users push VO membership information to resources in the form of identity attributes, thus declaring that resources will be consumed on behalf of a specific group inside the organizational structure of the VO. Resources contact an access policies repository, centralized at each site, to grant the appropriate privileges for that VO group. Before the work in this paper, despite the commonality of the model, OSG and EGEE used different protocols for the communication between resources and the policy repositories. Hence, middleware developed for one Grid could not naturally be deployed on the other Grid, since the authorization module of the middleware would have to be enhanced to support the other Grid's communication protocol. In addition, maintenance and support for different authorization call-out protocols represents a duplication of effort for our relatively small community. To address these issues, OSG and EGEE initiated a joint project on authorization interoperability. The project defined a common communication protocol and attribute identity profile for authorization call-out and provided implementation and integration with major Grid middleware. The activity had resonance with middleware development communities, such as the Globus Toolkit and Condor, who decided to join the collaboration and contribute requirements and software. In this paper, we discuss the main elements of the profile, its implementation, and deployment in EGEE and OSG. We focus in particular on the operations of the authorization infrastructures of both Grids.

  8. XACML profile and implementation for authorization interoperability between OSG and EGEE

    Energy Technology Data Exchange (ETDEWEB)

    Garzoglio, G.; Alderman, I.; Altunay, M.; Ananthakrishnan, R.; Bester, J.; Chadwick, K.; Ciaschini, V.; Demchenko, Y.; Ferraro, A.; Forti, A.; Groep, D.; /Fermilab /Wisconsin U., Madison /Argonne /INFN, CNAF /Amsterdam U. /NIKHEF, Amsterdam /Brookhaven /SWITCH, Zurich /Bergen Coll. Higher Educ.

    2009-05-01

    The Open Science Grid (OSG) and the Enabling Grids for E-sciencE (EGEE) have a common security model, based on Public Key Infrastructure. Grid resources grant access to users because of their membership in a Virtual Organization (VO), rather than on personal identity. Users push VO membership information to resources in the form of identity attributes, thus declaring that resources will be consumed on behalf of a specific group inside the organizational structure of the VO. Resources contact an access policies repository, centralized at each site, to grant the appropriate privileges for that VO group. Before the work in this paper, despite the commonality of the model, OSG and EGEE used different protocols for the communication between resources and the policy repositories. Hence, middleware developed for one Grid could not naturally be deployed on the other Grid, since the authorization module of the middleware would have to be enhanced to support the other Grid's communication protocol. In addition, maintenance and support for different authorization call-out protocols represents a duplication of effort for our relatively small community. To address these issues, OSG and EGEE initiated a joint project on authorization interoperability. The project defined a common communication protocol and attribute identity profile for authorization call-out and provided implementation and integration with major Grid middleware. The activity had resonance with middleware development communities, such as the Globus Toolkit and Condor, who decided to join the collaboration and contribute requirements and software. In this paper, we discuss the main elements of the profile, its implementation, and deployment in EGEE and OSG. We focus in particular on the operations of the authorization infrastructures of both Grids.

  9. CCS Infrastructure Development Scenarios for the Integrated Iberian Peninsula and Morocco Energy System

    NARCIS (Netherlands)

    Kanudia, A.; Berghout, N.A.; Boavida, D.; van den Broek, M.A.

    2013-01-01

    This paper briefly illustrates a method to represent national energy systems and the geographical details of CCS infrastructures in the same technical-economic model. In the MARKAL-TIMES modeling framework a model of Morocco, Portugal and Spain with both spatial and temporal details has been

  10. On the Impact of using Public Network Communication Infrastructure for Voltage Control Coordination in Smart Grid Scenario

    DEFF Research Database (Denmark)

    Shahid, Kamal; Petersen, Lennart; Iov, Florin

    2017-01-01

    voltage controlled distribution system. A cost effective way to connect the ReGen plants to the control center is to consider the existing public network infrastructure. This paper, therefore, illustrates the impact of using the existing public network communication infrastructure for online voltage...

  11. Assessing large-scale wildlife responses to human infrastructure development.

    Science.gov (United States)

    Torres, Aurora; Jaeger, Jochen A G; Alonso, Juan Carlos

    2016-07-26

    Habitat loss and deterioration represent the main threats to wildlife species, and are closely linked to the expansion of roads and human settlements. Unfortunately, large-scale effects of these structures remain generally overlooked. Here, we analyzed the European transportation infrastructure network and found that 50% of the continent is within 1.5 km of transportation infrastructure. We present a method for assessing the impacts from infrastructure on wildlife, based on functional response curves describing density reductions in birds and mammals (e.g., road-effect zones), and apply it to Spain as a case study. The imprint of infrastructure extends over most of the country (55.5% in the case of birds and 97.9% for mammals), with moderate declines predicted for birds (22.6% of individuals) and severe declines predicted for mammals (46.6%). Despite certain limitations, we suggest the approach proposed is widely applicable to the evaluation of effects of planned infrastructure developments under multiple scenarios, and propose an internationally coordinated strategy to update and improve it in the future.

  12. An Interoperable Cartographic Database

    Directory of Open Access Journals (Sweden)

    Slobodanka Ključanin

    2007-05-01

    Full Text Available The concept of producing a prototype of interoperable cartographic database is explored in this paper, including the possibilities of integration of different geospatial data into the database management system and their visualization on the Internet. The implementation includes vectorization of the concept of a single map page, creation of the cartographic database in an object-relation database, spatial analysis, definition and visualization of the database content in the form of a map on the Internet. 

  13. Improving global data infrastructures for more effective and scalable analysis of Earth and environmental data: the Australian NCI NERDIP Approach

    Science.gov (United States)

    Evans, Ben; Wyborn, Lesley; Druken, Kelsey; Richards, Clare; Trenham, Claire; Wang, Jingbo; Rozas Larraondo, Pablo; Steer, Adam; Smillie, Jon

    2017-04-01

    The National Computational Infrastructure (NCI) facility hosts one of Australia's largest repositories (10+ PBytes) of research data collections spanning datasets from climate, coasts, oceans, and geophysics through to astronomy, bioinformatics, and the social sciences domains. The data are obtained from national and international sources, spanning a wide range of gridded and ungridded (i.e., line surveys, point clouds) data, and raster imagery, as well as diverse coordinate reference projections and resolutions. Rather than managing these data assets as a digital library, whereby users can discover and download files to personal servers (similar to borrowing 'books' from a 'library'), NCI has built an extensive and well-integrated research data platform, the National Environmental Research Data Interoperability Platform (NERDIP, http://nci.org.au/data-collections/nerdip/). The NERDIP architecture enables programmatic access to data via standards-compliant services for high performance data analysis, and provides a flexible cloud-based environment to facilitate the next generation of transdisciplinary scientific research across all data domains. To improve use of modern scalable data infrastructures that are focused on efficient data analysis, the data organisation needs to be carefully managed including performance evaluations of projections and coordinate systems, data encoding standards and formats. A complication is that we have often found multiple domain vocabularies and ontologies are associated with equivalent datasets. It is not practical for individual dataset managers to determine which standards are best to apply to their dataset as this could impact accessibility and interoperability. Instead, they need to work with data custodians across interrelated communities and, in partnership with the data repository, the international scientific community to determine the most useful approach. For the data repository, this approach is essential to enable

  14. EarthCube Cyberinfrastructure: The Importance of and Need for International Strategic Partnerships to Enhance Interconnectivity and Interoperability

    Science.gov (United States)

    Ramamurthy, M. K.; Lehnert, K.; Zanzerkia, E. E.

    2017-12-01

    The United States National Science Foundation's EarthCube program is a community-driven activity aimed at transforming the conduct of geosciences research and education by creating a well-connected cyberinfrastructure for sharing and integrating data and knowledge across all geoscience disciplines in an open, transparent, and inclusive manner and to accelerate our ability to understand and predict the Earth system. After five years of community engagement, governance, and development activities, EarthCube is now transitioning into an implementation phase. In the first phase of implementing the EarthCube architecture, the project leadership has identified the following architectural components as the top three priorities, focused on technologies, interfaces and interoperability elements that will address: a) Resource Discovery; b) Resource Registry; and c) Resource Distribution and Access. Simultaneously, EarthCube is exploring international partnerships to leverage synergies with other e-infrastructure programs and projects in Europe, Australia, and other regions and discuss potential partnerships and mutually beneficial collaborations to increase interoperability of systems for advancing EarthCube's goals in an efficient and effective manner. In this session, we will present the progress of EarthCube on a number of fronts and engage geoscientists and data scientists in the future steps toward the development of EarthCube for advancing research and discovery in the geosciences. The talk will underscore the importance of strategic partnerships with other like eScience projects and programs across the globe.

  15. Brandenburg 3D - a comprehensive 3D Subsurface Model, Conception of an Infrastructure Node and a Web Application

    Science.gov (United States)

    Kerschke, Dorit; Schilling, Maik; Simon, Andreas; Wächter, Joachim

    2014-05-01

    The Energiewende and the increasing scarcity of raw materials will lead to an intensified utilization of the subsurface in Germany. Within this context, geological 3D modeling is a fundamental approach for integrated decision and planning processes. Initiated by the development of the European Geospatial Infrastructure INSPIRE, the German State Geological Offices started digitizing their predominantly analog archive inventory. Until now, a comprehensive 3D subsurface model of Brandenburg did not exist. Therefore the project B3D strived to develop a new 3D model as well as a subsequent infrastructure node to integrate all geological and spatial data within the Geodaten-Infrastruktur Brandenburg (Geospatial Infrastructure, GDI-BB) and provide it to the public through an interactive 2D/3D web application. The functionality of the web application is based on a client-server architecture. Server-sided, all available spatial data is published through GeoServer. GeoServer is designed for interoperability and acts as the reference implementation of the Open Geospatial Consortium (OGC) Web Feature Service (WFS) standard that provides the interface that allows requests for geographical features. In addition, GeoServer implements, among others, the high performance certified compliant Web Map Service (WMS) that serves geo-referenced map images. For publishing 3D data, the OGC Web 3D Service (W3DS), a portrayal service for three-dimensional geo-data, is used. The W3DS displays elements representing the geometry, appearance, and behavior of geographic objects. On the client side, the web application is solely based on Free and Open Source Software and leans on the JavaScript API WebGL that allows the interactive rendering of 2D and 3D graphics by means of GPU accelerated usage of physics and image processing as part of the web page canvas without the use of plug-ins. WebGL is supported by most web browsers (e.g., Google Chrome, Mozilla Firefox, Safari, and Opera). The web

  16. Sensing Models and Sensor Network Architectures for Transport Infrastructure Monitoring in Smart Cities

    Science.gov (United States)

    Simonis, Ingo

    2015-04-01

    Transport infrastructure monitoring and analysis is one of the focus areas in the context of smart cities. With the growing number of people moving into densely populated urban metro areas, precise tracking of moving people and goods is the basis for profound decision-making and future planning. With the goal of defining optimal extensions and modifications to existing transport infrastructures, multi-modal transport has to be monitored and analysed. This process is performed on the basis of sensor networks that combine a variety of sensor models, types, and deployments within the area of interest. Multi-generation networks, consisting of a number of sensor types and versions, are causing further challenges for the integration and processing of sensor observations. These challenges are not getting any smaller with the development of the Internet of Things, which brings promising opportunities, but is currently stuck in a type of protocol war between big industry players from both the hardware and network infrastructure domain. In this paper, we will highlight how the OGC suite of standards, with the Sensor Web standards developed by the Sensor Web Enablement Initiative together with the latest developments by the Sensor Web for Internet of Things community can be applied to the monitoring and improvement of transport infrastructures. Sensor Web standards have been applied in the past to pure technical domains, but need to be broadened now in order to meet new challenges. Only cross domain approaches will allow to develop satisfying transport infrastructure approaches that take into account requirements coming form a variety of sectors such as tourism, administration, transport industry, emergency services, or private people. The goal is the development of interoperable components that can be easily integrated within data infrastructures and follow well defined information models to allow robust processing.

  17. Equipping the Enterprise Interoperability Problem Solver

    NARCIS (Netherlands)

    Oude Luttighuis, Paul; Folmer, Erwin Johan Albert; Charalabidis, Yannis

    2010-01-01

    The maturity of the enterprise interoperability field does not match the importance attached to it by many, both in the public as well as the private community. A host of models, paradigms, designs, standards, methods, and instruments seems to be available, but many of them are only used in rather

  18. Supporting NEESPI with Data Services - The SIB-ESS-C e-Infrastructure

    Science.gov (United States)

    Gerlach, R.; Schmullius, C.; Frotscher, K.

    2009-04-01

    Data discovery and retrieval is commonly among the first steps performed for any Earth science study. The way scientific data is searched and accessed has changed significantly over the past two decades. Especially the development of the World Wide Web and the technologies that evolved along shortened the data discovery and data exchange process. On the other hand the amount of data collected and distributed by earth scientists has increased exponentially requiring new concepts for data management and sharing. One such concept to meet the demand is to build up Spatial Data Infrastructures (SDI) or e-Infrastructures. These infrastructures usually contain components for data discovery allowing users (or other systems) to query a catalogue or registry and retrieve metadata information on available data holdings and services. Data access is typically granted using FTP/HTTP protocols or, more advanced, through Web Services. A Service Oriented Architecture (SOA) approach based on standardized services enables users to benefit from interoperability among different systems and to integrate distributed services into their application. The Siberian Earth System Science Cluster (SIB-ESS-C) being established at the University of Jena (Germany) is such a spatial data infrastructure following these principles and implementing standards published by the Open Geospatial Consortium (OGC) and the International Organization for Standardization (ISO). The prime objective is to provide researchers with focus on Siberia with the technical means for data discovery, data access, data publication and data analysis. The region of interest covers the entire Asian part of the Russian Federation from the Ural to the Pacific Ocean including the Ob-, Lena- and Yenissey river catchments. The aim of SIB-ESS-C is to provide a comprehensive set of data products for Earth system science in this region. Although SIB-ESS-C will be equipped with processing capabilities for in-house data generation

  19. Semantic modeling and interoperability in product and process engineering a technology for engineering informatics

    CERN Document Server

    2013-01-01

    In the past decade, feature-based design and manufacturing has gained some momentum in various engineering domains to represent and reuse semantic patterns with effective applicability. However, the actual scope of feature application is still very limited. Semantic Modeling and Interoperability in Product and Process Engineering provides a systematic solution for the challenging engineering informatics field aiming at the enhancement of sustainable knowledge representation, implementation and reuse in an open and yet practically manageable scale.   This semantic modeling technology supports uniform, multi-facet and multi-level collaborative system engineering with heterogeneous computer-aided tools, such as CADCAM, CAE, and ERP.  This presented unified feature model can be applied to product and process representation, development, implementation and management. Practical case studies and test samples are provided to illustrate applications which can be implemented by the readers in real-world scenarios. �...

  20. A Smart City Lighting Case Study on an OpenStack-Powered Infrastructure.

    Science.gov (United States)

    Merlino, Giovanni; Bruneo, Dario; Distefano, Salvatore; Longo, Francesco; Puliafito, Antonio; Al-Anbuky, Adnan

    2015-07-06

    The adoption of embedded systems, mobile devices and other smart devices keeps rising globally, and the scope of their involvement broadens, for instance, in smart city-like scenarios. In light of this, a pressing need emerges to tame such complexity and reuse as much tooling as possible without resorting to vertical ad hoc solutions, while at the same time taking into account valid options with regard to infrastructure management and other more advanced functionalities. Existing solutions mainly focus on core mechanisms and do not allow one to scale by leveraging infrastructure or adapt to a variety of scenarios, especially if actuators are involved in the loop. A new, more flexible, cloud-based approach, able to provide device-focused workflows, is required. In this sense, a widely-used and competitive framework for infrastructure as a service, such as OpenStack, with its breadth in terms of feature coverage and expanded scope, looks to fit the bill, replacing current application-specific approaches with an innovative application-agnostic one. This work thus describes the rationale, efforts and results so far achieved for an integration of IoT paradigms and resource ecosystems with such a kind of cloud-oriented device-centric environment, by focusing on a smart city scenario, namely a park smart lighting example, and featuring data collection, data visualization, event detection and coordinated reaction, as example use cases of such integration.

  1. A Smart City Lighting Case Study on an OpenStack-Powered Infrastructure

    Science.gov (United States)

    Merlino, Giovanni; Bruneo, Dario; Distefano, Salvatore; Longo, Francesco; Puliafito, Antonio; Al-Anbuky, Adnan

    2015-01-01

    The adoption of embedded systems, mobile devices and other smart devices keeps rising globally, and the scope of their involvement broadens, for instance, in smart city-like scenarios. In light of this, a pressing need emerges to tame such complexity and reuse as much tooling as possible without resorting to vertical ad hoc solutions, while at the same time taking into account valid options with regard to infrastructure management and other more advanced functionalities. Existing solutions mainly focus on core mechanisms and do not allow one to scale by leveraging infrastructure or adapt to a variety of scenarios, especially if actuators are involved in the loop. A new, more flexible, cloud-based approach, able to provide device-focused workflows, is required. In this sense, a widely-used and competitive framework for infrastructure as a service, such as OpenStack, with its breadth in terms of feature coverage and expanded scope, looks to fit the bill, replacing current application-specific approaches with an innovative application-agnostic one. This work thus describes the rationale, efforts and results so far achieved for an integration of IoT paradigms and resource ecosystems with such a kind of cloud-oriented device-centric environment, by focusing on a smart city scenario, namely a park smart lighting example, and featuring data collection, data visualization, event detection and coordinated reaction, as example use cases of such integration. PMID:26153775

  2. Determining air quality and greenhouse gas impacts of hydrogen infrastructure and fuel cell vehicles.

    Science.gov (United States)

    Stephens-Romero, Shane; Carreras-Sospedra, Marc; Brouwer, Jacob; Dabdub, Donald; Samuelsen, Scott

    2009-12-01

    Adoption of hydrogen infrastructure and hydrogen fuel cell vehicles (HFCVs) to replace gasoline internal combustion engine (ICE) vehicles has been proposed as a strategy to reduce criteria pollutant and greenhouse gas (GHG) emissions from the transportation sector and transition to fuel independence. However, it is uncertain (1) to what degree the reduction in criteria pollutants will impact urban air quality, and (2) how the reductions in pollutant emissions and concomitant urban air quality impacts compare to ultralow emission gasoline-powered vehicles projected for a future year (e.g., 2060). To address these questions, the present study introduces a "spatially and temporally resolved energy and environment tool" (STREET) to characterize the pollutant and GHG emissions associated with a comprehensive hydrogen supply infrastructure and HFCVs at a high level of geographic and temporal resolution. To demonstrate the utility of STREET, two spatially and temporally resolved scenarios for hydrogen infrastructure are evaluated in a prototypical urban airshed (the South Coast Air Basin of California) using geographic information systems (GIS) data. The well-to-wheels (WTW) GHG emissions are quantified and the air quality is established using a detailed atmospheric chemistry and transport model followed by a comparison to a future gasoline scenario comprised of advanced ICE vehicles. One hydrogen scenario includes more renewable primary energy sources for hydrogen generation and the other includes more fossil fuel sources. The two scenarios encompass a variety of hydrogen generation, distribution, and fueling strategies. GHG emissions reductions range from 61 to 68% for both hydrogen scenarios in parallel with substantial improvements in urban air quality (e.g., reductions of 10 ppb in peak 8-h-averaged ozone and 6 mug/m(3) in 24-h-averaged particulate matter concentrations, particularly in regions of the airshed where concentrations are highest for the gasoline scenario).

  3. Infrastructure Systems Interdependencies and Risk Informed Decision Making (RIDM: Impact Scenario Analysis of Infrastructure Risks Induced by Natural, Technological and Intentional Hazards

    Directory of Open Access Journals (Sweden)

    Rudolph Frederick Stapelberg

    2008-10-01

    Full Text Available This paper reviews current research into infrastructure systems interdependencies with regard to safesty risks induced by natural, technological and intentional hazards. The paper further considers risk informed decision-making.

  4. Integrating sea floor observatory data: the EMSO data infrastructure

    Science.gov (United States)

    Huber, Robert; Azzarone, Adriano; Carval, Thierry; Doumaz, Fawzi; Giovanetti, Gabriele; Marinaro, Giuditta; Rolin, Jean-Francois; Beranzoli, Laura; Waldmann, Christoph

    2013-04-01

    interoperability of the EMSO data infrastructure. Beneath common standards for metadata exchange such as OpenSearch or OAI-PMH, EMSO has chosen to implement core standards of the Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) suite of standards, such as Catalogue Service for Web (CS-W), Sensor Observation Service (SOS) and Observations and Measurements (O&M). Further, strong integration efforts are currently undertaken to harmonize data formats e.g NetCDF as well as the used ontologies and terminologies. The presentation will also give information to users about the discovery and visualization procedure for the EMSO data presently available.

  5. SERA Scenarios of Early Market Fuel Cell Electric Vehicle Introductions: Modeling Framework, Regional Markets, and Station Clustering

    Energy Technology Data Exchange (ETDEWEB)

    Bush, B. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Melaina, M. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Penev, M. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Daniel, W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2013-09-01

    This report describes the development and analysis of detailed temporal and spatial scenarios for early market hydrogen fueling infrastructure clustering and fuel cell electric vehicle rollout using the Scenario Evaluation, Regionalization and Analysis (SERA) model. The report provides an overview of the SERA scenario development framework and discusses the approach used to develop the nationwidescenario.

  6. A systems engineering approach for realizing sustainability in infrastructure projects

    Directory of Open Access Journals (Sweden)

    Mohamed Matar

    2017-08-01

    The developed model addresses an identified gap within the current body of knowledge by considering infrastructure projects. Through the ability to simulate different scenarios, the model enables identifying which activities, products, and processes impact the environment more, and hence potential areas for optimization and improvement.

  7. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation (presentation)

    Science.gov (United States)

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  8. Interoperability And Value Added To Earth Observation Data

    Science.gov (United States)

    Gasperi, J.

    2012-04-01

    Geospatial web services technology has provided a new means for geospatial data interoperability. Open Geospatial Consortium (OGC) services such as Web Map Service (WMS) to request maps on the Internet, Web Feature Service (WFS) to exchange vectors or Catalog Service for the Web (CSW) to search for geospatialized data have been widely adopted in the Geosciences community in general and in the remote sensing community in particular. These services make Earth Observation data available to a wider range of public users than ever before. The mapshup web client offers an innovative and efficient user interface that takes advantage of the power of interoperability. This presentation will demonstrate how mapshup can be effectively used in the context of natural disasters management.

  9. [Lessons learned in the implementation of interoperable National Health Information Systems: a systematic review].

    Science.gov (United States)

    Ovies-Bernal, Diana Paola; Agudelo-Londoño, Sandra M

    2014-01-01

    Identify shared criteria used throughout the world in the implementation of interoperable National Health Information Systems (NHIS) and provide validated scientific information on the dimensions affecting interoperability. This systematic review sought to identify primary articles on the implementation of interoperable NHIS published in scientific journals in English, Portuguese, or Spanish between 1990 and 2011 through a search of eight databases of electronic journals in the health sciences and informatics: MEDLINE (PubMed), Proquest, Ovid, EBSCO, MD Consult, Virtual Health Library, Metapress, and SciELO. The full texts of the articles were reviewed, and those that focused on technical computer aspects or on normative issues were excluded, as well as those that did not meet the quality criteria for systematic reviews of interventions. Of 291 studies found and reviewed, only five met the inclusion criteria. These articles reported on the process of implementing an interoperable NHIS in Brazil, China, the United States, Turkey, and the Semiautonomous Region of Zanzíbar, respectively. Five common basic criteria affecting implementation of the NHIS were identified: standards in place to govern the process, availability of trained human talent, financial and structural constraints, definition of standards, and assurance that the information is secure. Four dimensions affecting interoperability were defined: technical, semantic, legal, and organizational. The criteria identified have to be adapted to the actual situation in each country and a proactive approach should be used to ensure that implementation of the interoperable NHIS is strategic, simple, and reliable.

  10. An Interoperable Security Framework for Connected Healthcare

    NARCIS (Netherlands)

    Asim, M.; Petkovic, M.; Qu, M.; Wang, C.

    2011-01-01

    Connected and interoperable healthcare system promises to reduce thecost of the healthcare delivery, increase its efficiency and enableconsumers to better engage with clinicians and manage their care. However at the same time it introduces new risks towards security andprivacy of personal health

  11. Potential for sharing nuclear power infrastructure between countries

    International Nuclear Information System (INIS)

    2006-10-01

    The introduction or expansion of a nuclear power programme in a country and its successful execution is largely dependent on the network of national infrastructure, covering a wide range of activities and capabilities. The infrastructure areas include legal framework, safety and environmental regulatory bodies, international agreements, physical facilities, finance, education, training, human resources and public information and acceptance. The wide extent of infrastructure needs require an investment that can be too large or onerous for the national economy. The burden of infrastructure can be reduced significantly if a country forms a sharing partnership with other countries. The sharing can be at regional or at multinational level. It can include physical facilities, common programmes and knowledge, which will reflect in economic benefits. The sharing can also contribute in a significant manner to harmonization of codes and standards in general and regulatory framework in particular. The opportunities and potential of sharing nuclear power infrastructure is determined by the objectives, strategy and scenario of the national nuclear power programme. A review of individual infrastructure items shows that there are several opportunities for sharing of nuclear power infrastructure between countries if they cooperate with each other. International cooperation and sharing of nuclear power infrastructure are not new. This publication provides criteria and guidance for analyzing and identifying the potential for sharing of nuclear power infrastructure during the stages of nuclear power project life cycle. The target users are decision makers, advisers and senior managers in utilities, industrial organizations, regulatory bodies and governmental organizations in countries adopting or extending nuclear power programmes. This publication was produced within the IAEA programme directed to increase the capability of Member States to plan and implement nuclear power

  12. Improved semantic interoperability for content reuse through knowledge organization systems

    Directory of Open Access Journals (Sweden)

    José Antonio Moreiro González

    2012-04-01

    Full Text Available The Knowledge Organization Systems (KOS are resources designed to improve the knowledge interoperability, management and retrieval. As increases the web resources, it’s evidenced the lack of KOS, with the consequent impact in the resources interoperability. The KOSS are, by definition, complicated and costly tools, so much in his creation as in his management. The reuse of similar organizational structures is a necessary element in this context. They analyses experiences of reuse of The KOS and signals like the new standards are impinged on this appearance.

  13. STOCHASTIC COLOURED PETRINET BASED HEALTHCARE INFRASTRUCTURE INTERDEPENDENCY MODEL

    Directory of Open Access Journals (Sweden)

    N. Nukavarapu

    2016-06-01

    Full Text Available The Healthcare Critical Infrastructure (HCI protects all sectors of the society from hazards such as terrorism, infectious disease outbreaks, and natural disasters. HCI plays a significant role in response and recovery across all other sectors in the event of a natural or manmade disaster. However, for its continuity of operations and service delivery HCI is dependent on other interdependent Critical Infrastructures (CI such as Communications, Electric Supply, Emergency Services, Transportation Systems, and Water Supply System. During a mass casualty due to disasters such as floods, a major challenge that arises for the HCI is to respond to the crisis in a timely manner in an uncertain and variable environment. To address this issue the HCI should be disaster prepared, by fully understanding the complexities and interdependencies that exist in a hospital, emergency department or emergency response event. Modelling and simulation of a disaster scenario with these complexities would help in training and providing an opportunity for all the stakeholders to work together in a coordinated response to a disaster. The paper would present interdependencies related to HCI based on Stochastic Coloured Petri Nets (SCPN modelling and simulation approach, given a flood scenario as the disaster which would disrupt the infrastructure nodes. The entire model would be integrated with Geographic information based decision support system to visualize the dynamic behaviour of the interdependency of the Healthcare and related CI network in a geographically based environment.

  14. Stochastic Coloured Petrinet Based Healthcare Infrastructure Interdependency Model

    Science.gov (United States)

    Nukavarapu, Nivedita; Durbha, Surya

    2016-06-01

    The Healthcare Critical Infrastructure (HCI) protects all sectors of the society from hazards such as terrorism, infectious disease outbreaks, and natural disasters. HCI plays a significant role in response and recovery across all other sectors in the event of a natural or manmade disaster. However, for its continuity of operations and service delivery HCI is dependent on other interdependent Critical Infrastructures (CI) such as Communications, Electric Supply, Emergency Services, Transportation Systems, and Water Supply System. During a mass casualty due to disasters such as floods, a major challenge that arises for the HCI is to respond to the crisis in a timely manner in an uncertain and variable environment. To address this issue the HCI should be disaster prepared, by fully understanding the complexities and interdependencies that exist in a hospital, emergency department or emergency response event. Modelling and simulation of a disaster scenario with these complexities would help in training and providing an opportunity for all the stakeholders to work together in a coordinated response to a disaster. The paper would present interdependencies related to HCI based on Stochastic Coloured Petri Nets (SCPN) modelling and simulation approach, given a flood scenario as the disaster which would disrupt the infrastructure nodes. The entire model would be integrated with Geographic information based decision support system to visualize the dynamic behaviour of the interdependency of the Healthcare and related CI network in a geographically based environment.

  15. SERA Scenarios of Early Market Fuel Cell Electric Vehicle Introductions: Modeling Framework, Regional Markets, and Station Clustering; NREL (National Renewable Energy Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Melaina, M.

    2015-03-23

    This presentation provides an overview of the Scenario Evaluation and Regionalization Analysis (SERA) model, describes the methodology for developing scenarios for hydrogen infrastructure development, outlines an example "Hydrogen Success" scenario, and discusses detailed scenario metrics for a particular case study region, the Northeast Corridor.

  16. Political, policy and social barriers to health system interoperability: emerging opportunities of Web 2.0 and 3.0.

    Science.gov (United States)

    Juzwishin, Donald W M

    2009-01-01

    Achieving effective health informatics interoperability in a fragmented and uncoordinated health system is by definition not possible. Interoperability requires the simultaneous integration of health care processes and information across different types and levels of care (systems thinking). The fundamental argument of this paper is that information system interoperability will remain an unfulfilled hope until health reforms effectively address the governance (accountability), structural and process barriers to interoperability of health care delivery. The ascendency of Web 2.0 and 3.0, although still unproven, signals the opportunity to accelerate patients' access to health information and their health record. Policy suggestions for simultaneously advancing health system delivery and information system interoperability are posited.

  17. An Emergent Micro-Services Approach to Digital Curation Infrastructure

    Directory of Open Access Journals (Sweden)

    Stephen Abrams

    2010-07-01

    Full Text Available In order better to meet the needs of its diverse University of California (UC constituencies, the California Digital Library UC Curation Center is re-envisioning its approach to digital curation infrastructure by devolving function into a set of granular, independent, but interoperable micro-services. Since each of these services is small and self-contained, they are more easily developed, deployed, maintained, and enhanced; at the same time, complex curation function can emerge from the strategic combination of atomistic services. The emergent approach emphasizes the persistence of content rather than the systems in which that content is managemed, thus the paradigmatic archival culture is not unduly coupled to any particular technological context. This results in a curation environment that is comprehensive in scope, yet flexible with regard to local policies and practices and sustainable despite the inevitability of disruptive change in technology and user expectation.

  18. Clinical Knowledge Governance Framework for Nationwide Data Infrastructure Projects.

    Science.gov (United States)

    Wulff, Antje; Haarbrandt, Birger; Marschollek, Michael

    2018-01-01

    The availability of semantically-enriched and interoperable clinical information models is crucial for reusing once collected data across institutions like aspired in the German HiGHmed project. Funded by the Federal Ministry of Education and Research, this nationwide data infrastructure project adopts the openEHR approach for semantic modelling. Here, strong governance is required to define high-quality and reusable models. Design of a clinical knowledge governance framework for openEHR modelling in cross-institutional settings like HiGHmed. Analysis of successful practices from international projects, published ideas on archetype governance and own modelling experiences as well as modelling of BPMN processes. We designed a framework by presenting archetype variations, roles and responsibilities, IT support and modelling workflows. Our framework has great potential to make the openEHR modelling efforts manageable. Because practical experiences are rare, prospectively our work will be predestinated to evaluate the benefits of such structured governance approaches.

  19. Extending the GI Brokering Suite to Support New Interoperability Specifications

    Science.gov (United States)

    Boldrini, E.; Papeschi, F.; Santoro, M.; Nativi, S.

    2014-12-01

    The GI brokering suite provides the discovery, access, and semantic Brokers (i.e. GI-cat, GI-axe, GI-sem) that empower a Brokering framework for multi-disciplinary and multi-organizational interoperability. GI suite has been successfully deployed in the framework of several programmes and initiatives, such as European Union funded projects, NSF BCube, and the intergovernmental coordinated effort Global Earth Observation System of Systems (GEOSS). Each GI suite Broker facilitates interoperability for a particular functionality (i.e. discovery, access, semantic extension) among a set of brokered resources published by autonomous providers (e.g. data repositories, web services, semantic assets) and a set of heterogeneous consumers (e.g. client applications, portals, apps). A wide set of data models, encoding formats, and service protocols are already supported by the GI suite, such as the ones defined by international standardizing organizations like OGC and ISO (e.g. WxS, CSW, SWE, GML, netCDF) and by Community specifications (e.g. THREDDS, OpenSearch, OPeNDAP, ESRI APIs). Using GI suite, resources published by a particular Community or organization through their specific technology (e.g. OPeNDAP/netCDF) can be transparently discovered, accessed, and used by different Communities utilizing their preferred tools (e.g. a GIS visualizing WMS layers). Since Information Technology is a moving target, new standards and technologies continuously emerge and are adopted in the Earth Science context too. Therefore, GI Brokering suite was conceived to be flexible and accommodate new interoperability protocols and data models. For example, GI suite has recently added support to well-used specifications, introduced to implement Linked data, Semantic Web and precise community needs. Amongst the others, they included: DCAT: a RDF vocabulary designed to facilitate interoperability between Web data catalogs. CKAN: a data management system for data distribution, particularly used by

  20. An interoperable security framework for connected healthcare

    NARCIS (Netherlands)

    Asim, M.; Petkovic, M.; Qu, M.; Wang, Changjie

    2011-01-01

    Connected and interoperable healthcare system promises to reduce the cost of healthcare delivery, increase its efficiency and enable consumers to better engage with clinicians and manage their care. However at the same time it introduces new risks towards security and privacy of personal health

  1. California Plug-In Electric Vehicle Infrastructure Projections: 2017-2025 - Future Infrastructure Needs for Reaching the State's Zero Emission-Vehicle Deployment Goals

    Energy Technology Data Exchange (ETDEWEB)

    Wood, Eric W [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Rames, Clement L [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Bedir, Abdulkadir [California Energy Commission; Crisostomo, Noel [California Energy Commission; Allen, Jennifer [California Energy Commission

    2018-03-27

    This report analyzes plug-in electric vehicle (PEV) infrastructure needs in California from 2017 to 2025 in a scenario where the State's zero-emission vehicle (ZEV) deployment goals are achieved by household vehicles. The statewide infrastructure needs are evaluated by using the Electric Vehicle Infrastructure Projection tool, which incorporates representative statewide travel data from the 2012 California Household Travel Survey. The infrastructure solution presented in this assessment addresses two primary objectives: (1) enabling travel for battery electric vehicles and (2) maximizing the electric vehicle-miles traveled for plug-in hybrid electric vehicles. The analysis is performed at the county-level for each year between 2017 and 2025 while considering potential technology improvements. The results from this study present an infrastructure solution that can facilitate market growth for PEVs to reach the State's ZEV goals by 2025. The overall results show a need for 99k-130k destination chargers, including workplaces and public locations, and 9k-25k fast chargers. The results also show a need for dedicated or shared residential charging solutions at multi-family dwellings, which are expected to host about 120k PEVs by 2025. An improvement to the scientific literature, this analysis presents the significance of infrastructure reliability and accessibility on the quantification of charger demand.

  2. Intensity earthquake scenario (scenario event - a damaging earthquake with higher probability of occurrence) for the city of Sofia

    Science.gov (United States)

    Aleksandrova, Irena; Simeonova, Stela; Solakov, Dimcho; Popova, Maria

    2014-05-01

    Among the many kinds of natural and man-made disasters, earthquakes dominate with regard to their social and economical impact on the urban environment. Global seismic risk to earthquakes are increasing steadily as urbanization and development occupy more areas that a prone to effects of strong earthquakes. Additionally, the uncontrolled growth of mega cities in highly seismic areas around the world is often associated with the construction of seismically unsafe buildings and infrastructures, and undertaken with an insufficient knowledge of the regional seismicity peculiarities and seismic hazard. The assessment of seismic hazard and generation of earthquake scenarios is the first link in the prevention chain and the first step in the evaluation of the seismic risk. The earthquake scenarios are intended as a basic input for developing detailed earthquake damage scenarios for the cities and can be used in earthquake-safe town and infrastructure planning. The city of Sofia is the capital of Bulgaria. It is situated in the centre of the Sofia area that is the most populated (the population is of more than 1.2 mil. inhabitants), industrial and cultural region of Bulgaria that faces considerable earthquake risk. The available historical documents prove the occurrence of destructive earthquakes during the 15th-18th centuries in the Sofia zone. In 19th century the city of Sofia has experienced two strong earthquakes: the 1818 earthquake with epicentral intensity I0=8-9 MSK and the 1858 earthquake with I0=9-10 MSK. During the 20th century the strongest event occurred in the vicinity of the city of Sofia is the 1917 earthquake with MS=5.3 (I0=7-8 MSK). Almost a century later (95 years) an earthquake of moment magnitude 5.6 (I0=7-8 MSK) hit the city of Sofia, on May 22nd, 2012. In the present study as a deterministic scenario event is considered a damaging earthquake with higher probability of occurrence that could affect the city with intensity less than or equal to VIII

  3. The Earth System Grid Federation : an Open Infrastructure for Access to Distributed Geospatial Data

    Science.gov (United States)

    Cinquini, Luca; Crichton, Daniel; Mattmann, Chris; Harney, John; Shipman, Galen; Wang, Feiyi; Ananthakrishnan, Rachana; Miller, Neill; Denvil, Sebastian; Morgan, Mark; hide

    2012-01-01

    The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF's architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL, GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).

  4. ActivitySim: large-scale agent based activity generation for infrastructure simulation

    Energy Technology Data Exchange (ETDEWEB)

    Gali, Emmanuel [Los Alamos National Laboratory; Eidenbenz, Stephan [Los Alamos National Laboratory; Mniszewski, Sue [Los Alamos National Laboratory; Cuellar, Leticia [Los Alamos National Laboratory; Teuscher, Christof [PORTLAND STATE UNIV

    2008-01-01

    The United States' Department of Homeland Security aims to model, simulate, and analyze critical infrastructure and their interdependencies across multiple sectors such as electric power, telecommunications, water distribution, transportation, etc. We introduce ActivitySim, an activity simulator for a population of millions of individual agents each characterized by a set of demographic attributes that is based on US census data. ActivitySim generates daily schedules for each agent that consists of a sequence of activities, such as sleeping, shopping, working etc., each being scheduled at a geographic location, such as businesses or private residences that is appropriate for the activity type and for the personal situation of the agent. ActivitySim has been developed as part of a larger effort to understand the interdependencies among national infrastructure networks and their demand profiles that emerge from the different activities of individuals in baseline scenarios as well as emergency scenarios, such as hurricane evacuations. We present the scalable software engineering principles underlying ActivitySim, the socia-technical modeling paradigms that drive the activity generation, and proof-of-principle results for a scenario in the Twin Cities, MN area of 2.6 M agents.

  5. Open | SpeedShop: An Open Source Infrastructure for Parallel Performance Analysis

    Directory of Open Access Journals (Sweden)

    Martin Schulz

    2008-01-01

    Full Text Available Over the last decades a large number of performance tools has been developed to analyze and optimize high performance applications. Their acceptance by end users, however, has been slow: each tool alone is often limited in scope and comes with widely varying interfaces and workflow constraints, requiring different changes in the often complex build and execution infrastructure of the target application. We started the Open | SpeedShop project about 3 years ago to overcome these limitations and provide efficient, easy to apply, and integrated performance analysis for parallel systems. Open | SpeedShop has two different faces: it provides an interoperable tool set covering the most common analysis steps as well as a comprehensive plugin infrastructure for building new tools. In both cases, the tools can be deployed to large scale parallel applications using DPCL/Dyninst for distributed binary instrumentation. Further, all tools developed within or on top of Open | SpeedShop are accessible through multiple fully equivalent interfaces including an easy-to-use GUI as well as an interactive command line interface reducing the usage threshold for those tools.

  6. Next generation terminology infrastructure to support interprofessional care planning.

    Science.gov (United States)

    Collins, Sarah; Klinkenberg-Ramirez, Stephanie; Tsivkin, Kira; Mar, Perry L; Iskhakova, Dina; Nandigam, Hari; Samal, Lipika; Rocha, Roberto A

    2017-11-01

    Develop a prototype of an interprofessional terminology and information model infrastructure that can enable care planning applications to facilitate patient-centered care, learn care plan linkages and associations, provide decision support, and enable automated, prospective analytics. The study steps included a 3 step approach: (1) Process model and clinical scenario development, and (2) Requirements analysis, and (3) Development and validation of information and terminology models. Components of the terminology model include: Health Concerns, Goals, Decisions, Interventions, Assessments, and Evaluations. A terminology infrastructure should: (A) Include discrete care plan concepts; (B) Include sets of profession-specific concerns, decisions, and interventions; (C) Communicate rationales, anticipatory guidance, and guidelines that inform decisions among the care team; (D) Define semantic linkages across clinical events and professions; (E) Define sets of shared patient goals and sub-goals, including patient stated goals; (F) Capture evaluation toward achievement of goals. These requirements were mapped to AHRQ Care Coordination Measures Framework. This study used a constrained set of clinician-validated clinical scenarios. Terminology models for goals and decisions are unavailable in SNOMED CT, limiting the ability to evaluate these aspects of the proposed infrastructure. Defining and linking subsets of care planning concepts appears to be feasible, but also essential to model interprofessional care planning for common co-occurring conditions and chronic diseases. We recommend the creation of goal dynamics and decision concepts in SNOMED CT to further enable the necessary models. Systems with flexible terminology management infrastructure may enable intelligent decision support to identify conflicting and aligned concerns, goals, decisions, and interventions in shared care plans, ultimately decreasing documentation effort and cognitive burden for clinicians and

  7. MPEG-4 IPMP Extension for Interoperable Protection of Multimedia Content

    Directory of Open Access Journals (Sweden)

    Zeng Wenjun

    2004-01-01

    Full Text Available To ensure secure content delivery, the Motion Picture Experts Group (MPEG has dedicated significant effort to the digital rights management (DRM issues. MPEG is now moving from defining only hooks to proprietary systems (e.g., in MPEG-2, MPEG-4 Version 1 to specifying a more encompassing standard in intellectual property management and protection (IPMP. MPEG feels that this is necessary in order to achieve MPEG's most important goal: interoperability. The design of the IPMP Extension framework also considers the complexity of the MPEG-4 standard and the diversity of its applications. This architecture leaves the details of the design of IPMP tools in the hands of applications developers, while ensuring the maximum flexibility and security. This paper first briefly describes the background of the development of the MPEG-4 IPMP Extension. It then presents an overview of the MPEG-4 IPMP Extension, including its architecture, the flexible protection signaling, and the secure messaging framework for the communication between the terminal and the tools. Two sample usage scenarios are also provided to illustrate how an MPEG-4 IPMP Extension compliant system works.

  8. Inter-operator Variability in Defining Uterine Position Using Three-dimensional Ultrasound Imaging

    DEFF Research Database (Denmark)

    Baker, Mariwan; Jensen, Jørgen Arendt; Behrens, Claus F.

    2013-01-01

    significantly larger inter-fractional uterine positional displacement, in some cases up to 20 mm, which outweighs the magnitude of current inter-operator variations. Thus, the current US-phantom-study suggests that the inter-operator variability in addressing uterine position is clinically irrelevant.......In radiotherapy the treatment outcome of gynecological (GYN) cancer patients is crucially related to reproducibility of the actual uterine position. The purpose of this study is to evaluate the inter-operator variability in addressing uterine position using a novel 3-D ultrasound (US) system....... The study is initiated by US-scanning of a uterine phantom (CIRS 404, Universal Medical, Norwood, USA) by seven experienced US operators. The phantom represents a female pelvic region, containing a uterus, bladder and rectal landmarks readily definable in the acquired US-scans. The organs are subjected...

  9. Biodiversity scenarios neglect future land-use changes.

    Science.gov (United States)

    Titeux, Nicolas; Henle, Klaus; Mihoub, Jean-Baptiste; Regos, Adrián; Geijzendorffer, Ilse R; Cramer, Wolfgang; Verburg, Peter H; Brotons, Lluís

    2016-07-01

    Efficient management of biodiversity requires a forward-looking approach based on scenarios that explore biodiversity changes under future environmental conditions. A number of ecological models have been proposed over the last decades to develop these biodiversity scenarios. Novel modelling approaches with strong theoretical foundation now offer the possibility to integrate key ecological and evolutionary processes that shape species distribution and community structure. Although biodiversity is affected by multiple threats, most studies addressing the effects of future environmental changes on biodiversity focus on a single threat only. We examined the studies published during the last 25 years that developed scenarios to predict future biodiversity changes based on climate, land-use and land-cover change projections. We found that biodiversity scenarios mostly focus on the future impacts of climate change and largely neglect changes in land use and land cover. The emphasis on climate change impacts has increased over time and has now reached a maximum. Yet, the direct destruction and degradation of habitats through land-use and land-cover changes are among the most significant and immediate threats to biodiversity. We argue that the current state of integration between ecological and land system sciences is leading to biased estimation of actual risks and therefore constrains the implementation of forward-looking policy responses to biodiversity decline. We suggest research directions at the crossroads between ecological and environmental sciences to face the challenge of developing interoperable and plausible projections of future environmental changes and to anticipate the full range of their potential impacts on biodiversity. An intergovernmental platform is needed to stimulate such collaborative research efforts and to emphasize the societal and political relevance of taking up this challenge. © 2016 John Wiley & Sons Ltd.

  10. Towards Patient-Centric Telehealth: a Journey into ICT Infrastructures and User Modeling

    DEFF Research Database (Denmark)

    Jørgensen, Daniel Bjerring

    Problem setting: The problem setting for this thesis is the telehealth domain. Telehealth is addressed from two perspectives: ICT infrastructures and personalized telehealth. ICT infrastructures are addressed on both the local level concerning the systems that are deployed in patients’ homes......, and on the national level concerning the transmission of data in end-to-end infrastructural scenarios. Personalized telehealth concerns the design of telehealth systems that are able to fit the everyday life of their patients. Problem and Research questions: The problem setting was formalized in a principal research...... events in the CASAS datasets. New ideas for research directions have been spawned by the ICT infrastructure: tools to strengthen and supporting telehealth patients’ motivation and identification of patterns, if such exist, indicating correlations between changes to a telehealth patient’s behavioral...

  11. A Smart City Lighting Case Study on an OpenStack-Powered Infrastructure

    Directory of Open Access Journals (Sweden)

    Giovanni Merlino

    2015-07-01

    Full Text Available The adoption of embedded systems, mobile devices and other smart devices keeps rising globally, and the scope of their involvement broadens, for instance, in smart city-like scenarios. In light of this, a pressing need emerges to tame such complexity and reuse as much tooling as possible without resorting to vertical ad hoc solutions, while at the same time taking into account valid options with regard to infrastructure management and other more advanced functionalities. Existing solutions mainly focus on core mechanisms and do not allow one to scale by leveraging infrastructure or adapt to a variety of scenarios, especially if actuators are involved in the loop. A new, more flexible, cloud-based approach, able to provide device-focused workflows, is required. In this sense, a widely-used and competitive framework for infrastructure as a service, such as OpenStack, with its breadth in terms of feature coverage and expanded scope, looks to fit the bill, replacing current application-specific approaches with an innovative application-agnostic one. This work thus describes the rationale, efforts and results so far achieved for an integration of IoT paradigms and resource ecosystems with such a kind of cloud-oriented device-centric environment, by focusing on a smart city scenario, namely a park smart lighting example, and featuring data collection, data visualization, event detection and coordinated reaction, as example use cases of such integration.

  12. Collection and Processing of Data from Wrist Wearable Devices in Heterogeneous and Multiple-User Scenarios

    Directory of Open Access Journals (Sweden)

    Francisco de Arriba-Pérez

    2016-09-01

    Full Text Available Over recent years, we have witnessed the development of mobile and wearable technologies to collect data from human vital signs and activities. Nowadays, wrist wearables including sensors (e.g., heart rate, accelerometer, pedometer that provide valuable data are common in market. We are working on the analytic exploitation of this kind of data towards the support of learners and teachers in educational contexts. More precisely, sleep and stress indicators are defined to assist teachers and learners on the regulation of their activities. During this development, we have identified interoperability challenges related to the collection and processing of data from wearable devices. Different vendors adopt specific approaches about the way data can be collected from wearables into third-party systems. This hinders such developments as the one that we are carrying out. This paper contributes to identifying key interoperability issues in this kind of scenario and proposes guidelines to solve them. Taking into account these topics, this work is situated in the context of the standardization activities being carried out in the Internet of Things and Machine to Machine domains.

  13. Collection and Processing of Data from Wrist Wearable Devices in Heterogeneous and Multiple-User Scenarios.

    Science.gov (United States)

    de Arriba-Pérez, Francisco; Caeiro-Rodríguez, Manuel; Santos-Gago, Juan M

    2016-09-21

    Over recent years, we have witnessed the development of mobile and wearable technologies to collect data from human vital signs and activities. Nowadays, wrist wearables including sensors (e.g., heart rate, accelerometer, pedometer) that provide valuable data are common in market. We are working on the analytic exploitation of this kind of data towards the support of learners and teachers in educational contexts. More precisely, sleep and stress indicators are defined to assist teachers and learners on the regulation of their activities. During this development, we have identified interoperability challenges related to the collection and processing of data from wearable devices. Different vendors adopt specific approaches about the way data can be collected from wearables into third-party systems. This hinders such developments as the one that we are carrying out. This paper contributes to identifying key interoperability issues in this kind of scenario and proposes guidelines to solve them. Taking into account these topics, this work is situated in the context of the standardization activities being carried out in the Internet of Things and Machine to Machine domains.

  14. Adaptation of interoperability standards for cross domain usage

    Science.gov (United States)

    Essendorfer, B.; Kerth, Christian; Zaschke, Christian

    2017-05-01

    As globalization affects most aspects of modern life, challenges of quick and flexible data sharing apply to many different domains. To protect a nation's security for example, one has to look well beyond borders and understand economical, ecological, cultural as well as historical influences. Most of the time information is produced and stored digitally and one of the biggest challenges is to receive relevant readable information applicable to a specific problem out of a large data stock at the right time. These challenges to enable data sharing across national, organizational and systems borders are known to other domains (e.g., ecology or medicine) as well. Solutions like specific standards have been worked on for the specific problems. The question is: what can the different domains learn from each other and do we have solutions when we need to interlink the information produced in these domains? A known problem is to make civil security data available to the military domain and vice versa in collaborative operations. But what happens if an environmental crisis leads to the need to quickly cooperate with civil or military security in order to save lives? How can we achieve interoperability in such complex scenarios? The paper introduces an approach to adapt standards from one domain to another and lines out problems that have to be overcome and limitations that may apply.

  15. Towards multi-layer interoperability of heterogeneous IoT platforms : the INTER-IoT approach

    NARCIS (Netherlands)

    Fortino, Giancarlo; Savaglio, Claudio; Palau, Carlos E.; de Puga, Jara Suarez; Ghanza, Maria; Paprzycki, Marcin; Montesinos, Miguel; Liotta, Antonio; Llop, Miguel; Gravina, R.; Palau, C.E.; Manso, M.; Liotta, A.; Fortino, G.

    2018-01-01

    Open interoperability delivers on the promise of enabling vendors and developers to interact and interoperate, without interfering with anyone’s ability to compete by delivering a superior product and experience. In the absence of global IoT standards, the INTER-IoT voluntary approach will support

  16. CCSDS SM and C Mission Operations Interoperability Prototype

    Science.gov (United States)

    Lucord, Steven A.

    2010-01-01

    This slide presentation reviews the prototype of the Spacecraft Monitor and Control (SM&C) Operations for interoperability among other space agencies. This particular prototype uses the German Space Agency (DLR) to test the ideas for interagency coordination.

  17. 47 CFR 0.192 - Emergency Response Interoperability Center.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Emergency Response Interoperability Center. 0.192 Section 0.192 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL COMMISSION ORGANIZATION..., industry representatives, and service providers. [75 FR 28207, May 20, 2010] ...

  18. Permafrost Hazards and Linear Infrastructure

    Science.gov (United States)

    Stanilovskaya, Julia; Sergeev, Dmitry

    2014-05-01

    climate change. Extra maintenance activity is needed for existence infrastructure to stay operable. Engineers should run climate models under the most pessimistic scenarios when planning new infrastructure projects. That would allow reducing the potential shortcomings related to the permafrost thawing.

  19. Latest developments for the IAGOS database: Interoperability and metadata

    Science.gov (United States)

    Boulanger, Damien; Gautron, Benoit; Thouret, Valérie; Schultz, Martin; van Velthoven, Peter; Broetz, Bjoern; Rauthe-Schöch, Armin; Brissebrat, Guillaume

    2014-05-01

    In-service Aircraft for a Global Observing System (IAGOS, http://www.iagos.org) aims at the provision of long-term, frequent, regular, accurate, and spatially resolved in situ observations of the atmospheric composition. IAGOS observation systems are deployed on a fleet of commercial aircraft. The IAGOS database is an essential part of the global atmospheric monitoring network. Data access is handled by open access policy based on the submission of research requests which are reviewed by the PIs. Users can access the data through the following web sites: http://www.iagos.fr or http://www.pole-ether.fr as the IAGOS database is part of the French atmospheric chemistry data centre ETHER (CNES and CNRS). The database is in continuous development and improvement. In the framework of the IGAS project (IAGOS for GMES/COPERNICUS Atmospheric Service), major achievements will be reached, such as metadata and format standardisation in order to interoperate with international portals and other databases, QA/QC procedures and traceability, CARIBIC (Civil Aircraft for the Regular Investigation of the Atmosphere Based on an Instrument Container) data integration within the central database, and the real-time data transmission. IGAS work package 2 aims at providing the IAGOS data to users in a standardized format including the necessary metadata and information on data processing, data quality and uncertainties. We are currently redefining and standardizing the IAGOS metadata for interoperable use within GMES/Copernicus. The metadata are compliant with the ISO 19115, INSPIRE and NetCDF-CF conventions. IAGOS data will be provided to users in NetCDF or NASA Ames format. We also are implementing interoperability between all the involved IAGOS data services, including the central IAGOS database, the former MOZAIC and CARIBIC databases, Aircraft Research DLR database and the Jülich WCS web application JOIN (Jülich OWS Interface) which combines model outputs with in situ data for

  20. Modeling Interoperable Information Systems with 3LGM² and IHE.

    Science.gov (United States)

    Stäubert, S; Schaaf, M; Jahn, F; Brandner, R; Winter, A

    2015-01-01

    Strategic planning of information systems (IS) in healthcare requires descriptions of the current and the future IS state. Enterprise architecture planning (EAP) tools like the 3LGM² tool help to build up and to analyze IS models. A model of the planned architecture can be derived from an analysis of current state IS models. Building an interoperable IS, i. e. an IS consisting of interoperable components, can be considered a relevant strategic information management goal for many IS in healthcare. Integrating the healthcare enterprise (IHE) is an initiative which targets interoperability by using established standards. To link IHE concepts to 3LGM² concepts within the 3LGM² tool. To describe how an information manager can be supported in handling the complex IHE world and planning interoperable IS using 3LGM² models. To describe how developers or maintainers of IHE profiles can be supported by the representation of IHE concepts in 3LGM². Conceptualization and concept mapping methods are used to assign IHE concepts such as domains, integration profiles actors and transactions to the concepts of the three-layer graph-based meta-model (3LGM²). IHE concepts were successfully linked to 3LGM² concepts. An IHE-master-model, i. e. an abstract model for IHE concepts, was modeled with the help of 3LGM² tool. Two IHE domains were modeled in detail (ITI, QRPH). We describe two use cases for the representation of IHE concepts and IHE domains as 3LGM² models. Information managers can use the IHE-master-model as reference model for modeling interoperable IS based on IHE profiles during EAP activities. IHE developers are supported in analyzing consistency of IHE concepts with the help of the IHE-master-model and functions of the 3LGM² tool The complex relations between IHE concepts can be modeled by using the EAP method 3LGM². 3LGM² tool offers visualization and analysis features which are now available for the IHE-master-model. Thus information managers and IHE

  1. Data Storage and Management for Global Research Data Infrastructures - Status and Perspectives

    Directory of Open Access Journals (Sweden)

    Erwin Laure

    2013-07-01

    Full Text Available In the vision of Global Research Data Infrastructures (GRDIs, data storage and management plays a crucial role. A successful GRDI will require a common globally interoperable distributed data system, formed out of data centres, that incorporates emerging technologies and new scientific data activities. The main challenge is to define common certification and auditing frameworks that will allow storage providers and data communities to build a viable partnership based on trust. To achieve this, it is necessary to find a long-term commitment model that will give financial, legal, and organisational guarantees of digital information preservation. In this article we discuss the state of the art in data storage and management for GRDIs and point out future research directions that need to be tackled to implement GRDIs.

  2. A federated semantic metadata registry framework for enabling interoperability across clinical research and care domains.

    Science.gov (United States)

    Sinaci, A Anil; Laleci Erturkmen, Gokce B

    2013-10-01

    In order to enable secondary use of Electronic Health Records (EHRs) by bridging the interoperability gap between clinical care and research domains, in this paper, a unified methodology and the supporting framework is introduced which brings together the power of metadata registries (MDR) and semantic web technologies. We introduce a federated semantic metadata registry framework by extending the ISO/IEC 11179 standard, and enable integration of data element registries through Linked Open Data (LOD) principles where each Common Data Element (CDE) can be uniquely referenced, queried and processed to enable the syntactic and semantic interoperability. Each CDE and their components are maintained as LOD resources enabling semantic links with other CDEs, terminology systems and with implementation dependent content models; hence facilitating semantic search, much effective reuse and semantic interoperability across different application domains. There are several important efforts addressing the semantic interoperability in healthcare domain such as IHE DEX profile proposal, CDISC SHARE and CDISC2RDF. Our architecture complements these by providing a framework to interlink existing data element registries and repositories for multiplying their potential for semantic interoperability to a greater extent. Open source implementation of the federated semantic MDR framework presented in this paper is the core of the semantic interoperability layer of the SALUS project which enables the execution of the post marketing safety analysis studies on top of existing EHR systems. Copyright © 2013 Elsevier Inc. All rights reserved.

  3. Extensions to Traditional Spatial Data Infrastructures: Integration of Social Media, Synchronization of Datasets, and Data on the Go in GeoPackages

    Science.gov (United States)

    Simonis, Ingo

    2015-04-01

    Traditional Spatial Data Infrastructures focus on aspects such as description and discovery of geospatial data, integration of these data into processing workflows, and representation of fusion or other data analysis results. Though lots of interoperability agreements still need to be worked out to achieve a satisfying level of interoperability within large scale initiatives such as INSPIRE, new technologies, use cases and requirements are constantly emerging from the user community. This paper focuses on three aspects that came up recently: The integration of social media data into SDIs, synchronization aspects between datasets used by field workers in shared resources environments, and the generation and maintenance of data for mixed mode online/offline situations that can be easily packed, delivered, modified, and synchronized with reference data sets. The work described in this paper results from the latest testbed executed by the Open Geospatial Consortium, OGC. The testbed is part of the interoperability program (IP), which constitutes a significant part of the OGC standards development process. The IP has a number of instruments to enhance geospatial standards and technologies, such as Testbeds, Pilot Projects, Interoperability Experiments, and Interoperability Expert Services. These activities are designed to encourage rapid development, testing, validation, demonstration and adoption of open, consensus based standards and best practices. The latest global activity, testbed-11, aims at exploring new technologies and architectural approaches to enrich and extend traditional spatial data infrastructures with data from Social Media, improved data synchronization, and the capability to take data to the field in new synchronized data containers called GeoPackages. Social media sources are a valuable supplement to providing up to date information in distributed environments. Following an uncoordinated crowdsourcing approach, social media data can be both

  4. The development of a prototype level-three interoperable catalog system

    Science.gov (United States)

    Hood, Carroll A.; Howie, Randy; Verhanovitz, Rich

    1993-08-01

    The development of a level-three interoperable catalog system is defined by a new paradigm for metadata access. The old paradigm is characterized by a hierarchy of metadata layers, the transfer of control to target systems, and the requirement for the user to be familiar with the syntax and data dictionaries of several catalog system elements. Attributes of the new paradigm are exactly orthogonal: the directory and inventories are peer entities, there is a single user interface, and the system manages the complexity of interacting transparently with remote elements. We have designed and implemented a prototype level-three interoperable catalog system based on the new paradigm. Through a single intelligent interface, users can interoperably access a master directory, inventories for selected satellite datasets, and an in situ meteorological dataset inventory. This paper describes the development of the prototype system and three of the formidable challenges that were addressed in the process. The first involved the interoperable integration of satellite and in situ inventories, which to our knowledge, has never been operationally demonstrated. The second was the development of a search strategy for orbital and suborbital granules which preserves the capability to identify temporally or spatially coincident subsets between them. The third involved establishing a method of incorporating inventory-specific search criteria into user queries. We are working closely with selected science data users to obtain feedback on the system's design and performance. The lessons learned from this prototype will help direct future development efforts. Distributed data systems of the 1990s such as EOSDIS and the Global Change Data and Information System (GCDIS) will be able to build on this prototype.

  5. Rollout Strategy to Implement Interoperable Traceability in the Seafood Industry.

    Science.gov (United States)

    Gooch, Martin; Dent, Benjamin; Sylvia, Gilbert; Cusack, Christopher

    2017-08-01

    Verifying the accuracy and rigor of data exchanged within and between businesses for the purposes of traceability rests on the existence of effective and efficient interoperable information systems that meet users' needs. Interoperability, particularly given the complexities intrinsic to the seafood industry, requires that the systems used by businesses operating along the supply chain share a common technology architecture that is robust, resilient, and evolves as industry needs change. Technology architectures are developed through engaging industry stakeholders in understanding why an architecture is required, the benefits provided to the industry and individual businesses and supply chains, and how the architecture will translate into practical results. This article begins by reiterating the benefits that the global seafood industry can capture by implementing interoperable chain-length traceability and the reason for basing the architecture on a peer-to-peer networked database concept versus more traditional centralized or linear approaches. A summary of capabilities that already exist within the seafood industry that the proposed architecture uses is discussed; and a strategy for implementing the architecture is presented. The 6-step strategy is presented in the form of a critical path. © 2017 Institute of Food Technologists®.

  6. Meeting People’s Needs in a Fully Interoperable Domotic Environment

    Directory of Open Access Journals (Sweden)

    Vittorio Miori

    2012-05-01

    Full Text Available The key idea underlying many Ambient Intelligence (AmI projects and applications is context awareness, which is based mainly on their capacity to identify users and their locations. The actual computing capacity should remain in the background, in the periphery of our awareness, and should only move to the center if and when necessary. Computing thus becomes ‘invisible’, as it is embedded in the environment and everyday objects. The research project described herein aims to realize an Ambient Intelligence-based environment able to improve users’ quality of life by learning their habits and anticipating their needs. This environment is part of an adaptive, context-aware framework designed to make today’s incompatible heterogeneous domotic systems fully interoperable, not only for connecting sensors and actuators, but for providing comprehensive connections of devices to users. The solution is a middleware architecture based on open and widely recognized standards capable of abstracting the peculiarities of underlying heterogeneous technologies and enabling them to co-exist and interwork, without however eliminating their differences. At the highest level of this infrastructure, the Ambient Intelligence framework, integrated with the domotic sensors, can enable the system to recognize any unusual or dangerous situations and anticipate health problems or special user needs in a technological living environment, such as a house or a public space.

  7. Radio Interoperability: There Is More to It Than Hardware

    National Research Council Canada - National Science Library

    Hutchins, Susan G; Timmons, Ronald P

    2007-01-01

    Radio Interoperability: The Problem *Superfluous radio transmissions contribute to auditory overload of first responders -Obscure development of an accurate operational picture for all involved -Radio spectrum is a limited commodity once...

  8. Collaborative Development of e-Infrastructures and Data Management Practices for Global Change Research

    Science.gov (United States)

    Samors, R. J.; Allison, M. L.

    2016-12-01

    An e-infrastructure that supports data-intensive, multidisciplinary research is being organized under the auspices of the Belmont Forum consortium of national science funding agencies to accelerate the pace of science to address 21st century global change research challenges. The pace and breadth of change in information management across the data lifecycle means that no one country or institution can unilaterally provide the leadership and resources required to use data and information effectively, or needed to support a coordinated, global e-infrastructure. The five action themes adopted by the Belmont Forum: 1. Adopt and make enforceable Data Principles that establish a global, interoperable e-infrastructure. 2. Foster communication, collaboration and coordination between the wider research community and Belmont Forum and its projects through an e-Infrastructure Coordination, Communication, & Collaboration Office. 3. Promote effective data planning and stewardship in all Belmont Forum agency-funded research with a goal to make it enforceable. 4. Determine international and community best practice to inform Belmont Forum research e-infrastructure policy through identification and analysis of cross-disciplinary research case studies. 5. Support the development of a cross-disciplinary training curriculum to expand human capacity in technology and data-intensive analysis methods. The Belmont Forum is ideally poised to play a vital and transformative leadership role in establishing a sustained human and technical international data e-infrastructure to support global change research. In 2016, members of the 23-nation Belmont Forum began a collaborative implementation phase. Four multi-national teams are undertaking Action Themes based on the recommendations above. Tasks include mapping the landscape, identifying and documenting existing data management plans, and scheduling a series of workshops that analyse trans-disciplinary applications of existing Belmont Forum

  9. Collaborative ocean resource interoperability - multi-use of ocean data on the semantic web

    OpenAIRE

    Tao, Feng; Campbell, Jon; Pagnani, Maureen; Griffiths, Gwyn

    2009-01-01

    Earth Observations (EO) collect various characteristics of the objective environment using sensors which often have different measuring, spatial and temporal coverage. Making individual observational data interoperable becomes equally important when viewed in the context of its expensive and time-consuming EO operations. Interoperability will improve reusability of existing observations in both the broader context, and with other observations. As a demonstration of the potential offered by se...

  10. Infrastructure for China’s Ecologically Balanced Civilization†

    Directory of Open Access Journals (Sweden)

    Chris Kennedy

    2016-12-01

    Full Text Available China’s green investment needs up to 2020 are ¥1.7 trillion–2.9 trillion CNY ($274 billion–468 billion USD per year. Estimates of financing requirements are provided for multiple sectors, including sustainable energy, infrastructure (including for environmental protection, environmental remediation, industrial pollution control, energy and water efficiency, and green products. The context to China’s green financing is discussed, covering urbanization, climate change, interactions between infrastructure sectors, and the transformation of industry. Much of the infrastructure financing will occur in cities, with a focus on equity, environmental protection, and quality of life under the National New-Type Urbanization Plan (2014–2020. China has implemented many successful policies in the building sector, but there is still considerable scope for improvement in the energy efficiency of Chinese buildings. China is currently pursuing low-carbon growth strategies that are consistent with its overall environmental and quality-of-life objectives. Beyond 2020, China’s future as an ecologically balanced civilization will rest on the implementation of a central infrastructure policy: China 2050 High Renewable Energy Penetration Scenario and Roadmap Study. As exemplified by the Circular Economy Development Strategy and Near-Term Action Plan, an essential part of China’s green industrial transformation involves engineering systems that conserve materials, thereby reducing or even eliminating wastes. To better understand changes to China’s economy under its green transformation and to unlock large potential sources of finance, it is necessary to undertake a fuller examination of all of China’s infrastructure sectors, particularly freight rail infrastructure and ports. Large investments are required to clean up a legacy of environmental contamination of soil and groundwater and to reduce industrial pollution. Transformation of the power sector

  11. Implementation of a large-scale hospital information infrastructure for multi-unit health-care services.

    Science.gov (United States)

    Yoo, Sun K; Kim, Dong Keun; Kim, Jung C; Park, Youn Jung; Chang, Byung Chul

    2008-01-01

    With the increase in demand for high quality medical services, the need for an innovative hospital information system has become essential. An improved system has been implemented in all hospital units of the Yonsei University Health System. Interoperability between multi-units required appropriate hardware infrastructure and software architecture. This large-scale hospital information system encompassed PACS (Picture Archiving and Communications Systems), EMR (Electronic Medical Records) and ERP (Enterprise Resource Planning). It involved two tertiary hospitals and 50 community hospitals. The monthly data production rate by the integrated hospital information system is about 1.8 TByte and the total quantity of data produced so far is about 60 TByte. Large scale information exchange and sharing will be particularly useful for telemedicine applications.

  12. Design of large-scale enterprise interoperable value webs

    NARCIS (Netherlands)

    Hofman, W.J.

    2011-01-01

    Still a lot of enterprises are faced with the issue of interoperability. Whereas large enterprises are able to implement the required technology, SMEs (Small and Medium sized Enterprises) face challenges as they lack knowledge and budget. Enterprises have defined their specific semantics and

  13. Importance of physical infrastructure in the economic growth of municipalities in the northern border

    Directory of Open Access Journals (Sweden)

    Héctor Alonso Barajas Bustillos

    2012-01-01

    Full Text Available This paper evaluates the importance of infrastructure on economic growth for the Mexican northern border municipalities. From the growth literature, we know that infrastructure, besides other factors, has been pointed out as a key factor in the long run perspectives of regional growth. Nevertheless, and within this thematic context, works that use a disaggregated analysis down to the level of the municipality are still scarce, which in the case of the northern border constitutes a scenario of much relevance given the development model adopted by the Mexican economy in recent decades. Empirical models results indicate that the municipalities of Piedras Negras, Nogales and Torreon maintain a positive relationship between physical infrastructure and growth. In the case of other municipalities like Tijuana, its elevated population growth inhibits a proper infrastructure allocation, although the positive effect induced by infrastructure on growth remains.

  14. Environmental and natural resource implications of sustainable urban infrastructure systems

    Science.gov (United States)

    Bergesen, Joseph D.; Suh, Sangwon; Baynes, Timothy M.; Kaviti Musango, Josephine

    2017-12-01

    As cities grow, their environmental and natural resource footprints also tend to grow to keep up with the increasing demand on essential urban services such as passenger transportation, commercial space, and thermal comfort. The urban infrastructure systems, or socio-technical systems providing these services are the major conduits through which natural resources are consumed and environmental impacts are generated. This paper aims to gauge the potential reductions in environmental and resources footprints through urban transformation, including the deployment of resource-efficient socio-technical systems and strategic densification. Using hybrid life cycle assessment approach combined with scenarios, we analyzed the greenhouse gas (GHG) emissions, water use, metal consumption and land use of selected socio-technical systems in 84 cities from the present to 2050. The socio-technical systems analyzed are: (1) bus rapid transit with electric buses, (2) green commercial buildings, and (3) district energy. We developed a baseline model for each city considering gross domestic product, population density, and climate conditions. Then, we overlaid three scenarios on top of the baseline model: (1) decarbonization of electricity, (2) aggressive deployment of resource-efficient socio-technical systems, and (3) strategic urban densification scenarios to each city and quantified their potentials in reducing the environmental and resource impacts of cities by 2050. The results show that, under the baseline scenario, the environmental and natural resource footprints of all 84 cities combined would increase 58%-116% by 2050. The resource-efficient scenario along with strategic densification, however, has the potential to curve down GHG emissions to 17% below the 2010 level in 2050. Such transformation can also limit the increase in all resource footprints to less than 23% relative to 2010. This analysis suggests that resource-efficient urban infrastructure and decarbonization of

  15. Urban Planning Dealing with Change and Infrastructure

    Directory of Open Access Journals (Sweden)

    Sonja Deppisch

    2015-07-01

    Full Text Available This paper deals with urban planning and change processes potentially impacting local infrastructure. The overarching theoretical frame is social-ecological resilience thinking and its potential application to as well as implications for urban land-use development. The paper draws its main attention on if this concept can be of use for urban planners dealing with change and urban infrastructure and if a readiness towards its application can be identified. This endeavor is informed by two explorative studies in Germany. One study gains its material from a scenario process with planning practitioners and further urban stakeholders of a medium-sized city. Main topic was how to deal with the challenges of climate change impacts in urban planning and development. The second explorative study reflects research results on the readiness to apply the resilience concept to urban planning dealing with change and local infrastructure in a small community. The scenario process showed that applying social-ecological resilience thinking to urban planning helps to critically reflect so far taken paths in local built infrastructure, to take on an integrated perspective and to develop new and innovative strategies for further land-use development. Nevertheless, such a process requires additional financial as well as human resources and translation exercises. Also, the given path dependency as well as financial constrains are hindering to perceive any leeway in infrastructure development at the political level, so that any real implementation at the moment seems to be out of sight, which is also caused by multi-level dependencies.  Normal 0 21 false false false DE X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Normale Tabelle"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin-top:0cm; mso-para-margin-right:0cm

  16. Open Health Tools: Tooling for Interoperable Healthcare

    Directory of Open Access Journals (Sweden)

    Skip McGaughey

    2008-11-01

    Full Text Available The Open Health Tools initiative is creating an ecosystem focused on the production of software tooling that promotes the exchange of medical information across political, geographic, cultural, product, and technology lines. At its core, OHT believes that the availability of high-quality tooling that interoperates will propel the industry forward, enabling organizations and vendors to build products and systems that effectively work together. This will ?raise the interoperability bar? as a result of having tools that just work. To achieve these lofty goals, careful consideration must be made to the constituencies that will be most affected by an OHT-influenced world. This document outlines a vision of OHT?s impact to these stakeholders. It does not explain the OHT process itself or how the OHT community operates. Instead, we place emphasis on the impact of that process within the health industry. The catchphrase ?code is king? underpins this document, meaning that the manifestation of any open source community lies in the products and technology it produces.

  17. Regulatory Barriers Blocking Standardization of Interoperability

    OpenAIRE

    Zhong, Daidi; Kirwan, Michael J; Duan, Xiaolian

    2013-01-01

    Developing and implementing a set of personal health device interoperability standards is key to cultivating a healthy global industry ecosystem. The standardization organizations, including the Institute of Electrical and Electronics Engineers 11073 Personal Health Device Workgroup (IEEE 11073-PHD WG) and Continua Health Alliance, are striving for this purpose. However, factors like the medial device regulation, health policy, and market reality have placed non-technical barriers over the ad...

  18. Interoperability in the e-Government Context

    Science.gov (United States)

    2012-01-01

    TN-014 | 3 ing e- government systems focus primarily on these technical challenges [UNDP 2007a, p. 10; CS Transform 2009, p. 3]. More recently...Thailand’s government hits its own wall. Responding agencies and non- governmental groups are unable to share information vital to the rescue effort...Interoperability and Open Standards for e- Governance .” egov (Sep. 1, 2007): 17–19. [Secretary General, United Nations 2010] Secretary General, United

  19. Infrastructure for genomic interactions: Bioconductor classes for Hi-C, ChIA-PET and related experiments [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Aaron T. L. Lun

    2016-05-01

    Full Text Available The study of genomic interactions has been greatly facilitated by techniques such as chromatin conformation capture with high-throughput sequencing (Hi-C. These genome-wide experiments generate large amounts of data that require careful analysis to obtain useful biological conclusions. However, development of the appropriate software tools is hindered by the lack of basic infrastructure to represent and manipulate genomic interaction data. Here, we present the InteractionSet package that provides classes to represent genomic interactions and store their associated experimental data, along with the methods required for low-level manipulation and processing of those classes. The InteractionSet package exploits existing infrastructure in the open-source Bioconductor project, while in turn being used by Bioconductor packages designed for higher-level analyses. For new packages, use of the functionality in InteractionSet will simplify development, allow access to more features and improve interoperability between packages.

  20. Infrastructure for genomic interactions: Bioconductor classes for Hi-C, ChIA-PET and related experiments [version 2; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Aaron T. L. Lun

    2016-06-01

    Full Text Available The study of genomic interactions has been greatly facilitated by techniques such as chromatin conformation capture with high-throughput sequencing (Hi-C. These genome-wide experiments generate large amounts of data that require careful analysis to obtain useful biological conclusions. However, development of the appropriate software tools is hindered by the lack of basic infrastructure to represent and manipulate genomic interaction data. Here, we present the InteractionSet package that provides classes to represent genomic interactions and store their associated experimental data, along with the methods required for low-level manipulation and processing of those classes. The InteractionSet package exploits existing infrastructure in the open-source Bioconductor project, while in turn being used by Bioconductor packages designed for higher-level analyses. For new packages, use of the functionality in InteractionSet will simplify development, allow access to more features and improve interoperability between packages.

  1. Infrastructure system restoration planning using evolutionary algorithms

    Science.gov (United States)

    Corns, Steven; Long, Suzanna K.; Shoberg, Thomas G.

    2016-01-01

    This paper presents an evolutionary algorithm to address restoration issues for supply chain interdependent critical infrastructure. Rapid restoration of infrastructure after a large-scale disaster is necessary to sustaining a nation's economy and security, but such long-term restoration has not been investigated as thoroughly as initial rescue and recovery efforts. A model of the Greater Saint Louis Missouri area was created and a disaster scenario simulated. An evolutionary algorithm is used to determine the order in which the bridges should be repaired based on indirect costs. Solutions were evaluated based on the reduction of indirect costs and the restoration of transportation capacity. When compared to a greedy algorithm, the evolutionary algorithm solution reduced indirect costs by approximately 12.4% by restoring automotive travel routes for workers and re-establishing the flow of commodities across the three rivers in the Saint Louis area.

  2. Assessing equitable access to urban green space: the role of engineered water infrastructure.

    Science.gov (United States)

    Wendel, Heather E Wright; Downs, Joni A; Mihelcic, James R

    2011-08-15

    Urban green space and water features provide numerous social, environmental, and economic benefits, yet disparities often exist in their distribution and accessibility. This study examines the link between issues of environmental justice and urban water management to evaluate potential improvements in green space and surface water access through the revitalization of existing engineered water infrastructures, namely stormwater ponds. First, relative access to green space and water features were compared for residents of Tampa, Florida, and an inner-city community of Tampa (East Tampa). Although disparities were not found in overall accessibility between Tampa and East Tampa, inequalities were apparent when quality, diversity, and size of green spaces were considered. East Tampa residents had significantly less access to larger, more desirable spaces and water features. Second, this research explored approaches for improving accessibility to green space and natural water using three integrated stormwater management development scenarios. These scenarios highlighted the ability of enhanced water infrastructures to increase access equality at a variety of spatial scales. Ultimately, the "greening" of gray urban water infrastructures is advocated as a way to address environmental justice issues while also reconnecting residents with issues of urban water management.

  3. Holistic Framework For Establishing Interoperability of Heterogeneous Software Development Tools

    National Research Council Canada - National Science Library

    Puett, Joseph

    2003-01-01

    This dissertation presents a Holistic Framework for Software Engineering (HFSE) that establishes collaborative mechanisms by which existing heterogeneous software development tools and models will interoperate...

  4. Managing Uncertainty: The Road Towards Better Data Interoperability

    NARCIS (Netherlands)

    Herschel, M.; van Keulen, Maurice

    Data interoperability encompasses the many data management activities needed for effective information management in anyone´s or any organization´s everyday work such as data cleaning, coupling, fusion, mapping, and information extraction. It is our conviction that a significant amount of money and

  5. Towards Cross-Organizational Innovative Business Process Interoperability Services

    Science.gov (United States)

    Karacan, Ömer; Del Grosso, Enrico; Carrez, Cyril; Taglino, Francesco

    This paper presents the vision and initial results of the COIN (FP7-IST-216256) European project for the development of open source Collaborative Business Process Interoperability (CBPip) in cross-organisational business collaboration environments following the Software-as-a-Service Utility (SaaS-U) paradigm.

  6. Legal and Ethical Issues around Incorporating Traditional Knowledge in Polar Data Infrastructures

    Directory of Open Access Journals (Sweden)

    Teresa Scassa

    2017-02-01

    Full Text Available Human knowledge of the polar region is a unique blend of Western scientific knowledge and local and indigenous knowledge. It is increasingly recognized that to exclude Traditional Knowledge from repositories of polar data would both limit the value of such repositories and perpetuate colonial legacies of exclusion and exploitation. However, the inclusion of Traditional Knowledge within repositories that are conceived and designed for Western scientific knowledge raises its own unique challenges. There is increasing acceptance of the need to make these two knowledge systems interoperable but in addition to the technical challenge there are legal and ethical issues involved. These relate to ‘ownership’ or custodianship of the knowledge; obtaining appropriate consent to gather, use and incorporate this knowledge; being sensitive to potentially different norms regarding access to and sharing of some types of knowledge; and appropriate acknowledgement for data contributors. In some cases, respectful incorporation of Traditional Knowledge may challenge standard conceptions regarding the sharing of data, including through open data licensing. These issues have not been fully addressed in the existing literature on legal interoperability which does not adequately deal with Traditional Knowledge. In this paper we identify legal and ethical norms regarding the use of Traditional Knowledge and explore their application in the particular context of polar data. Drawing upon our earlier work on cybercartography and Traditional Knowledge we identify the elements required in the development of a framework for the inclusion of Traditional Knowledge within data infrastructures.

  7. Cyber Threats to Nuclear Infrastructures

    Energy Technology Data Exchange (ETDEWEB)

    Robert S. Anderson; Paul Moskowitz; Mark Schanfein; Trond Bjornard; Curtis St. Michel

    2010-07-01

    Nuclear facility personnel expend considerable efforts to ensure that their facilities can maintain continuity of operations against both natural and man-made threats. Historically, most attention has been placed on physical security. Recently however, the threat of cyber-related attacks has become a recognized and growing world-wide concern. Much attention has focused on the vulnerability of the electric grid and chemical industries to cyber attacks, in part, because of their use of Supervisory Control and Data Acquisition (SCADA) systems. Lessons learned from work in these sectors indicate that the cyber threat may extend to other critical infrastructures including sites where nuclear and radiological materials are now stored. In this context, this white paper presents a hypothetical scenario by which a determined adversary launches a cyber attack that compromises the physical protection system and results in a reduced security posture at such a site. The compromised security posture might then be malevolently exploited in a variety of ways. The authors conclude that the cyber threat should be carefully considered for all nuclear infrastructures.

  8. Cyber Threats to Nuclear Infrastructures

    International Nuclear Information System (INIS)

    Anderson, Robert S.; Moskowitz, Paul; Schanfein, Mark; Bjornard, Trond; St. Michel, Curtis

    2010-01-01

    Nuclear facility personnel expend considerable efforts to ensure that their facilities can maintain continuity of operations against both natural and man-made threats. Historically, most attention has been placed on physical security. Recently however, the threat of cyber-related attacks has become a recognized and growing world-wide concern. Much attention has focused on the vulnerability of the electric grid and chemical industries to cyber attacks, in part, because of their use of Supervisory Control and Data Acquisition (SCADA) systems. Lessons learned from work in these sectors indicate that the cyber threat may extend to other critical infrastructures including sites where nuclear and radiological materials are now stored. In this context, this white paper presents a hypothetical scenario by which a determined adversary launches a cyber attack that compromises the physical protection system and results in a reduced security posture at such a site. The compromised security posture might then be malevolently exploited in a variety of ways. The authors conclude that the cyber threat should be carefully considered for all nuclear infrastructures.

  9. Impact of Business Interoperability on the Performance of Complex Cooperative Supply Chain Networks: A Case Study

    Directory of Open Access Journals (Sweden)

    Izunildo Cabral

    2018-01-01

    Full Text Available This paper proposes an agent-based model for evaluating the effect of business interoperability on the performance of cooperative supply chain networks. The model is based on insights from the Industrial Marketing and Purchasing network approach and the complex systems theory perspective. To demonstrate its applicability, an explanatory case study regarding a Portuguese reverse logistics cooperative supply chain network is presented. Face-to-face interviews and forms were used to collect data. The findings show that the establishment of appropriate levels of business interoperability has helped to reduce several non-value-added interaction processes and consequently improve the operational performance of the Valorpneu network. Regarding the research implications, this paper extends the current knowledge on business interoperability and an important problem in business: how business interoperability gaps in dyadic organizational relationships affect the network of companies that the two companies belong to—network effect. In terms of practical implications, managers can use the proposed model as a starting point to simulate complex interactions between supply chain network partners and understand better how the performance of their networks emerges from these interactions and from the adoption of different levels of business interoperability.

  10. Evaluation of Green Infrastructure on Peak Flow Mitigation Focusing on the Connectivity of Impervious Areas

    Science.gov (United States)

    Seo, Y.; Hwang, J.; Kwon, Y.

    2017-12-01

    The existence of impervious areas is one of the most distinguishing characteristics of urban catchments. It decreases infiltration and increases direct runoff in urban catchments. The recent introduction of green infrastructure in urban catchments for the purpose of sustainable development contributes to the decrease of the directly connected impervious areas (DCIA) by isolating existing impervious areas and consequently, to the flood risk mitigation. This study coupled the width function-based instantaneous hydrograph (WFIUH), which is able to handle the spatial distribution of the impervious areas, with the concept of the DCIA to assess the impact of decreasing DCIA on the shape of direct runoff hydrographs. Using several scenarios for typical green infrastructure and corresponding changes of DCIA in a test catchment, this study evaluated the effect of green infrastructure on the shape of the resulting direct runoff hydrographs and peak flows. The results showed that the changes in the DCIA immediately affects the shape of the direct runoff hydrograph and decreases peak flows depending on spatial implementation scenarios. The quantitative assessment of the spatial distribution of impervious areas and also the changes to the DCIA suggests effective and well-planned green infrastructure can be introduced in urban environments for flood risk management.

  11. The GMOS cyber(e)-infrastructure: advanced services for supporting science and policy.

    Science.gov (United States)

    Cinnirella, S; D'Amore, F; Bencardino, M; Sprovieri, F; Pirrone, N

    2014-03-01

    The need for coordinated, systematized and catalogued databases on mercury in the environment is of paramount importance as improved information can help the assessment of the effectiveness of measures established to phase out and ban mercury. Long-term monitoring sites have been established in a number of regions and countries for the measurement of mercury in ambient air and wet deposition. Long term measurements of mercury concentration in biota also produced a huge amount of information, but such initiatives are far from being within a global, systematic and interoperable approach. To address these weaknesses the on-going Global Mercury Observation System (GMOS) project ( www.gmos.eu ) established a coordinated global observation system for mercury as well it retrieved historical data ( www.gmos.eu/sdi ). To manage such large amount of information a technological infrastructure was planned. This high-performance back-end resource associated with sophisticated client applications enables data storage, computing services, telecommunications networks and all services necessary to support the activity. This paper reports the architecture definition of the GMOS Cyber(e)-Infrastructure and the services developed to support science and policy, including the United Nation Environmental Program. It finally describes new possibilities in data analysis and data management through client applications.

  12. PyMOOSE: interoperable scripting in Python for MOOSE

    Directory of Open Access Journals (Sweden)

    Subhasis Ray

    2008-12-01

    Full Text Available Python is emerging as a common scripting language for simulators. This opens up many possibilities for interoperability in the form of analysis, interfaces, and communications between simulators. We report the integration of Python scripting with the Multi-scale Object Oriented Simulation Environment (MOOSE. MOOSE is a general-purpose simulation system for compartmental neuronal models and for models of signaling pathways based on chemical kinetics. We show how the Python-scripting version of MOOSE, PyMOOSE, combines the power of a compiled simulator with the versatility and ease of use of Python. We illustrate this by using Python numerical libraries to analyze MOOSE output online, and by developing a GUI in Python/Qt for a MOOSE simulation. Finally, we build and run a composite neuronal/signaling model that uses both the NEURON and MOOSE numerical engines, and Python as a bridge between the two. Thus PyMOOSE has a high degree of interoperability with analysis routines, with graphical toolkits, and with other simulators.

  13. Customer Satisfaction versus Infrastructural Facilities in the Realm of Higher Education--A Case Study of Sri Venkateswara University Tirupati

    Science.gov (United States)

    Janardhana, G.; Rajasekhar, Mamilla

    2012-01-01

    This article analyses the levels of students' satisfaction and how institution provides infrastructure facilities in the field of higher education. Infrastructure is the fastest growing segment of the higher education scenario. Universities play a very vital role in a country in terms of their potential. It contributes to employment and growth.…

  14. The Effect of Urban Green Infrastructure on Disaster Mitigation in Korea

    Directory of Open Access Journals (Sweden)

    So Yoon Kim

    2017-06-01

    Full Text Available Increasing precipitation by climate change and the growing number of impervious areas present greater risk of disaster damage in urban areas. Urban green infrastructure can be an effective mitigation alternative in highly developed and concentrated area. This study investigates the effect of various types of urban green infrastructure on mitigating disaster damage in Korea. Tobit model is used to analyze the factors that determine disaster damage. Damage variation is predicted with scenarios of RCP 8.5 and urban green spaces. Seventy-four districts and counties in seven metropolitan areas are defined as the unit and the period from 2005 to 2013 is considered in the analysis. The results indicate that higher urban green ratio, sewer length, financial independence rate, and local government’s budget are relating to lower disaster damage. Based on a precipitation level of RCP 8.5 scenario in 2050, an increase in economic damage is expected to range from 262 to 1086%. However, with an increase in urban green ratio by 10%, increased economic damage is only expected to range from 217 to 1013%. The results suggest that green spaces play important role to mitigate precipitation related disasters. Highly concentrated urban areas need to consider various types of urban green infrastructure to prepare for an increase in precipitation due to climate change.

  15. A Systems-Based Risk Assessment Framework for Intentional Electromagnetic Interference (IEMI) on Critical Infrastructures.

    Science.gov (United States)

    Oakes, Benjamin Donald; Mattsson, Lars-Göran; Näsman, Per; Glazunov, Andrés Alayón

    2018-01-03

    Modern infrastructures are becoming increasingly dependent on electronic systems, leaving them more vulnerable to electrical surges or electromagnetic interference. Electromagnetic disturbances appear in nature, e.g., lightning and solar wind; however, they may also be generated by man-made technology to maliciously damage or disturb electronic equipment. This article presents a systematic risk assessment framework for identifying possible, consequential, and plausible intentional electromagnetic interference (IEMI) attacks on an arbitrary distribution network infrastructure. In the absence of available data on IEMI occurrences, we find that a systems-based risk assessment is more useful than a probabilistic approach. We therefore modify the often applied definition of risk, i.e., a set of triplets containing scenario, probability, and consequence, to a set of quadruplets: scenario, resource requirements, plausibility, and consequence. Probability is "replaced" by resource requirements and plausibility, where the former is the minimum amount and type of equipment necessary to successfully carry out an attack scenario and the latter is a subjective assessment of the extent of the existence of attackers who possess the motivation, knowledge, and resources necessary to carry out the scenario. We apply the concept of intrusion areas and classify electromagnetic source technology according to key attributes. Worst-case scenarios are identified for different quantities of attacker resources. The most plausible and consequential of these are deemed the most important scenarios and should provide useful decision support in a countermeasures effort. Finally, an example of the proposed risk assessment framework, based on notional data, is provided on a hypothetical water distribution network. © 2017 Society for Risk Analysis.

  16. Regional Charging Infrastructure for Plug-In Electric Vehicles: A Case Study of Massachusetts

    Energy Technology Data Exchange (ETDEWEB)

    Wood, Eric [National Renewable Energy Lab. (NREL), Golden, CO (United States); Raghavan, Sesha [National Renewable Energy Lab. (NREL), Golden, CO (United States); Rames, Clement [National Renewable Energy Lab. (NREL), Golden, CO (United States); Eichman, Joshua [National Renewable Energy Lab. (NREL), Golden, CO (United States); Melaina, Marc [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-01-01

    Given the complex issues associated with plug-in electric vehicle (PEV) charging and options in deploying charging infrastructure, there is interest in exploring scenarios of future charging infrastructure deployment to provide insight and guidance to national and regional stakeholders. The complexity and cost of PEV charging infrastructure pose challenges to decision makers, including individuals, communities, and companies considering infrastructure installations. The value of PEVs to consumers and fleet operators can be increased with well-planned and cost-effective deployment of charging infrastructure. This will increase the number of miles driven electrically and accelerate PEV market penetration, increasing the shared value of charging networks to an expanding consumer base. Given these complexities and challenges, the objective of the present study is to provide additional insight into the role of charging infrastructure in accelerating PEV market growth. To that end, existing studies on PEV infrastructure are summarized in a literature review. Next, an analysis of current markets is conducted with a focus on correlations between PEV adoption and public charging availability. A forward-looking case study is then conducted focused on supporting 300,000 PEVs by 2025 in Massachusetts. The report concludes with a discussion of potential methodology for estimating economic impacts of PEV infrastructure growth.

  17. Interoperable mesh and geometry tools for advanced petascale simulations

    International Nuclear Information System (INIS)

    Diachin, L; Bauer, A; Fix, B; Kraftcheck, J; Jansen, K; Luo, X; Miller, M; Ollivier-Gooch, C; Shephard, M S; Tautges, T; Trease, H

    2007-01-01

    SciDAC applications have a demonstrated need for advanced software tools to manage the complexities associated with sophisticated geometry, mesh, and field manipulation tasks, particularly as computer architectures move toward the petascale. The Center for Interoperable Technologies for Advanced Petascale Simulations (ITAPS) will deliver interoperable and interchangeable mesh, geometry, and field manipulation services that are of direct use to SciDAC applications. The premise of our technology development goal is to provide such services as libraries that can be used with minimal intrusion into application codes. To develop these technologies, we focus on defining a common data model and data-structure neutral interfaces that unify a number of different services such as mesh generation and improvement, front tracking, adaptive mesh refinement, shape optimization, and solution transfer operations. We highlight the use of several ITAPS services in SciDAC applications

  18. Working towards a European Geological Data Infrastructure

    Science.gov (United States)

    van der Krogt, Rob; Hughes, Richard; Pedersen, Mikael; Serrano, Jean-Jacques; Lee, Kathryn A.; Tulstrup, Jørgen; Robida, François

    2013-04-01

    ; what can we conclude and what is the way forward? • The project has evaluated relevant existing interoperable infrastructures revealing a typology of infrastructures that may be useful models for the EGDI; • Planning for the EGDI also need to be integrated with other relevant international initiatives and programs such as GMES, GEO and EPOS, and with legally binding regulations like INSPIRE. The outcomes of these relevant evaluations and activities will contribute to the implementation plan for the EGDI including the prioritization of relevant datasets and the most important functional, technical (design, use of standards), legal and organizational requirements.

  19. Parking infrastructure: energy, emissions, and automobile life-cycle environmental accounting

    Energy Technology Data Exchange (ETDEWEB)

    Chester, Mikhail; Horvath, Arpad; Madanat, Samer, E-mail: mchester@cal.berkeley.edu, E-mail: horvath@ce.berkeley.edu, E-mail: madanat@ce.berkeley.edu [Department of Civil and Environmental Engineering, University of California, Berkeley, Berkeley CA 94720 (United States)

    2010-07-15

    The US parking infrastructure is vast and little is known about its scale and environmental impacts. The few parking space inventories that exist are typically regionalized and no known environmental assessment has been performed to determine the energy and emissions from providing this infrastructure. A better understanding of the scale of US parking is necessary to properly value the total costs of automobile travel. Energy and emissions from constructing and maintaining the parking infrastructure should be considered when assessing the total human health and environmental impacts of vehicle travel. We develop five parking space inventory scenarios and from these estimate the range of infrastructure provided in the US to be between 105 million and 2 billion spaces. Using these estimates, a life-cycle environmental inventory is performed to capture the energy consumption and emissions of greenhouse gases, CO, SO{sub 2}, NO{sub X}, VOC (volatile organic compounds), and PM{sub 10} (PM: particulate matter) from raw material extraction, transport, asphalt and concrete production, and placement (including direct, indirect, and supply chain processes) of space construction and maintenance. The environmental assessment is then evaluated within the life-cycle performance of sedans, SUVs (sports utility vehicles), and pickups. Depending on the scenario and vehicle type, the inclusion of parking within the overall life-cycle inventory increases energy consumption from 3.1 to 4.8 MJ by 0.1-0.3 MJ and greenhouse gas emissions from 230 to 380 g CO{sub 2}e by 6-23 g CO{sub 2}e per passenger kilometer traveled. Life-cycle automobile SO{sub 2} and PM{sub 10} emissions show some of the largest increases, by as much as 24% and 89% from the baseline inventory. The environmental consequences of providing the parking spaces are discussed as well as the uncertainty in allocating paved area between parking and roadways.

  20. Flexible Language Interoperability

    DEFF Research Database (Denmark)

    Ekman, Torbjörn; Mechlenborg, Peter; Schultz, Ulrik Pagh

    2007-01-01

    Virtual machines raise the abstraction level of the execution environment at the cost of restricting the set of supported languages. Moreover, the ability of a language implementation to integrate with other languages hosted on the same virtual machine typically constrains the features...... of the language. In this paper, we present a highly flexible yet efficient approach to hosting multiple programming languages on an object-oriented virtual machine. Our approach is based on extending the interface of each class with language-specific wrapper methods, offering each language a tailored view...... of a given class. This approach can be deployed both on a statically typed virtual machine, such as the JVM, and on a dynamic virtual machine, such as a Smalltalk virtual machine. We have implemented our approach to language interoperability on top of a prototype virtual machine for embedded systems based...

  1. Interoperable Multimedia Annotation and Retrieval for the Tourism Sector

    NARCIS (Netherlands)

    Chatzitoulousis, Antonios; Efraimidis, Pavlos S.; Athanasiadis, I.N.

    2015-01-01

    The Atlas Metadata System (AMS) employs semantic web annotation techniques in order to create an interoperable information annotation and retrieval platform for the tourism sector. AMS adopts state-of-the-art metadata vocabularies, annotation techniques and semantic web technologies.

  2. Providing interoperability of eHealth communities through peer-to-peer networks.

    Science.gov (United States)

    Kilic, Ozgur; Dogac, Asuman; Eichelberg, Marco

    2010-05-01

    Providing an interoperability infrastructure for Electronic Healthcare Records (EHRs) is on the agenda of many national and regional eHealth initiatives. Two important integration profiles have been specified for this purpose, namely, the "Integrating the Healthcare Enterprise (IHE) Cross-enterprise Document Sharing (XDS)" and the "IHE Cross Community Access (XCA)." IHE XDS describes how to share EHRs in a community of healthcare enterprises and IHE XCA describes how EHRs are shared across communities. However, the current version of the IHE XCA integration profile does not address some of the important challenges of cross-community exchange environments. The first challenge is scalability. If every community that joins the network needs to connect to every other community, i.e., a pure peer-to-peer network, this solution will not scale. Furthermore, each community may use a different coding vocabulary for the same metadata attribute, in which case, the target community cannot interpret the query involving such an attribute. Yet another important challenge is that each community may (and typically will) have a different patient identifier domain. Querying for the patient identifiers in the target community using patient demographic data may create patient privacy concerns. In this paper, we address each of these challenges and show how they can be handled effectively in a superpeer-based peer-to-peer architecture.

  3. Understanding the infrastructure of European Research Infrastructures

    DEFF Research Database (Denmark)

    Lindstrøm, Maria Duclos; Kropp, Kristoffer

    2017-01-01

    European Research Infrastructure Consortia (ERIC) are a new form of legal and financial framework for the establishment and operation of research infrastructures in Europe. Despite their scope, ambition, and novelty, the topic has received limited scholarly attention. This article analyses one ER....... It is also a promising theoretical framework for addressing the relationship between the ERIC construct and the large diversity of European Research Infrastructures.......European Research Infrastructure Consortia (ERIC) are a new form of legal and financial framework for the establishment and operation of research infrastructures in Europe. Despite their scope, ambition, and novelty, the topic has received limited scholarly attention. This article analyses one ERIC...... became an ERIC using the Bowker and Star’s sociology of infrastructures. We conclude that focusing on ERICs as a European standard for organising and funding research collaboration gives new insights into the problems of membership, durability, and standardisation faced by research infrastructures...

  4. A Review of Interoperability Standards in E-health and Imperatives for their Adoption in Africa

    Directory of Open Access Journals (Sweden)

    Funmi Adebesin

    2013-07-01

    Full Text Available The ability of healthcare information systems to share and exchange information (interoperate is essential to facilitate the quality and effectiveness of healthcare services. Although standardization is considered key to addressing the fragmentation currently challenging the healthcare environment, e-health standardization can be difficult for many reasons, one of which is making sense of the e-health interoperability standards landscape. Specifically aimed at the African health informatics community, this paper aims to provide an overview of e-health interoperability and the significance of standardization in its achievement. We conducted a literature study of e-health standards, their development, and the degree of participation by African countries in the process. We also provide a review of a selection of prominent e-health interoperability standards that have been widely adopted especially by developed countries, look at some of the factors that affect their adoption in Africa, and provide an overview of ongoing global initiatives to address the identified barriers. Although the paper is specifically aimed at the African community, its findings would be equally applicable to many other developing countries.

  5. Waveform Diversity and Design for Interoperating Radar Systems

    Science.gov (United States)

    2013-01-01

    University Di Pisa Department Di Ingegneria Dell Informazione Elettronica, Informatica , Telecomunicazioni Via Girolamo Caruso 16 Pisa, Italy 56122...NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) University Di Pisa Department Di Ingegneria Dell Informazione Elettronica, Informatica ...DIPARTIMENTO DI INGEGNERIA DELL’INFORMAZIONE ELETTRONICA, INFORMATICA , TELECOMUNICAZIONI WAVEFORM DIVERSITY AND DESIGN FOR INTEROPERATING

  6. Interoperability format translation and transformation between IFC architectural design file and simulation file formats

    Science.gov (United States)

    Chao, Tian-Jy; Kim, Younghun

    2015-01-06

    Automatically translating a building architecture file format (Industry Foundation Class) to a simulation file, in one aspect, may extract data and metadata used by a target simulation tool from a building architecture file. Interoperability data objects may be created and the extracted data is stored in the interoperability data objects. A model translation procedure may be prepared to identify a mapping from a Model View Definition to a translation and transformation function. The extracted data may be transformed using the data stored in the interoperability data objects, an input Model View Definition template, and the translation and transformation function to convert the extracted data to correct geometric values needed for a target simulation file format used by the target simulation tool. The simulation file in the target simulation file format may be generated.

  7. A Cultural Framework for the Interoperability of C2 Systems

    National Research Council Canada - National Science Library

    Slay, Jill

    2002-01-01

    In considering some of the difficulties experienced in coalition operations, it becomes apparent that attention is needed, is in establishing a cultural framework for the interoperability of personnel (the human agents...

  8. Advances in a Distributed Approach for Ocean Model Data Interoperability

    Directory of Open Access Journals (Sweden)

    Richard P. Signell

    2014-03-01

    Full Text Available An infrastructure for earth science data is emerging across the globe based on common data models and web services. As we evolve from custom file formats and web sites to standards-based web services and tools, data is becoming easier to distribute, find and retrieve, leaving more time for science. We describe recent advances that make it easier for ocean model providers to share their data, and for users to search, access, analyze and visualize ocean data using MATLAB® and Python®. These include a technique for modelers to create aggregated, Climate and Forecast (CF metadata convention datasets from collections of non-standard Network Common Data Form (NetCDF output files, the capability to remotely access data from CF-1.6-compliant NetCDF files using the Open Geospatial Consortium (OGC Sensor Observation Service (SOS, a metadata standard for unstructured grid model output (UGRID, and tools that utilize both CF and UGRID standards to allow interoperable data search, browse and access. We use examples from the U.S. Integrated Ocean Observing System (IOOS® Coastal and Ocean Modeling Testbed, a project in which modelers using both structured and unstructured grid model output needed to share their results, to compare their results with other models, and to compare models with observed data. The same techniques used here for ocean modeling output can be applied to atmospheric and climate model output, remote sensing data, digital terrain and bathymetric data.

  9. Advances in a distributed approach for ocean model data interoperability

    Science.gov (United States)

    Signell, Richard P.; Snowden, Derrick P.

    2014-01-01

    An infrastructure for earth science data is emerging across the globe based on common data models and web services. As we evolve from custom file formats and web sites to standards-based web services and tools, data is becoming easier to distribute, find and retrieve, leaving more time for science. We describe recent advances that make it easier for ocean model providers to share their data, and for users to search, access, analyze and visualize ocean data using MATLAB® and Python®. These include a technique for modelers to create aggregated, Climate and Forecast (CF) metadata convention datasets from collections of non-standard Network Common Data Form (NetCDF) output files, the capability to remotely access data from CF-1.6-compliant NetCDF files using the Open Geospatial Consortium (OGC) Sensor Observation Service (SOS), a metadata standard for unstructured grid model output (UGRID), and tools that utilize both CF and UGRID standards to allow interoperable data search, browse and access. We use examples from the U.S. Integrated Ocean Observing System (IOOS®) Coastal and Ocean Modeling Testbed, a project in which modelers using both structured and unstructured grid model output needed to share their results, to compare their results with other models, and to compare models with observed data. The same techniques used here for ocean modeling output can be applied to atmospheric and climate model output, remote sensing data, digital terrain and bathymetric data.

  10. Interoperable web applications for sharing data and products of the International DORIS Service

    Science.gov (United States)

    Soudarin, L.; Ferrage, P.

    2017-12-01

    The International DORIS Service (IDS) was created in 2003 under the umbrella of the International Association of Geodesy (IAG) to foster scientific research related to the French satellite tracking system DORIS and to deliver scientific products, mostly related to the International Earth rotation and Reference systems Service (IERS). Since its start, the organization has continuously evolved, leading to additional and improved operational products from an expanded set of DORIS Analysis Centers. In addition, IDS has developed services for sharing data and products with the users. Metadata and interoperable web applications are proposed to explore, visualize and download the key products such as the position time series of the geodetic points materialized at the ground tracking stations. The Global Geodetic Observing System (GGOS) encourages the IAG Services to develop such interoperable facilities on their website. The objective for GGOS is to set up an interoperable portal through which the data and products produced by the IAG Services can be served to the user community. We present the web applications proposed by IDS to visualize time series of geodetic observables or to get information about the tracking ground stations and the tracked satellites. We discuss the future plans for IDS to meet the recommendations of GGOS. The presentation also addresses the needs for the IAG Services to adopt common metadata thesaurus to describe data and products, and interoperability standards to share them.

  11. Reliable Freestanding Position-Based Routing in Highway Scenarios

    Science.gov (United States)

    Galaviz-Mosqueda, Gabriel A.; Aquino-Santos, Raúl; Villarreal-Reyes, Salvador; Rivera-Rodríguez, Raúl; Villaseñor-González, Luis; Edwards, Arthur

    2012-01-01

    Vehicular Ad Hoc Networks (VANETs) are considered by car manufacturers and the research community as the enabling technology to radically improve the safety, efficiency and comfort of everyday driving. However, before VANET technology can fulfill all its expected potential, several difficulties must be addressed. One key issue arising when working with VANETs is the complexity of the networking protocols compared to those used by traditional infrastructure networks. Therefore, proper design of the routing strategy becomes a main issue for the effective deployment of VANETs. In this paper, a reliable freestanding position-based routing algorithm (FPBR) for highway scenarios is proposed. For this scenario, several important issues such as the high mobility of vehicles and the propagation conditions may affect the performance of the routing strategy. These constraints have only been partially addressed in previous proposals. In contrast, the design approach used for developing FPBR considered the constraints imposed by a highway scenario and implements mechanisms to overcome them. FPBR performance is compared to one of the leading protocols for highway scenarios. Performance metrics show that FPBR yields similar results when considering freespace propagation conditions, and outperforms the leading protocol when considering a realistic highway path loss model. PMID:23202159

  12. Public Opinions and Use of Various Types of Recreational Infrastructure in Boreal Forest Settings

    Directory of Open Access Journals (Sweden)

    Vegard Gundersen

    2016-05-01

    Full Text Available We have investigated public preferences for use intensity and visual quality of forest recreational infrastructure. Forest infrastructure covers five classes, along a continuum from unmarked paths to paved walkways. Altogether, 39 sites were categorized into the five classes and measured with automatic counters. A sample of 545 respondents living in southeastern and middle Norway were asked to rate 15 forest scenes and 35 preconceptions of recreational settings. The path scenarios were depicted as digitally calibrated photos that systematically displayed physical path feature in boreal, semi-natural settings. Survey participants showed a clearly greater preference for photos and preconceptions of forests settings containing minor elements of forest infrastructure; unmarked paths received the highest score and forest roads/walkways/bikeways the lowest. We identified a clear mismatch between public preferences for forest infrastructure and the intensity of use; the less appreciated infrastructure was the most used. Planning and management has to consider these different needs for recreational infrastructure, and we propose an area zoning system that meets the different segments of forest visitors.

  13. The e-MapScholar project—an example of interoperability in GIScience education

    Science.gov (United States)

    Purves, R. S.; Medyckyj-Scott, D. J.; Mackaness, W. A.

    2005-03-01

    The proliferation of the use of digital spatial data in learning and teaching provides a set of opportunities and challenges for the development of e-learning materials suitable for use by a broad spectrum of disciplines in Higher Education. Effective e-learning materials must both provide engaging materials with which the learner can interact and be relevant to the learners' disciplinary and background knowledge. Interoperability aims to allow sharing of data and materials through the use of common agreements and specifications. Shared learning materials can take advantage of interoperable components to provide customisable components, and must consider issues in sharing data across institutional borders. The e-MapScholar project delivers teaching materials related to spatial data, which are customisable with respect to both context and location. Issues in the provision of such interoperable materials are discussed, including suitable levels of granularity of materials, the provision of tools to facilitate customisation and mechanisms to deliver multiple data sets and the metadata issues related to such materials. The examples shown make extensive use of the OpenGIS consortium specifications in the delivery of spatial data.

  14. Microtheories for Spatial Data Infrastructures - Accounting for Diversity of Local Conceptualizations at a Global Level

    Science.gov (United States)

    Duce, Stephanie; Janowicz, Krzysztof

    The categorization of our environment into feature types is an essential prerequisite for cartography, geographic information retrieval, routing applications, spatial decision support systems, and data sharing in general. However, there is no a priori conceptualization of the world and the creation of features and types is an act of cognition. Humans conceptualize their environment based on multiple criteria such as their cultural background, knowledge, motivation, and particularly by space and time. Sharing and making these conceptualizations explicit in a formal, unambiguous way is at the core of semantic interoperability. One way to cope with semantic heterogeneities is by standardization, i.e., by agreeing on a shared conceptualization. This bears the danger of losing local diversity. In contrast, this work proposes the use of microtheories for Spatial Data Infrastructures, such as INSPIRE, to account for the diversity of local conceptualizations while maintaining their semantic interoperability at a global level. We introduce a novel methodology to structure ontologies by spatial and temporal aspects, in our case administrative boundaries, which reflect variations in feature conceptualization. A local, bottom-up approach, based on non-standard inference, is used to compute global feature definitions which are neither too broad nor too specific. Using different conceptualizations of rivers and other geographic feature types, we demonstrate how the present approach can improve the INSPIRE data model and ease its adoption by European member states.

  15. Improving Patient Safety with X-Ray and Anesthesia Machine Ventilator Synchronization: A Medical Device Interoperability Case Study

    Science.gov (United States)

    Arney, David; Goldman, Julian M.; Whitehead, Susan F.; Lee, Insup

    When a x-ray image is needed during surgery, clinicians may stop the anesthesia machine ventilator while the exposure is made. If the ventilator is not restarted promptly, the patient may experience severe complications. This paper explores the interconnection of a ventilator and simulated x-ray into a prototype plug-and-play medical device system. This work assists ongoing interoperability framework development standards efforts to develop functional and non-functional requirements and illustrates the potential patient safety benefits of interoperable medical device systems by implementing a solution to a clinical use case requiring interoperability.

  16. WDS/DSA Certification - International collaboration for a trustworthy research data infrastructure

    Science.gov (United States)

    Mokrane, Mustapha; Hugo, Wim; Harrison, Sandy

    2016-04-01

    Today's research is international, transdisciplinary, and data-enabled, which requires scrupulous data stewardship, full and open access to data, and efficient collaboration and coordination. New expectations on researchers based on policies from governments and funders to share data fully, openly, and in a timely manner present significant challenges but are also opportunities to improve the quality and efficiency of research and its accountability to society. Researchers should be able to archive and disseminate data as required by many institutions or funders, and civil society to scrutinize datasets underlying public policies. Thus, the trustworthiness of data services must be verifiable. In addition, the need to integrate large and complex datasets across disciplines and domains with variable levels of maturity calls for greater coordination to achieve sufficient interoperability and sustainability. The World Data System (WDS) of the International Council for Science (ICSU) promotes long-term stewardship of, and universal and equitable access to, quality-assured scientific data and services across a range of disciplines in the natural and social sciences. WDS aims at coordinating and supporting trusted scientific data services for the provision, use, and preservation of relevant datasets to facilitate scientific research, in particular under the ICSU umbrella, while strengthening their links with the research community. WDS certifies its Members, holders and providers of data or data products, using internationally recognized standards. Certification of scientific data services is essential to ensure trustworthiness of the global research data infrastructure. It contributes to building a searchable, distributed, interoperable and sustainable research data infrastructure. Several certification standards have been developed over the last decade, such as the Network of Expertise in long-term Storage and Accessibility of Digital Resources in Germany (NESTOR) seal

  17. Review of CERN Computer Centre Infrastructure

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    The CERN Computer Centre is reviewing strategies for optimizing the use of the existing infrastructure in the future, and in the likely scenario that any extension will be remote from CERN, and in the light of the way other large facilities are today being operated. Over the past six months, CERN has been investigating modern and widely-used tools and procedures used for virtualisation, clouds and fabric management in order to reduce operational effort, increase agility and support unattended remote computer centres. This presentation will give the details on the project’s motivations, current status and areas for future investigation.

  18. Needs of National Infrastructure for Nuclear Energy Program in Macedonia

    International Nuclear Information System (INIS)

    Chaushevski, A.; Poceva, S.N.; Spasevska, H.; Popov, N.

    2016-01-01

    The introduction of a nuclear energy program is a major undertaking with significant implications for many aspects of national infrastructure, ranging from capacity of the power grid, access roads and production facilities, to the involvement of stakeholders and the development of human resources. For new comers countries without nuclear power, even for those who wish to realize substantial expansion of existing nuclear capacity, it can take up to 10-15 years to develop the necessary infrastructure. One of the crucial problems in nuclear energy implementation are human resources needs and educational infrastructure development in this field. No matter what will be the future energy scenario in the Republic of Macedonia, the nuclear educational program is the first step to have HR in the field of nuclear energy. This paper presents the proposed direction for having HR for establishing national infrastructure in nuclear energy program in Macedonia. This includes establishing and developing of MONEP (Macedonian NEPIO), and the enhancing the capabilities of the national regulatory body in the Republic of Macedonia. Keywords: NEP (Nuclear Energy Program), HR (Human Resources), NEPIO (Nuclear Energy Program Implementation Organization), MONEP Macedonian Organization for Nuclear Energy Program (Macedonian NEPIO), NRB (Nuclear Regulatory Body)

  19. vNet Zero Energy for Radio Base Stations- Balearic Scenario

    DEFF Research Database (Denmark)

    Sabater, Pere; Mihovska, Albena Dimitrova; Pol, Andreu Moia

    2016-01-01

    The Balearic Islands have one of the best telecommunications infrastructures in Spain, with more than 1500 Radio Base Stations (RBS) covering a total surface of 4.991,66 km². This archipelago has high energy consumption, with high CO2 emissions, due to an electrical energy production system mainly...... based on coal and fossil fuels which is not an environmentally sustainable scenario. The aim of this study is to identify the processes that would reduce the energy consumption and greenhouse gas emissions, designing a target scenario featuring "zero CO2 emissions" and "100% renewable energies" in RBS....... The energy costs, CO2 emissions and data traffic data used for the study are generated by a sample of RBS from the Balearic Islands. The results are shown in terms of energy performance for a normal and net zero emissions scenarios....

  20. Energy infrastructure in India: Profile and risks under climate change

    International Nuclear Information System (INIS)

    Garg, Amit; Naswa, Prakriti; Shukla, P.R.

    2015-01-01

    India has committed large investments to energy infrastructure assets-power plants, refineries, energy ports, pipelines, roads, railways, etc. The coastal infrastructure being developed to meet the rising energy imports is vulnerable to climate extremes. This paper provides an overview of climate risks to energy infrastructures in India and details two case studies – a crude oil importing port and a western coast railway transporting coal. The climate vulnerability of the port has been mapped using an index while that of the railway has been done through a damage function for RCP 4.5.0 and 8.5 scenarios. Our analysis shows that risk management through adaptation is likely to be very expensive. The system risks can be even greater and might adversely affect energy security and access objectives. Aligning sustainable development and climate adaptation measures can deliver substantial co-benefits. The key policy recommendations include: i) mandatory vulnerability assessment to future climate risks for energy infrastructures; ii) project and systemic risks in the vulnerability index; iii) adaptation funds for unmitigated climate risks; iv) continuous monitoring of climatic parameters and implementation of adaptation measures, and iv) sustainability actions along energy infrastructures that enhance climate resilience and simultaneously deliver co-benefits to local agents. -- Highlights: •Climate risks to energy infrastructures adversely impact energy security. •Case studies of a port and a railway show their future climate change vulnerability. •Managing climate-induced risks through preventive adaptation policies

  1. Architecture for Cognitive Networking within NASAs Future Space Communications Infrastructure

    Science.gov (United States)

    Clark, Gilbert J., III; Eddy, Wesley M.; Johnson, Sandra K.; Barnes, James; Brooks, David

    2016-01-01

    Future space mission concepts and designs pose many networking challenges for command, telemetry, and science data applications with diverse end-to-end data delivery needs. For future end-to-end architecture designs, a key challenge is meeting expected application quality of service requirements for multiple simultaneous mission data flows with options to use diverse onboard local data buses, commercial ground networks, and multiple satellite relay constellations in LEO, MEO, GEO, or even deep space relay links. Effectively utilizing a complex network topology requires orchestration and direction that spans the many discrete, individually addressable computer systems, which cause them to act in concert to achieve the overall network goals. The system must be intelligent enough to not only function under nominal conditions, but also adapt to unexpected situations, and reorganize or adapt to perform roles not originally intended for the system or explicitly programmed. This paper describes architecture features of cognitive networking within the future NASA space communications infrastructure, and interacting with the legacy systems and infrastructure in the meantime. The paper begins by discussing the need for increased automation, including inter-system collaboration. This discussion motivates the features of an architecture including cognitive networking for future missions and relays, interoperating with both existing endpoint-based networking models and emerging information-centric models. From this basis, we discuss progress on a proof-of-concept implementation of this architecture as a cognitive networking on-orbit application on the SCaN Testbed attached to the International Space Station.

  2. Future CO2 Emissions and Climate Change from Existing Energy Infrastructure

    Science.gov (United States)

    Davis, S. J.; Caldeira, K.; Matthews, D.

    2010-12-01

    If current greenhouse gas (GHG) concentrations remain constant, the world would be committed to several centuries of increasing global mean temperatures and sea level rise. By contrast, near elimination of anthropogenic CO2 emissions would be required to produce diminishing GHG concentrations consistent with stabilization of mean temperatures. Yet long-lived energy and transportation infrastructure now operating can be expected to contribute substantial CO2 emissions over the next 50 years. Barring widespread retrofitting of existing power plants with carbon capture and storage (CCS) technologies or the early decommissioning of serviceable infrastructure, these “committed emissions” represent infrastructural inertia which may be the primary contributor to total future warming commitment. With respect to GHG emissions, infrastructural inertia may be thought of as having two important and overlapping components: (i) infrastructure that directly releases GHGs to the atmosphere, and (ii) infrastructure that contributes to the continued production of devices that emit GHGs to the atmosphere. For example, the interstate highway and refueling infrastructure in the United States facilitates continued production of gasoline-powered automobiles. Here, we focus only on the warming commitment from infrastructure that directly releases CO2 to the atmosphere. Essentially, we answer the question: What if no additional CO2-emitting devices (e.g., power plants, motor vehicles) were built, but all the existing CO2-emitting devices were allowed to live out their normal lifetimes? What CO2 levels and global mean temperatures would we attain? Of course, the actual lifetime of devices may be strongly influenced by economic and policy constraints. For instance, a ban on new CO2-emitting devices would create tremendous incentive to prolong the lifetime of existing devices. Thus, our scenarios are not realistic, but offer a means of gauging the threat of climate change from existing

  3. OBEST: The Object-Based Event Scenario Tree Methodology

    International Nuclear Information System (INIS)

    WYSS, GREGORY D.; DURAN, FELICIA A.

    2001-01-01

    Event tree analysis and Monte Carlo-based discrete event simulation have been used in risk assessment studies for many years. This report details how features of these two methods can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology with some of the best features of each. The resultant Object-Based Event Scenarios Tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible (especially those that exhibit inconsistent or variable event ordering, which are difficult to represent in an event tree analysis). Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST method uses a recursive algorithm to solve the object model and identify all possible scenarios and their associated probabilities. Since scenario likelihoods are developed directly by the solution algorithm, they need not be computed by statistical inference based on Monte Carlo observations (as required by some discrete event simulation methods). Thus, OBEST is not only much more computationally efficient than these simulation methods, but it also discovers scenarios that have extremely low probabilities as a natural analytical result--scenarios that would likely be missed by a Monte Carlo-based method. This report documents the OBEST methodology, the demonstration software that implements it, and provides example OBEST models for several different application domains, including interactions among failing interdependent infrastructure systems, circuit analysis for fire risk evaluation in nuclear power plants, and aviation safety studies

  4. Methodology for assessing electric vehicle charging infrastructure business models

    International Nuclear Information System (INIS)

    Madina, Carlos; Zamora, Inmaculada; Zabala, Eduardo

    2016-01-01

    The analysis of economic implications of innovative business models in networked environments, as electro-mobility is, requires a global approach to ensure that all the involved actors obtain a benefit. Although electric vehicles (EVs) provide benefits for the society as a whole, there are a number of hurdles for their widespread adoption, mainly the high investment cost for the EV and for the infrastructure. Therefore, a sound business model must be built up for charging service operators, which allows them to recover their costs while, at the same time, offer EV users a charging price which makes electro-mobility comparable to internal combustion engine vehicles. For that purpose, three scenarios are defined, which present different EV charging alternatives, in terms of charging power and charging station ownership and accessibility. A case study is presented for each scenario and the required charging station usage to have a profitable business model is calculated. We demonstrate that private home charging is likely to be the preferred option for EV users who can charge at home, as it offers a lower total cost of ownership under certain conditions, even today. On the contrary, finding a profitable business case for fast charging requires more intensive infrastructure usage. - Highlights: • Ecosystem is a network of actors who collaborate to create a positive business case. • Electro-mobility (electricity-powered road vehicles and ICT) is a complex ecosystem. • Methodological analysis to ensure that all actors benefit from electro-mobility. • Economic analysis of charging infrastructure deployment linked to its usage. • Comparison of EV ownership cost vs. ICE for vehicle users.

  5. Information and documentation - Thesauri and interoperability with other vocabularies

    DEFF Research Database (Denmark)

    Lykke, Marianne; Dalbin, Sylvie; Smedt, Johan De

    ISO 25964-2:2013 is applicable to thesauri and other types of vocabulary that are commonly used for information retrieval. It describes, compares and contrasts the elements and features of these vocabularies that are implicated when interoperability is needed. It gives recommendations for the est...

  6. A Conceptual Framework to Enhance the Interoperability of Observatories among Countries, Continents and the World

    Science.gov (United States)

    Loescher, H.; Fundamental Instrument Unit

    2013-05-01

    Ecological research addresses challenges relating to the dynamics of the planet, such as changes in climate, biodiversity, ecosystem functioning and services, carbon and energy cycles, natural and human-induced hazards, and adaptation and mitigation strategies that involve many science and engineering disciplines and cross national boundaries. Because of the global nature of these challenges, greater international collaboration is required for knowledge sharing and technology deployment to advance earth science investigations and enhance societal benefits. For example, the Working Group on Biodiversity Preservation and Ecosystem Services (PCAST 2011) noted the scale and complexity of the physical and human resources needed to address these challenges. Many of the most pressing ecological research questions require global-scale data and global scale solutions (Suresh 2012), e.g., interdisciplinary data access from data centers managing ecological resources and hazards, drought, heat islands, carbon cycle, or data used to forecast the rate of spread of invasive species or zoonotic diseases. Variability and change at one location or in one region may well result from the superposition of global processes coupled together with regional and local modes of variability. For example, we know the El Niño-Southern Oscillation large-scale modes of variability in the coupled terrestrial-aquatic-atmospheric systems' correlation with variability in regional rainfall and ecosystem functions. It is therefore a high priority of government and non-government organizations to develop the necessary large scale, world-class research infrastructures for environmental research—and the framework by which these data can be shared, discovered, and utilized by a broad user community of scientists and policymakers, alike. Given that there are many, albeit nascent, efforts to build new environmental observatories/networks globally (e.g., EU-ICOS, EU-Lifewatch, AU-TERN, China-CERN, GEOSS

  7. Assessment of the biodiesel distribution infrastructure in Canada

    International Nuclear Information System (INIS)

    Lagace, C.

    2007-08-01

    Canada's biodiesel industry is in its infancy, and must work to achieve the demand needed to ensure its development. This assessment of Canada's biodiesel distribution infrastructure was conducted to recommend the most efficient infrastructure pathway for effective biodiesel distribution. The study focused on the establishment of a link between biodiesel supplies and end-users. The current Canadian biodiesel industry was discussed, and future market potentials were outlined. The Canadian distillate product distribution infrastructure was discussed. Technical considerations and compliance issues were reviewed. The following 2 scenarios were used to estimate adaptations and costs for the Canadian market: (1) the use of primary terminals to ensure quality control of biodiesel, and (2) storage in secondary terminals where biodiesel blends are prepared before being transported to retail outlets. The study showed that relevant laboratory training programs are needed as well as proficiency testing programs in order to ensure adequate quality control of biodiesel. Standards for biodiesel distribution are needed, as well as specifications for the heating oil market. It was concluded that this document may prove useful in developing government policy objectives and identifying further research needs. 21 refs., 12 tabs., 13 figs

  8. All quiet on the eastern front? Disruption scenarios of Russian natural gas supply to Europe

    International Nuclear Information System (INIS)

    Richter, Philipp M.; Holz, Franziska

    2015-01-01

    The 2014 Russian–Ukrainian crisis reignited European concerns about natural gas supply security recalling the experiences of 2006 and 2009. However, the European supply situation, regulation and infrastructure have changed, with better diversified import sources, EU member states being better connected and a common regulation on the security of supply has been introduced. Nevertheless, European dependency on natural gas remained high. This paper investigates different Russian natural gas export disruptions scenarios and analyses short- and long-term reactions in Europe. We use the Global Gas Model (GGM), a large-scale mixed complementarity representation of the natural gas sector with a high level of technical granularity with respect to storage and transportation infrastructure. While we find that most of the EU member states are not severely affected by Russian disruptions, some East European countries are very vulnerable. Prioritizing the removal of infrastructure bottlenecks is critical for securing a sufficient natural gas supply to all EU member states. - Highlights: • We analyze disruption scenarios of Russian natural gas exports to Europe. • Most EU countries are only weakly affected by a complete Russian supply disruption. • We find that Eastern Europe is vulnerable to Russian supply disruptions. • We identify infrastructure bottlenecks in the European natural gas network. • We find that the large EU LNG import capacity is not sufficiently connected

  9. UGV Control Interoperability Profile (IOP), Version 0

    Science.gov (United States)

    2011-12-21

    a tracked vehicle to climb stairs , traverse ditches/ruts, etc. The operator should be able to control the position of the flippers via the OCU and...Unclassified UGV Control Interoperability Profile (IOP) Version 0 Robotic Systems, Joint Project Office (RS JPO) SFAE-GCS-UGV MS...NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Robotic Systems, Joint Project Office (RS JPO),SFAE-GCS-UGV MS 266,6501 East 11 Mile Road

  10. Contribution of Clinical Archetypes, and the Challenges, towards Achieving Semantic Interoperability for EHRs.

    Science.gov (United States)

    Tapuria, Archana; Kalra, Dipak; Kobayashi, Shinji

    2013-12-01

    The objective is to introduce 'clinical archetype' which is a formal and agreed way of representing clinical information to ensure interoperability across and within Electronic Health Records (EHRs). The paper also aims at presenting the challenges building quality labeled clinical archetypes and the challenges towards achieving semantic interoperability between EHRs. Twenty years of international research, various European healthcare informatics projects and the pioneering work of the openEHR Foundation have led to the following results. The requirements for EHR information architectures have been consolidated within ISO 18308 and adopted within the ISO 13606 EHR interoperability standard. However, a generic EHR architecture cannot ensure that the clinical meaning of information from heterogeneous sources can be reliably interpreted by receiving systems and services. Therefore, clinical models called 'clinical archetypes' are required to formalize the representation of clinical information within the EHR. Part 2 of ISO 13606 defines how archetypes should be formally represented. The current challenge is to grow clinical communities to build a library of clinical archetypes and to identify how evidence of best practice and multi-professional clinical consensus should best be combined to define archetypes at the optimal level of granularity and specificity and quality label them for wide adoption. Standardizing clinical terms within EHRs using clinical terminology like Systematized Nomenclature of Medicine Clinical Terms is also a challenge. Clinical archetypes would play an important role in achieving semantic interoperability within EHRs. Attempts are being made in exploring the design and adoption challenges for clinical archetypes.

  11. The HayWired Earthquake Scenario

    Science.gov (United States)

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    ForewordThe 1906 Great San Francisco earthquake (magnitude 7.8) and the 1989 Loma Prieta earthquake (magnitude 6.9) each motivated residents of the San Francisco Bay region to build countermeasures to earthquakes into the fabric of the region. Since Loma Prieta, bay-region communities, governments, and utilities have invested tens of billions of dollars in seismic upgrades and retrofits and replacements of older buildings and infrastructure. Innovation and state-of-the-art engineering, informed by science, including novel seismic-hazard assessments, have been applied to the challenge of increasing seismic resilience throughout the bay region. However, as long as people live and work in seismically vulnerable buildings or rely on seismically vulnerable transportation and utilities, more work remains to be done.With that in mind, the U.S. Geological Survey (USGS) and its partners developed the HayWired scenario as a tool to enable further actions that can change the outcome when the next major earthquake strikes. By illuminating the likely impacts to the present-day built environment, well-constructed scenarios can and have spurred officials and citizens to take steps that change the outcomes the scenario describes, whether used to guide more realistic response and recovery exercises or to launch mitigation measures that will reduce future risk.The HayWired scenario is the latest in a series of like-minded efforts to bring a special focus onto the impacts that could occur when the Hayward Fault again ruptures through the east side of the San Francisco Bay region as it last did in 1868. Cities in the east bay along the Richmond, Oakland, and Fremont corridor would be hit hardest by earthquake ground shaking, surface fault rupture, aftershocks, and fault afterslip, but the impacts would reach throughout the bay region and far beyond. The HayWired scenario name reflects our increased reliance on the Internet and telecommunications and also alludes to the

  12. Vulnerability to terrorist attacks in European electricity decarbonisation scenarios: Comparing renewable electricity imports to gas imports

    International Nuclear Information System (INIS)

    Lilliestam, Johan

    2014-01-01

    The decarbonised future European electricity system must remain secure: reliable electricity supply is a prerequisite for the functioning of modern society. Scenarios like Desertec, which partially rely on solar power imports from the Middle East and North Africa, may be attractive for decarbonisation, but raise concerns about terrorists interrupting supply by attacking the long, unprotected transmission lines in the Sahara. In this paper, I develop new methods and assess the European vulnerability to terrorist attacks in the Desertec scenario. I compare this to the vulnerability of today's system and a decarbonisation scenario in which Europe relies on gas imports for electricity generation. I show that the vulnerability of both gas and electricity imports is low, but electricity imports are more vulnerable than gas imports, due to their technical characteristics. Gas outages (and, potentially, resulting blackouts) are the very unlikely consequence even of very high-number attacks against the gas import system, whereas short blackouts are the potential consequence of a few attacks against the import electricity lines. As the impacts of all except extreme attacks are limited, terrorists cannot attack energy infrastructure and cause spectacular, fear-creating outages. Both gas and electricity import infrastructure are thus unattractive and unlikely terrorist targets. - Highlights: • A comparison of terrorism risks of importing solar power and gas for power generation. • Both scenarios show low vulnerability to terrorist attacks. • Within low vulnerabilities, gas imports are less vulnerable than electricity imports. • Causing spectacular, large and long outages is very difficult for attacker. • The attractiveness of gas and power import infrastructure as terrorist target is low

  13. Scientific data and climate scenarios. Study report nr 2

    International Nuclear Information System (INIS)

    Alex, Bastien; Baillat, Alice; Francois Gemenne; Jean Jouzel

    2017-05-01

    The objective of this report is to present climate evolutions and their impacts according to two warming scenarios: a 2 degree increase of the average surface temperature by 2100 (i.e. the most optimistic IPCC scenario), and a 5 degree increase by 2100 (the most pessimistic scenario). As far as possible, physical, social and economic, and health impacts are assessed by 2030 and 2050. The authors notice that the differences between both scenarios are hardly discernible by 2030, but more obviously by 2050. After a brief recall on IPCC scenarios, a first part addresses the evolutions of the world climate by considering the atmosphere (temperature increase, modification of precipitation regimes), seas and oceans (temperature, currents and thermal circulation, ocean acidification, seal level rise), extreme climate events (observations and trends, main impacts on populations and infrastructures), and the cryo-sphere (observations and impacts). The second part discusses regional predictions in terms of trends and impacts for metropolitan France and its overseas territories, for Africa, and for the Asia-Pacific region. The last part briefly discusses the possibly necessary evolution of the typology chosen to determine sources of vulnerability and the level of exposure to different risks. Many appendices propose more detailed presentations on specific issues and examples. A summarised version of the report is also provided

  14. Current and future flood risk to railway infrastructure in Europe

    Science.gov (United States)

    Bubeck, Philip; Kellermann, Patric; Alfieri, Lorenzo; Feyen, Luc; Dillenardt, Lisa; Thieken, Annegret H.

    2017-04-01

    CORINE, due to their line shapes. To assess current and future damage and risk to railway infrastructure in Europe, we apply the damage model RAIL -' RAilway Infrastructure Loss' that was specifically developed for railway infrastructure using empirical damage data. To adequately and comprehensively capture the line-shaped features of railway infrastructure, the assessment makes use of the open-access data set of openrailway.org. Current and future flood hazard in Europe is obtained with the LISFLOOD-based pan-European flood hazard mapping procedure combined with ensemble projections of extreme streamflow for the current century based on EURO-CORDEX RCP 8.5 climate scenarios. The presentation shows first results of the combination of the hazard data and the model RAIL for Europe.

  15. Enabling interoperability in Geoscience with GI-suite

    Science.gov (United States)

    Boldrini, Enrico; Papeschi, Fabrizio; Santoro, Mattia; Nativi, Stefano

    2015-04-01

    GI-suite is a brokering framework targeting interoperability of heterogeneous systems in the Geoscience domain. The framework is composed by different brokers each one focusing on a specific functionality: discovery, access and semantics (i.e. GI-cat, GI-axe, GI-sem). The brokering takes place between a set of heterogeneous publishing services and a set of heterogeneous consumer applications: the brokering target is represented by resources (e.g. coverages, features, or metadata information) required to seamlessly flow from the providers to the consumers. Different international and community standards are now supported by GI-suite, making possible the successful deployment of GI-suite in many international projects and initiatives (such as GEOSS, NSF BCube and several EU funded projects). As for the publisher side more than 40 standards and implementations are supported (e.g. Dublin Core, OAI-PMH, OGC W*S, Geonetwork, THREDDS Data Server, Hyrax Server, etc.). The support for each individual standard is provided by means of specific GI-suite components, called accessors. As for the consumer applications side more than 15 standards and implementations are supported (e.g. ESRI ArcGIS, Openlayers, OGC W*S, OAI-PMH clients, etc.). The support for each individual standard is provided by means of specific profiler components. The GI-suite can be used in different scenarios by different actors: - A data provider having a pre-existent data repository can deploy and configure GI-suite to broker it and making thus available its data resources through different protocols to many different users (e.g. for data discovery and/or data access) - A data consumer can use GI-suite to discover and/or access resources from a variety of publishing services that are already publishing data according to well-known standards. - A community can deploy and configure GI-suite to build a community (or project-specific) broker: GI-suite can broker a set of community related repositories and

  16. Scenario-based tsunami risk assessment using a static flooding approach and high-resolution digital elevation data: An example from Muscat in Oman

    Science.gov (United States)

    Schneider, Bastian; Hoffmann, Gösta; Reicherter, Klaus

    2016-04-01

    Knowledge of tsunami risk and vulnerability is essential to establish a well-adapted Multi Hazard Early Warning System, land-use planning and emergency management. As the tsunami risk for the coastline of Oman is still under discussion and remains enigmatic, various scenarios based on historical tsunamis were created. The suggested inundation and run-up heights were projected onto the modern infrastructural setting of the Muscat Capital Area. Furthermore, possible impacts of the worst-case tsunami event for Muscat are discussed. The approved Papathoma Tsunami Vulnerability Assessment Model was used to model the structural vulnerability of the infrastructure for a 2 m tsunami scenario, depicting the 1945 tsunami and a 5 m tsunami in Muscat. Considering structural vulnerability, the results suggest a minor tsunami risk for the 2 m tsunami scenario as the flooding is mainly confined to beaches and wadis. Especially traditional brick buildings, still predominant in numerous rural suburbs, and a prevalently coast-parallel road network lead to an increased tsunami risk. In contrast, the 5 m tsunami scenario reveals extensively inundated areas and with up to 48% of the buildings flooded, and therefore consequently a significantly higher tsunami risk. We expect up to 60000 damaged buildings and up to 380000 residents directly affected in the Muscat Capital Area, accompanied with a significant loss of life and damage to vital infrastructure. The rapid urbanization processes in the Muscat Capital Area, predominantly in areas along the coast, in combination with infrastructural, demographic and economic growth will additionally increase the tsunami risk and therefore emphasizes the importance of tsunami risk assessment in Oman.

  17. Building a multidisciplinary e-infrastructure for the NextData Community

    Science.gov (United States)

    Nativi, Stefano; Rorro, Marco; Mazzetti, Paolo; Fiameni, Giuseppe; Papeschi, Fabrizio; Carpenè, Michele

    2014-05-01

    In 2012, Italy decided to launch a national initiative called NextData (http://www.nextdataproject.it/): a national system for the retrieval, storage, access and diffusion of environmental and climate data from mountain and marine areas. NextData is funded by the Research and University Ministry, as a "Project of Interest". In 2013, NextData funded a "special project", the NextData System of Systems Infrastructure project (ND-SoS-Ina). The main objective is to design, build and operate in production the NextData multidisciplinary and multi-organizational e-infrastructure for the publication and sharing of its resources (e.g. data, services, vocabularies, models). SoS-Ina realizes the NextData general portal implementing the interoperability among the data archives carried out by NextData. The Florentine Division of the Institute of Atmospheric Pollution Research of CNR (CNR-IIA) and CINECA run the project. SoS-Ina (http://essi-lab.eu/nextdata/sosina/) decided to adopt a "System of Systems" (SoS) approach based on a brokering architecture. This has been pursued by applying the brokering technology first developed by the EC-FP7 EuroGEOSS project (http://www.eurogeoss.eu/broker/Pages/AbouttheEuroGEOSSBroker.aspx) and more recently consolidated by the international programme GEOSS (Global Earth Observation System of Systems) of GEO (Group oh Earth Observation) -see http://www.earthobservations.org/documents/geo_ix/20111122_geoss_implementation_highlights.pdf. The NextData general Portal architecture definition will proceed accordingly with the requirements elicited by user communities. The portal will rely on services and interfaces being offered by the brokering middleware and will be based on Liferay (http://www.liferay.com/). Liferay is free and open source, it provides many built-in applications for social collaboration, content and document management. Liferay is also configurable for high availability. The project considers three distinct phases and related

  18. The energy consumption of traffic 1990 - 2035 - Results of scenarios I - IV

    International Nuclear Information System (INIS)

    Keller, M.

    2007-01-01

    This comprehensive report for the Swiss Federal Office of Energy (SFOE) presents four scenarios concerning the development of energy consumption in the traffic sector for the period 1990 - 2035. The four scenarios - status quo, increased co-operation between the state and the economy with various energy levies, global reduction of energy consumption and, finally, scenario IV 'on the way to a 2000-Watt Society' - are briefly described. The areas examined include road, rail and air traffic as well as 'off-road' traffic. Infrastructure developments are commented on. The four scenarios are examined for various sensitivities including high gross domestic product GDP, high prices and warmer climate. Alternative fuels are looked at, as are further factors such as fuel tourism, pollutant emissions and costs. The results of the sensitivity analyses are compared and discussed and the necessary instruments are examined. This comprehensive report is completed with a comprehensive appendix

  19. Effective Utilization of Resources and Infrastructure for a Spaceport Network Architecture

    Science.gov (United States)

    Gill, Tracy; Larson, Wiley; Mueller, Robert; Roberson, Luke

    2012-01-01

    Providing routine, affordable access to a variety of orbital and deep space destinations requires an intricate network of ground, planetary surface, and space-based spaceports like those on Earth (land and sea), in various Earth orbits, and on other extraterrestrial surfaces. Advancements in technology and international collaboration are critical to establish a spaceport network that satisfies the requirements for private and government research, exploration, and commercial objectives. Technologies, interfaces, assembly techniques, and protocols must be adapted to enable mission critical capabilities and interoperability throughout the spaceport network. The conceptual space mission architecture must address the full range of required spaceport services, from managing propellants for a variety of spacecraft to governance structure. In order to accomplish affordability and sustainability goals, the network architecture must consider deriving propellants from in situ planetary resources to the maximum extent possible. Water on the Moon and Mars, Mars' atmospheric CO2, and O2 extracted from lunar regolith are examples of in situ resources that could be used to generate propellants for various spacecraft, orbital stages and trajectories, and the commodities to support habitation and human operations at these destinations. The ability to use in-space fuel depots containing in situ derived propellants would drastically reduce the mass required to launch long-duration or deep space missions from Earth's gravity well. Advances in transformative technologies and common capabilities, interfaces, umbilicals, commodities, protocols, and agreements will facilitate a cost-effective, safe, reliable infrastructure for a versatile network of Earth- and extraterrestrial spaceports. Defining a common infrastructure on Earth, planetary surfaces, and in space, as well as deriving propellants from in situ planetary resources to construct in-space propellant depots to serve the spaceport

  20. Enabling Research without Geographical Boundaries via Collaborative Research Infrastructures

    Science.gov (United States)

    Gesing, S.

    2016-12-01

    Collaborative research infrastructures on global scale for earth and space sciences face a plethora of challenges from technical implementations to organizational aspects. Science gateways - also known as virtual research environments (VREs) or virtual laboratories - address part of such challenges by providing end-to-end solutions to aid researchers to focus on their specific research questions without the need to become acquainted with the technical details of the complex underlying infrastructures. In general, they provide a single point of entry to tools and data irrespective of organizational boundaries and thus make scientific discoveries easier and faster. The importance of science gateways has been recognized on national as well as on international level by funding bodies and by organizations. For example, the US NSF has just funded a Science Gateways Community Institute, which offers support, consultancy and open accessible software repositories for users and developers; Horizon 2020 provides funding for virtual research environments in Europe, which has led to projects such as VRE4EIC (A Europe-wide Interoperable Virtual Research Environment to Empower Multidisciplinary Research Communities and Accelerate Innovation and Collaboration); national or continental research infrastructures such as XSEDE in the USA, Nectar in Australia or EGI in Europe support the development and uptake of science gateways; the global initiatives International Coalition on Science Gateways, the RDA Virtual Research Environment Interest Group as well as the IEEE Technical Area on Science Gateways have been founded to provide global leadership on future directions for science gateways in general and facilitate awareness for science gateways. This presentation will give an overview on these projects and initiatives aiming at supporting domain researchers and developers with measures for the efficient creation of science gateways, for increasing their usability and sustainability

  1. Regional climate scenarios for use in Nordic water resources studies

    DEFF Research Database (Denmark)

    Rummukainen, Markku; Räisänen, J.; Bjørge, D.

    2003-01-01

    in the Nordic region than in the global mean, regional increases and decreases in net precipitation, longer growing season, shorter snow season etc. These in turn affect runoff, snowpack, groundwater, soil frost and moisture, and thus hydropower production potential, flooding risks etc. Regional climate models......-users of water resources scenarios are the hydropower industry, dam safety instances and planners of other lasting infrastructure exposed to precipitation, river flows and flooding....

  2. Functional requirements document for NASA/MSFC Earth Science and Applications Division: Data and information system (ESAD-DIS). Interoperability, 1992

    Science.gov (United States)

    Stephens, J. Briscoe; Grider, Gary W.

    1992-01-01

    These Earth Science and Applications Division-Data and Information System (ESAD-DIS) interoperability requirements are designed to quantify the Earth Science and Application Division's hardware and software requirements in terms of communications between personal and visualization workstation, and mainframe computers. The electronic mail requirements and local area network (LAN) requirements are addressed. These interoperability requirements are top-level requirements framed around defining the existing ESAD-DIS interoperability and projecting known near-term requirements for both operational support and for management planning. Detailed requirements will be submitted on a case-by-case basis. This document is also intended as an overview of ESAD-DIs interoperability for new-comers and management not familiar with these activities. It is intended as background documentation to support requests for resources and support requirements.

  3. Emerging Requirements for Technology Management: A Sector-based Scenario Planning Approach

    Directory of Open Access Journals (Sweden)

    Simon Patrick Philbin

    2013-09-01

    Full Text Available Identifying the emerging requirements for technology management will help organisations to prepare for the future and remain competitive. Indeed technology management as a discipline needs to develop and respond to societal and industrial needs as well as the corresponding technology challenges. Therefore, following a review of technology forecasting methodologies, a sector-based scenario planning approach has been used to derive the emerging requirements for technology management. This structured framework provided an analytical lens to focus on the requirements for managing technology in the healthcare, energy and higher education sectors over the next 5-10 years. These requirements include the need for new business models to support the adoption of technologies; integration of new technologies with existing delivery channels; management of technology options including R&D project management; technology standards, validation and interoperability; and decision-making tools to support technology investment.

  4. Assessment of Collaboration and Interoperability in an Information Management System to Support Bioscience Research

    Science.gov (United States)

    Myneni, Sahiti; Patel, Vimla L.

    2009-01-01

    Biomedical researchers often have to work on massive, detailed, and heterogeneous datasets that raise new challenges of information management. This study reports an investigation into the nature of the problems faced by the researchers in two bioscience test laboratories when dealing with their data management applications. Data were collected using ethnographic observations, questionnaires, and semi-structured interviews. The major problems identified in working with these systems were related to data organization, publications, and collaboration. The interoperability standards were analyzed using a C4I framework at the level of connection, communication, consolidation, and collaboration. Such an analysis was found to be useful in judging the capabilities of data management systems at different levels of technological competency. While collaboration and system interoperability are the “must have” attributes of these biomedical scientific laboratory information management applications, usability and human interoperability are the other design concerns that must also be addressed for easy use and implementation. PMID:20351900

  5. A Service Oriented Architecture Approach to Achieve Interoperability between Immunization Information Systems in Iran.

    Science.gov (United States)

    Hosseini, Masoud; Ahmadi, Maryam; Dixon, Brian E

    2014-01-01

    Clinical decision support (CDS) systems can support vaccine forecasting and immunization reminders; however, immunization decision-making requires data from fragmented, independent systems. Interoperability and accurate data exchange between immunization information systems (IIS) is an essential factor to utilize Immunization CDS systems. Service oriented architecture (SOA) and Health Level 7 (HL7) are dominant standards for web-based exchange of clinical information. We implemented a system based on SOA and HL7 v3 to support immunization CDS in Iran. We evaluated system performance by exchanging 1500 immunization records for roughly 400 infants between two IISs. System turnaround time is less than a minute for synchronous operation calls and the retrieved immunization history of infants were always identical in different systems. CDS generated reports were accordant to immunization guidelines and the calculations for next visit times were accurate. Interoperability is rare or nonexistent between IIS. Since inter-state data exchange is rare in United States, this approach could be a good prototype to achieve interoperability of immunization information.

  6. Security infrastructure for dynamically provisioned cloud infrastructure services

    NARCIS (Netherlands)

    Demchenko, Y.; Ngo, C.; de Laat, C.; Lopez, D.R.; Morales, A.; García-Espín, J.A.; Pearson, S.; Yee, G.

    2013-01-01

    This chapter discusses conceptual issues, basic requirements and practical suggestions for designing dynamically configured security infrastructure provisioned on demand as part of the cloud-based infrastructure. This chapter describes general use cases for provisioning cloud infrastructure services

  7. Language interoperability for high-performance parallel scientific components

    International Nuclear Information System (INIS)

    Elliot, N; Kohn, S; Smolinski, B

    1999-01-01

    With the increasing complexity and interdisciplinary nature of scientific applications, code reuse is becoming increasingly important in scientific computing. One method for facilitating code reuse is the use of components technologies, which have been used widely in industry. However, components have only recently worked their way into scientific computing. Language interoperability is an important underlying technology for these component architectures. In this paper, we present an approach to language interoperability for a high-performance parallel, component architecture being developed by the Common Component Architecture (CCA) group. Our approach is based on Interface Definition Language (IDL) techniques. We have developed a Scientific Interface Definition Language (SIDL), as well as bindings to C and Fortran. We have also developed a SIDL compiler and run-time library support for reference counting, reflection, object management, and exception handling (Babel). Results from using Babel to call a standard numerical solver library (written in C) from C and Fortran show that the cost of using Babel is minimal, where as the savings in development time and the benefits of object-oriented development support for C and Fortran far outweigh the costs

  8. Enabling IoT ecosystems through platform interoperability

    OpenAIRE

    Bröring, Arne; Schmid, Stefan; Schindhelm, Corina-Kim; Khelil, Abdelmajid; Kabisch, Sebastian; Kramer, Denis; Le Phuoc, Danh; Mitic, Jelena; Anicic, Darko; Teniente López, Ernest

    2017-01-01

    Today, the Internet of Things (IoT) comprises vertically oriented platforms for things. Developers who want to use them need to negotiate access individually and adapt to the platform-specific API and information models. Having to perform these actions for each platform often outweighs the possible gains from adapting applications to multiple platforms. This fragmentation of the IoT and the missing interoperability result in high entry barriers for developers and prevent the emergence of broa...

  9. caCORE version 3: Implementation of a model driven, service-oriented architecture for semantic interoperability

    OpenAIRE

    Komatsoulis, George A.; Warzel, Denise B.; Hartel, Frank W.; Shanbhag, Krishnakant; Chilukuri, Ram; Fragoso, Gilberto; de Coronado, Sherri; Reeves, Dianne M.; Hadfield, Jillaine B.; Ludet, Christophe; Covitz, Peter A.

    2007-01-01

    One of the requirements for a federated information system is interoperability, the ability of one computer system to access and use the resources of another system. This feature is particularly important in biomedical research systems, which need to coordinate a variety of disparate types of data. In order to meet this need, the National Cancer Institute Center for Bioinformatics (NCICB) has created the cancer Common Ontologic Representation Environment (caCORE), an interoperability infrastr...

  10. Understanding and enhancing future infrastructure resiliency: a socio-ecological approach.

    Science.gov (United States)

    Sage, Daniel; Sircar, Indraneel; Dainty, Andrew; Fussey, Pete; Goodier, Chris

    2015-07-01

    The resilience of any system, human or natural, centres on its capacity to adapt its structure, but not necessarily its function, to a new configuration in response to long-term socio-ecological change. In the long term, therefore, enhancing resilience involves more than simply improving a system's ability to resist an immediate threat or to recover to a stable past state. However, despite the prevalence of adaptive notions of resilience in academic discourse, it is apparent that infrastructure planners and policies largely continue to struggle to comprehend longer-term system adaptation in their understanding of resilience. Instead, a short-term, stable system (STSS) perspective on resilience is prevalent. This paper seeks to identify and problematise this perspective, presenting research based on the development of a heuristic 'scenario-episode' tool to address, and challenge, it in the context of United Kingdom infrastructure resilience. The aim is to help resilience practitioners to understand better the capacities of future infrastructure systems to respond to natural, malicious threats. © 2015 The Author(s). Disasters © Overseas Development Institute, 2015.

  11. Current Barriers to Large-scale Interoperability of Traceability Technology in the Seafood Sector.

    Science.gov (United States)

    Hardt, Marah J; Flett, Keith; Howell, Colleen J

    2017-08-01

    Interoperability is a critical component of full-chain digital traceability, but is almost nonexistent in the seafood industry. Using both quantitative and qualitative methodology, this study explores the barriers impeding progress toward large-scale interoperability among digital traceability systems in the seafood sector from the perspectives of seafood companies, technology vendors, and supply chains as a whole. We highlight lessons from recent research and field work focused on implementing traceability across full supply chains and make some recommendations for next steps in terms of overcoming challenges and scaling current efforts. © 2017 Institute of Food Technologists®.

  12. A Vision for Open Cyber-Scholarly Infrastructures

    Directory of Open Access Journals (Sweden)

    Costantino Thanos

    2016-05-01

    Full Text Available The characteristics of modern science, i.e., data-intensive, multidisciplinary, open, and heavily dependent on Internet technologies, entail the creation of a linked scholarly record that is online and open. Instrumental in making this vision happen is the development of the next generation of Open Cyber-Scholarly Infrastructures (OCIs, i.e., enablers of an open, evolvable, and extensible scholarly ecosystem. The paper delineates the evolving scenario of the modern scholarly record and describes the functionality of future OCIs as well as the radical changes in scholarly practices including new reading, learning, and information-seeking practices enabled by OCIs.

  13. Special issue on enabling open and interoperable access to Planetary Science and Heliophysics databases and tools

    Science.gov (United States)

    2018-01-01

    The large amount of data generated by modern space missions calls for a change of organization of data distribution and access procedures. Although long term archives exist for telescopic and space-borne observations, high-level functions need to be developed on top of these repositories to make Planetary Science and Heliophysics data more accessible and to favor interoperability. Results of simulations and reference laboratory data also need to be integrated to support and interpret the observations. Interoperable software and interfaces have recently been developed in many scientific domains. The Virtual Observatory (VO) interoperable standards developed for Astronomy by the International Virtual Observatory Alliance (IVOA) can be adapted to Planetary Sciences, as demonstrated by the VESPA (Virtual European Solar and Planetary Access) team within the Europlanet-H2020-RI project. Other communities have developed their own standards: GIS (Geographic Information System) for Earth and planetary surfaces tools, SPASE (Space Physics Archive Search and Extract) for space plasma, PDS4 (NASA Planetary Data System, version 4) and IPDA (International Planetary Data Alliance) for planetary mission archives, etc, and an effort to make them interoperable altogether is starting, including automated workflows to process related data from different sources.

  14. The MADE reference information model for interoperable pervasive telemedicine systems

    NARCIS (Netherlands)

    Fung, L.S.N.; Jones, Valerie M.; Hermens, Hermanus J.

    2017-01-01

    Objectives: The main objective is to develop and validate a reference information model (RIM) to support semantic interoperability of pervasive telemedicine systems. The RIM is one component within a larger, computer-interpretable "MADE language" developed by the authors in the context of the

  15. Fundamental Data Standards for Science Data System Interoperability and Data Correlation

    Science.gov (United States)

    Hughes, J. Steven; Gopala Krishna, Barla; Rye, Elizabeth; Crichton, Daniel

    The advent of the Web and languages such as XML have brought an explosion of online science data repositories and the promises of correlated data and interoperable systems. However there have been relatively few successes in meeting the expectations of science users in the internet age. For example a Google-like search for images of Mars will return many highly-derived and appropriately tagged images but largely ignore the majority of images in most online image repositories. Once retrieved, users are further frustrated by poor data descriptions, arcane formats, and badly organized ancillary information. A wealth of research indicates that shared information models are needed to enable system interoperability and data correlation. However, at a more fundamental level, data correlation and system interoperability are dependant on a relatively few shared data standards. A com-mon data dictionary standard, for example, allows the controlled vocabulary used in a science repository to be shared with potential collaborators. Common data registry and product iden-tification standards enable systems to efficiently find, locate, and retrieve data products and their metadata from remote repositories. Information content standards define categories of descriptive data that help make the data products scientifically useful to users who were not part of the original team that produced the data. The Planetary Data System (PDS) has a plan to move the PDS to a fully online, federated system. This plan addresses new demands on the system including increasing data volume, numbers of missions, and complexity of missions. A key component of this plan is the upgrade of the PDS Data Standards. The adoption of the core PDS data standards by the International Planetary Data Alliance (IPDA) adds the element of international cooperation to the plan. This presentation will provide an overview of the fundamental data standards being adopted by the PDS that transcend science domains and that

  16. Smart hospitality—Interconnectivity and interoperability towards an ecosystem

    OpenAIRE

    Buhalis, Dimitrios; Leung, Rosanna

    2018-01-01

    The Internet and cloud computing changed the way business operate. Standardised web-based applications simplify data interchange which allow internal applications and business partners systems to become interconnected and interoperable. This study conceptualises the smart and agile hospitality enterprises of the future, and proposes a smart hospitality ecosystem that adds value to all stakeholders. Internal data from applications among all stakeholders, consolidated with external environment ...

  17. Interoperability between Fingerprint Biometric Systems: An Empirical Study

    OpenAIRE

    Gashi, I.; Mason, S.; Lugini, L.; Marasco, E.; Cukic, B.

    2014-01-01

    Fingerprints are likely the most widely used biometric in commercial as well as law enforcement applications. With the expected rapid growth of fingerprint authentication in mobile devices their importance justifies increased demands for dependability. An increasing number of new sensors,applications and a diverse user population also intensify concerns about the interoperability in fingerprint authentication. In most applications, fingerprints captured for user enrollment with one device may...

  18. Watershed and Economic Data InterOperability (WEDO): Facilitating Discovery, Evaluation and Integration through the Sharing of Watershed Modeling Data

    Science.gov (United States)

    Watershed and Economic Data InterOperability (WEDO) is a system of information technologies designed to publish watershed modeling studies for reuse. WEDO facilitates three aspects of interoperability: discovery, evaluation and integration of data. This increased level of interop...

  19. Cyber Security Threats to Safety-Critical, Space-Based Infrastructures

    Science.gov (United States)

    Johnson, C. W.; Atencia Yepez, A.

    2012-01-01

    Space-based systems play an important role within national critical infrastructures. They are being integrated into advanced air-traffic management applications, rail signalling systems, energy distribution software etc. Unfortunately, the end users of communications, location sensing and timing applications often fail to understand that these infrastructures are vulnerable to a wide range of security threats. The following pages focus on concerns associated with potential cyber-attacks. These are important because future attacks may invalidate many of the safety assumptions that support the provision of critical space-based services. These safety assumptions are based on standard forms of hazard analysis that ignore cyber-security considerations This is a significant limitation when, for instance, security attacks can simultaneously exploit multiple vulnerabilities in a manner that would never occur without a deliberate enemy seeking to damage space based systems and ground infrastructures. We address this concern through the development of a combined safety and security risk assessment methodology. The aim is to identify attack scenarios that justify the allocation of additional design resources so that safety barriers can be strengthened to increase our resilience against security threats.

  20. Support interoperability and reusability of emerging forms of assessment: Some issues on integrating IMS LD with IMS QTI

    NARCIS (Netherlands)

    Miao, Yongwu; Boon, Jo; Van der Klink, Marcel; Sloep, Peter; Koper, Rob

    2009-01-01

    Miao, Y., Boon, J., Van der Klink, M., Sloep, P. B., & Koper, R. (2011). Support interoperability and reusability of emerging forms of assessment: Some issues on integrating IMS LD with IMS QTI. In F. Lazarinis, S. Green, & E. Pearson (Eds.), E-Learning Standards and Interoperability: Frameworks

  1. An Open Source Tool to Test Interoperability

    Science.gov (United States)

    Bermudez, L. E.

    2012-12-01

    Scientists interact with information at various levels from gathering of the raw observed data to accessing portrayed processed quality control data. Geoinformatics tools help scientist on the acquisition, storage, processing, dissemination and presentation of geospatial information. Most of the interactions occur in a distributed environment between software components that take the role of either client or server. The communication between components includes protocols, encodings of messages and managing of errors. Testing of these communication components is important to guarantee proper implementation of standards. The communication between clients and servers can be adhoc or follow standards. By following standards interoperability between components increase while reducing the time of developing new software. The Open Geospatial Consortium (OGC), not only coordinates the development of standards but also, within the Compliance Testing Program (CITE), provides a testing infrastructure to test clients and servers. The OGC Web-based Test Engine Facility, based on TEAM Engine, allows developers to test Web services and clients for correct implementation of OGC standards. TEAM Engine is a JAVA open source facility, available at Sourceforge that can be run via command line, deployed in a web servlet container or integrated in developer's environment via MAVEN. The TEAM Engine uses the Compliance Test Language (CTL) and TestNG to test HTTP requests, SOAP services and XML instances against Schemas and Schematron based assertions of any type of web service, not only OGC services. For example, the OGC Web Feature Service (WFS) 1.0.0 test has more than 400 test assertions. Some of these assertions includes conformance of HTTP responses, conformance of GML-encoded data; proper values for elements and attributes in the XML; and, correct error responses. This presentation will provide an overview of TEAM Engine, introduction of how to test via the OGC Testing web site and

  2. MediCoordination: a practical approach to interoperability in the Swiss health system.

    Science.gov (United States)

    Müller, Henning; Schumacher, Michael; Godel, David; Omar, Abu Khaled; Mooser, Francois; Ding, Sandrine

    2009-01-01

    Interoperability and data exchange between partners in the health sector is seen as one of the important domains that can improve care processes and in the long run also decrease costs of the health care system. Data exchange can assure that the data on the patient are as complete as possible avoiding potential mistreatments, and it can avoid double examinations if the data required are already available. On the other hand, health data is a sensible point for many people and strong protection needs to be implemented to protect patient data against misuse as well as tools to let the patient manage his/her own data. Many countries have eHealth initiatives in preparation or already implemented. However, health data exchange on a large scale still has a fairly long way to go as the political processes for global solutions are often complicated. In the MediCoordination project a pragmatic approach is selected trying to integrate several partners in health care on a regional scale. In parallel with the Swiss eHealth strategy that is currently being elaborated by the Swiss confederation, particularly medium-sized hospitals and external partners are targeted in MediCoordination to implement concrete added-value scenarios of information exchange between hospitals and external medical actors.

  3. Rich services in interoperable Learning Designs: can the circle be squared?

    OpenAIRE

    Griffiths, David

    2009-01-01

    Griffiths, D. (2009). Rich services in interoperable Learning Designs: Can the circle be squared?. Presented at Opening Up Learning Design, European LAMS and Learning Design Conference 2009. July, 6-9, 2009, Milton Keynes, United Kingdom.

  4. Enterprise interoperability with SOA: a survey of service composition approaches

    NARCIS (Netherlands)

    Mantovaneli Pessoa, Rodrigo; Goncalves da Silva, Eduardo; van Sinderen, Marten J.; Quartel, Dick; Ferreira Pires, Luis

    Service-oriented architecture (SOA) claims to facilitate the construction of flexible and loosely coupled business applications, and therefore is seen as an enabling factor for enterprise interoperability. The concept of service, which is central to SOA, is very convenient to address the matching of

  5. Report by the study committee related to data held by energy network and infrastructure managers

    International Nuclear Information System (INIS)

    2017-01-01

    This study aimed at providing a view of the status of data related to energy and held by network and infrastructure managers and operators. It is notably based on about fifty hearings of regulated energy operators, providers, representatives of electricity producers, local authorities, representatives of public bodies awarding concession and exploitation, and consumer associations. The authors also met heat and water network operators, IT service companies, start-ups of the energy sector, and telecommunications operators, and representatives of French and European institutional bodies. Fifteen propositions have been formulated which address the imperative of data consistence, quality and inter-operability, the clarification of the game of actors for an efficient governance of networks, the necessity of a consolidated confidence of consumers regarding the management of their data, and the activities of regulation (readability of analysis criteria, predictability of the resulting action)

  6. Nato Multinational Brigade Interoperability: Issues, Mitigating Solutions and is it Time for a Nato Multinational Brigade Doctrine?

    Directory of Open Access Journals (Sweden)

    Schiller Mark

    2016-06-01

    Full Text Available Multinational Brigade Operations involving NATO and its European Partners are the norm in the post-Cold War Era. Commonplace today are Multinational Brigades, composed of staffs and subordinate units representing almost every NATO Country and Partner, participating in training exercises or actual operations in both the European and Southwest Asian Theatres. Leadership challenges are prevalent for the Multinational Brigade Commander and his staff, especially those challenges they face in achieving an effective level of brigade interoperability in order to conduct successful operations in NATO’s present and future operating environments. The purpose of this paper is twofold: to examine the major interoperability obstacles a multinational brigade commander and his staff are likely to encounter during the planning and execution of brigade operations; and, to recommend actions and measures a multinational brigade commander and his staff can implement to facilitate interoperability in a multinational brigade operating environment. Several key interoperability topics considered integral to effective multinational brigade operations will be examined and analysed to include understanding partner unit capabilities and limitations facilitated by an integration plan, appropriate command and support relationships, compatible communications, synchronized intelligence and information collection, establishing effective liaison, and fratricide prevention. The paper conclusion will urge for a NATO land brigade doctrine considering doctrine’s critical importance to effective brigade command and control interoperability and the expected missions a land brigade will encounter in future NATO operating environments as part of the NATO Very High Readiness Joint Task Force (VJTF.

  7. Assessing the Climate Resilience of Transport Infrastructure Investments in Tanzania

    Science.gov (United States)

    Hall, J. W.; Pant, R.; Koks, E.; Thacker, S.; Russell, T.

    2017-12-01

    Whilst there is an urgent need for infrastructure investment in developing countries, there is a risk that poorly planned and built infrastructure will introduce new vulnerabilities. As climate change increases the magnitudes and frequency of natural hazard events, incidence of disruptive infrastructure failures are likely to become more frequent. Therefore, it is important that infrastructure planning and investment is underpinned by climate risk assessment that can inform adaptation planning. Tanzania's rapid economic growth is placing considerable strain on the country's transportation infrastructure (roads, railways, shipping and aviation); especially at the port of Dar es Salaam and its linking transport corridors. A growing number of natural hazard events, in particular flooding, are impacting the reliability of this already over-used network. Here we report on new methodology to analyse vulnerabilities and risks due to failures of key locations in the intermodal transport network of Tanzania, including strategic connectivity to neighboring countries. To perform the national-scale risk analysis we will utilize a system-of-systems methodology. The main components of this general risk assessment, when applied to transportation systems, include: (1) Assembling data on: spatially coherent extreme hazards and intermodal transportation networks; (2) Intersecting hazards with transport network models to initiate failure conditions that trigger failure propagation across interdependent networks; (3) Quantifying failure outcomes in terms of social impacts (customers/passengers disrupted) and/or macroeconomic consequences (across multiple sectors); and (4) Simulating, testing and collecting multiple failure scenarios to perform an exhaustive risk assessment in terms of probabilities and consequences. The methodology is being used to pinpoint vulnerability and reduce climate risks to transport infrastructure investments.

  8. FHIR Healthcare Directories: Adopting Shared Interfaces to Achieve Interoperable Medical Device Data Integration.

    Science.gov (United States)

    Tyndall, Timothy; Tyndall, Ayami

    2018-01-01

    Healthcare directories are vital for interoperability among healthcare providers, researchers and patients. Past efforts at directory services have not provided the tools to allow integration of the diverse data sources. Many are overly strict, incompatible with legacy databases, and do not provide Data Provenance. A more architecture-independent system is needed to enable secure, GDPR-compatible (8) service discovery across organizational boundaries. We review our development of a portable Data Provenance Toolkit supporting provenance within Health Information Exchange (HIE) systems. The Toolkit has been integrated with client software and successfully leveraged in clinical data integration. The Toolkit validates provenance stored in a Blockchain or Directory record and creates provenance signatures, providing standardized provenance that moves with the data. This healthcare directory suite implements discovery of healthcare data by HIE and EHR systems via FHIR. Shortcomings of past directory efforts include the ability to map complex datasets and enabling interoperability via exchange endpoint discovery. By delivering data without dictating how it is stored we improve exchange and facilitate discovery on a multi-national level through open source, fully interoperable tools. With the development of Data Provenance resources we enhance exchange and improve security and usability throughout the health data continuum.

  9. Next-generation navigational infrastructure and the ATLAS event store

    International Nuclear Information System (INIS)

    Gemmeren, P van; Malon, D; Nowak, M

    2014-01-01

    The ATLAS event store employs a persistence framework with extensive navigational capabilities. These include real-time back navigation to upstream processing stages, externalizable data object references, navigation from any data object to any other both within a single file and across files, and more. The 2013-2014 shutdown of the Large Hadron Collider provides an opportunity to enhance this infrastructure in several ways that both extend these capabilities and allow the collaboration to better exploit emerging computing platforms. Enhancements include redesign with efficient file merging in mind, content-based indices in optimized reference types, and support for forward references. The latter provide the potential to construct valid references to data before those data are written, a capability that is useful in a variety of multithreading, multiprocessing, distributed processing, and deferred processing scenarios. This paper describes the architecture and design of the next generation of ATLAS navigational infrastructure.

  10. OpenICE medical device interoperability platform overview and requirement analysis.

    Science.gov (United States)

    Arney, David; Plourde, Jeffrey; Goldman, Julian M

    2018-02-23

    We give an overview of OpenICE, an open source implementation of the ASTM standard F2761 for the Integrated Clinical Environment (ICE) that leverages medical device interoperability, together with an analysis of the clinical and non-functional requirements and community process that inspired its design.

  11. European environmental research infrastructures are going for common 30 years strategy

    Science.gov (United States)

    Asmi, Ari; Konjin, Jacco; Pursula, Antti

    2014-05-01

    Environmental Research infrastructures are facilities, resources, systems and related services that are used by research communities to conduct top-level research. Environmental research is addressing processes at very different time scales, and supporting research infrastructures must be designed as long-term facilities in order to meet the requirements of continuous environmental observation, measurement and analysis. This longevity makes the environmental research infrastructures ideal structures to support the long-term development in environmental sciences. ENVRI project is a collaborative action of the major European (ESFRI) Environmental Research Infrastructures working towards increased co-operation and interoperability between the infrastructures. One of the key products of the ENVRI project is to combine the long-term plans of the individual infrastructures towards a common strategy, describing the vision and planned actions. The envisaged vision for environmental research infrastructures toward 2030 is to support the holistic understanding of our planet and it's behavior. The development of a 'Standard Model of the Planet' is a common ambition, a challenge to define an environmental standard model; a framework of all interactions within the Earth System, from solid earth to near space. Indeed scientists feel challenged to contribute to a 'Standard Model of the Planet' with data, models, algorithms and discoveries. Understanding the Earth System as an interlinked system requires a systems approach. The Environmental Sciences are rapidly moving to become a one system-level science. Mainly since modern science, engineering and society are increasingly facing complex problems that can only be understood in the context of the full overall system. The strategy of the supporting collaborating research infrastructures is based on developing three key factors for the Environmental Sciences: the technological, the cultural and the human capital. The technological

  12. 'System-of-systems' approach for interdependent critical infrastructures

    International Nuclear Information System (INIS)

    Eusgeld, Irene; Nan, Cen; Dietz, Sven

    2011-01-01

    The study of the interdependencies within critical infrastructures (CI) is a growing field of research as the importance of potential failure propagation among infrastructures may lead to cascades affecting all supply networks. New powerful methods are required to model and describe such 'systems-of-systems' (SoS) as a whole. An overall model is required to provide security and reliability assessment taking into account various kinds of threats and failures. A significant challenge associated with this model may be to create 'what-if' scenarios for the analysis of interdependencies. In this paper the interdependencies between industrial control systems (ICS), in particular SCADA (Supervisory Control and Data Acquisition), and the underlying critical infrastructures to address the vulnerabilities related to the coupling of these systems are analyzed. The modeling alternatives for system-of-systems, integrated versus coupled models, are discussed. An integrated model contains detailed low level models of (sub)systems as well as a high level model, covering all hierarchical levels. On the other hand, a coupled model aggregates different simulated outputs of the low level models as inputs at a higher level. Strengths and weaknesses of both approaches are analyzed and a model architecture for SCADA and the 'system under control' are proposed. Furthermore, the HLA simulation standard is introduced and discussed in this paper as a promising approach to represent interdependencies between infrastructures. To demonstrate the capabilities of the HLA standard for the interdependencies study, an exemplary application and some first results are also briefly presented in this paper.

  13. The Semantic Management of Environmental Resources within the Interoperable Context of the EuroGEOSS: Alignment of GEMET and the GEOSS SBAs

    Science.gov (United States)

    Cialone, Claudia; Stock, Kristin

    2010-05-01

    EuroGEOSS is a European Commission funded project. It aims at improving a scientific understanding of the complex mechanisms which drive changes affecting our planet, identifying and establishing interoperable arrangements between environmental information systems. These systems would be sustained and operated by organizations with a clear mandate and resources and rendered available following the specifications of already existent frameworks such as GEOSS (the Global Earth Observation System of systems)1 and INSPIRE (the Infrastructure for Spatial Information in the European Community)2. The EuroGEOSS project's infrastructure focuses on three thematic areas: forestry, drought and biodiversity. One of the important activities in the project is the retrieval, parsing and harmonization of the large amount of heterogeneous environmental data available at local, regional and global levels between these strategic areas. The challenge is to render it semantically and technically interoperable in a simple way. An initial step in achieving this semantic and technical interoperability involves the selection of appropriate classification schemes (for example, thesauri, ontologies and controlled vocabularies) to describe the resources in the EuroGEOSS framework. These classifications become a crucial part of the interoperable framework scaffolding because they allow data providers to describe their resources and thus support resource discovery, execution and orchestration of varying levels of complexity. However, at present, given the diverse range of environmental thesauri, controlled vocabularies and ontologies and the large number of resources provided by project participants, the selection of appropriate classification schemes involves a number of considerations. First of all, there is the semantic difficulty of selecting classification schemes that contain concepts that are relevant to each thematic area. Secondly, EuroGEOSS is intended to accommodate a number of

  14. ForM@Ter: a French Solid Earth Research Infrastructure Project

    Science.gov (United States)

    Mandea, M.; Diament, M.; Jamet, O.; Deschamps-Ostanciaux, E.

    2017-12-01

    Recently, some noteworthy initiatives to develop efficient research e-infrastructures for the study of the Earth's system have been set up. However, some gaps between the data availability and their scientific use still exists, either because technical reasons (big data issues) or because of the lack of a dedicated support in terms of expert knowledge of the data, software availability, or data cost. The need for thematic cooperative platforms has been underlined over the last years, as well as the need to create thematic centres designed to federate the scientific community of Earth's observation. Four thematic data centres have been developed in France, covering the domains of ocean, atmosphere, land, and solid Earth sciences. For the Solid Earth science community, a research infrastructure project named ForM@Ter was launched by the French Space Agency (CNES) and the National Centre for Scientific Research (CNRS), with the active participation of the National institute for geographical and forestry information (IGN). Currently, it relies on the contributions of scientists from more than 20 French Earth science laboratories.Preliminary analysis have showed that a focus on the determination of the shape and movements of the Earth surface (ForM@Ter: Formes et Mouvements de la Terre) can federate a wide variety of scientific areas (earthquake cycle, tectonics, morphogenesis, volcanism, erosion dynamics, mantle rheology, geodesy) and offers many interfaces with other geoscience domains, such as glaciology or snow evolution. This choice motivates the design of an ambitious data distribution scheme, including a wide variety of sources - optical imagery, SAR, GNSS, gravity, satellite altimetry data, in situ observations (inclinometers, seismometers, etc.) - as well as a wide variety of processing techniques. In the evolving context of the current and forthcoming national and international e-infrastructures, the challenge of the project is to design a non

  15. Common business objects: Demonstrating interoperability in the oil and gas industry

    International Nuclear Information System (INIS)

    McLellan, S.G.; Abusalbi, N.; Brown, J.; Quinlivan, W.F.

    1997-01-01

    The PetroTechnical Open Software Corp. (POSC) was organized in 1990 to define technical methods to make it easier to design interoperable data solutions for oil and gas companies. When POSC rolls out seed implementations, oilfield service members must validate them, correct any errors or ambiguities, and champion these corrections into the original specifications before full integration into POSC-compliant, commercial products. Organizations like POSC are assuming a new role of promoting formation of projects where E and P companies and vendors jointly test their pieces of the migration puzzle on small subsets of the whole problem. The authors describe three such joint projects. While confirming the value of such open cross-company cooperation, these cases also help to redefine interoperability in terms of business objects that will be common across oilfield companies, their applications, access software, data, or data stores

  16. Cross border semantic interoperability for clinical research: the EHR4CR semantic resources and services

    Science.gov (United States)

    Daniel, Christel; Ouagne, David; Sadou, Eric; Forsberg, Kerstin; Gilchrist, Mark Mc; Zapletal, Eric; Paris, Nicolas; Hussain, Sajjad; Jaulent, Marie-Christine; MD, Dipka Kalra

    2016-01-01

    With the development of platforms enabling the use of routinely collected clinical data in the context of international clinical research, scalable solutions for cross border semantic interoperability need to be developed. Within the context of the IMI EHR4CR project, we first defined the requirements and evaluation criteria of the EHR4CR semantic interoperability platform and then developed the semantic resources and supportive services and tooling to assist hospital sites in standardizing their data for allowing the execution of the project use cases. The experience gained from the evaluation of the EHR4CR platform accessing to semantically equivalent data elements across 11 European participating EHR systems from 5 countries demonstrated how far the mediation model and mapping efforts met the expected requirements of the project. Developers of semantic interoperability platforms are beginning to address a core set of requirements in order to reach the goal of developing cross border semantic integration of data. PMID:27570649

  17. Network of Research Infrastructures for European Seismology (NERIES)—Web Portal Developments for Interactive Access to Earthquake Data on a European Scale

    OpenAIRE

    A. Spinuso; L. Trani; S. Rives; P. Thomy; F. Euchner; Danijel Schorlemmer; Joachim Saul; Andres Heinloo; R. Bossu; T. van Eck

    2009-01-01

    The Network of Research Infrastructures for European Seismology (NERIES) is European Commission (EC) project whose focus is networking together seismological observatories and research institutes into one integrated European infrastructure that provides access to data and data products for research. Seismological institutes and organizations in European and Mediterranean countries maintain large, geographically distributed data archives, therefore this scenario suggested a design approach bas...

  18. RuleML-Based Learning Object Interoperability on the Semantic Web

    Science.gov (United States)

    Biletskiy, Yevgen; Boley, Harold; Ranganathan, Girish R.

    2008-01-01

    Purpose: The present paper aims to describe an approach for building the Semantic Web rules for interoperation between heterogeneous learning objects, namely course outlines from different universities, and one of the rule uses: identifying (in)compatibilities between course descriptions. Design/methodology/approach: As proof of concept, a rule…

  19. Emissions Scenarios and Fossil-fuel Peaking

    Science.gov (United States)

    Brecha, R.

    2008-12-01

    Intergovernmental Panel on Climate Change (IPCC) emissions scenarios are based on detailed energy system models in which demographics, technology and economics are used to generate projections of future world energy consumption, and therefore, of greenhouse gas emissions. Built into the assumptions for these scenarios are estimates for ultimately recoverable resources of various fossil fuels. There is a growing chorus of critics who believe that the true extent of recoverable fossil resources is much smaller than the amounts taken as a baseline for the IPCC scenarios. In a climate optimist camp are those who contend that "peak oil" will lead to a switch to renewable energy sources, while others point out that high prices for oil caused by supply limitations could very well lead to a transition to liquid fuels that actually increase total carbon emissions. We examine a third scenario in which high energy prices, which are correlated with increasing infrastructure, exploration and development costs, conspire to limit the potential for making a switch to coal or natural gas for liquid fuels. In addition, the same increasing costs limit the potential for expansion of tar sand and shale oil recovery. In our qualitative model of the energy system, backed by data from short- and medium-term trends, we have a useful way to gain a sense of potential carbon emission bounds. A bound for 21st century emissions is investigated based on two assumptions: first, that extractable fossil-fuel resources follow the trends assumed by "peak oil" adherents, and second, that little is done in the way of climate mitigation policies. If resources, and perhaps more importantly, extraction rates, of fossil fuels are limited compared to assumptions in the emissions scenarios, a situation can arise in which emissions are supply-driven. However, we show that even in this "peak fossil-fuel" limit, carbon emissions are high enough to surpass 550 ppm or 2°C climate protection guardrails. Some

  20. Energy infrastructure in India: Profile and risks under climate change

    DEFF Research Database (Denmark)

    Garg, Amit; Naswa, Prakriti; Shukla, P.R.

    2015-01-01

    risks to energy infrastructures in India and details two case studies - a crude oil importing port and a western coast railway transporting coal. The climate vulnerability of the port has been mapped using an index while that of the railway has been done through a damage function for RCP 4.5.0 and 8.......5 scenarios. Our analysis shows that risk management through adaptation is likely to be very expensive. The system risks can be even greater and might adversely affect energy security and access objectives. Aligning sustainable development and climate adaptation measures can deliver substantial co......-benefits. The key policy recommendations include: i) mandatory vulnerability assessment to future climate risks for energy infrastructures; ii) project and systemic risks in the vulnerability index; iii) adaptation funds for unmitigated climate risks; iv) continuous monitoring of climatic parameters...