WorldWideScience

Sample records for interoperability

  1. Interoperability

    DEFF Research Database (Denmark)

    Savin, Andrej

    would considerably alter the current privacy setting.3 First, the current Directive would be replaced with a Regulation, achieving EU-­‐wide harmonization. Second, the scope of the instrument would be widened and the provisions made more precise. Third, the use of consent for data processing would....... Finally, the Commission would obtain significant new powers to adopt delegated acts. This appendix explores the impact that the proposed Regulation might have on interoperability of user-­‐generated services.4 Since the proposed Regulation is an instrument of high complexity, only those provisions...

  2. Data Interoperability

    Directory of Open Access Journals (Sweden)

    Pasquale Pagano

    2013-07-01

    Full Text Available In the context of scientific investigations, data have acquired an ever growing leading role while their large scale, cross-community and cross-domain sharing have concurred to identify new investigation paradigms (Hey, Tansley, & Tolle, 2009. Unfortunately, data interoperability – a mandatory prerequisite for achieving the above scenarios – is still a difficult open research challenge. Both the “data” and “interoperability” concepts are difficult to be fully perceived and actually lead to different perceptions in diverse communities. This problem is further amplified when considered in the context of (global research data infrastructures that are expected to serve a plethora of communities of practice (Lave & Wenger, 1991 potentially involved in very diverse application scenarios, each characterised by a specific sharing problem.

  3. Interoperability Strategic Vision

    Energy Technology Data Exchange (ETDEWEB)

    Widergren, Steven E.; Knight, Mark R.; Melton, Ronald B.; Narang, David; Martin, Maurice; Nordman, Bruce; Khandekar, Aditya; Hardy, Keith S.

    2018-02-28

    The Interoperability Strategic Vision whitepaper aims to promote a common understanding of the meaning and characteristics of interoperability and to provide a strategy to advance the state of interoperability as applied to integration challenges facing grid modernization. This includes addressing the quality of integrating devices and systems and the discipline to improve the process of successfully integrating these components as business models and information technology improve over time. The strategic vision for interoperability described in this document applies throughout the electric energy generation, delivery, and end-use supply chain. Its scope includes interactive technologies and business processes from bulk energy levels to lower voltage level equipment and the millions of appliances that are becoming equipped with processing power and communication interfaces. A transformational aspect of a vision for interoperability in the future electric system is the coordinated operation of intelligent devices and systems at the edges of grid infrastructure. This challenge offers an example for addressing interoperability concerns throughout the electric system.

  4. Interoperability for electronic ID

    OpenAIRE

    Zygadlo, Zuzanna

    2009-01-01

    Electronic Business, including eBanking, eCommerce and eGovernmental services, is today based on a large variety of security solutions, comprising electronic IDs provided by a broad community of Public Key Infrastructure (PKI) vendors. Significant differences in implementations of those solutions introduce a problem of lack of interoperability in electronic business, which have not yet been resolved by standardization and interoperability initiatives based on existing PKI trust models. It i...

  5. Buildings Interoperability Landscape

    Energy Technology Data Exchange (ETDEWEB)

    Hardin, Dave [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Stephan, Eric G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wang, Weimin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Corbin, Charles D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Widergren, Steven E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-12-31

    Through its Building Technologies Office (BTO), the United States Department of Energy’s Office of Energy Efficiency and Renewable Energy (DOE-EERE) is sponsoring an effort to advance interoperability for the integration of intelligent buildings equipment and automation systems, understanding the importance of integration frameworks and product ecosystems to this cause. This is important to BTO’s mission to enhance energy efficiency and save energy for economic and environmental purposes. For connected buildings ecosystems of products and services from various manufacturers to flourish, the ICT aspects of the equipment need to integrate and operate simply and reliably. Within the concepts of interoperability lie the specification, development, and certification of equipment with standards-based interfaces that connect and work. Beyond this, a healthy community of stakeholders that contribute to and use interoperability work products must be developed. On May 1, 2014, the DOE convened a technical meeting to take stock of the current state of interoperability of connected equipment and systems in buildings. Several insights from that meeting helped facilitate a draft description of the landscape of interoperability for connected buildings, which focuses mainly on small and medium commercial buildings. This document revises the February 2015 landscape document to address reviewer comments, incorporate important insights from the Buildings Interoperability Vision technical meeting, and capture thoughts from that meeting about the topics to be addressed in a buildings interoperability vision. In particular, greater attention is paid to the state of information modeling in buildings and the great potential for near-term benefits in this area from progress and community alignment.

  6. Semantically Interoperable XML Data.

    Science.gov (United States)

    Vergara-Niedermayr, Cristobal; Wang, Fusheng; Pan, Tony; Kurc, Tahsin; Saltz, Joel

    2013-09-01

    XML is ubiquitously used as an information exchange platform for web-based applications in healthcare, life sciences, and many other domains. Proliferating XML data are now managed through latest native XML database technologies. XML data sources conforming to common XML schemas could be shared and integrated with syntactic interoperability. Semantic interoperability can be achieved through semantic annotations of data models using common data elements linked to concepts from ontologies. In this paper, we present a framework and software system to support the development of semantic interoperable XML based data sources that can be shared through a Grid infrastructure. We also present our work on supporting semantic validated XML data through semantic annotations for XML Schema, semantic validation and semantic authoring of XML data. We demonstrate the use of the system for a biomedical database of medical image annotations and markups.

  7. Lemnos Interoperable Security Program

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, John [Tennessee Valley Authority, Knoxville, TN (United States); Halbgewachs, Ron [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Chavez, Adrian [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Smith, Rhett [Schweitzer Engineering Laboratories, Chattanooga, TN (United States); Teumim, David [Teumim Technical, Allentown, PA (United States)

    2012-01-31

    The manner in which the control systems are being designed and operated in the energy sector is undergoing some of the most significant changes in history due to the evolution of technology and the increasing number of interconnections to other system. With these changes however come two significant challenges that the energy sector must face; 1) Cyber security is more important than ever before, and 2) Cyber security is more complicated than ever before. A key requirement in helping utilities and vendors alike in meeting these challenges is interoperability. While interoperability has been present in much of the discussions relating to technology utilized within the energy sector and especially the Smart Grid, it has been absent in the context of cyber security. The Lemnos project addresses these challenges by focusing on the interoperability of devices utilized within utility control systems which support critical cyber security functions. In theory, interoperability is possible with many of the cyber security solutions available to utilities today. The reality is that the effort required to achieve cyber security interoperability is often a barrier for utilities. For example, consider IPSec, a widely-used Internet Protocol to define Virtual Private Networks, or tunnels , to communicate securely through untrusted public and private networks. The IPSec protocol suite has a significant number of configuration options and encryption parameters to choose from, which must be agreed upon and adopted by both parties establishing the tunnel. The exercise in getting software or devices from different vendors to interoperate is labor intensive and requires a significant amount of security expertise by the end user. Scale this effort to a significant number of devices operating over a large geographical area and the challenge becomes so overwhelming that it often leads utilities to pursue solutions from a single vendor. These single vendor solutions may inadvertently lock

  8. Standard CGIF interoperability in Amine

    OpenAIRE

    Kabbaj, A.; Launders, I.; Polovina, S.

    2009-01-01

    The adoption of standard CGIF by CG tools will enable interoperability between them to be achieved, and in turn lead to the interoperability between CG tools and other tools. The integration of ISO Common Logic’s standard CGIF notation in the Amine platform is presented. It also describes the first steps towards full interoperability between the Amine CG tool (through its Synergy component) and CharGer, a representative CG tool that supports similar interoperability and for process (or ‘activ...

  9. Interoperability does matter

    Directory of Open Access Journals (Sweden)

    Manfred Goepel

    2006-04-01

    Full Text Available In companies, the historically developed IT systems are mostly application islands. They always produce good results if the system's requirements and surroundings are not changed and as long as a system interface is not needed. With the ever increas-ing dynamic and globalization of the market, however, these IT islands are certain to collapse. Interoperability (IO is the bid of the hour, assuming the integration of users, data, applications and processes. In the following, important IO enablers such as ETL, EAI, and SOA will be examined on the basis of practica-bility. It will be shown that especially SOA produces a surge of interoperability that could rightly be referred to as IT evolution.

  10. Maturity model for enterprise interoperability

    Science.gov (United States)

    Guédria, Wided; Naudet, Yannick; Chen, David

    2015-01-01

    Historically, progress occurs when entities communicate, share information and together create something that no one individually could do alone. Moving beyond people to machines and systems, interoperability is becoming a key factor of success in all domains. In particular, interoperability has become a challenge for enterprises, to exploit market opportunities, to meet their own objectives of cooperation or simply to survive in a growing competitive world where the networked enterprise is becoming a standard. Within this context, many research works have been conducted over the past few years and enterprise interoperability has become an important area of research, ensuring the competitiveness and growth of European enterprises. Among others, enterprises have to control their interoperability strategy and enhance their ability to interoperate. This is the purpose of the interoperability assessment. Assessing interoperability maturity allows a company to know its strengths and weaknesses in terms of interoperability with its current and potential partners, and to prioritise actions for improvement. The objective of this paper is to define a maturity model for enterprise interoperability that takes into account existing maturity models while extending the coverage of the interoperability domain. The assessment methodology is also presented. Both are demonstrated with a real case study.

  11. Inter-operability

    International Nuclear Information System (INIS)

    Plaziat, J.F.; Moulin, P.; Van Beurden, R.; Ballet, E.

    2005-01-01

    Building an internal gas market implies establishing harmonized rules for cross border trading between operators. To that effect, the European association EASEE-gas is carrying out standards and procedures, commonly called 'inter-operability'. Set up in 2002, the Association brings together all segments of the gas industry: producers, transporters, distributors, traders and shippers, suppliers, consumers and service providers. This workshop presents the latest status on issues such as barriers to gas trade in Europe, rules and procedures under preparation by EASEE-gas, and the implementation schedule of these rules by operators. This article gathers 5 presentations about this topic given at the gas conference

  12. Unmanned Ground Vehicle (UGV) Interoperability Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The UGV Interoperability Lab provides the capability to verify vendor conformance against government-defined interoperability profiles (IOPs). This capability allows...

  13. Flexible Language Interoperability

    DEFF Research Database (Denmark)

    Ekman, Torbjörn; Mechlenborg, Peter; Schultz, Ulrik Pagh

    2007-01-01

    Virtual machines raise the abstraction level of the execution environment at the cost of restricting the set of supported languages. Moreover, the ability of a language implementation to integrate with other languages hosted on the same virtual machine typically constrains the features...... of the language. In this paper, we present a highly flexible yet efficient approach to hosting multiple programming languages on an object-oriented virtual machine. Our approach is based on extending the interface of each class with language-specific wrapper methods, offering each language a tailored view...... of a given class. This approach can be deployed both on a statically typed virtual machine, such as the JVM, and on a dynamic virtual machine, such as a Smalltalk virtual machine. We have implemented our approach to language interoperability on top of a prototype virtual machine for embedded systems based...

  14. Evaluation of Enterprise Architecture Interoperability

    National Research Council Canada - National Science Library

    Jamison, Theresa A; Niska, Brice T; Layman, Phillip A; Whitney, Steven P

    2005-01-01

    ...), which describes these architectures. The purpose of this project, suggested by Air Force Space Command, was to examine the value of existing analytical tools in making an interoperability assessment of individual enterprises, as well...

  15. IPSec VPN Capabilities and Interoperability

    Science.gov (United States)

    2006-07-01

    IPSec VPN services include Juniper (formerly Netscreen) and Cisco. Of interest is the interoperability of setting up an IPSec VPN tunnel ... IPSec VPN services include Juniper (formerly Netscreen) and Cisco. Of interest is the interoperability of setting up an IPSec VPN tunnel with a Juniper...vendor implementations of IPSec VPN tunneling in an environment where both vendors play a role. The second objective was to determine some

  16. Interoperability and HealthGRID.

    Science.gov (United States)

    Bescos, C; Schmitt, D; Kass, J; García-Barbero, M; Kantchev, P

    2005-01-01

    GRID technology, with initiatives like the GGF, will have the potential to allow both competition and interoperability not only among applications and toolkits, but also among implementations of key services. The pyramid of eHealth interoperability should be achieved from standards in communication and data security, storage and processing, to the policy initiatives, including organizational protocols, financing procedures, and legal framework. The open challenges for GRID use in clinical fields illustrate the potential of the combination of grid technologies with medical routine into a wider interoperable framework. The Telemedicine Alliance is a consortium (ESA, WHO and ITU), initiated in 2002, in building a vision for the provision of eHealth to European citizens by 2010. After a survey with more that 50 interviews of experts, interoperability was identified as the main showstopper to eHealth implementation. There are already several groups and organizations contributing to standardization. TM-Alliance is supporting the "e-Health Standardization Coordination Group" (eHSCG). It is now, in the design and development phase of GRID technology in Health, the right moment to act with the aim of achieving an interoperable and open framework. The Health area should benefit from the initiatives started at the GGF in terms of global architecture and services definitions, as well as from the security and other web services applications developed under the Internet umbrella. There is a risk that existing important results of the standardization efforts in this area are not taken up simply because they are not always known.

  17. Turning Interoperability Operational with GST

    Science.gov (United States)

    Schaeben, Helmut; Gabriel, Paul; Gietzel, Jan; Le, Hai Ha

    2013-04-01

    GST - Geosciences in space and time is being developed and implemented as hub to facilitate the exchange of spatially and temporally indexed multi-dimensional geoscience data and corresponding geomodels amongst partners. It originates from TUBAF's contribution to the EU project "ProMine" and its perspective extensions are TUBAF's contribution to the actual EU project "GeoMol". As of today, it provides basic components of a geodata infrastructure as required to establish interoperability with respect to geosciences. Generally, interoperability means the facilitation of cross-border and cross-sector information exchange, taking into account legal, organisational, semantic and technical aspects, cf. Interoperability Solutions for European Public Administrations (ISA), cf. http://ec.europa.eu/isa/. Practical interoperability for partners of a joint geoscience project, say European Geological Surveys acting in a border region, means in particular provision of IT technology to exchange spatially and maybe additionally temporally indexed multi-dimensional geoscience data and corresponding models, i.e. the objects composing geomodels capturing the geometry, topology, and various geoscience contents. Geodata Infrastructure (GDI) and interoperability are objectives of several inititatives, e.g. INSPIRE, OneGeology-Europe, and most recently EGDI-SCOPE to name just the most prominent ones. Then there are quite a few markup languages (ML) related to geographical or geological information like GeoSciML, EarthResourceML, BoreholeML, ResqML for reservoir characterization, earth and reservoir models, and many others featuring geoscience information. Several Web Services are focused on geographical or geoscience information. The Open Geospatial Consortium (OGC) promotes specifications of a Web Feature Service (WFS), a Web Map Service (WMS), a Web Coverage Serverice (WCS), a Web 3D Service (W3DS), and many more. It will be clarified how GST is related to these initiatives, especially

  18. Information modeling for interoperable dimensional metrology

    CERN Document Server

    Zhao, Y; Brown, Robert; Xu, Xun

    2014-01-01

    This book analyzes interoperability issues in dimensional metrology systems and describes information modeling techniques. Coverage includes theory, techniques and key technologies, and explores new approaches for solving real-world interoperability problems.

  19. Assessing Schizophrenia with an Interoperable Architecture

    NARCIS (Netherlands)

    Emerencia, Ando; van der Krieke, Lian; Petkov, Nicolai; Aiello, Marco; Bouamrane, Matt-Mouley; Tao, Cui

    2011-01-01

    With the introduction of electronic personal health records and e-health applications spreading, interoperability concerns are of increasing importance to hospitals and care facilities. Interoperability between distributed and complex systems requires, among other things, compatible data formats.

  20. Intercloud Architecture for Interoperability and Integration

    NARCIS (Netherlands)

    Demchenko, Y.; Makkes, M.X.; Strijkers, R.J.; Ngo, C.

    2011-01-01

    This paper presents on-going research to develop the Intercloud Architecture (ICA) Framework that should address problems in multi-provider multi-domain heterogeneous Cloud based infrastructure services and applications integration and interoperability, including integration and interoperability

  1. Linked Data for Transaction Based Enterprise Interoperability

    NARCIS (Netherlands)

    Folmer, Erwin Johan Albert; ir. Krukkert, D.; Sinderen, Marten; Chapurlat, Vincent

    2015-01-01

    Interoperability is of major importance in B2B environments. Starting with EDI in the ‘80s, currently interoperability relies heavily on XML-based standards. Although having great impact, still issues remain to be solved for improving B2B interoperability. These issues include lack of dynamics, cost

  2. The interoperability force in the ERP field

    Science.gov (United States)

    Boza, Andrés; Cuenca, Llanos; Poler, Raúl; Michaelides, Zenon

    2015-04-01

    Enterprise resource planning (ERP) systems participate in interoperability projects and this participation sometimes leads to new proposals for the ERP field. The aim of this paper is to identify the role that interoperability plays in the evolution of ERP systems. To go about this, ERP systems have been first identified within interoperability frameworks. Second, the initiatives in the ERP field driven by interoperability requirements have been identified from two perspectives: technological and business. The ERP field is evolving from classical ERP as information system integrators to a new generation of fully interoperable ERP. Interoperability is changing the way of running business, and ERP systems are changing to adapt to the current stream of interoperability.

  3. Interoperability of Web Archives and Digital Libraries

    DEFF Research Database (Denmark)

    Kalb, Hendrik; Lazaridou, Paraskevi; Pinsent, Edward

    2013-01-01

    The interoperability of web archives and digital libraries is crucial to avoid silos of preserved data and content. While various researches focus on specfic facets of the challenge to interoperate, there is a lack of empirical work about the overall situation of actual challenges. We conduct...... a Delphi study to survey and reveal the insights of experts in the field. Results of our study are presented in this paper to enhance further research and development efforts for interoperability....

  4. Standards to open and interoperable digital libraries

    Directory of Open Access Journals (Sweden)

    Luís Fernando Sayão

    2007-12-01

    Full Text Available Interoperability is one of the main issues in creating a networked system of digital libraries. However, the interoperability as the way to accomplish data exchange and service collaboration requires adoption of a set of open standards covering all digital repository processes. The aim of this document is to revise the most important standards, protocols and the best pratices that form the framework to an open and fully interoperable digital library.

  5. Interoperability of ESA Science Archives

    Science.gov (United States)

    Arviset, C.; Dowson, J.; Hernández, J.; Osuna, P.; Venet, A.

    The ISO Data Archive (IDA) and the XMM-Newton Science Archive (XSA) have been developed by the Science Operations and Data Systems Division of ESA in Villafranca, Spain. They are both built using the same flexible and modular 3-tier architecture: Data Products and Database, Business Logic, User Interface. This open architecture, together with Java and XML technology have helped in making the IDA and XSA inter-operable with other archives and applications. The various accesses from the IDA and the XSA to remote archives are described as well as the mechanism to directly access these ESA archives from remote archives

  6. Towards an enterprise interoperability framework

    CSIR Research Space (South Africa)

    Kotzé, P

    2010-06-01

    Full Text Available Framework defines interoperabili- ty more holistically as ‘the ability of information and communication technology (ICT) systems and of the business processes they support to exchange data and to enable the sharing of information and knowledge’ [14: 5... the services so exchanged to enable them to operate effectively together’ [18]. Interoperability is thus the ability of two or more different entities (be they pieces of software, processes, systems, business units, etc.) to ‘inter-operate’ [29]. 2.3 What...

  7. Interoperation Modeling for Intelligent Domotic Environments

    Science.gov (United States)

    Bonino, Dario; Corno, Fulvio

    This paper introduces an ontology-based model for domotic device inter-operation. Starting from a previously published ontology (DogOnt) a refactoring and extension is described allowing to explicitly represent device capabilities, states and commands, and supporting abstract modeling of device inter-operation.

  8. Model for Trans-sector Digital Interoperability

    NARCIS (Netherlands)

    Madureira, A.; Den Hartog, F.; Silva, E.; Baken, N.

    2010-01-01

    Interoperability refers to the ability of two or more systems or components to exchange information and to use the information that has been exchanged. The importance of interoperability has grown together with the adoption of Digital Information Networks (DINs). DINs refer to information networks

  9. Model for Trans-sector Digital Interoperability

    NARCIS (Netherlands)

    Popplewell, Keith; Madureira, António; Harding, Jenny; den Hartog, Frank; Goncalves da Silva, Eduardo; Poler, Raul; Chalmeta, Ricardo; Baken, Nico

    Interoperability refers to the ability of two or more systems or components to exchange information and to use the information that has been exchanged. The importance of interoperability has grown together with the adoption of Digital Information Networks (DINs). DINs refer to information networks

  10. Interoperability of Web Archives and Digital Libraries

    DEFF Research Database (Denmark)

    Kalb, Hendrik; Lazaridou, Paraskevi; Pinsent, Edward

    2013-01-01

    The interoperability of web archives and digital libraries is crucial to avoid silos of preserved data and content. While various researches focus on specfic facets of the challenge to interoperate, there is a lack of empirical work about the overall situation of actual challenges. We conduct...

  11. Model for Trans-sector Digital Interoperability

    NARCIS (Netherlands)

    Madureira, António; den Hartog, Frank; Goncalves da Silva, Eduardo; Baken, Nico; Zhao, L.; Macaulay, L.

    2009-01-01

    Interoperability refers to the ability of two or more systems or components to exchange information and to use the information that has been exchanged. The importance of interoperability has grown together with the adoption of Digital Information Networks (DINs). DINs refer to information networks

  12. Interoperability in the networked design infrastructure

    NARCIS (Netherlands)

    Coenders, J.L.

    2012-01-01

    Interoperability, the ability of different software applications to communicate with each other, is one of the biggest challenges for efficient and effective use of advanced software technology in structural design and engineering. In practice, the problem of interoperability exists very much for

  13. Interoperability

    DEFF Research Database (Denmark)

    Savin, Andrej

    would considerably alter the current privacy setting.3 First, the current Directive would be replaced with a Regulation, achieving EU-­‐wide harmonization. Second, the scope of the instrument would be widened and the provisions made more precise. Third, the use of consent for data processing would...... be limited. Fourth, Data protection “by design” would be distinguished from data protection “by default”. Fifth, new fundamental rights would be introduced and the old ones clarified. Sixth, new rules on controllers’ and processors’ duties, on supervisory authorities and on sanctions would be introduced...... of direct relevance for the project and Work Package 5 will be analysed here....

  14. Innovation in OGC: The Interoperability Program

    Directory of Open Access Journals (Sweden)

    George Percivall

    2015-10-01

    Full Text Available The OGC Interoperability Program is a source of innovation in the development of open standards. The approach to innovation is based on hands-on; collaborative engineering leading to more mature standards and implementations. The process of the Interoperability Program engages a community of sponsors and participants based on an economic model that benefits all involved. Each initiative begins with an innovative approach to identify interoperability needs followed by agile software development to advance the state of technology to the benefit of society. Over eighty initiatives have been conducted in the Interoperability Program since the breakthrough Web Mapping Testbed began the program in 1999. OGC standards that were initiated in Interoperability Program are the basis of two thirds of the certified compliant products.

  15. MIDST: Interoperability for Semantic Annotations

    Science.gov (United States)

    Atzeni, Paolo; Del Nostro, Pierluigi; Paolozzi, Stefano

    In the last years, interoperability of ontologies and databases has received a lot of attention. However, most of the work has concentrated on specific problems (such as storing an ontology in a database or making database data available to ontologies) and referred to specific models for each of the two. Here, we propose an approach that aims at being more general and model independent. In fact, it works for different dialects for ontologies and for various data models for databases. Also, it supports translations in both directions (ontologies to databases and vice versa) and it allows for flexibility in the translations, so that customization is possible. The proposal extends recent work for schema and data translation (the MIDST project, which implements the ModelGen operator proposed in model management), which relies on a metamodel approach, where data models and variations thereof are described in a common framework and translations are built as compositions of elementary ones.

  16. Towards semantic interoperability for electronic health records.

    Science.gov (United States)

    Garde, Sebastian; Knaup, Petra; Hovenga, Evelyn; Heard, Sam

    2007-01-01

    In the field of open electronic health records (EHRs), openEHR as an archetype-based approach is being increasingly recognised. It is the objective of this paper to shortly describe this approach, and to analyse how openEHR archetypes impact on health professionals and semantic interoperability. Analysis of current approaches to EHR systems, terminology and standards developments. In addition to literature reviews, we organised face-to-face and additional telephone interviews and tele-conferences with members of relevant organisations and committees. The openEHR archetypes approach enables syntactic interoperability and semantic interpretability -- both important prerequisites for semantic interoperability. Archetypes enable the formal definition of clinical content by clinicians. To enable comprehensive semantic interoperability, the development and maintenance of archetypes needs to be coordinated internationally and across health professions. Domain knowledge governance comprises a set of processes that enable the creation, development, organisation, sharing, dissemination, use and continuous maintenance of archetypes. It needs to be supported by information technology. To enable EHRs, semantic interoperability is essential. The openEHR archetypes approach enables syntactic interoperability and semantic interpretability. However, without coordinated archetype development and maintenance, 'rank growth' of archetypes would jeopardize semantic interoperability. We therefore believe that openEHR archetypes and domain knowledge governance together create the knowledge environment required to adopt EHRs.

  17. Interoperability for Entreprise Systems and Applications '12

    CERN Document Server

    Doumeingts, Guy; Katzy, Bernhard; Chalmeta, Ricardo

    2012-01-01

    Within a scenario of globalised markets, where the capacity to efficiently cooperate with other firms starts to become essential in order to remain in the market in an economically, socially and environmentally cost-effective manner, it can be seen how the most innovative enterprises are beginning to redesign their business model to become interoperable. This goal of interoperability is essential, not only from the perspective of the individual enterprise but also in the new business structures that are now emerging, such as supply chains, virtual enterprises, interconnected organisations or extended enterprises, as well as in mergers and acquisitions. Composed of over 40 papers, Enterprise Interoperability V ranges from academic research through case studies to industrial and administrative experience of interoperability. The international nature of the authorship contnues to broaden. Many of the papers have examples and illustrations calculated to deepen understanding and generate new ideas. The I-ESA'12 Co...

  18. Requirements for Interoperability in Healthcare Information Systems

    Directory of Open Access Journals (Sweden)

    Rita Noumeir

    2012-01-01

    Full Text Available Interoperability is a requirement for the successful deployment of Electronic Health Records (EHR. EHR improves the quality of healthcare by enabling access to all relevant information at the diagnostic decision moment, regardless of location. It is a system that results from the cooperation of several heterogeneous distributed subsystems that need to successfully exchange information relative to a specific healthcare process. This paper analyzes interoperability impediments in healthcare by first defining them and providing concrete healthcare examples, followed by discussion of how specifications can be defined and how verification can be conducted to eliminate those impediments and ensure interoperability in healthcare. This paper also analyzes how Integrating the Healthcare Enterprise (IHE has been successful in enabling interoperability, and identifies some neglected aspects that need attention.

  19. Intercloud Architecture Framework for Interoperability and Integration

    NARCIS (Netherlands)

    Demchenko, Y.; Ngo, C.; Makkes, M.X.; Strijkers, R.J.

    2013-01-01

    This report presents on-going research to develop the Intercloud Architecture Framework (ICAF) that addresses interoperability and integration issues in multi-provider multi-domain heterogeneous Cloud based infrastructure services and applications provisioning, including integration and

  20. Interoperability for Enterprise Systems and Applications

    CERN Document Server

    Jardim-Gonçalves, Ricardo; Popplewell, Keith; Mendonça, João

    2016-01-01

    A concise reference to the state of the art in systems interoperability, Enterprise Interoperability VII will be of great value to engineers and computer scientists working in manufacturing and other process industries and to software engineers and electronic and manufacturing engineers working in the academic environment. Furthermore, it shows how knowledge of the meaning within information and the use to which it will be put have to be held in common between enterprises for consistent and efficient inter-enterprise networks. Over 30 papers, ranging from academic research through case studies to industrial and administrative experience of interoperability show how, in a scenario of globalised markets, where the capacity to cooperate with other organizations efficiently is essential in order to remain economically, socially and environmentally cost-effective, the most innovative digitized and networked enterprises ensure that their systems and applications are able to interoperate across heterogeneous collabo...

  1. Measuring Systems Interoperability: Challenges and Opportunities

    National Research Council Canada - National Science Library

    Kasunic, Mark; Anderson, William

    2004-01-01

    Interoperability is the ability of systems, units, or forces to provide services to and accept services from other systems, units, or forces and to use the services exchanged to enable them to operate...

  2. RFID in Libraries: Standards and Interoperability

    OpenAIRE

    Hopkinson, Alan

    2007-01-01

    RFID needs standards to ensure interoperability so that systems can survive a change in library system and use RFID in inter-library lending between libraries with different systems. Efforts are under way to develop ISO standards to achieve this.

  3. River Basin Standards Interoperability Pilot

    Science.gov (United States)

    Pesquer, Lluís; Masó, Joan; Stasch, Christoph

    2016-04-01

    There is a lot of water information and tools in Europe to be applied in the river basin management but fragmentation and a lack of coordination between countries still exists. The European Commission and the member states have financed several research and innovation projects in support of the Water Framework Directive. Only a few of them are using the recently emerging hydrological standards, such as the OGC WaterML 2.0. WaterInnEU is a Horizon 2020 project focused on creating a marketplace to enhance the exploitation of EU funded ICT models, tools, protocols and policy briefs related to water and to establish suitable conditions for new market opportunities based on these offerings. One of WaterInnEU's main goals is to assess the level of standardization and interoperability of these outcomes as a mechanism to integrate ICT-based tools, incorporate open data platforms and generate a palette of interchangeable components that are able to use the water data emerging from the recently proposed open data sharing processes and data models stimulated by initiatives such as the INSPIRE directive. As part of the standardization and interoperability activities in the project, the authors are designing an experiment (RIBASE, the present work) to demonstrate how current ICT-based tools and water data can work in combination with geospatial web services in the Scheldt river basin. The main structure of this experiment, that is the core of the present work, is composed by the following steps: - Extraction of information from river gauges data in OGC WaterML 2.0 format using SOS services (preferably compliant to the OGC SOS 2.0 Hydrology Profile Best Practice). - Model floods using a WPS 2.0, WaterML 2.0 data and weather forecast models as input. - Evaluation of the applicability of Sensor Notification Services in water emergencies. - Open distribution of the input and output data as OGC web services WaterML, / WCS / WFS and with visualization utilities: WMS. The architecture

  4. Interoperability and Standardization of Intercloud Cloud Computing

    OpenAIRE

    Wang, Jingxin K.; Ding, Jianrui; Niu, Tian

    2012-01-01

    Cloud computing is getting mature, and the interoperability and standardization of the clouds is still waiting to be solved. This paper discussed the interoperability among clouds about message transmission, data transmission and virtual machine transfer. Starting from IEEE Pioneering Cloud Computing Initiative, this paper discussed about standardization of the cloud computing, especially intercloud cloud computing. This paper also discussed the standardization from the market-oriented view.

  5. Grid interoperability: joining grid information systems

    International Nuclear Information System (INIS)

    Flechl, M; Field, L

    2008-01-01

    A grid is defined as being 'coordinated resource sharing and problem solving in dynamic, multi-institutional virtual organizations'. Over recent years a number of grid projects, many of which have a strong regional presence, have emerged to help coordinate institutions and enable grids. Today, we face a situation where a number of grid projects exist, most of which are using slightly different middleware. Grid interoperation is trying to bridge these differences and enable Virtual Organizations to access resources at the institutions independent of their grid project affiliation. Grid interoperation is usually a bilateral activity between two grid infrastructures. Recently within the Open Grid Forum, the Grid Interoperability Now (GIN) Community Group is trying to build upon these bilateral activities. The GIN group is a focal point where all the infrastructures can come together to share ideas and experiences on grid interoperation. It is hoped that each bilateral activity will bring us one step closer to the overall goal of a uniform grid landscape. A fundamental aspect of a grid is the information system, which is used to find available grid services. As different grids use different information systems, interoperation between these systems is crucial for grid interoperability. This paper describes the work carried out to overcome these differences between a number of grid projects and the experiences gained. It focuses on the different techniques used and highlights the important areas for future standardization

  6. On MDA - SOA based Intercloud Interoperability framework

    Directory of Open Access Journals (Sweden)

    Tahereh Nodehi

    2013-01-01

    Full Text Available Cloud computing has been one of the latest technologies which assures reliable delivery of on - demand computing services over the Internet. Cloud service providers have established geographically distributed data centers and computing resources, which are available online as service. The clouds operated by different service providers working together in collaboration can open up lots more spaces for innovative scenarios with huge amount of resources provisioning on demand. However, current cloud systems do not support intercloud interoperability. This paper is thus motivated to address Intercloud Interoperabilityby analyzing different methodologies that have been applied to resolve various scenarios of interoperability. Model Driven Architecture (MDA and Service Oriented Architecture (SOA method have been used to address interoperability in various scenarios, which also opens up spaces to address intercloud interoperability by making use of these well accepted methodologies. The focus of this document is to show Intercloud Interoperability can be supported through a Model Driven approach and Service Oriented systems. Moreover, the current state of the art in Intercloud, concept and benefits of MDA and SOA are discussed in the paper. At the same time this paper also proposes a generic architecture for MDA - SOA based framework, which can be useful for developing applications which will require intercloud interoperability. The paper justi fies the usability of the framework by a use - case scenario for dynamic workload migration among heterogeneous clouds.

  7. Towards Interoperable Preservation Repositories: TIPR

    Directory of Open Access Journals (Sweden)

    Priscilla Caplan

    2010-07-01

    Full Text Available Towards Interoperable Preservation Repositories (TIPR is a project funded by the Institute of Museum and Library Services to create and test a Repository eXchange Package (RXP. The package will make it possible to transfer complex digital objects between dissimilar preservation repositories.  For reasons of redundancy, succession planning and software migration, repositories must be able to exchange copies of archival information packages with each other. Every different repository application, however, describes and structures its archival packages differently. Therefore each system produces dissemination packages that are rarely understandable or usable as submission packages by other repositories. The RXP is an answer to that mismatch. Other solutions for transferring packages between repositories focus either on transfers between repositories of the same type, such as DSpace-to-DSpace transfers, or on processes that rely on central translation services.  Rather than build translators between many dissimilar repository types, the TIPR project has defined a standards-based package of metadata files that can act as an intermediary information package, the RXP, a lingua franca all repositories can read and write.

  8. 77 FR 19575 - Promoting Interoperability in the 700 MHz Commercial Spectrum; Interoperability of Mobile User...

    Science.gov (United States)

    2012-04-02

    ...-1 specifies the spectrum emission limits for available channel bandwidths. \\7\\ Receiver blocking... Interoperability in the 700 MHz Commercial Spectrum; Interoperability of Mobile User Equipment Across Paired Commercial Spectrum Blocks in the 700 MHz Band AGENCY: Federal Communications Commission. ACTION: Notice of...

  9. Enterprise interoperability VI : Interoperability for Agility, Resilience and Plasticity of Collaboration

    CERN Document Server

    Bénaben, Frédérick; Poler, Raúl; Bourrières, Jean-Paul

    2014-01-01

    A concise reference to the state of the art in systems interoperability, Enterprise Interoperability VI will be of great value to engineers and computer scientists working in manufacturing and other process industries and to software engineers and electronic and manufacturing engineers working in the academic environment. Over 40 papers, ranging from academic research through case studies to industrial and administrative experience of interoperability show how, in a scenario of globalised markets, where the capacity to cooperate with other firms efficiently starts to become essential in order to remain in the market in an economically, socially and environmentally cost-effective manner, the most innovative enterprises are beginning to redesign their business model to become interoperable. This goal of interoperability is essential, not only from the perspective of the individual enterprise but also in the new business structures that are now emerging, such as supply chains, virtual enterprises, interconnected...

  10. A step-by-step methodology for enterprise interoperability projects

    Science.gov (United States)

    Chalmeta, Ricardo; Pazos, Verónica

    2015-05-01

    Enterprise interoperability is one of the key factors for enhancing enterprise competitiveness. Achieving enterprise interoperability is an extremely complex process which involves different technological, human and organisational elements. In this paper we present a framework to help enterprise interoperability. The framework has been developed taking into account the three domains of interoperability: Enterprise Modelling, Architecture and Platform and Ontologies. The main novelty of the framework in comparison to existing ones is that it includes a step-by-step methodology that explains how to carry out an enterprise interoperability project taking into account different interoperability views, like business, process, human resources, technology, knowledge and semantics.

  11. IHE based interoperability - benefits and challenges.

    Science.gov (United States)

    Wozak, Florian; Ammenwerth, Elske; Hörbst, Alexander; Sögner, Peter; Mair, Richard; Schabetsberger, Thomas

    2008-01-01

    Optimized workflows and communication between institutions involved in a patient's treatment process can lead to improved quality and efficiency in the healthcare sector. Electronic Health Records (EHRs) provide a patient-centered access to clinical data across institutional boundaries supporting the above mentioned aspects. Interoperability is regarded as vital success factor. However a clear definition of interoperability does not exist. The aim of this work is to define and to assess interoperability criteria as required for EHRs. The definition and assessment of interoperability criteria is supported by the analysis of existing literature and personal experience as well as by discussions with several domain experts. Criteria for interoperability addresses the following aspects: Interfaces, Semantics, Legal and organizational aspects and Security. The Integrating the Healthcare Enterprises initiative (IHE) profiles make a major contribution to these aspects, but they also arise new problems. Flexibility for adoption to different organizational/regional or other specific conditions is missing. Regional or national initiatives should get a possibility to realize their specific needs within the boundaries of IHE profiles. Security so far is an optional element which is one of IHE greatest omissions. An integrated security approach seems to be preferable. Irrespective of the so far practical significance of the IHE profiles it appears to be of great importance, that the profiles are constantly checked against practical experiences and are continuously adapted.

  12. Benefit quantification of interoperability in coordinate metrology

    DEFF Research Database (Denmark)

    Savio, E.; Carmignato, S.; De Chiffre, Leonardo

    2014-01-01

    these inefficiencies. The paper presents a methodology for an economic evaluation of interoperability benefits with respect to the verification of geometrical product specifications. It requires input data from testing and inspection activities, as well as information on training of personnel and licensing of software......One of the factors contributing to limited reproducibility of coordinate measurements is the use of different inspection software. Time-consuming efforts for translation of part programmes are sometimes needed, and interoperability of inspection equipment has the potential to reduce...

  13. Meeting the New Challenges of International Interoperability

    Science.gov (United States)

    2009-06-01

    achieved by issuing common equipment to partners • Interoperability is gained by continuously working to tie cultural , procedural, and technical and policy...that attribute to improve information and knowledge by collective processes and cross-fertilization Interoperability… it’s Not Just for Geeks Reasons

  14. -Means Based Fingerprint Segmentation with Sensor Interoperability

    Directory of Open Access Journals (Sweden)

    Yang Xiukun

    2010-01-01

    Full Text Available A critical step in an automatic fingerprint recognition system is the segmentation of fingerprint images. Existing methods are usually designed to segment fingerprint images originated from a certain sensor. Thus their performances are significantly affected when dealing with fingerprints collected by different sensors. This work studies the sensor interoperability of fingerprint segmentation algorithms, which refers to the algorithm's ability to adapt to the raw fingerprints obtained from different sensors. We empirically analyze the sensor interoperability problem, and effectively address the issue by proposing a -means based segmentation method called SKI. SKI clusters foreground and background blocks of a fingerprint image based on the -means algorithm, where a fingerprint block is represented by a 3-dimensional feature vector consisting of block-wise coherence, mean, and variance (abbreviated as CMV. SKI also employs morphological postprocessing to achieve favorable segmentation results. We perform SKI on each fingerprint to ensure sensor interoperability. The interoperability and robustness of our method are validated by experiments performed on a number of fingerprint databases which are obtained from various sensors.

  15. Parallel mesh management using interoperable tools.

    Energy Technology Data Exchange (ETDEWEB)

    Tautges, Timothy James (Argonne National Laboratory); Devine, Karen Dragon

    2010-10-01

    This presentation included a discussion of challenges arising in parallel mesh management, as well as demonstrated solutions. They also described the broad range of software for mesh management and modification developed by the Interoperable Technologies for Advanced Petascale Simulations (ITAPS) team, and highlighted applications successfully using the ITAPS tool suite.

  16. Equipping the enterprise interoperability problem solver

    NARCIS (Netherlands)

    Oude Luttighuis, P.; Folmer, E.J.A.

    2011-01-01

    The maturity of the enterprise interoperability field does not match the importance attached to it by many, both in the public as well as the private community. A host of models, paradigms, designs, standards, methods, and instruments seems to be available, but many of them are only used in rather

  17. Equipping the Enterprise Interoperability Problem Solver

    NARCIS (Netherlands)

    Oude Luttighuis, Paul; Folmer, Erwin Johan Albert; Charalabidis, Yannis

    2010-01-01

    The maturity of the enterprise interoperability field does not match the importance attached to it by many, both in the public as well as the private community. A host of models, paradigms, designs, standards, methods, and instruments seems to be available, but many of them are only used in rather

  18. Intercloud Architecture for interoperability and integration

    NARCIS (Netherlands)

    Demchenko, Y.; Makkes, M.X.; Strijkers, R.J.; Laat, C. de

    2012-01-01

    This paper presents on-going research to develop the Intercloud Architecture Framework (ICAF) that addresses problems in multi-provider multi-domain heterogeneous cloud based infrastructure services and applications integration and interoperability. The paper refers to existing standards in Cloud

  19. Intercloud architecture for interoperability and integration

    NARCIS (Netherlands)

    Demchenko, Y.; Makkes, M.X.; Strijkers, R.; de Laat, C.

    2012-01-01

    This paper presents on-going research to develop the Intercloud Architecture Framework (ICAF) that addresses problems in multi-provider multi-domain heterogeneous cloud based infrastructure services and applications integration and interoperability. The paper refers to existing standards in Cloud

  20. Semantic Service Modeling: Enabling System Interoperability.

    NARCIS (Netherlands)

    Pokraev, S.; Quartel, Dick; Steen, Maarten W.A.; Reichert, M.U.

    2006-01-01

    Interoperability is the capability of different systems to use each other’s services effectively. It is about sharing functionality and information between systems at different levels, e.g., between physical devices, software applications, business units within one organization, or between different

  1. An Interoperable Security Framework for Connected Healthcare

    NARCIS (Netherlands)

    Asim, M.; Petkovic, M.; Qu, M.; Wang, C.

    2011-01-01

    Connected and interoperable healthcare system promises to reduce thecost of the healthcare delivery, increase its efficiency and enableconsumers to better engage with clinicians and manage their care. However at the same time it introduces new risks towards security andprivacy of personal health

  2. Smart Grid Interoperability Maturity Model Beta Version

    Energy Technology Data Exchange (ETDEWEB)

    Widergren, Steven E.; Drummond, R.; Giroti, Tony; Houseman, Doug; Knight, Mark; Levinson, Alex; longcore, Wayne; Lowe, Randy; Mater, J.; Oliver, Terry V.; Slack, Phil; Tolk, Andreas; Montgomery, Austin

    2011-12-02

    The GridWise Architecture Council was formed by the U.S. Department of Energy to promote and enable interoperability among the many entities that interact with the electric power system. This balanced team of industry representatives proposes principles for the development of interoperability concepts and standards. The Council provides industry guidance and tools that make it an available resource for smart grid implementations. In the spirit of advancing interoperability of an ecosystem of smart grid devices and systems, this document presents a model for evaluating the maturity of the artifacts and processes that specify the agreement of parties to collaborate across an information exchange interface. You are expected to have a solid understanding of large, complex system integration concepts and experience in dealing with software component interoperation. Those without this technical background should read the Executive Summary for a description of the purpose and contents of the document. Other documents, such as checklists, guides, and whitepapers, exist for targeted purposes and audiences. Please see the www.gridwiseac.org website for more products of the Council that may be of interest to you.

  3. Enhancing Data Interoperability with Web Services

    Science.gov (United States)

    Shrestha, S. R.; Zimble, D. A.; Wang, W.; Herring, D.; Halpert, M.

    2014-12-01

    In an effort to improve data access and interoperability of climate and weather data, the National Oceanic and Atmospheric Administration's (NOAA) Climate.gov and Climate Prediction Center (CPC) are exploring various platform solutions to enhance a user's ability to locate, preview, and acquire the data. The Climate.gov and CPC data team faces multiple challenges including the various kinds of data and formats, inconsistency of metadata records, variety of data service implementations, very large volumes of data and geographically distributed locations. We have created the Data Access and Interoperability project to design a web-based platform, where interoperability between systems can be leveraged to allow greater data discovery, access, visualization and delivery. In the interoperable data platform, systems can integrate with each other to support the synthesis of climate and weather data. Interoperability is the ability for users to discover the available climate and weather data, preview and interact with the data, and acquire the data in common digital formats through a simple web-based interface. The goal of the interoperable data platform is to leverage existing web services, implement the established standards and integrate with existing solutions across the earth sciences domain instead of creating new technologies. Towards this effort to improve the interoperability of the platform, we are collaborating with ESRI Inc. to provide climate and weather data via web services. In this presentation, we will discuss and demonstrate how to use ArcGIS to author RESTful based scientific web services using open standards. These web services are able to encapsulate the logic required to handle and describe scientific data through a variety of service types including, image, map, feature, geoprocessing, and their respective service methods. Combining these types of services and leveraging well-documented APIs, including the ArcGIS JavaScript API, we can afford to

  4. Designing learning management system interoperability in semantic web

    Science.gov (United States)

    Anistyasari, Y.; Sarno, R.; Rochmawati, N.

    2018-01-01

    The extensive adoption of learning management system (LMS) has set the focus on the interoperability requirement. Interoperability is the ability of different computer systems, applications or services to communicate, share and exchange data, information, and knowledge in a precise, effective and consistent way. Semantic web technology and the use of ontologies are able to provide the required computational semantics and interoperability for the automation of tasks in LMS. The purpose of this study is to design learning management system interoperability in the semantic web which currently has not been investigated deeply. Moodle is utilized to design the interoperability. Several database tables of Moodle are enhanced and some features are added. The semantic web interoperability is provided by exploited ontology in content materials. The ontology is further utilized as a searching tool to match user’s queries and available courses. It is concluded that LMS interoperability in Semantic Web is possible to be performed.

  5. Tool interoperability in SSE OI 2.0

    Science.gov (United States)

    Carmody, C. L.; Shotton, C. T.

    1988-01-01

    This paper presents a review of the concept and implementation of tool interoperability in the Space Station Software Support Environment (SSE) OI 2.0. By first providing a description of SSE, the paper describes the problem at hand, that is; the nature of the SSE that gives rise to the requirement for interoperability--between SSE workstations and hence, between the tools which reside on the workstations. Specifically, word processor and graphic tool interoperability are discussed. The concept for interoperability that is implemented in OI 2.0 is described, as is an overview of the implementation strategy. Some of the significant challenges that the development team had to overcome to bring about interoperability are described, perhaps as a checklist, or warning, to others who would bring about tool interoperability. Lastly, plans to extend tool interoperability to a third class of tools in OI 3.0 are described.

  6. Augmenting interoperability across repositories architectural ideas

    CERN Multimedia

    CERN. Geneva

    2005-01-01

    The aDORe digital repository architecture designed and implemented by the Los Alamos Research Library is fully standards-based and highly modular, with the various components of the architecture interacting in a protocol-driven manner. Although aDORe was designed for use in the context of the Los Alamos Library, its modular and standards-based design has led to interesting insights regarding possible new levels of interoperability in a federation of heterogeneous repositories. The presentation will discuss these insights, and will illustrate that attractive federations of repositories can be built by introducing rather basic interoperability requirements. The presentation will also show that, once these requirements are met, a powerful service framework that overlays the federation can emerge.

  7. Future Interoperability of Camp Protection Systems (FICAPS)

    Science.gov (United States)

    Caron, Sylvie; Gündisch, Rainer; Marchand, Alain; Stahl, Karl-Hermann

    2013-05-01

    The FICAPS Project has been established as a Project of the European Defence Agency based on an initiative of Germany and France. Goal of this Project was to derive Guidelines, which by a proper implementation in future developments improve Camp Protection Systems (CPS) by enabling and improving interoperability between Camp Protection Systems and its Equipments of different Nations involved in multinational missions. These Guidelines shall allow for: • Real-time information exchange between equipments and systems of different suppliers and nations (even via SatCom), • Quick and easy replacement of equipments (even of different Nations) at run-time in the field by means of plug and play capability, thus lowering the operational and logistic costs and making the system highly available, • Enhancement of system capabilities (open and modular systems) by adding new equipment with new capabilities (just plug-in, automatic adjustment of the HMI Human Machine Interface) without costly and time consuming validation and test on system level (validation and test can be done on Equipment level), Four scenarios have been identified to summarize the interoperability requirements from an operational viewpoint. To prove the definitions given in the Guideline Document, a French and a German Demonstration System, based on existing national assets, were realized. Demonstrations, showing the capabilities given by the defined interoperability requirements with respect to the operational scenarios, were performed. Demonstrations included remote control of a CPS by another CPS, remote sensor control (Electro-Optic/InfraRed EO/IR) and remote effector control. This capability can be applied to extend the protection area or to protect distant infrastructural assets Demonstrations have been performed. The required interoperability functionality was shown successfully. Even if the focus of the FICAPS project was on camp protection, the solution found is also appropriate for other

  8. The Joint Lessons Learned System and Interoperability

    Science.gov (United States)

    1989-06-02

    should not be artificially separated. 4 8 That lesson would not be learned here. Another lesson which was learned, however, was that interservice... artificially high level of support masked the continuing rivalry between the Army and Air Force over mission priorities. 6 4 In spite of Air Force...knowledge concerning joint interoperability issues and lessons learned activities. -72- MAP 2 Central African Republic Sudan sangu . Bo"oo Cameroon tuie

  9. The Challenges of Interoperable Data Discovery

    Science.gov (United States)

    Meaux, Melanie F.

    2005-01-01

    The Global Change Master Directory (GCMD) assists the oceanographic community in data discovery and access through its online metadata directory. The directory also offers data holders a means to post and search their oceanographic data through the GCMD portals, i.e. online customized subset metadata directories. The Gulf of Maine Ocean Data Partnership (GoMODP) has expressed interest in using the GCMD portals to increase the visibility of their data holding throughout the Gulf of Maine region and beyond. The purpose of the Gulf of Maine Ocean Data Partnership (GoMODP) is to "promote and coordinate the sharing, linking, electronic dissemination, and use of data on the Gulf of Maine region". The participants have decided that a "coordinated effort is needed to enable users throughout the Gulf of Maine region and beyond to discover and put to use the vast and growing quantities of data in their respective databases". GoMODP members have invited the GCMD to discuss further collaborations in view of this effort. This presentation. will focus on the GCMD GoMODP Portal - demonstrating its content and use for data discovery, and will discuss the challenges of interoperable data discovery. interoperability among metadata standards and vocabularies will be discussed. A short overview of the lessons learned at the Marine Metadata Interoperability (MMI) metadata workshop held in Boulder, Colorado on August 9-11, 2005 will be given.

  10. Telemedicine system interoperability architecture: concept description and architecture overview.

    Energy Technology Data Exchange (ETDEWEB)

    Craft, Richard Layne, II

    2004-05-01

    In order for telemedicine to realize the vision of anywhere, anytime access to care, it must address the question of how to create a fully interoperable infrastructure. This paper describes the reasons for pursuing interoperability, outlines operational requirements that any interoperability approach needs to consider, proposes an abstract architecture for meeting these needs, identifies candidate technologies that might be used for rendering this architecture, and suggests a path forward that the telemedicine community might follow.

  11. Interoperability technology assessment for joint C4ISR systems

    OpenAIRE

    Berzins, Valdis Andris; Luqi; Shultes, Bruce C.; Guo, Jiang; Allen, Jim; Cheng, Ngom; Gee, Karen; Nyugen, Tom; Stierna, Eria

    1999-01-01

    This study characterizes and assesses alternative approaches to software component interoperability in distributed environments typical of C4ISR systems. Interoperability is the ability of systems to provide services to and accept services from other systems, and to use the services so exchanged to enable them to operate effectively together. This study characterizes and assesses alternative approaches to software component interoperability in distributed environments. Candidate approaches in...

  12. The role of architecture and ontology for interoperability.

    Science.gov (United States)

    Blobel, Bernd; González, Carolina; Oemig, Frank; Lopéz, Diego; Nykänen, Pirkko; Ruotsalainen, Pekka

    2010-01-01

    Turning from organization-centric to process-controlled or even to personalized approaches, advanced healthcare settings have to meet special interoperability challenges. eHealth and pHealth solutions must assure interoperability between actors cooperating to achieve common business objectives. Hereby, the interoperability chain also includes individually tailored technical systems, but also sensors and actuators. For enabling corresponding pervasive computing and even autonomic computing, individualized systems have to be based on an architecture framework covering many domains, scientifically managed by specialized disciplines using their specific ontologies in a formalized way. Therefore, interoperability has to advance from a communication protocol to an architecture-centric approach mastering ontology coordination challenges.

  13. Enterprise Interoperability - Proceedings of the 5th International IFIP Working Conference on Enterprise Interoperability, IWEI 2013

    NARCIS (Netherlands)

    van Sinderen, Marten J.; Oude Luttighuis, P.H.W.M.; Folmer, Erwin Johan Albert; Bosems, S.; Unknown, [Unknown

    IWEI is an International IFIP Working Conference covering all aspects of enterprise interoperability with the purpose of achieving flexible cross-organizational collaboration through integrated support at business and technical levels. It provides a forum for discussing ideas and results among both

  14. Inter-Operability of ESA Science Archives

    Science.gov (United States)

    Arviset, Christophe; Guainazzi, Matteo; Salama, Alberto; Dowson, John; Hernández, José; Osuna, Pedro; Venet, Aurèle

    ESA Science Archives for ISO and XMM-Newton have been developed by the Science Operations and Data Systems Division in Villafranca, Spain. By using an open 3-tier architecture (Data Products and Database, Business Logic, User Interface) together with Java and XML technology, inter-operability has been achieved from these archives to external archives (NED/ SIMBAD, ADS, IRAS). Furthermore, that has allowed external archives (CDS, ADS, IRSA, HEASARC) to directly access ISO and XMM-Newton data without going through their standard user interfaces.

  15. Open Source Interoperability: It's More than Technology

    Directory of Open Access Journals (Sweden)

    Dominic Sartorio

    2008-01-01

    Full Text Available The Open Solutions Alliance is a consortium of leading commercial open source vendors, integrators and end users dedicated to the growth of open source based solutions in the enterprise. We believe Linux and other infrastructure software, such as Apache, has become mainstream, and packaged solutions represent the next great growth opportunity. However some unique challenges can temper that opportunity. These challenges include getting the word out about the maturity and enterprise-readiness of those solutions, ensuring interoperability both with each other and with other proprietary and legacy solutions, and ensuring healthy collaboration between vendors and their respective customer and developer communities.

  16. ARGOS policy brief on semantic interoperability.

    Science.gov (United States)

    Kalra, Dipak; Musen, Mark; Smith, Barry; Ceusters, Werner; De Moor, Georges

    2011-01-01

    Semantic interoperability is one of the priority themes of the ARGOS Trans-Atlantic Observatory. This topic represents a globally recognised challenge that must be addressed if electronic health records are to be shared among heterogeneous systems, and the information in them exploited to the maximum benefit of patients, professionals, health services, research, and industry. Progress in this multi-faceted challenge has been piecemeal, and valuable lessons have been learned, and approaches discovered, in Europe and in the US that can be shared and combined. Experts from both continents have met at three ARGOS workshops during 2010 and 2011 to share understanding of these issues and how they might be tackled collectively from both sides of the Atlantic. This policy brief summarises the problems and the reasons why they are important to tackle, and also why they are so difficult. It outlines the major areas of semantic innovation that exist and that are available to help address this challenge. It proposes a series of next steps that need to be championed on both sides of the Atlantic if further progress is to be made in sharing and analysing electronic health records meaningfully. Semantic interoperability requires the use of standards, not only for EHR data to be transferred and structurally mapped into a receiving repository, but also for the clinical content of the EHR to be interpreted in conformity with the original meanings intended by its authors. Wide-scale engagement with professional bodies, globally, is needed to develop these clinical information standards. Accurate and complete clinical documentation, faithful to the patient's situation, and interoperability between systems, require widespread and dependable access to published and maintained collections of coherent and quality-assured semantic resources, including models such as archetypes and templates that would (1) provide clinical context, (2) be mapped to interoperability standards for EHR data

  17. RFID in libraries a step toward interoperability

    CERN Document Server

    Ayre, Lori Bowen

    2012-01-01

    The approval by The National Information Standards Organization (NISO) of a new standard for RFID in libraries is a big step toward interoperability among libraries and vendors. By following this set of practices and procedures, libraries can ensure that an RFID tag in one library can be used seamlessly by another, assuming both comply, even if they have different suppliers for tags, hardware, and software. In this issue of Library Technology Reports, Lori Bowen Ayre, an experienced implementer of automated materials handling systems, Provides background on the evolution of the standard

  18. AliEn - EDG Interoperability in ALICE

    OpenAIRE

    Bagnasco, S.; Barbera, R.; Buncic, P.; Carminati, F.; Cerello, P.; Saiz, P.

    2003-01-01

    AliEn (ALICE Environment) is a GRID-like system for large scale job submission and distributed data management developed and used in the context of ALICE, the CERN LHC heavy-ion experiment. With the aim of exploiting upcoming Grid resources to run AliEn-managed jobs and store the produced data, the problem of AliEn-EDG interoperability was addressed and an in-terface was designed. One or more EDG (European Data Grid) User Interface machines run the AliEn software suite (Cluster Monitor, Stora...

  19. Interoperable PKI Data Distribution in Computational Grids

    Energy Technology Data Exchange (ETDEWEB)

    Pala, Massimiliano; Cholia, Shreyas; Rea, Scott A.; Smith, Sean W.

    2008-07-25

    One of the most successful working examples of virtual organizations, computational grids need authentication mechanisms that inter-operate across domain boundaries. Public Key Infrastructures(PKIs) provide sufficient flexibility to allow resource managers to securely grant access to their systems in such distributed environments. However, as PKIs grow and services are added to enhance both security and usability, users and applications must struggle to discover available resources-particularly when the Certification Authority (CA) is alien to the relying party. This article presents how to overcome these limitations of the current grid authentication model by integrating the PKI Resource Query Protocol (PRQP) into the Grid Security Infrastructure (GSI).

  20. Semantic and Syntactic Object Correlation in the Object-Oriented Method for Interoperability

    National Research Council Canada - National Science Library

    Shedd, Stephen

    2002-01-01

    In today's military interoperability is not a luxury, it is a necessity. Unfortunately, differences in data representation between various systems greatly complicate the task of achieving interoperability...

  1. Establishing Interoperability of a Blog Archive through Linked Open Data

    DEFF Research Database (Denmark)

    Kalb, Hendrik; Lazaridou, Paraskevi; Trier, Matthias

    2013-01-01

    on archived data. However, interoperability among BlogForever archives, as well as with other digital libraries, is necessary in order to avoid silos of data. In this paper, we reveal some of our efforts to establish interoperability through the application of Linked Open data....

  2. Defining Inter-Cloud Architecture for Interoperability and Integration

    NARCIS (Netherlands)

    Demchenko, Y.; Ngo, C.; Makkes, M.X.; Strijkers, R.J.; Laat, C. de

    2012-01-01

    This paper presents on-going research to develop the Inter-Cloud Architecture that should address problems in multi-provider multi-domain heterogeneous Cloud based applications integration and interoperability, including integration and interoperability with legacy infrastructure services. Cloud

  3. Defining inter-cloud architecture for interoperability and integration

    NARCIS (Netherlands)

    Demchenko, Y.; Ngo, C.; Makkes, M.X.; Strijkers, R.; de Laat, C.; Zimmermann, W.; Lee, Y.W.; Demchenko, Y.

    2012-01-01

    This paper presents an on-going research to develop the Inter-Cloud Architecture, which addresses the architectural problems in multi-provider multi-domain heterogeneous cloud based applications integration and interoperability, including integration and interoperability with legacy infrastructure

  4. Reference architecture for interoperability testing of Electric Vehicle charging

    NARCIS (Netherlands)

    Lehfuss, F.; Nohrer, M.; Werkmany, E.; Lopezz, J.A.; Zabalaz, E.

    2015-01-01

    This paper presents a reference architecture for interoperability testing of electric vehicles as well as their support equipment with the smart grid and the e-Mobility environment. Pan-European Electric Vehicle (EV)-charging is currently problematic as there are compliance and interoperability

  5. Interactive test tool for interoperable C-ITS development

    NARCIS (Netherlands)

    Voronov, A.; Englund, C.; Bengtsson, H.H.; Chen, L.; Ploeg, J.; Jongh, J.F.C.M. de; Sluis, H.J.D. van de

    2015-01-01

    This paper presents the architecture of an Interactive Test Tool (ITT) for interoperability testing of Cooperative Intelligent Transport Systems (C-ITS). Cooperative systems are developed by different manufacturers at different locations, which makes interoperability testing a tedious task. Up until

  6. A maturity model for interoperability in eHealth

    NARCIS (Netherlands)

    van Velsen, Lex Stefan; Oude Nijeweme-d'Hollosy, Wendeline; Hermens, Hermanus J.

    2016-01-01

    Interoperability, the ability of different technological applications to exchange data, is viewed by many as an important goal for eHealth, as it can save money and improve the quality of care and patient safety. However, creating an interoperable infrastructure for eHealth is a difficult task. In

  7. Connected Lighting System Interoperability Study Part 1: Application Programming Interfaces

    Energy Technology Data Exchange (ETDEWEB)

    Gaidon, Clement [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Poplawski, Michael [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2017-10-31

    First in a series of studies that focuses on interoperability as realized by the use of Application Programming Interfaces (APIs), explores the diversity of such interfaces in several connected lighting systems; characterizes the extent of interoperability that they provide; and illustrates challenges, limitations, and tradeoffs that were encountered during this exploration.

  8. Interoperability of Demand Response Resources Demonstration in NY

    Energy Technology Data Exchange (ETDEWEB)

    Wellington, Andre

    2014-03-31

    The Interoperability of Demand Response Resources Demonstration in NY (Interoperability Project) was awarded to Con Edison in 2009. The objective of the project was to develop and demonstrate methodologies to enhance the ability of customer sited Demand Response resources to integrate more effectively with electric delivery companies and regional transmission organizations.

  9. Evolving Interoperable Data Systems Through Regional Collaborations

    Science.gov (United States)

    Howard, M. K.

    2008-12-01

    The Gulf of Mexico Coastal Ocean Observing System (GCOOS) is a federation of independent sub-regional observing systems. Most of these systems were in operation long before the Integrated Ocean Observing System (IOOS) Data Management and Communications (DMAC) guidelines were established. Hence, each local data management system evolved independently and interoperability was never a consideration. Achieving the goal of building an automated and largely unattended machine-to-machine interoperable data system for the region has proven to be more than a resource and technological challenge. Challenges also fall within the organizational and cultural realms. In 2008 NOAA funds were used to build the first instance of a GCOOS regional data portal and to harmonize the local data management systems of ten principal sub-regional data providers. Early efforts were focused on regional data catalogs, adoption of a common vocabulary for parameters, and deploying data service access points using common interfaces. This was done in full partnership between the data providers and the portal builders with the intent that local data providers remain independent nodes capable of participating in the vision of IOOS on their own. The data portal serves the region primarily as a central point for fusions of data and products.

  10. Open Health Tools: Tooling for Interoperable Healthcare

    Directory of Open Access Journals (Sweden)

    Skip McGaughey

    2008-11-01

    Full Text Available The Open Health Tools initiative is creating an ecosystem focused on the production of software tooling that promotes the exchange of medical information across political, geographic, cultural, product, and technology lines. At its core, OHT believes that the availability of high-quality tooling that interoperates will propel the industry forward, enabling organizations and vendors to build products and systems that effectively work together. This will ?raise the interoperability bar? as a result of having tools that just work. To achieve these lofty goals, careful consideration must be made to the constituencies that will be most affected by an OHT-influenced world. This document outlines a vision of OHT?s impact to these stakeholders. It does not explain the OHT process itself or how the OHT community operates. Instead, we place emphasis on the impact of that process within the health industry. The catchphrase ?code is king? underpins this document, meaning that the manifestation of any open source community lies in the products and technology it produces.

  11. Food product tracing technology capabilities and interoperability.

    Science.gov (United States)

    Bhatt, Tejas; Zhang, Jianrong Janet

    2013-12-01

    Despite the best efforts of food safety and food defense professionals, contaminated food continues to enter the food supply. It is imperative that contaminated food be removed from the supply chain as quickly as possible to protect public health and stabilize markets. To solve this problem, scores of technology companies purport to have the most effective, economical product tracing system. This study sought to compare and contrast the effectiveness of these systems at analyzing product tracing information to identify the contaminated ingredient and likely source, as well as distribution of the product. It also determined if these systems can work together to better secure the food supply (their interoperability). Institute of Food Technologists (IFT) hypothesized that when technology providers are given a full set of supply-chain data, even for a multi-ingredient product, their systems will generally be able to trace a contaminated product forward and backward through the supply chain. However, when provided with only a portion of supply-chain data, even for a product with a straightforward supply chain, it was expected that interoperability of the systems will be lacking and that there will be difficulty collaborating to identify sources and/or recipients of potentially contaminated product. IFT provided supply-chain data for one complex product to 9 product tracing technology providers, and then compared and contrasted their effectiveness at analyzing product tracing information to identify the contaminated ingredient and likely source, as well as distribution of the product. A vertically integrated foodservice restaurant agreed to work with IFT to secure data from its supply chain for both a multi-ingredient and a simpler product. Potential multi-ingredient products considered included canned tuna, supreme pizza, and beef tacos. IFT ensured that all supply-chain data collected did not include any proprietary information or information that would otherwise

  12. PERSPECTIVES ON INTEROPERABILITY INTEGRATION WITHIN NATO DEFENSE PLANNING PROCESS

    Directory of Open Access Journals (Sweden)

    Florian CIOCAN

    2011-01-01

    Full Text Available Interoperability is not a new area of effort at NATO level. In fact, interoperability and more specifi cally standardization, has been a key element of the Alliance’s approach to fi elding forces for decades. But as the security and operational environment has been in a continuous change, the need to face the new threats and the current involvement in challenging operations in Afghanistan and elsewhere alongside with the necessity to interoperate at lower and lower levels of command with an increasing number of nations, including non-NATO ISAF partners, NGOs, and other organizations, have made the task even more challenging. In this respect Interoperability Integration within NATO Defense Planning Process will facilitate the timely identifi cation, development and delivery of required forces and capabilities that are interoperable and adequately prepared, equipped, trained and supported to undertake the Alliance’s full spectrum of missions.

  13. Secure Interoperable Open Smart Grid Demonstration Project

    Energy Technology Data Exchange (ETDEWEB)

    Magee, Thoman [Consolidated Edison Company Of New York, Inc., NY (United States)

    2014-12-28

    The Consolidated Edison, Inc., of New York (Con Edison) Secure Interoperable Open Smart Grid Demonstration Project (SGDP), sponsored by the United States (US) Department of Energy (DOE), demonstrated that the reliability, efficiency, and flexibility of the grid can be improved through a combination of enhanced monitoring and control capabilities using systems and resources that interoperate within a secure services framework. The project demonstrated the capability to shift, balance, and reduce load where and when needed in response to system contingencies or emergencies by leveraging controllable field assets. The range of field assets includes curtailable customer loads, distributed generation (DG), battery storage, electric vehicle (EV) charging stations, building management systems (BMS), home area networks (HANs), high-voltage monitoring, and advanced metering infrastructure (AMI). The SGDP enables the seamless integration and control of these field assets through a common, cyber-secure, interoperable control platform, which integrates a number of existing legacy control and data systems, as well as new smart grid (SG) systems and applications. By integrating advanced technologies for monitoring and control, the SGDP helps target and reduce peak load growth, improves the reliability and efficiency of Con Edison’s grid, and increases the ability to accommodate the growing use of distributed resources. Con Edison is dedicated to lowering costs, improving reliability and customer service, and reducing its impact on the environment for its customers. These objectives also align with the policy objectives of New York State as a whole. To help meet these objectives, Con Edison’s long-term vision for the distribution grid relies on the successful integration and control of a growing penetration of distributed resources, including demand response (DR) resources, battery storage units, and DG. For example, Con Edison is expecting significant long-term growth of DG

  14. Interoperability science cases with the CDPP tools

    Science.gov (United States)

    Nathanaël, J.; Cecconi, B.; André, N.; Bouchemit, M.; Gangloff, M.; Budnik, E.; Jacquey, C.; Pitout, F.; Durand, J.; Rouillard, A.; Lavraud, B.; Genot, V. N.; Popescu, D.; Beigbeder, L.; Toniutti, J. P.; Caussarieu, S.

    2017-12-01

    Data exchange protocols are never as efficient as when they are invisible for the end user who is then able to discover data, to cross compare observations and modeled data and finally to perform in depth analysis. Over the years these protocols, including SAMP from IVOA, EPN-TAP from the Europlanet 2020 RI community, backed by standard web-services, have been deployed in tools designed by the French Centre de Données de la Physique des Plasmas (CDPP) including AMDA, the Propagation Tool, 3DView, ... . This presentation will focus on science cases which show the capability of interoperability in the planetary and heliophysics contexts, involving both CDPP and companion tools. Europlanet 2020 RI has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No 654208.

  15. SHARP/PRONGHORN Interoperability: Mesh Generation

    Energy Technology Data Exchange (ETDEWEB)

    Avery Bingham; Javier Ortensi

    2012-09-01

    Progress toward collaboration between the SHARP and MOOSE computational frameworks has been demonstrated through sharing of mesh generation and ensuring mesh compatibility of both tools with MeshKit. MeshKit was used to build a three-dimensional, full-core very high temperature reactor (VHTR) reactor geometry with 120-degree symmetry, which was used to solve a neutron diffusion critical eigenvalue problem in PRONGHORN. PRONGHORN is an application of MOOSE that is capable of solving coupled neutron diffusion, heat conduction, and homogenized flow problems. The results were compared to a solution found on a 120-degree, reflected, three-dimensional VHTR mesh geometry generated by PRONGHORN. The ability to exchange compatible mesh geometries between the two codes is instrumental for future collaboration and interoperability. The results were found to be in good agreement between the two meshes, thus demonstrating the compatibility of the SHARP and MOOSE frameworks. This outcome makes future collaboration possible.

  16. Interoperability of Standards for Robotics in CIME

    DEFF Research Database (Denmark)

    Kroszynski, Uri; Sørensen, Torben; Ludwig, Arnold

    1997-01-01

    Esprit Project 6457 "Interoperability of Standards for Robotics in CIME (InterRob)" belongs to the Subprogramme "Integration in Manufacturing" of Esprit, the European Specific Programme for Research and Development in Information Technology supported by the European Commision.The first main goal...... of InterRob was to close the information chain between product design, simulation, programming, and robot control by developing standardized interfaces and their software implementation for standards STEP (International Standard for the Exchange of Product model data, ISO 10303) and IRL (Industrial Robot...... Language, DIN 66312). This is a continuation of the previous Esprit projects CAD*I and NIRO, which developed substantial basics of STEP.The InterRob approach is based on standardized models for product geometry, kinematics, robotics, dynamics and control, hence on a coherent neutral information model...

  17. BIM Interoperability Limitations: Australian and Malaysian Rail Projects

    Directory of Open Access Journals (Sweden)

    Kenley Russell

    2016-01-01

    Full Text Available Building information modelling (BIM is defined as a process involving the generation and management of digital representation of physical and functional characteristics of a facility. The purpose of interoperability in integrated or “open” BIM is to facilitate the information exchange between different digital systems, models and tools. There has been effort towards data interoperability with development of open source standards and object-oriented models, such as industry foundation classes (IFC for vertical infrastructure. However, the lack of open data standards for the information exchange for horizontal infrastructure limits the adoption and effectiveness of integrated BIM. The paper outlines two interoperability issues for construction of rail infrastructure. The issues are presented in two case study reports, one from Australia and one from Malaysia. The each case study includes: a description of the project, the application of BIM in the project, a discussion of the promised BIM interoperability solution plus the identification of the unresolved lack of interoperability for horizontal infrastructure project management. The Moreton Bay Rail project in Australia introduces general software interoperability issues. The Light Rail Extension project in Kuala Lumpur outlines an example of the integration problems related to two different location data structures. The paper highlights how the continuing lack of data interoperability limits utilisation of integrated BIM for horizontal infrastructure rail projects.

  18. Biodiversity information platforms: From standards to interoperability

    Directory of Open Access Journals (Sweden)

    Walter Berendsohn

    2011-11-01

    Full Text Available One of the most serious bottlenecks in the scientific workflows of biodiversity sciences is the need to integrate data from different sources, software applications, and services for analysis, visualisation and publication. For more than a quarter of a century the TDWG Biodiversity Information Standards organisation has a central role in defining and promoting data standards and protocols supporting interoperability between disparate and locally distributed systems. Although often not sufficiently recognized, TDWG standards are the foundation of many popular Biodiversity Informatics applications and infrastructures ranging from small desktop software solutions to large scale international data networks. However, individual scientists and groups of collaborating scientist have difficulties in fully exploiting the potential of standards that are often notoriously complex, lack non-technical documentations, and use different representations and underlying technologies. In the last few years, a series of initiatives such as Scratchpads, the EDIT Platform for Cybertaxonomy, and biowikifarm have started to implement and set up virtual work platforms for biodiversity sciences which shield their users from the complexity of the underlying standards. Apart from being practical work-horses for numerous working processes related to biodiversity sciences, they can be seen as information brokers mediating information between multiple data standards and protocols. The ViBRANT project will further strengthen the flexibility and power of virtual biodiversity working platforms by building software interfaces between them, thus facilitating essential information flows needed for comprehensive data exchange, data indexing, web-publication, and versioning. This work will make an important contribution to the shaping of an international, interoperable, and user-oriented biodiversity information infrastructure.

  19. Biodiversity information platforms: From standards to interoperability.

    Science.gov (United States)

    Berendsohn, W G; Güntsch, A; Hoffmann, N; Kohlbecker, A; Luther, K; Müller, A

    2011-01-01

    One of the most serious bottlenecks in the scientific workflows of biodiversity sciences is the need to integrate data from different sources, software applications, and services for analysis, visualisation and publication. For more than a quarter of a century the TDWG Biodiversity Information Standards organisation has a central role in defining and promoting data standards and protocols supporting interoperability between disparate and locally distributed systems.Although often not sufficiently recognized, TDWG standards are the foundation of many popular Biodiversity Informatics applications and infrastructures ranging from small desktop software solutions to large scale international data networks. However, individual scientists and groups of collaborating scientist have difficulties in fully exploiting the potential of standards that are often notoriously complex, lack non-technical documentations, and use different representations and underlying technologies. In the last few years, a series of initiatives such as Scratchpads, the EDIT Platform for Cybertaxonomy, and biowikifarm have started to implement and set up virtual work platforms for biodiversity sciences which shield their users from the complexity of the underlying standards. Apart from being practical work-horses for numerous working processes related to biodiversity sciences, they can be seen as information brokers mediating information between multiple data standards and protocols.The ViBRANT project will further strengthen the flexibility and power of virtual biodiversity working platforms by building software interfaces between them, thus facilitating essential information flows needed for comprehensive data exchange, data indexing, web-publication, and versioning. This work will make an important contribution to the shaping of an international, interoperable, and user-oriented biodiversity information infrastructure.

  20. IoT interoperability:a hub-based approach

    OpenAIRE

    Blackstock, Michael; Lea, Rodger

    2014-01-01

    Interoperability in the Internet of Things is critical for emerging services and applications. In this paper we advocate the use of IoT ‘hubs’ to aggregate things using web protocols, and suggest a staged approach to interoperability. In the context of a UK government funded project involving 8 IoT projects to address cross-domain IoT interoperability, we introduce the HyperCat IoT catalogue specification. We then describe the tools and techniques we developed to adapt an existing data portal...

  1. Cloud portability and interoperability issues and current trends

    CERN Document Server

    Di Martino, Beniamino; Esposito, Antonio

    2015-01-01

    This book offers readers a quick, comprehensive and up-to-date overview of the most important methodologies, technologies, APIs and standards related to the portability and interoperability of cloud applications and services, illustrated by a number of use cases representing a variety of interoperability and portability scenarios. The lack of portability and interoperability between cloud platforms at different service levels is the main issue affecting cloud-based services today. The brokering, negotiation, management, monitoring and reconfiguration of cloud resources are challenging tasks

  2. Model-driven approach to enterprise interoperability at the technical service level

    NARCIS (Netherlands)

    Kadka, Ravi; Sapkota, Brahmananda; Ferreira Pires, Luis; van Sinderen, Marten J.; Jansen, Slinger

    2013-01-01

    Enterprise Interoperability is the ability of enterprises to interoperate in order to achieve their business goals. Although the purpose of enterprise interoperability is determined at the business level, the use of technical (IT) services to support business services implies that interoperability

  3. A Cultural Framework for the Interoperability of C2 Systems

    National Research Council Canada - National Science Library

    Slay, Jill

    2002-01-01

    In considering some of the difficulties experienced in coalition operations, it becomes apparent that attention is needed, is in establishing a cultural framework for the interoperability of personnel (the human agents...

  4. CCSDS SM and C Mission Operations Interoperability Prototype

    Science.gov (United States)

    Lucord, Steven A.

    2010-01-01

    This slide presentation reviews the prototype of the Spacecraft Monitor and Control (SM&C) Operations for interoperability among other space agencies. This particular prototype uses the German Space Agency (DLR) to test the ideas for interagency coordination.

  5. Model-driven development of service compositions for enterprise interoperability

    NARCIS (Netherlands)

    Khadka, Ravi; Sapkota, Brahmananda; Ferreira Pires, Luis; Jansen, Slinger; van Sinderen, Marten J.; Johnson, Pontus

    2011-01-01

    Service-Oriented Architecture (SOA) has emerged as an architectural style to foster enterprise interoperability, as it claims to facilitate the flexible composition of loosely coupled enterprise applications and thus alleviates the heterogeneity problem among enterprises. Meanwhile, Model-Driven

  6. An interoperability architecture for the health information exchange in Rwanda

    CSIR Research Space (South Africa)

    Crichton, R

    2012-08-01

    Full Text Available of an architecture to support: interoperability between existing health information systems already in use in the country; incremental extension into a fully integrated national health information system without substantial reengineering; and scaling, from a single...

  7. Patterns in Standards and Technologies for Economic Information Systems Interoperability

    Directory of Open Access Journals (Sweden)

    Vasile Irimia

    2012-06-01

    Full Text Available This paper presets results from a review of the current standards used for collaboration between economic information systems, including web services and service oriented architecture, EDI, ebXML framework, RosettaNet framework, cXML, xCBL UBL, BPMN, BPEL, WS-CDL, ASN.1, and others. Standards have a key role in promoting economic information system interoperability, and thus enable collaboration. Analyzing the current standards, technologies and applications used for economic information systems interoperability has revealed a common pattern that runs through all of them. From this pattern we construct a basic model of interoperability around which we relate and judge all standards, technologies and applications for economic information systems interoperability.

  8. Improving NATO's Interoperability Through U.S. Precision Weapons

    National Research Council Canada - National Science Library

    Westhauser, Todd

    1998-01-01

    .... This paper compares and contrasts two U.S. advanced precision weapons capabilities, the Paveway LGBs using buddy-lasing tactics and the JDAM, against the criteria of training, cost, interoperability...

  9. Positive train control interoperability and networking research : final report.

    Science.gov (United States)

    2015-12-01

    This document describes the initial development of an ITC PTC Shared Network (IPSN), a hosted : environment to support the distribution, configuration management, and IT governance of Interoperable : Train Control (ITC) Positive Train Control (PTC) s...

  10. Semantic Interoperability, E-Health and Australian Health Statistics.

    Science.gov (United States)

    Goodenough, Sally

    2009-06-01

    E-health implementation in Australia will depend upon interoperable computer systems to share information and data across the health sector. Semantic interoperability, which preserves the meaning of information and data when it is shared or re-purposed, is critical for safe clinical care, and also for any re-use of the information or data for other purposes. One such re-use is for national health statistics. Usable statistics rely on comparable and consistent data, and current practice is to use agreed national data standards to achieve this. The standardisation and interoperability needed to support e-health should also provide strong support for national health statistics. This report discusses some of the semantic interoperability issues involved in moving from the current data supply process for national health statistics to an e-health-enabled future.

  11. Interoperability, Enterprise Architectures, and IT Governance in Government

    OpenAIRE

    Scholl , Hans ,; Kubicek , Herbert; Cimander , Ralf

    2011-01-01

    Part 4: Architecture, Security and Interoperability; International audience; Government represents a unique, and also uniquely complex, environment for interoperation of information systems as well as for integration of workflows and processes across governmental levels and branches. While private-sector organizations by and large have the capacity to implement “enterprise architectures” in a relatively straightforward fashion, for notable reasons governments do not enjoy such luxury. For thi...

  12. Ensuring Sustainable Data Interoperability Across the Natural and Social Sciences

    Science.gov (United States)

    Downs, R. R.; Chen, R. S.

    2015-12-01

    Both the natural and social science data communities are attempting to address the long-term sustainability of their data infrastructures in rapidly changing research, technological, and policy environments. Many parts of these communities are also considering how to improve the interoperability and integration of their data and systems across natural, social, health, and other domains. However, these efforts have generally been undertaken in parallel, with little thought about how different sustainability approaches may impact long-term interoperability from scientific, legal, or economic perspectives, or vice versa, i.e., how improved interoperability could enhance—or threaten—infrastructure sustainability. Scientific progress depends substantially on the ability to learn from the legacy of previous work available for current and future scientists to study, often by integrating disparate data not previously assembled. Digital data are less likely than scientific publications to be usable in the future unless they are managed by science-oriented repositories that can support long-term data access with the documentation and services needed for future interoperability. We summarize recent discussions in the social and natural science communities on emerging approaches to sustainability and relevant interoperability activities, including efforts by the Belmont Forum E-Infrastructures project to address global change data infrastructure needs; the Group on Earth Observations to further implement data sharing and improve data management across diverse societal benefit areas; and the Research Data Alliance to develop legal interoperability principles and guidelines and to address challenges faced by domain repositories. We also examine emerging needs for data interoperability in the context of the post-2015 development agenda and the expected set of Sustainable Development Goals (SDGs), which set ambitious targets for sustainable development, poverty reduction, and

  13. GEOSS interoperability for Weather, Ocean and Water

    Science.gov (United States)

    Richardson, David; Nyenhuis, Michael; Zsoter, Ervin; Pappenberger, Florian

    2013-04-01

    "Understanding the Earth system — its weather, climate, oceans, atmosphere, water, land, geodynamics, natural resources, ecosystems, and natural and human-induced hazards — is crucial to enhancing human health, safety and welfare, alleviating human suffering including poverty, protecting the global environment, reducing disaster losses, and achieving sustainable development. Observations of the Earth system constitute critical input for advancing this understanding." With this in mind, the Group on Earth Observations (GEO) started implementing the Global Earth Observation System of Systems (GEOSS). GEOWOW, short for "GEOSS interoperability for Weather, Ocean and Water", is supporting this objective. GEOWOW's main challenge is to improve Earth observation data discovery, accessibility and exploitability, and to evolve GEOSS in terms of interoperability, standardization and functionality. One of the main goals behind the GEOWOW project is to demonstrate the value of the TIGGE archive in interdisciplinary applications, providing a vast amount of useful and easily accessible information to the users through the GEO Common Infrastructure (GCI). GEOWOW aims at developing funcionalities that will allow easy discovery, access and use of TIGGE archive data and of in-situ observations, e.g. from the Global Runoff Data Centre (GRDC), to support applications such as river discharge forecasting.TIGGE (THORPEX Interactive Grand Global Ensemble) is a key component of THORPEX: a World Weather Research Programme to accelerate the improvements in the accuracy of 1-day to 2 week high-impact weather forecasts for the benefit of humanity. The TIGGE archive consists of ensemble weather forecast data from ten global NWP centres, starting from October 2006, which has been made available for scientific research. The TIGGE archive has been used to analyse hydro-meteorological forecasts of flooding in Europe as well as in China. In general the analysis has been favourable in terms of

  14. ISAIA: Interoperable Systems for Archival Information Access

    Science.gov (United States)

    Hanisch, Robert J.

    2002-01-01

    The ISAIA project was originally proposed in 1999 as a successor to the informal AstroBrowse project. AstroBrowse, which provided a data location service for astronomical archives and catalogs, was a first step toward data system integration and interoperability. The goals of ISAIA were ambitious: '...To develop an interdisciplinary data location and integration service for space science. Building upon existing data services and communications protocols, this service will allow users to transparently query hundreds or thousands of WWW-based resources (catalogs, data, computational resources, bibliographic references, etc.) from a single interface. The service will collect responses from various resources and integrate them in a seamless fashion for display and manipulation by the user.' Funding was approved only for a one-year pilot study, a decision that in retrospect was wise given the rapid changes in information technology in the past few years and the emergence of the Virtual Observatory initiatives in the US and worldwide. Indeed, the ISAIA pilot study was influential in shaping the science goals, system design, metadata standards, and technology choices for the virtual observatory. The ISAIA pilot project also helped to cement working relationships among the NASA data centers, US ground-based observatories, and international data centers. The ISAIA project was formed as a collaborative effort between thirteen institutions that provided data to astronomers, space physicists, and planetary scientists. Among the fruits we ultimately hoped would come from this project would be a central site on the Web that any space scientist could use to efficiently locate existing data relevant to a particular scientific question. Furthermore, we hoped that the needed technology would be general enough to allow smaller, more-focused community within space science could use the same technologies and standards to provide more specialized services. A major challenge to searching

  15. Interoperable Data Sharing for Diverse Scientific Disciplines

    Science.gov (United States)

    Hughes, John S.; Crichton, Daniel; Martinez, Santa; Law, Emily; Hardman, Sean

    2016-04-01

    For diverse scientific disciplines to interoperate they must be able to exchange information based on a shared understanding. To capture this shared understanding, we have developed a knowledge representation framework using ontologies and ISO level archive and metadata registry reference models. This framework provides multi-level governance, evolves independent of implementation technologies, and promotes agile development, namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. The knowledge representation framework is populated through knowledge acquisition from discipline experts. It is also extended to meet specific discipline requirements. The result is a formalized and rigorous knowledge base that addresses data representation, integrity, provenance, context, quantity, and their relationships within the community. The contents of the knowledge base is translated and written to files in appropriate formats to configure system software and services, provide user documentation, validate ingested data, and support data analytics. This presentation will provide an overview of the framework, present the Planetary Data System's PDS4 as a use case that has been adopted by the international planetary science community, describe how the framework is being applied to other disciplines, and share some important lessons learned.

  16. Scalability and interoperability within glideinWMS

    International Nuclear Information System (INIS)

    Bradley, D.; Sfiligoi, I.; Padhi, S.; Frey, J.; Tannenbaum, T.

    2010-01-01

    Physicists have access to thousands of CPUs in grid federations such as OSG and EGEE. With the start-up of the LHC, it is essential for individuals or groups of users to wrap together available resources from multiple sites across multiple grids under a higher user-controlled layer in order to provide a homogeneous pool of available resources. One such system is glideinWMS, which is based on the Condor batch system. A general discussion of glideinWMS can be found elsewhere. Here, we focus on recent advances in extending its reach: scalability and integration of heterogeneous compute elements. We demonstrate that the new developments exceed the design goal of over 10,000 simultaneous running jobs under a single Condor schedd, using strong security protocols across global networks, and sustaining a steady-state job completion rate of a few Hz. We also show interoperability across heterogeneous computing elements achieved using client-side methods. We discuss this technique and the challenges in direct access to NorduGrid and CREAM compute elements, in addition to Globus based systems.

  17. The advanced microgrid. Integration and interoperability

    Energy Technology Data Exchange (ETDEWEB)

    Bower, Ward Isaac [Ward Bower Innovations, LLC, Albuquerque, NM (United Staes); Ton, Dan T. [U.S. Dept. of Energy, Washington, DC (United States); Guttromson, Ross [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Glover, Steven F [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Stamp, Jason Edwin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bhatnagar, Dhruv [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Reilly, Jim [Reily Associates, Pittston, PA (United States)

    2014-02-01

    This white paper focuses on "advanced microgrids," but sections do, out of necessity, reference today's commercially available systems and installations in order to clearly distinguish the differences and advances. Advanced microgrids have been identified as being a necessary part of the modern electrical grid through a two DOE microgrid workshops, the National Institute of Standards and Technology, Smart Grid Interoperability Panel and other related sources. With their grid-interconnectivity advantages, advanced microgrids will improve system energy efficiency and reliability and provide enabling technologies for grid-independence to end-user sites. One popular definition that has been evolved and is used in multiple references is that a microgrid is a group of interconnected loads and distributed-energy resources within clearly defined electrical boundaries that acts as a single controllable entity with respect to the grid. A microgrid can connect and disconnect from the grid to enable it to operate in both grid-connected or island-mode. Further, an advanced microgrid can then be loosely defined as a dynamic microgrid.

  18. Recent ARC developments: Through modularity to interoperability

    Energy Technology Data Exchange (ETDEWEB)

    Smirnova, O; Cameron, D; Ellert, M; Groenager, M; Johansson, D; Kleist, J [NDGF, Kastruplundsgade 22, DK-2770 Kastrup (Denmark); Dobe, P; Joenemo, J; Konya, B [Lund University, Experimental High Energy Physics, Institute of Physics, Box 118, SE-22100 Lund (Sweden); Fraagaat, T; Konstantinov, A; Nilsen, J K; Saada, F Ould; Qiang, W; Read, A [University of Oslo, Department of Physics, P. O. Box 1048, Blindern, N-0316 Oslo (Norway); Kocan, M [Pavol Jozef Safarik University, Faculty of Science, Jesenna 5, SK-04000 Kosice (Slovakia); Marton, I; Nagy, Zs [NIIF/HUNGARNET, Victor Hugo 18-22, H-1132 Budapest (Hungary); Moeller, S [University of Luebeck, Inst. Of Neuro- and Bioinformatics, Ratzeburger Allee 160, D-23538 Luebeck (Germany); Mohn, B, E-mail: oxana.smirnova@hep.lu.s [Uppsala University, Department of Physics and Astronomy, Div. of Nuclear and Particle Physics, Box 535, SE-75121 Uppsala (Sweden)

    2010-04-01

    The Advanced Resource Connector (ARC) middleware introduced by NorduGrid is one of the basic Grid solutions used by scientists worldwide. While being well-proven in daily use by a wide variety of scientific applications at large-scale infrastructures like the Nordic DataGrid Facility (NDGF) and smaller scale projects, production ARC of today is still largely based on conventional Grid technologies and custom interfaces introduced a decade ago. In order to guarantee sustainability, true cross-system portability and standards-compliance based interoperability, the ARC community undertakes a massive effort of implementing modular Web Service (WS) approach into the middleware. With support from the EU KnowARC project, new components were introduced and the existing key ARC services got extended with WS technology based standard-compliant interfaces following a service-oriented architecture. Such components include the hosting environment framework, the resource-coupled execution service, the re-engineered client library, the self-healing storage solution and the peer-to-peer information system, to name a few. Gradual introduction of these new services and client tools into the production middleware releases is carried out together with NDGF and thus ensures a smooth transition to the next generation Grid middleware. Standard interfaces and modularity of the new component design are essential for ARC contributions to the planned Universal Middleware Distribution of the European Grid Initiative.

  19. Recent ARC developments: Through modularity to interoperability

    International Nuclear Information System (INIS)

    Smirnova, O; Cameron, D; Ellert, M; Groenager, M; Johansson, D; Kleist, J; Dobe, P; Joenemo, J; Konya, B; Fraagaat, T; Konstantinov, A; Nilsen, J K; Saada, F Ould; Qiang, W; Read, A; Kocan, M; Marton, I; Nagy, Zs; Moeller, S; Mohn, B

    2010-01-01

    The Advanced Resource Connector (ARC) middleware introduced by NorduGrid is one of the basic Grid solutions used by scientists worldwide. While being well-proven in daily use by a wide variety of scientific applications at large-scale infrastructures like the Nordic DataGrid Facility (NDGF) and smaller scale projects, production ARC of today is still largely based on conventional Grid technologies and custom interfaces introduced a decade ago. In order to guarantee sustainability, true cross-system portability and standards-compliance based interoperability, the ARC community undertakes a massive effort of implementing modular Web Service (WS) approach into the middleware. With support from the EU KnowARC project, new components were introduced and the existing key ARC services got extended with WS technology based standard-compliant interfaces following a service-oriented architecture. Such components include the hosting environment framework, the resource-coupled execution service, the re-engineered client library, the self-healing storage solution and the peer-to-peer information system, to name a few. Gradual introduction of these new services and client tools into the production middleware releases is carried out together with NDGF and thus ensures a smooth transition to the next generation Grid middleware. Standard interfaces and modularity of the new component design are essential for ARC contributions to the planned Universal Middleware Distribution of the European Grid Initiative.

  20. The MED-SUV Multidisciplinary Interoperability Infrastructure

    Science.gov (United States)

    Mazzetti, Paolo; D'Auria, Luca; Reitano, Danilo; Papeschi, Fabrizio; Roncella, Roberto; Puglisi, Giuseppe; Nativi, Stefano

    2016-04-01

    the layer above. In order to address data and service heteogeneity, the MED-SUV infrastructure is based on the brokered architecture approach, implemented using the GI-suite Brokering Framework for discovery and access. The GI-Suite Brokering Framework has been extended and configured to broker all the identified relevant data sources. It is also able to publish data according to several de-iure and de-facto standards including OGC CSW and OpenSearch, facilitating the interconnection with external systems. At the Global level, MED-SUV identified the interconnection with GEOSS as the main requirement. Since MED-SUV Supersite level is implemented based on the same technology adopted in the current GEOSS Common Infrastructure (GCI) by the GEO Discovery and Access Broker (GEO DAB), no major interoperability problem is foreseen. The MED-SUV Multidisciplinary Interoperability Infrastructure is complemented by a user portal providing human-to-machine interaction, and enabling data discovery and access. The GI-Suite Brokering Framework APIs and javascript library support machine-to-machine interaction, enabling the creation of mobile and Web applications using information available through the MED-SUV Supersite.

  1. Vocabulary services to support scientific data interoperability

    Science.gov (United States)

    Cox, Simon; Mills, Katie; Tan, Florence

    2013-04-01

    Shared vocabularies are a core element in interoperable systems. Vocabularies need to be available at run-time, and where the vocabularies are shared by a distributed community this implies the use of web technology to provide vocabulary services. Given the ubiquity of vocabularies or classifiers in systems, vocabulary services are effectively the base of the interoperability stack. In contemporary knowledge organization systems, a vocabulary item is considered a concept, with the "terms" denoting it appearing as labels. The Simple Knowledge Organization System (SKOS) formalizes this as an RDF Schema (RDFS) application, with a bridge to formal logic in Web Ontology Language (OWL). For maximum utility, a vocabulary should be made available through the following interfaces: * the vocabulary as a whole - at an ontology URI corresponding to a vocabulary document * each item in the vocabulary - at the item URI * summaries, subsets, and resources derived by transformation * through the standard RDF web API - i.e. a SPARQL endpoint * through a query form for human users. However, the vocabulary data model may be leveraged directly in a standard vocabulary API that uses the semantics provided by SKOS. SISSvoc3 [1] accomplishes this as a standard set of URI templates for a vocabulary. Any URI comforming to the template selects a vocabulary subset based on the SKOS properties, including labels (skos:prefLabel, skos:altLabel, rdfs:label) and a subset of the semantic relations (skos:broader, skos:narrower, etc). SISSvoc3 thus provides a RESTFul SKOS API to query a vocabulary, but hiding the complexity of SPARQL. It has been implemented using the Linked Data API (LDA) [2], which connects to a SPARQL endpoint. By using LDA, we also get content-negotiation, alternative views, paging, metadata and other functionality provided in a standard way. A number of vocabularies have been formalized in SKOS and deployed by CSIRO, the Australian Bureau of Meteorology (BOM) and their

  2. An Open Source Tool to Test Interoperability

    Science.gov (United States)

    Bermudez, L. E.

    2012-12-01

    Scientists interact with information at various levels from gathering of the raw observed data to accessing portrayed processed quality control data. Geoinformatics tools help scientist on the acquisition, storage, processing, dissemination and presentation of geospatial information. Most of the interactions occur in a distributed environment between software components that take the role of either client or server. The communication between components includes protocols, encodings of messages and managing of errors. Testing of these communication components is important to guarantee proper implementation of standards. The communication between clients and servers can be adhoc or follow standards. By following standards interoperability between components increase while reducing the time of developing new software. The Open Geospatial Consortium (OGC), not only coordinates the development of standards but also, within the Compliance Testing Program (CITE), provides a testing infrastructure to test clients and servers. The OGC Web-based Test Engine Facility, based on TEAM Engine, allows developers to test Web services and clients for correct implementation of OGC standards. TEAM Engine is a JAVA open source facility, available at Sourceforge that can be run via command line, deployed in a web servlet container or integrated in developer's environment via MAVEN. The TEAM Engine uses the Compliance Test Language (CTL) and TestNG to test HTTP requests, SOAP services and XML instances against Schemas and Schematron based assertions of any type of web service, not only OGC services. For example, the OGC Web Feature Service (WFS) 1.0.0 test has more than 400 test assertions. Some of these assertions includes conformance of HTTP responses, conformance of GML-encoded data; proper values for elements and attributes in the XML; and, correct error responses. This presentation will provide an overview of TEAM Engine, introduction of how to test via the OGC Testing web site and

  3. Interoperability Guidelines for Lithuanian E-Learning Management Systems

    Directory of Open Access Journals (Sweden)

    Eugenijus Kurilovas

    2011-08-01

    Full Text Available Purpose – the paper aims to analyse e-learning content and repositories along with the problems of learning organisation interoperability. The main objective of the paper is to analyse scientific research results and the newest international experience in the area and to provide interoperability guidelines and recommendations for the implementation of appropriate Lithuanian state programmes. The learning content and repositories recommendations are designed for the implementation of the Lithuanian education portal project as well as Lithuanian Virtual University (LVU programme’s information services’ (LABT / eLABa and e-learning services’ (LieDM sub-programmes. The whole education institution recommendations are designed for the maintenance and development of LVU programme’s management services’ (LieMSIS system.Design/methodology/approach – methods used for the general analysis of proposed interoperability guidelines (reccomendations were bibliographic research and comparative analysis of Lithuanian and foreign scientific works published in periodicals and large-scale EU-funded interoperability projects deliverables. System analysis and comparative analysis methods were used in order to formulate and analyse systems’ interoperability guidelines and recommendations. The author employed the experimental research method while working in the appropriate EU-funded interoperability projects to form the guidelines (recommendations. In order to summarize the results, the evaluative research method was used..Findings – the international guidelines and recommendations presented in the paper could be suitable for implementation while developing Lithuanian state education information systems such as the Lithuanian education portal, the Lithuanian academic libraries’ (eLABa system, the Lithuanian distance learning system (LieDM, and the Lithuanian universities’ management system (LieMSIS.Research limitations/implications – the paper

  4. Interoperability Guidelines for Lithuanian E-Learning Management Systems

    Directory of Open Access Journals (Sweden)

    Eugenijus Kurilovas

    2013-08-01

    Full Text Available Purpose – the paper aims to analyse e-learning content and repositories along with the problems of learning organisation interoperability. The main objective of the paper is to analyse scientific research results and the newest international experience in the area and to provide interoperability guidelines and recommendations for the implementation of appropriate Lithuanian state programmes. The learning content and repositories recommendations are designed for the implementation of the Lithuanian education portal project as well as Lithuanian Virtual University (LVU programme’s information services’ (LABT / eLABa and e-learning services’ (LieDM sub-programmes. The whole education institution recommendations are designed for the maintenance and development of LVU programme’s management services’ (LieMSIS system.Design/methodology/approach – methods used for the general analysis of proposed interoperability guidelines (reccomendations were bibliographic research and comparative analysis of Lithuanian and foreign scientific works published in periodicals and large-scale EU-funded interoperability projects deliverables. System analysis and comparative analysis methods were used in order to formulate and analyse systems’ interoperability guidelines and recommendations. The author employed the experimental research method while working in the appropriate EU-funded interoperability projects to form the guidelines (recommendations. In order to summarize the results, the evaluative research method was used.Findings – the international guidelines and recommendations presented in the paper could be suitable for implementation while developing Lithuanian state education information systems such as the Lithuanian education portal, the Lithuanian academic libraries’ (eLABa system, the Lithuanian distance learning system (LieDM, and the Lithuanian universities’ management system (LieMSIS.Research limitations/implications – the paper

  5. Personal health records: is rapid adoption hindering interoperability?

    Science.gov (United States)

    Studeny, Jana; Coustasse, Alberto

    2014-01-01

    The establishment of the Meaningful Use criteria has created a critical need for robust interoperability of health records. A universal definition of a personal health record (PHR) has not been agreed upon. Standardized code sets have been built for specific entities, but integration between them has not been supported. The purpose of this research study was to explore the hindrance and promotion of interoperability standards in relationship to PHRs to describe interoperability progress in this area. The study was conducted following the basic principles of a systematic review, with 61 articles used in the study. Lagging interoperability has stemmed from slow adoption by patients, creation of disparate systems due to rapid development to meet requirements for the Meaningful Use stages, and rapid early development of PHRs prior to the mandate for integration among multiple systems. Findings of this study suggest that deadlines for implementation to capture Meaningful Use incentive payments are supporting the creation of PHR data silos, thereby hindering the goal of high-level interoperability.

  6. Semantic Interoperability in Heterogeneous IoT Infrastructure for Healthcare

    Directory of Open Access Journals (Sweden)

    Sohail Jabbar

    2017-01-01

    Full Text Available Interoperability remains a significant burden to the developers of Internet of Things’ Systems. This is due to the fact that the IoT devices are highly heterogeneous in terms of underlying communication protocols, data formats, and technologies. Secondly due to lack of worldwide acceptable standards, interoperability tools remain limited. In this paper, we proposed an IoT based Semantic Interoperability Model (IoT-SIM to provide Semantic Interoperability among heterogeneous IoT devices in healthcare domain. Physicians communicate their patients with heterogeneous IoT devices to monitor their current health status. Information between physician and patient is semantically annotated and communicated in a meaningful way. A lightweight model for semantic annotation of data using heterogeneous devices in IoT is proposed to provide annotations for data. Resource Description Framework (RDF is a semantic web framework that is used to relate things using triples to make it semantically meaningful. RDF annotated patients’ data has made it semantically interoperable. SPARQL query is used to extract records from RDF graph. For simulation of system, we used Tableau, Gruff-6.2.0, and Mysql tools.

  7. Achieving Interoperability in GEOSS - How Close Are We?

    Science.gov (United States)

    Arctur, D. K.; Khalsa, S. S.; Browdy, S. F.

    2010-12-01

    A primary goal of the Global Earth Observing System of System (GEOSS) is improving the interoperability between the observational, modelling, data assimilation, and prediction systems contributed by member countries. The GEOSS Common Infrastructure (GCI) comprises the elements designed to enable discovery and access to these diverse data and information sources. But to what degree can the mechanisms for accessing these data, and the data themselves, be considered interoperable? Will the separate efforts by Communities of Practice within GEO to build their own portals, such as for Energy, Biodiversity, and Air Quality, lead to fragmentation or synergy? What communication and leadership do we need with these communities to improve interoperability both within and across such communities? The Standards and Interoperability Forum (SIF) of GEO's Architecture and Data Committee has assessed progress towards achieving the goal of global interoperability and made recommendations regarding evolution of the architecture and overall data strategy to ensure fulfillment of the GEOSS vision. This presentation will highlight the results of this study, and directions for further work.

  8. Framework research of semantic sharing and interoperability of geospatial information

    Science.gov (United States)

    Zhao, Hu; Li, Lin; Shi, Yunfei

    2008-12-01

    Knowledge sharing and semantic interoperability is a significant research theme in Geographical Information Science (GIScience) because many researchers believe that semantic heterogeneity has been identified as the main obstacle for GIScience development. Interoperability issues can exist at three levels: syntactic, structural (also called systemic) and semantic. The former two, however, can be achieved by implementing international or domain standards proposed by several organizations, for example, Open Geospatial Consortium (OGC), World Wide Web Consortium (W3C) and the International Organization for Standardization/Technical Committee for Geographic information/Geomatics (ISO/TC 211). In this paper, we are concentrating on semantic interoperability, which is the sort of topic that halt conversations and cause people's eyes to glaze over, from two aspects: data/information/knowledge and operation/processing. We presented a service-centered architecture for semantic interoperability of geospatial data and processes. OGC standards like Web Feature Service (WFS) and Web Map Service (WMS) have been employed as normative interfaces for analyzing requests, division requests and delivering small requests. Ontology has been introduced to describe distributed resource including various data and geo-processing operations. The role of interoperability, especially from semantic perspective, has been distinguished at the first section in this paper. As a fundamental principal, the following section introduces semantic web, web service and other related works at this orientation. We present our service-based architecture in detail and its simple application at part three. Conclusion and further orientations have been illustrated at last section.

  9. Metadata behind the Interoperability of Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Miguel Angel Manso Callejo

    2009-05-01

    Full Text Available Wireless Sensor Networks (WSNs produce changes of status that are frequent, dynamic and unpredictable, and cannot be represented using a linear cause-effect approach. Consequently, a new approach is needed to handle these changes in order to support dynamic interoperability. Our approach is to introduce the notion of context as an explicit representation of changes of a WSN status inferred from metadata elements, which in turn, leads towards a decision-making process about how to maintain dynamic interoperability. This paper describes the developed context model to represent and reason over different WSN status based on four types of contexts, which have been identified as sensing, node, network and organisational contexts. The reasoning has been addressed by developing contextualising and bridges rules. As a result, we were able to demonstrate how contextualising rules have been used to reason on changes of WSN status as a first step towards maintaining dynamic interoperability.

  10. Dynamic Business Networks: A Headache for Sustainable Systems Interoperability

    Science.gov (United States)

    Agostinho, Carlos; Jardim-Goncalves, Ricardo

    Collaborative networked environments emerged with the spread of the internet, contributing to overcome past communication barriers, and identifying interoperability as an essential property. When achieved seamlessly, efficiency is increased in the entire product life cycle. Nowadays, most organizations try to attain interoperability by establishing peer-to-peer mappings with the different partners, or in optimized networks, by using international standard models as the core for information exchange. In current industrial practice, mappings are only defined once, and the morphisms that represent them, are hardcoded in the enterprise systems. This solution has been effective for static environments, where enterprise and product models are valid for decades. However, with an increasingly complex and dynamic global market, models change frequently to answer new customer requirements. This paper draws concepts from the complex systems science and proposes a framework for sustainable systems interoperability in dynamic networks, enabling different organizations to evolve at their own rate.

  11. Operational Interoperability Challenges on the Example of GEOSS and WIS

    Science.gov (United States)

    Heene, M.; Buesselberg, T.; Schroeder, D.; Brotzer, A.; Nativi, S.

    2015-12-01

    The following poster highlights the operational interoperability challenges on the example of Global Earth Observation System of Systems (GEOSS) and World Meteorological Organization Information System (WIS). At the heart of both systems is a catalogue of earth observation data, products and services but with different metadata management concepts. While in WIS a strong governance with an own metadata profile for the hundreds of thousands metadata records exists, GEOSS adopted a more open approach for the ten million records. Furthermore, the development of WIS - as an operational system - follows a roadmap with committed downwards compatibility while the GEOSS development process is more agile. The poster discusses how the interoperability can be reached for the different metadata management concepts and how a proxy concept helps to couple two different systems which follow a different development methodology. Furthermore, the poster highlights the importance of monitoring and backup concepts as a verification method for operational interoperability.

  12. Interoperability of CAD Standards and Robotics in CIME

    DEFF Research Database (Denmark)

    Sørensen, Torben

    The research presented in this dissertation concerns the identification of problems and provision of solutions for increasing the degree of interoperability between CAD, CACSD (Computer Aided Control Systems Design) and CAR (Computer Aided Robotics) in Computer Integrated Manufacturing and Engine......The research presented in this dissertation concerns the identification of problems and provision of solutions for increasing the degree of interoperability between CAD, CACSD (Computer Aided Control Systems Design) and CAR (Computer Aided Robotics) in Computer Integrated Manufacturing......· The development of a STEP based interface for general control system data and functions, especially related to robot motion control for interoperability of CAD, CACSD, and CAR systems for the extension of the inter-system communication capabilities beyond the stage achieved up to now.This interface development...

  13. Interoperable and standard e-Health solution over Bluetooth.

    Science.gov (United States)

    Martinez, I; Del Valle, P; Munoz, P; Trigo, J D; Escayola, J; Martínez-Espronceda, M; Muñoz, A; Serrano, L; Garcia, J

    2010-01-01

    The new paradigm of e-Health demands open sensors and middleware components that permit transparent integration and end-to-end interoperability of new personal health devices. The use of standards seems to be the internationally adopted way to solve these problems. This paper presents the implementation of an end-to-end standards-based e-Health solution. This includes ISO/IEEE11073 standard for the interoperability of the medical devices in the patient environment and EN13606 standard for the interoperable exchange of the Electronic Healthcare Record. The design strictly fulfills all the technical features of the most recent versions of both standards. The implemented prototype has been tested in a laboratory environment to demonstrate its feasibility for its further transfer to the healthcare system.

  14. 75 FR 66752 - Smart Grid Interoperability Standards; Notice of Technical Conference

    Science.gov (United States)

    2010-10-29

    ... Energy Regulatory Commission Smart Grid Interoperability Standards; Notice of Technical Conference... regulatory authorities that also are considering the adoption of Smart Grid Interoperability Standards.../FERC Collaborative on Smart Response (Collaborative), in the International D Ballroom at the Omni Hotel...

  15. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation (presentation)

    Science.gov (United States)

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  16. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation.

    Science.gov (United States)

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  17. Report on the IFIP WG5.8 International Workshop on Enterprise Interoperability (IWEI 2008)

    NARCIS (Netherlands)

    van Sinderen, Marten J.; Johnson, P.; Kutvonen, L.

    2008-01-01

    Enterprise interoperability is a growing research topic, rooted in various sub-disciplines from computer science and business management. Enterprise interoperability addresses intra- and inter-organizational collaboration and is characterized by the objective of aligning business level and

  18. Requirements for and barriers towards interoperable ehealth technology in primary care

    NARCIS (Netherlands)

    Oude Nijeweme-d'Hollosy, Wendeline; van Velsen, Lex Stefan; Huygens, Martine; Hermens, Hermanus J.

    Despite eHealth technology's rapid growth, eHealth applications are rarely embedded within primary care, mostly because systems lack interoperability. This article identifies requirements for, and barriers towards, interoperable eHealth technology from healthcare professionals' perspective -- the

  19. Exploring NASA GES DISC Data with Interoperable Services

    Science.gov (United States)

    Zhao, Peisheng; Yang, Wenli; Hegde, Mahabal; Wei, Jennifer C.; Kempler, Steven; Pham, Long; Teng, William; Savtchenko, Andrey

    2015-01-01

    Overview of NASA GES DISC (NASA Goddard Earth Science Data and Information Services Center) data with interoperable services: Open-standard and Interoperable Services Improve data discoverability, accessibility, and usability with metadata, catalogue and portal standards Achieve data, information and knowledge sharing across applications with standardized interfaces and protocols Open Geospatial Consortium (OGC) Data Services and Specifications Web Coverage Service (WCS) -- data Web Map Service (WMS) -- pictures of data Web Map Tile Service (WMTS) --- pictures of data tiles Styled Layer Descriptors (SLD) --- rendered styles.

  20. Interoperation of World-Wide Production e-Science Infrastructures

    CERN Document Server

    Riedel, M; Soddemann, T; Field, L; Navarro, JP; Casey, J; Litmaath, M; Baud, J; Koblitz, B; Catlett, C; Skow, D; Wang, S; Saeki, Y; Sato, H; Matsuoka, S; Geddes, N

    Many production Grid and e-Science infrastructures have begun to offer services to end-users during the past several years with an increasing number of scientific applications that require access to a wide variety of resources and services in multiple Grids. Therefore, the Grid Interoperation Now—Community Group of the Open Grid Forum—organizes and manages interoperation efforts among those production Grid infrastructures to reach the goal of a world-wide Grid vision on a technical level in the near future. This contribution highlights fundamental approaches of the group and discusses open standards in the context of production e-Science infrastructures.

  1. Interoperable Archetypes With a Three Folded Terminology Governance.

    Science.gov (United States)

    Pederson, Rune; Ellingsen, Gunnar

    2015-01-01

    The use of openEHR archetypes increases the interoperability of clinical terminology, and in doing so improves upon the availability of clinical terminology for both primary and secondary purposes. Where clinical terminology is employed in the EPR system, research reports conflicting a results for the use of structuring and standardization as measurements of success. In order to elucidate this concept, this paper focuses on the effort to establish a national repository for openEHR based archetypes in Norway where clinical terminology could be included with benefit for interoperability three folded.

  2. Improved semantic interoperability for content reuse through knowledge organization systems

    Directory of Open Access Journals (Sweden)

    José Antonio Moreiro González

    2012-04-01

    Full Text Available The Knowledge Organization Systems (KOS are resources designed to improve the knowledge interoperability, management and retrieval. As increases the web resources, it’s evidenced the lack of KOS, with the consequent impact in the resources interoperability. The KOSS are, by definition, complicated and costly tools, so much in his creation as in his management. The reuse of similar organizational structures is a necessary element in this context. They analyses experiences of reuse of The KOS and signals like the new standards are impinged on this appearance.

  3. Architectures for the Development of the National Interoperability Framework in Romania

    Directory of Open Access Journals (Sweden)

    Codrin-Florentin NISIOIU

    2015-10-01

    Full Text Available The authors of Digital Agenda consider that Europe do not take fully advantage of interoperability. They believe that we need effective interoperability between IT products and services to build a truly Digital Society. The Digital Agenda can only be effective if all the elements and applications are interoperable and based on open standards and platforms. In this context, I propose in this article a specific architecture for developing Romanian National Interoperability framework.

  4. 75 FR 63462 - Smart Grid Interoperability Standards; Notice of Docket Designation for Smart Grid...

    Science.gov (United States)

    2010-10-15

    ... Energy Regulatory Commission Smart Grid Interoperability Standards; Notice of Docket Designation for Smart Grid Interoperability Standards October 7, 2010. 1. The Energy Independence and Security Act of... interoperability of smart grid devices and systems, including protocols and model standards for information...

  5. Pemanfaatan Google API Untuk Model Interoperability Web Berbasis PHP Dengan Google Drive

    OpenAIRE

    Sumiari, Ni Kadek

    2015-01-01

    Dalam sebuah website tercapinya interoperability suatu system sangatlah penting. Penggunaan database berbasis Mysql, Sql Server ataupun oracle memang sudah sangat lumrah dipergunakan dalam sebuah system berbasis website. Namun penggunaan database tersebut tidak bisa menjamin apakah interoperability dari system tersebut dapat tercapai. Selain dari keamanan data dari segi implementasi system juga cukup sulit. Salah satu solusi dalam mencapi interoperability suatu system berbasis website adalah...

  6. Interoperability of Services in an Open Broadband Market : Cases from the Netherlands

    NARCIS (Netherlands)

    Burgmeijer, J.

    2006-01-01

    End-to-end interoperability of broadband services and networks is a condition for an open broadband market. A business model for broadband service interoperability is given. Two cases from the Netherlands, of initiatives from the market to reach interoperability, are presented: E-norm and FIST VoIP.

  7. Datacube Interoperability, Encoding Independence, and Analytics

    Science.gov (United States)

    Baumann, Peter; Hirschorn, Eric; Maso, Joan

    2017-04-01

    Datacubes are commonly accepted as an enabling paradigm which provides a handy abstraction for accessing and analyzing the zillions of image files delivered by the manifold satellite instruments and climate simulations, among others. Additionally, datacubes are the classic model for statistical and OLAP datacubes, so a further information category can be integrated. From a standards perspective, spatio-temporal datacubes naturally are included in the concept of coverages which encompass regular and irregular grids, point clouds, and general meshes - or, more abstractly, digital representations of spatio-temporally varying phenomena. ISO 19123, which is identical to OGC Abstract Topic 6, gives a high-level abstract definition which is complemented by the OGC Coverage Implementation Schema (CIS) which is an interoperable, yet format independent concretization of the abstract model. Currently, ISO is working on adopting OGC CIS as ISO 19123-2; the existing ISO 19123 standard is under revision by one of the abstract authors and will become ISO 19123-1. The roadmap agreed by ISO further foresees adoption of the OGC Web Coverage Service (WCS) as an ISO standard so that a complete data and service model will exist. In 2016, INSPIRE has adopted WCS as Coverage Download Service, including the datacube analytics language Web Coverage Processing Service (WCPS). The rasdaman technology (www.rasdaman.org) is both OGC and INSPIRE Reference Implementation. In the global EarthServer initiative rasdaman database sizes are exceeding 250 TB today, heading for the Petabyte frontier well in 2017. Technically, CIS defines a compact, efficient model for representing multi-dimensional datacubes in several ways. The classical coverage cube defines a domain set (where are values?), a range set (what are these values?), and range type (what do the values mean?), as well as a "bag" for arbitrary metadata. With CIS 1.1, coordinate/value pair sequences have been added, as well as tiled

  8. Metadata behind the interoperability of wireless sensor networks

    NARCIS (Netherlands)

    Ballari, D.E.; Wachowicz, M.; Manso-Callejo, M.A.

    2009-01-01

    Wireless Sensor Networks (WSNs) produce changes of status that are frequent, dynamic and unpredictable, and cannot be represented using a linear cause-effect approach. Consequently, a new approach is needed to handle these changes in order to support dynamic interoperability. Our approach is to

  9. Interoperable transactions in business models: A structured approach

    NARCIS (Netherlands)

    Weigand, H.; Verharen, E.; Dignum, F.P.M.

    1996-01-01

    Recent database research has given much attention to the specification of "flexible" transactions that can be used in interoperable systems. Starting from a quite different angle, Business Process Modelling has approached the area of communication modelling as well (the Language/Action

  10. The next generation of interoperability agents in healthcare.

    Science.gov (United States)

    Cardoso, Luciana; Marins, Fernando; Portela, Filipe; Santos, Manuel; Abelha, António; Machado, José

    2014-05-16

    Interoperability in health information systems is increasingly a requirement rather than an option. Standards and technologies, such as multi-agent systems, have proven to be powerful tools in interoperability issues. In the last few years, the authors have worked on developing the Agency for Integration, Diffusion and Archive of Medical Information (AIDA), which is an intelligent, agent-based platform to ensure interoperability in healthcare units. It is increasingly important to ensure the high availability and reliability of systems. The functions provided by the systems that treat interoperability cannot fail. This paper shows the importance of monitoring and controlling intelligent agents as a tool to anticipate problems in health information systems. The interaction between humans and agents through an interface that allows the user to create new agents easily and to monitor their activities in real time is also an important feature, as health systems evolve by adopting more features and solving new problems. A module was installed in Centro Hospitalar do Porto, increasing the functionality and the overall usability of AIDA.

  11. Proposed Specifications for International Interoperability on Repaired Bomb Damaged Runways

    Science.gov (United States)

    1981-01-01

    ESL-TR-81-03 PROPOSED SPECIFICATIONS FOR INTERNATIONAL INTEROPERABILITY ON REPAIRED BOMB DAMAGED RUNWAYS CALDWELL, LAPSLEY R. LT COL. USAF GERARDI... Lapsley R., Lt Col, USAF xctard,., Anthony G. IN-HOUSE 9. PERFORk, AG’ •)RGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT. PROJ :CT, TASKAREA & WORK

  12. Pragmatic Interoperability: A Systematic Review of Published Definitions

    NARCIS (Netherlands)

    Asuncion, C.H.; van Sinderen, Marten J.; Bernus, Peter; Doumeingts, Guy; Fox, Mark

    2010-01-01

    Enabling the interoperability between applications requires agreement in the format and meaning (syntax and semantics) of exchanged data including the ordering of message exchanges. However, today’s researchers argue that these are not enough to achieve a complete, effective and meaningful

  13. Managing Uncertainty: The Road Towards Better Data Interoperability

    NARCIS (Netherlands)

    Herschel, M.; van Keulen, Maurice

    Data interoperability encompasses the many data management activities needed for effective information management in anyone´s or any organization´s everyday work such as data cleaning, coupling, fusion, mapping, and information extraction. It is our conviction that a significant amount of money and

  14. Waveform Diversity and Design for Interoperating Radar Systems

    Science.gov (United States)

    2013-01-01

    University Di Pisa Department Di Ingegneria Dell Informazione Elettronica, Informatica , Telecomunicazioni Via Girolamo Caruso 16 Pisa, Italy 56122...NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) University Di Pisa Department Di Ingegneria Dell Informazione Elettronica, Informatica ...DIPARTIMENTO DI INGEGNERIA DELL’INFORMAZIONE ELETTRONICA, INFORMATICA , TELECOMUNICAZIONI WAVEFORM DIVERSITY AND DESIGN FOR INTEROPERATING

  15. Towards Cross-Organizational Innovative Business Process Interoperability Services

    Science.gov (United States)

    Karacan, Ömer; Del Grosso, Enrico; Carrez, Cyril; Taglino, Francesco

    This paper presents the vision and initial results of the COIN (FP7-IST-216256) European project for the development of open source Collaborative Business Process Interoperability (CBPip) in cross-organisational business collaboration environments following the Software-as-a-Service Utility (SaaS-U) paradigm.

  16. Design of large-scale enterprise interoperable value webs

    NARCIS (Netherlands)

    Hofman, W.J.

    2011-01-01

    Still a lot of enterprises are faced with the issue of interoperability. Whereas large enterprises are able to implement the required technology, SMEs (Small and Medium sized Enterprises) face challenges as they lack knowledge and budget. Enterprises have defined their specific semantics and

  17. The Role of Markup for Enabling Interoperability in Health Informatics

    Directory of Open Access Journals (Sweden)

    Steve eMckeever

    2015-05-01

    Full Text Available Interoperability is the faculty of making information systems work together. In this paper we will distinguish a number of different forms that interoperability can take and show how they are realised on a variety of physiological and health care use cases. The last fifteen years has seen the rise of very cheap digital storage both on and off cite. With the advent of the 'Internet of Things' people's expectations are for greater interconnectivity and seamless interoperability. The potential impact these technologies have on healthcare are dramatic: from improved diagnoses through immediate access to a patient's electronic health record, to 'in silico' modeling of organs and early stage drug trials, to predictive medicine based on top-down modeling of disease progression and treatment. We will begin by looking at the underlying technology, classify the various kinds of interoperability that exist in the field, and discuss how they are realised. We conclude with a discussion on future possibilities that big data and further standardizations will enable.

  18. The Next Generation of Interoperability Agents in Healthcare

    Directory of Open Access Journals (Sweden)

    Luciana Cardoso

    2014-05-01

    Full Text Available Interoperability in health information systems is increasingly a requirement rather than an option. Standards and technologies, such as multi-agent systems, have proven to be powerful tools in interoperability issues. In the last few years, the authors have worked on developing the Agency for Integration, Diffusion and Archive of Medical Information (AIDA, which is an intelligent, agent-based platform to ensure interoperability in healthcare units. It is increasingly important to ensure the high availability and reliability of systems. The functions provided by the systems that treat interoperability cannot fail. This paper shows the importance of monitoring and controlling intelligent agents as a tool to anticipate problems in health information systems. The interaction between humans and agents through an interface that allows the user to create new agents easily and to monitor their activities in real time is also an important feature, as health systems evolve by adopting more features and solving new problems. A module was installed in Centro Hospitalar do Porto, increasing the functionality and the overall usability of AIDA.

  19. The MADE reference information model for interoperable pervasive telemedicine systems

    NARCIS (Netherlands)

    Fung, L.S.N.; Jones, Valerie M.; Hermens, Hermanus J.

    2017-01-01

    Objectives: The main objective is to develop and validate a reference information model (RIM) to support semantic interoperability of pervasive telemedicine systems. The RIM is one component within a larger, computer-interpretable "MADE language" developed by the authors in the context of the

  20. 47 CFR 0.192 - Emergency Response Interoperability Center.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Emergency Response Interoperability Center. 0.192 Section 0.192 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL COMMISSION ORGANIZATION..., industry representatives, and service providers. [75 FR 28207, May 20, 2010] ...

  1. Information and documentation - Thesauri and interoperability with other vocabularies

    DEFF Research Database (Denmark)

    Lykke, Marianne; Dalbin, Sylvie; Smedt, Johan De

    ISO 25964-2:2013 is applicable to thesauri and other types of vocabulary that are commonly used for information retrieval. It describes, compares and contrasts the elements and features of these vocabularies that are implicated when interoperability is needed. It gives recommendations...... for the establishment and maintenance of mappings between multiple thesauri, or between thesauri and other types of vocabularies....

  2. ICD-11 (JLMMS) and SCT Inter-Operation.

    Science.gov (United States)

    Mamou, Marzouk; Rector, Alan; Schulz, Stefan; Campbell, James; Solbrig, Harold; Rodrigues, Jean-Marie

    2016-01-01

    The goal of this work is to contribute to a smooth and semantically sound inter-operability between the ICD-11 (International Classification of Diseases-11th revision Joint Linearization for Mortality, Morbidity and Statistics) and SNOMED CT (SCT). To guarantee such inter-operation between a classification, characterized by a single hierarchy of mutually exclusive and exhaustive classes, as is the JLMMS successor of ICD-10 on the one hand, and the multi-hierarchical, ontology-based clinical terminology SCT on the other hand, we use ontology axioms that logically express generalizable truths. This is expressed by the compositional grammar of SCT, together with queries on axiomsof SCT. We test the feasibility of the method on the circulatory chapter of ICD-11 JLMMS and present limitations and results.

  3. Interoperable eHealth Platform for Personalized Smart Services

    DEFF Research Database (Denmark)

    Mihaylov, Mihail Rumenov; Mihovska, Albena Dimitrova; Kyriazakos, Sofoklis

    2015-01-01

    personalized context-aware applications to serve the user's needs. This paper proposes the use of advised sensing, context-aware and cloud-based lifestyle reasoning to design an innovative eHealth platform that supports highly personalized smart services to primary users. The architecture of the platform has...... been designed in accordance with the interoperability requirements and standards as proposed by ITU-T and Continua Alliance. In particular, we define the interface dependencies and functional requirements needed, to allow eCare and eHealth vendors to manufacture interoperable sensors, ambient and home...... networks, telehealth platforms, health support applications and software services. Finally, data mining techniques in relation to the proposed architecture are also proposed to enhance the overall AAL experience of the users....

  4. Modeling Reusable and Interoperable Faceted Browsing Systems with Category Theory

    OpenAIRE

    Harris, Daniel R.

    2015-01-01

    Faceted browsing has become ubiquitous with modern digital libraries and online search engines, yet the process is still difficult to abstractly model in a manner that supports the development of interoperable and reusable interfaces. We propose category theory as a theoretical foundation for faceted browsing and demonstrate how the interactive process can be mathematically abstracted. Existing efforts in facet modeling are based upon set theory, formal concept analysis, and lightweight ontol...

  5. Foundations of reusable and interoperable facet models using category theory

    OpenAIRE

    Harris, Daniel R.

    2016-01-01

    Faceted browsing has become ubiquitous with modern digital libraries and online search engines, yet the process is still difficult to abstractly model in a manner that supports the development of interoperable and reusable interfaces. We propose category theory as a theoretical foundation for faceted browsing and demonstrate how the interactive process can be mathematically abstracted. Existing efforts in facet modeling are based upon set theory, formal concept analysis, and light-weight onto...

  6. Smart home interoperability: the DomoEsi project approach

    OpenAIRE

    Maestre Torreblanca, José María; Camacho, Eduardo F.

    2009-01-01

    The home automation market is characterized by the great number of systems available to the end user. The recent bubble in the building industry made the situation even worse due to the birth of new proprietary systems. The success of the digital home concept depends on the ease of integration between home automation systems and other consumer electronic equipment pre-existing in the home. In this paper the interoperability issue is addressed and the approach followed in the pr...

  7. Internet of Things Heterogeneous Interoperable Network Architecture Design

    OpenAIRE

    Bhalerao, Dipashree M.

    2014-01-01

    Internet of Thing‘s (IoT) state of the art deduce that there is no mature Internet of Things architecture available. Thesis contributes an abstract generic IoT system reference architecture development with specifications. Novelties of thesis are proposed solutions and implementations for Scalability, heterogeneous interoperability, security and extension of IoT architecture for rural, poor and catastrophic (RPC) areas. VLC is proposed and proved as one of the suitable internetwork means to o...

  8. Improving interoperability by encouraging the sharing of interface specifications

    OpenAIRE

    Weston, Sally

    2017-01-01

    3D CAD software is vital to record design information. The industry is oligopolistic and despite standards has all the elements associated with a lack of interoperability, namely proprietary software, network effects and lock-in. Interfaces are similar to standards and their indirect effect amplifies their impact and value and distorts the intended intellectual property protection. The distributed machine code is not readable and the restrictions on reverse engineering are tantamount to makin...

  9. The challenge of networked enterprises for cloud computing interoperability

    OpenAIRE

    Mezgár, István; Rauschecker, Ursula

    2014-01-01

    Manufacturing enterprises have to organize themselves into effective system architectures forming different types of Networked Enterprises (NE) to match fast changing market demands. Cloud Computing (CC) is an important up to date computing concept for NE, as it offers significant financial and technical advantages beside high-level collaboration possibilities. As cloud computing is a new concept the solutions for handling interoperability, portability, security, privacy and standardization c...

  10. Medical Device Plug-and-Play Interoperability Standards & Technology Leadership

    Science.gov (United States)

    2011-10-01

    records and will introduce error resistance into networked medical device systems. We are producing a standardization framework consisting of a...We have also begun collecting data on the issue related to device clock time errors and erroneous data time-stamps in preparation for a White House...advances in mind. We also recognize that, as in all technological advances, interoperability poses safety and medico -legal challenges as well. The

  11. Robotics Systems Joint Project Office (RSJPO) Interoperability Profiles (IOPS) 101

    Science.gov (United States)

    2012-07-01

    Management • Suspend/Resume Waypoint Follow • Leader Follower Mode & Attributes • Execution Leader /Follower Operation • Following Status • Suspend...V0:  Leader Management  Leader /Follower Driver  Communicator (i.e. radio messages)  Platform Mode  Health Monitor  Health Reporter...Partial Interoperable Robot • Attributes selected:  Use the JAUS messages for 2 payload ports  Use the “B” style connectors at 12 VDC  Allow the use

  12. Data and Semantic Interoperability for the Oceans Sensor Web

    Science.gov (United States)

    Bermudez, L. E.; Bogden, P.; Bridger, E.; Conover, H.; Creager, G.; Forrest, D.; Gale, T.; Graybeal, J.; Howe, B.; Maskey, M.

    2007-12-01

    Ocean observing systems incorporate a spectrum of sensors and data. Making the data available to any interested scientist is important: data sharing and experimental reproducibility are hallmarks of the scientific process. However, different groups may represent, transport, store and distribute their data in different ways, leading to difficulties in sharing these data. OOSTethys, an open-source community effort with involvement from six regional associations and two major research institutes, is exploring the best mechanisms to make ocean data and metadata interoperable by advancing and influencing standards from the Open Geospatial Consortium (OGC), World Wide Web Consortium (W3C) and OASIS. Our strategy to address these challenges has been to envision a service oriented architecture (SOA) which comprises data providers, registries, semantic mediators, aggregators and visualizers. For each component of the system we select the most appropriate standard(s) and create cookbooks and tools to support its implementation. This improves accessibility for data providers with limited time and limited budgets for information technology projects. For example, we have created cookbooks and toolkits in Perl, Java and Python to facilitate implementation of OGC Sensor Observation Services (SOS). The implementation includes publishing metadata in SensorML, and making data available via Geographic Markup Language (GML) records conforming to the Observation and Measurements specifications. A semantic mediator implemented as a web service uses Semantic Web technologies to solve semantic incompatibilities, and enables proper categorization of the different services. Our initial results are positive: in addition to several national demonstrations of data interoperability, the cookbooks have been used to bring more than 60 oceanographic platforms online, and we have at least 7 data consumers relying on web services for their own oceanographic applications. OOSTethys work is an

  13. Enabling Medical Device Interoperability for the Integrated Clinical Environment

    Science.gov (United States)

    2016-12-01

    Integration ” at Society of Critical Care Medicine Annual Congress, San Francisco, CA  January 21-22 2014 – Chaired Meetings for US TAG ISO TC 121 on...Award Number: W81XWH-12-C-0154 TITLE: “Enabling Medical Device Interoperability for the Integrated Clinical Environment” PRINCIPAL INVESTIGATOR...Julian M. Goldman, MD CONTRACTING ORGANIZATION: Massachusetts General Hospital Boston, MA 02114 REPORT DATE: December 2016 TYPE OF REPORT : Final

  14. Emergent Semantics Interoperability in Large-Scale Decentralized Information Systems

    CERN Document Server

    Cudré-Mauroux, Philippe

    2008-01-01

    Peer-to-peer systems are evolving with new information-system architectures, leading to the idea that the principles of decentralization and self-organization will offer new approaches in informatics, especially for systems that scale with the number of users or for which central authorities do not prevail. This book describes a new way of building global agreements (semantic interoperability) based only on decentralized, self-organizing interactions.

  15. Cloud-based Communications Planning Collaboration and Interoperability

    Science.gov (United States)

    2012-06-01

    Marine Expeditionary Force SaaS Software as a Service SOA Service Oriented Architecture SPE Systems Planning and Engineering SPEED... Software as a Service (SaaS) application to improve processes and products in the field of Marine Corps communications planning through automation...collaboration, and interoperability. It introduces the idea of using the Software as a Service (SaaS) model to develop a cloud-based communications

  16. Interoperable and accessible census and survey data from IPUMS.

    Science.gov (United States)

    Kugler, Tracy A; Fitch, Catherine A

    2018-02-27

    The first version of the Integrated Public Use Microdata Series (IPUMS) was released to users in 1993, and since that time IPUMS has come to stand for interoperable and accessible census and survey data. Initially created to harmonize U.S. census microdata over time, IPUMS now includes microdata from the U.S. and international censuses and from surveys on health, employment, and other topics. IPUMS also provides geo-spatial data, aggregate population data, and environmental data. IPUMS supports ten data products, each disseminating an integrated data collection with a set of tools that make complex data easy to find, access, and use. Key features are record-level integration to create interoperable datasets, user-friendly interfaces, and comprehensive metadata and documentation. The IPUMS philosophy aligns closely with the FAIR principles of findability, accessibility, interoperability, and re-usability. IPUMS data have catalyzed knowledge generation across a wide range of social science and other disciplines, as evidenced by the large volume of publications and other products created by the vast IPUMS user community.

  17. On the feasibility of interoperable schemes in hand biometrics.

    Science.gov (United States)

    Morales, Aythami; González, Ester; Ferrer, Miguel A

    2012-01-01

    Personal recognition through hand-based biometrics has attracted the interest of many researchers in the last twenty years. A significant number of proposals based on different procedures and acquisition devices have been published in the literature. However, comparisons between devices and their interoperability have not been thoroughly studied. This paper tries to fill this gap by proposing procedures to improve the interoperability among different hand biometric schemes. The experiments were conducted on a database made up of 8,320 hand images acquired from six different hand biometric schemes, including a flat scanner, webcams at different wavelengths, high quality cameras, and contactless devices. Acquisitions on both sides of the hand were included. Our experiment includes four feature extraction methods which determine the best performance among the different scenarios for two of the most popular hand biometrics: hand shape and palm print. We propose smoothing techniques at the image and feature levels to reduce interdevice variability. Results suggest that comparative hand shape offers better performance in terms of interoperability than palm prints, but palm prints can be more effective when using similar sensors.

  18. On the Feasibility of Interoperable Schemes in Hand Biometrics

    Directory of Open Access Journals (Sweden)

    Miguel A. Ferrer

    2012-02-01

    Full Text Available Personal recognition through hand-based biometrics has attracted the interest of many researchers in the last twenty years. A significant number of proposals based on different procedures and acquisition devices have been published in the literature. However, comparisons between devices and their interoperability have not been thoroughly studied. This paper tries to fill this gap by proposing procedures to improve the interoperability among different hand biometric schemes. The experiments were conducted on a database made up of 8,320 hand images acquired from six different hand biometric schemes, including a flat scanner, webcams at different wavelengths, high quality cameras, and contactless devices. Acquisitions on both sides of the hand were included. Our experiment includes four feature extraction methods which determine the best performance among the different scenarios for two of the most popular hand biometrics: hand shape and palm print. We propose smoothing techniques at the image and feature levels to reduce interdevice variability. Results suggest that comparative hand shape offers better performance in terms of interoperability than palm prints, but palm prints can be more effective when using similar sensors.

  19. Rollout Strategy to Implement Interoperable Traceability in the Seafood Industry.

    Science.gov (United States)

    Gooch, Martin; Dent, Benjamin; Sylvia, Gilbert; Cusack, Christopher

    2017-08-01

    Verifying the accuracy and rigor of data exchanged within and between businesses for the purposes of traceability rests on the existence of effective and efficient interoperable information systems that meet users' needs. Interoperability, particularly given the complexities intrinsic to the seafood industry, requires that the systems used by businesses operating along the supply chain share a common technology architecture that is robust, resilient, and evolves as industry needs change. Technology architectures are developed through engaging industry stakeholders in understanding why an architecture is required, the benefits provided to the industry and individual businesses and supply chains, and how the architecture will translate into practical results. This article begins by reiterating the benefits that the global seafood industry can capture by implementing interoperable chain-length traceability and the reason for basing the architecture on a peer-to-peer networked database concept versus more traditional centralized or linear approaches. A summary of capabilities that already exist within the seafood industry that the proposed architecture uses is discussed; and a strategy for implementing the architecture is presented. The 6-step strategy is presented in the form of a critical path. © 2017 Institute of Food Technologists®.

  20. INTEROPERABLE FRAMEWORK SOLUTION TO ICU HEALTH CARE MONITORING

    Directory of Open Access Journals (Sweden)

    Shola Usha Rani

    2015-03-01

    Full Text Available An interoperable telehealth system provides an independent healthcare solution for better management of health and wellness. It allows people to manage their heart disease and diabetes etc. by sending their health parameters like blood pressure, heart rate, glucose levels, temperature, weight, respiration from remote place to health professional, and get real-time feedback on their condition. Here different medical devices are connected to the patient for monitoring. Each kind of device is manufactured by different vendors. And each device information and communication requires different installation and network design. It causes design complexities and network overheads when moving patients for diagnosis examinations. This problem will be solved by interoperability among devices. The ISO/IEEE 11073 is an international standard which produces interoperable hospital information system solution to medical devices. One such type of integrated environment that requires the integration of medical devices is ICU (Intensive Care Unit. This paper presents the issues for ICU monitoring system and framework solution for it.

  1. Achieving interoperability for metadata registries using comparative object modeling.

    Science.gov (United States)

    Park, Yu Rang; Kim, Ju Han

    2010-01-01

    Achieving data interoperability between organizations relies upon agreed meaning and representation (metadata) of data. For managing and registering metadata, many organizations have built metadata registries (MDRs) in various domains based on international standard for MDR framework, ISO/IEC 11179. Following this trend, two pubic MDRs in biomedical domain have been created, United States Health Information Knowledgebase (USHIK) and cancer Data Standards Registry and Repository (caDSR), from U.S. Department of Health & Human Services and National Cancer Institute (NCI), respectively. Most MDRs are implemented with indiscriminate extending for satisfying organization-specific needs and solving semantic and structural limitation of ISO/IEC 11179. As a result it is difficult to address interoperability among multiple MDRs. In this paper, we propose an integrated metadata object model for achieving interoperability among multiple MDRs. To evaluate this model, we developed an XML Schema Definition (XSD)-based metadata exchange format. We created an XSD-based metadata exporter, supporting both the integrated metadata object model and organization-specific MDR formats.

  2. The GEOSS solution for enabling data interoperability and integrative research.

    Science.gov (United States)

    Nativi, Stefano; Mazzetti, Paolo; Craglia, Max; Pirrone, Nicola

    2014-03-01

    Global sustainability research requires an integrative research effort underpinned by digital infrastructures (systems) able to harness data and heterogeneous information across disciplines. Digital data and information sharing across systems and applications is achieved by implementing interoperability: a property of a product or system to work with other products or systems, present or future. There are at least three main interoperability challenges a digital infrastructure must address: technological, semantic, and organizational. In recent years, important international programs and initiatives are focusing on such an ambitious objective. This manuscript presents and combines the studies and the experiences carried out by three relevant projects, focusing on the heavy metal domain: Global Mercury Observation System, Global Earth Observation System of Systems (GEOSS), and INSPIRE. This research work recognized a valuable interoperability service bus (i.e., a set of standards models, interfaces, and good practices) proposed to characterize the integrative research cyber-infrastructure of the heavy metal research community. In the paper, the GEOSS common infrastructure is discussed implementing a multidisciplinary and participatory research infrastructure, introducing a possible roadmap for the heavy metal pollution research community to join GEOSS as a new Group on Earth Observation community of practice and develop a research infrastructure for carrying out integrative research in its specific domain.

  3. Modeling Interoperable Information Systems with 3LGM² and IHE.

    Science.gov (United States)

    Stäubert, S; Schaaf, M; Jahn, F; Brandner, R; Winter, A

    2015-01-01

    Strategic planning of information systems (IS) in healthcare requires descriptions of the current and the future IS state. Enterprise architecture planning (EAP) tools like the 3LGM² tool help to build up and to analyze IS models. A model of the planned architecture can be derived from an analysis of current state IS models. Building an interoperable IS, i. e. an IS consisting of interoperable components, can be considered a relevant strategic information management goal for many IS in healthcare. Integrating the healthcare enterprise (IHE) is an initiative which targets interoperability by using established standards. To link IHE concepts to 3LGM² concepts within the 3LGM² tool. To describe how an information manager can be supported in handling the complex IHE world and planning interoperable IS using 3LGM² models. To describe how developers or maintainers of IHE profiles can be supported by the representation of IHE concepts in 3LGM². Conceptualization and concept mapping methods are used to assign IHE concepts such as domains, integration profiles actors and transactions to the concepts of the three-layer graph-based meta-model (3LGM²). IHE concepts were successfully linked to 3LGM² concepts. An IHE-master-model, i. e. an abstract model for IHE concepts, was modeled with the help of 3LGM² tool. Two IHE domains were modeled in detail (ITI, QRPH). We describe two use cases for the representation of IHE concepts and IHE domains as 3LGM² models. Information managers can use the IHE-master-model as reference model for modeling interoperable IS based on IHE profiles during EAP activities. IHE developers are supported in analyzing consistency of IHE concepts with the help of the IHE-master-model and functions of the 3LGM² tool The complex relations between IHE concepts can be modeled by using the EAP method 3LGM². 3LGM² tool offers visualization and analysis features which are now available for the IHE-master-model. Thus information managers and IHE

  4. A methodology for the development of software agent based interoperable telemedicine systems: a tele-electrocardiography perspective.

    Science.gov (United States)

    Ganguly, P; Ray, P

    2000-01-01

    Telemedicine involves the integration of information, human-machine, and healthcare technologies. Because different modalities of patient care require applications running on heterogeneous computing environment, software interoperability is a major issue in telemedicine. Software agent technology provides a range of promising techniques to solve this problem. This article discusses the development of a methodology for the design of interoperable telemedicine systems (illustrated with a tele-electrocardiography application). Software interoperability between different applications can be modeled at different levels of abstraction such as physical interoperability, data-type interoperability, specification-level interoperability, and semantic interoperability. Software agents address the issue of software interoperability at semantic level. A popular object-oriented software development methodology - unified modeling language (UML) - has been used for this development. This research has demonstrated the feasibility of the development of agent-based interoperable telemedicine systems. More research is needed before widespread deployment of such systems can take place.

  5. System and methods of resource usage using an interoperable management framework

    Science.gov (United States)

    Heileman, Gregory L.; Jamkhedkar, Pramod A.; Lamb, Christopher C.

    2017-10-31

    Generic rights expression language allowing interoperability across different computing environments including resource usage of different applications. A formal framework for usage management provides scaffolding upon which interoperable usage management systems can be built. Certain features of the framework are standardized, such as the operational semantics, including areas free of standards that necessitate choice and innovation to achieve a balance of flexibility and usability for interoperability in usage management systems.

  6. Interoperability, Data Control and Battlespace Visualization using XML, XSLT and X3D

    National Research Council Canada - National Science Library

    Neushul, James

    2003-01-01

    This work represents the realization of Network-Centric goals of interoperability, information management, systems integration and cohesive battlespace visualization using networked computer technology...

  7. Standards-based data interoperability in the climate sciences

    Science.gov (United States)

    Woolf, Andrew; Cramer, Ray; Gutierrez, Marta; Kleese van Dam, Kerstin; Kondapalli, Siva; Latham, Susan; Lawrence, Bryan; Lowry, Roy; O'Neill, Kevin

    2005-03-01

    Emerging developments in geographic information systems and distributed computing offer a roadmap towards an unprecedented spatial data infrastructure in the climate sciences. Key to this are the standards developments for digital geographic information being led by the International Organisation for Standardisation (ISO) technical committee on geographic information/geomatics (TC211) and the Open Geospatial Consortium (OGC). These, coupled with the evolution of standardised web services for applications on the internet by the World Wide Web Consortium (W3C), mean that opportunities for both new applications and increased interoperability exist. These are exemplified by the ability to construct ISO-compliant data models that expose legacy data sources through OGC web services. This paper concentrates on the applicability of these standards to climate data by introducing some examples and outlining the challenges ahead. An abstract data model is developed, based on ISO standards, and applied to a range of climate data both observational and modelled. An OGC Web Map Server interface is constructed for numerical weather prediction (NWP) data stored in legacy data files. A W3C web service for remotely accessing gridded climate data is illustrated. Challenges identified include the following: first, both the ISO and OGC specifications require extensions to support climate data. Secondly, OGC services need to fully comply with W3C web services, and support complex access control. Finally, to achieve real interoperability, broadly accepted community-based semantic data models are required across the range of climate data types. These challenges are being actively pursued, and broad data interoperability for the climate sciences appears within reach.

  8. UMTS network planning, optimization, and inter-operation with GSM

    CERN Document Server

    Rahnema, Moe

    2008-01-01

    UMTS Network Planning, Optimization, and Inter-Operation with GSM is an accessible, one-stop reference to help engineers effectively reduce the time and costs involved in UMTS deployment and optimization. Rahnema includes detailed coverage from both a theoretical and practical perspective on the planning and optimization aspects of UMTS, and a number of other new techniques to help operators get the most out of their networks. Provides an end-to-end perspective, from network design to optimizationIncorporates the hands-on experiences of numerous researchersSingle

  9. Creating XML/PHP Interface for BAN Interoperability.

    Science.gov (United States)

    Fragkos, Vasileios; Katzis, Konstantinos; Despotou, Georgios

    2017-01-01

    Recent advances in medical and electronic technologies have introduced the use of Body Area Networks as a part of e-health, for constant and accurate monitoring of patients and the transmission as well as processing of the data to develop a holistic Electronic Health Record. The rising global population, different BAN manufacturers and a variety of medical systems pose the issue of interoperability between BANs and systems as well as the proper way to propagate medical data in an organized and efficient manner. In this paper, we describe BANs and propose the use of certain web technologies to address this issue.

  10. Semantic Integration for Marine Science Interoperability Using Web Technologies

    Science.gov (United States)

    Rueda, C.; Bermudez, L.; Graybeal, J.; Isenor, A. W.

    2008-12-01

    The Marine Metadata Interoperability Project, MMI (http://marinemetadata.org) promotes the exchange, integration, and use of marine data through enhanced data publishing, discovery, documentation, and accessibility. A key effort is the definition of an Architectural Framework and Operational Concept for Semantic Interoperability (http://marinemetadata.org/sfc), which is complemented with the development of tools that realize critical use cases in semantic interoperability. In this presentation, we describe a set of such Semantic Web tools that allow performing important interoperability tasks, ranging from the creation of controlled vocabularies and the mapping of terms across multiple ontologies, to the online registration, storage, and search services needed to work with the ontologies (http://mmisw.org). This set of services uses Web standards and technologies, including Resource Description Framework (RDF), Web Ontology language (OWL), Web services, and toolkits for Rich Internet Application development. We will describe the following components: MMI Ontology Registry: The MMI Ontology Registry and Repository provides registry and storage services for ontologies. Entries in the registry are associated with projects defined by the registered users. Also, sophisticated search functions, for example according to metadata items and vocabulary terms, are provided. Client applications can submit search requests using the WC3 SPARQL Query Language for RDF. Voc2RDF: This component converts an ASCII comma-delimited set of terms and definitions into an RDF file. Voc2RDF facilitates the creation of controlled vocabularies by using a simple form-based user interface. Created vocabularies and their descriptive metadata can be submitted to the MMI Ontology Registry for versioning and community access. VINE: The Vocabulary Integration Environment component allows the user to map vocabulary terms across multiple ontologies. Various relationships can be established, for example

  11. THE Interoperability Challenge for the Geosciences: Stepping up from Interoperability between Disciplinary Siloes to Creating Transdisciplinary Data Platforms.

    Science.gov (United States)

    Wyborn, L. A.; Evans, B. J. K.; Trenham, C.; Druken, K. A.; Wang, J.

    2015-12-01

    The National Computational Infrastructure (NCI) at the Australian National University (ANU) has collocated over 10 PB of national and international data assets within a HPC facility to create the National Environmental Research Data Interoperability Platform (NERDIP). The data span a wide range of fields from the earth systems and environment (climate, coasts, oceans, and geophysics) through to astronomy, bioinformatics, and the social sciences. These diverse data collections are collocated on a major data storage node that is linked to a Petascale HPC and Cloud facility. Users can search across all of the collections and either log in and access the data directly, or they can access the data via standards-based web services. These collocated petascale data collections are theoretically a massive resource for interdisciplinary science at scales and resolutions never hitherto possible. But once collocated, multiple barriers became apparent that make cross-domain data integration very difficult and often so time consuming, that either less ambitious research goals are attempted or the project is abandoned. Incompatible content is only one half of the problem: other showstoppers are differing access models, licences and issues of ownership of derived products. Brokers can enable interdisciplinary research but in reality are we just delaying the inevitable? A call to action is required adopt a transdiciplinary approach at the conception of development of new multi-disciplinary systems whereby those across all the scientific domains, the humanities, social sciences and beyond work together to create a unity of informatics plaforms that interoperate horizontally across the multiple discipline boundaries, and also operate vertically to enable a diversity of people to access data from high end researchers, to undergraduate, school students and the general public. Once we master such a transdisciplinary approach to our vast global information assets, we will then achieve

  12. Watershed and Economic Data InterOperability (WEDO): Facilitating Discovery, Evaluation and Integration through the Sharing of Watershed Modeling Data

    Science.gov (United States)

    Watershed and Economic Data InterOperability (WEDO) is a system of information technologies designed to publish watershed modeling studies for reuse. WEDO facilitates three aspects of interoperability: discovery, evaluation and integration of data. This increased level of interop...

  13. Enabling interoperability in planetary sciences and heliophysics: The case for an information model

    Science.gov (United States)

    Hughes, J. Steven; Crichton, Daniel J.; Raugh, Anne C.; Cecconi, Baptiste; Guinness, Edward A.; Isbell, Christopher E.; Mafi, Joseph N.; Gordon, Mitchell K.; Hardman, Sean H.; Joyner, Ronald S.

    2018-01-01

    The Planetary Data System has developed the PDS4 Information Model to enable interoperability across diverse science disciplines. The Information Model is based on an integration of International Organization for Standardization (ISO) level standards for trusted digital archives, information model development, and metadata registries. Where controlled vocabularies provides a basic level of interoperability by providing a common set of terms for communication between both machines and humans the Information Model improves interoperability by means of an ontology that provides semantic information or additional related context for the terms. The information model was defined by team of computer scientists and science experts from each of the diverse disciplines in the Planetary Science community, including Atmospheres, Geosciences, Cartography and Imaging Sciences, Navigational and Ancillary Information, Planetary Plasma Interactions, Ring-Moon Systems, and Small Bodies. The model was designed to be extensible beyond the Planetary Science community, for example there are overlaps between certain PDS disciplines and the Heliophysics and Astrophysics disciplines. "Interoperability" can apply to many aspects of both the developer and the end-user experience, for example agency-to-agency, semantic level, and application level interoperability. We define these types of interoperability and focus on semantic level interoperability, the type of interoperability most directly enabled by an information model.

  14. Examining the Relationship between Electronic Health Record Interoperability and Quality Management

    Science.gov (United States)

    Purcell, Bernice M.

    2013-01-01

    A lack of interoperability impairs data quality among health care providers' electronic health record (EHR) systems. The problem is whether the International Organization for Standardization (ISO) 9000 principles relate to the problem of interoperability in implementation of EHR systems. The purpose of the nonexperimental quantitative research…

  15. Advancing telemedicine services for the aging population: The challenge of interoperability

    NARCIS (Netherlands)

    van Velsen, Lex Stefan; Solana, Javier; Oude Nijeweme-d'Hollosy, Wendeline; Garate-Barreiro, Francisco; Vollenbroek-Hutten, Miriam Marie Rosé; Sik-Lányi, Cecilia; Hoogerwerf, Evert-Jan; Miesenberger, Klaus; Cudd, Peter

    2015-01-01

    We reflect on our experiences in two projects in which we developed interoperable telemedicine applications for the aging population. While technically data exchange could be implemented technically, uptake was impeded by a lack of working procedures. We argue that development of interoperable

  16. Towards Pragmatic Interoperability in the New Enterprise -- A Survey of Approaches

    NARCIS (Netherlands)

    Asuncion, C.H.; van Sinderen, Marten J.; Johnson, Pontus

    Pragmatic interoperability (PI) is the compatibility between the intended versus the actual effect of message exchange. This paper advances PI as a new research agenda within the gamut of enterprise interoperability research. PI is timely in today's new enterprises as it is increasingly important

  17. CIRCULATING INTER-OPERATIONAL RESERVES: LONG-TIME ERRORS

    Directory of Open Access Journals (Sweden)

    G. A. Каlinkin

    2009-01-01

    Full Text Available The paper shows that for dozens of years some researchers have presented a formula for calculating changes in circulating inter-operational reserve in the working phase of the flow production line in a wrong way as a formula for calculating a maximum change in the circulating inter-operational reserve of the working phase and the paper proposes an algorithm for calculation of circulating reserve in a boundary point of working phases of straight flow line. Having calculated a maximum change of the circulating reserve in the working phases it is possible to obtain an algebraic sum of reserve changes . After that the total algebraic sum of circulating reserve changes in boundary points from the initial phase to the final one. According to a minimum value of the algebraic sum of  the reserve changes in the phases it is possible to find a boundary point of phases where the reserve is equal to zero. A reserve in any boundary point  of  phases  is  calculated  through  finding  an  absolute value of the algebraic sum of reserve changes from boundary point of phases, where the reserve is equal to zero, to the sought one.

  18. Governance of Interoperability in Intergovernmental Services - Towards an Empirical Taxonomy

    Directory of Open Access Journals (Sweden)

    Herbert Kubicek

    2008-12-01

    Full Text Available High quality and comfortable online delivery of governmental services often requires the seamless exchange of data between two or more government agencies. Smooth data exchange, in turn, requires interoperability of the databases and workflows in the agencies involved. Interoperability (IOP is a complex issue covering purely technical aspects such as transmission protocols and data exchange formats, but also content-related semantic aspects such as identifiers and the meaning of codes as well as organizational, contractual or legal issues. Starting from IOP frameworks which provide classifications of what has to be standardized, this paper, based on an ongoing research project, adopts a political and managerial view and tries to clarify the governance of achieving IOP, i.e. where and by whom IOPstandards are developed and established and how they are put into operation. By analyzing 32 cases of successful implementation of IOP in E-Government services within the European Union empirical indicators for different aspects of governance are proposed and applied to develop an empirical taxonomy of different types of IOP governance which can be used for future comparative research regarding success factors, barriers etc.

  19. Interoperability prototype between hospitals and general practitioners in Switzerland.

    Science.gov (United States)

    Alves, Bruno; Müller, Henning; Schumacher, Michael; Godel, David; Abu Khaled, Omar

    2010-01-01

    Interoperability in data exchange has the potential to improve the care processes and decrease costs of the health care system. Many countries have related eHealth initiatives in preparation or already implemented. In this area, Switzerland has yet to catch up. Its health system is fragmented, because of the federated nature of cantons. It is thus more difficult to coordinate efforts between the existing healthcare actors. In the Medicoordination project a pragmatic approach was selected: integrating several partners in healthcare on a regional scale in French speaking Switzerland. In parallel with the Swiss eHealth strategy, currently being elaborated by the Swiss confederation, particularly medium-sized hospitals and general practitioners were targeted in Medicoordination to implement concrete scenarios of information exchange between hospitals and general practitioners with a high added value. In this paper we focus our attention on a prototype implementation of one chosen scenario: the discharge summary. Although simple in concept, exchanging release letters shows small, hidden difficulties due to the multi-partner nature of the project. The added value of such a prototype is potentially high and it is now important to show that interoperability can work in practice.

  20. PyMOOSE: interoperable scripting in Python for MOOSE

    Directory of Open Access Journals (Sweden)

    Subhasis Ray

    2008-12-01

    Full Text Available Python is emerging as a common scripting language for simulators. This opens up many possibilities for interoperability in the form of analysis, interfaces, and communications between simulators. We report the integration of Python scripting with the Multi-scale Object Oriented Simulation Environment (MOOSE. MOOSE is a general-purpose simulation system for compartmental neuronal models and for models of signaling pathways based on chemical kinetics. We show how the Python-scripting version of MOOSE, PyMOOSE, combines the power of a compiled simulator with the versatility and ease of use of Python. We illustrate this by using Python numerical libraries to analyze MOOSE output online, and by developing a GUI in Python/Qt for a MOOSE simulation. Finally, we build and run a composite neuronal/signaling model that uses both the NEURON and MOOSE numerical engines, and Python as a bridge between the two. Thus PyMOOSE has a high degree of interoperability with analysis routines, with graphical toolkits, and with other simulators.

  1. Interoperability and FAIRness through a novel combination of Web technologies

    Directory of Open Access Journals (Sweden)

    Mark D. Wilkinson

    2017-04-01

    Full Text Available Data in the life sciences are extremely diverse and are stored in a broad spectrum of repositories ranging from those designed for particular data types (such as KEGG for pathway data or UniProt for protein data to those that are general-purpose (such as FigShare, Zenodo, Dataverse or EUDAT. These data have widely different levels of sensitivity and security considerations. For example, clinical observations about genetic mutations in patients are highly sensitive, while observations of species diversity are generally not. The lack of uniformity in data models from one repository to another, and in the richness and availability of metadata descriptions, makes integration and analysis of these data a manual, time-consuming task with no scalability. Here we explore a set of resource-oriented Web design patterns for data discovery, accessibility, transformation, and integration that can be implemented by any general- or special-purpose repository as a means to assist users in finding and reusing their data holdings. We show that by using off-the-shelf technologies, interoperability can be achieved atthe level of an individual spreadsheet cell. We note that the behaviours of this architecture compare favourably to the desiderata defined by the FAIR Data Principles, and can therefore represent an exemplar implementation of those principles. The proposed interoperability design patterns may be used to improve discovery and integration of both new and legacy data, maximizing the utility of all scholarly outputs.

  2. Designing for Change: Interoperability in a scaling and adapting environment

    Science.gov (United States)

    Yarmey, L.

    2015-12-01

    The Earth Science cyberinfrastructure landscape is constantly changing. Technologies advance and technical implementations are refined or replaced. Data types, volumes, packaging, and use cases evolve. Scientific requirements emerge and mature. Standards shift while systems scale and adapt. In this complex and dynamic environment, interoperability remains a critical component of successful cyberinfrastructure. Through the resource- and priority-driven iterations on systems, interfaces, and content, questions fundamental to stable and useful Earth Science cyberinfrastructure arise. For instance, how are sociotechnical changes planned, tracked, and communicated? How should operational stability balance against 'new and shiny'? How can ongoing maintenance and mitigation of technical debt be managed in an often short-term resource environment? The Arctic Data Explorer is a metadata brokering application developed to enable discovery of international, interdisciplinary Arctic data across distributed repositories. Completely dependent on interoperable third party systems, the Arctic Data Explorer publicly launched in 2013 with an original 3000+ data records from four Arctic repositories. Since then the search has scaled to 25,000+ data records from thirteen repositories at the time of writing. In the final months of original project funding, priorities shift to lean operations with a strategic eye on the future. Here we present lessons learned from four years of Arctic Data Explorer design, development, communication, and maintenance work along with remaining questions and potential directions.

  3. BENEFITS OF LINKED DATA FOR INTEROPERABILITY DURING CRISIS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    R. Roller

    2015-08-01

    Full Text Available Floodings represent a permanent risk to the Netherlands in general and to her power supply in particular. Data sharing is essential within this crisis scenario as a power cut affects a great variety of interdependant sectors. Currently used data sharing systems have been shown to hamper interoperability between stakeholders since they lack flexibility and there is no consensus in term definitions and interpretations. The study presented in this paper addresses these challenges by proposing a new data sharing solution based on Linked Data, a method of interlinking data points in a structured way on the web. A conceptual model for two data sharing parties in a flood-caused power cut crisis management scenario was developed to which relevant data were linked. The analysis revealed that the presented data sharing solution burderns its user with extra costs in the short run, but saves resources in the long run by overcoming interoperability problems of the legacy systems. The more stakeholders adopt Linked Data the stronger its benefits for data sharing will become.

  4. Interoperability in planetary research for geospatial data analysis

    Science.gov (United States)

    Hare, Trent M.; Rossi, Angelo P.; Frigeri, Alessandro; Marmo, Chiara

    2018-01-01

    For more than a decade there has been a push in the planetary science community to support interoperable methods for accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (e.g., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized geospatial image formats, geologic mapping conventions, U.S. Federal Geographic Data Committee (FGDC) cartographic and metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Map Tile Services (cached image tiles), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they can be just as valuable for planetary domain. Another initiative, called VESPA (Virtual European Solar and Planetary Access), will marry several of the above geoscience standards and astronomy-based standards as defined by International Virtual Observatory Alliance (IVOA). This work outlines the current state of interoperability initiatives in use or in the process of being researched within the planetary geospatial community.

  5. Using software interoperability to achieve a virtual design environment

    Science.gov (United States)

    Gregory, G. Groot; Koshel, R. John

    2005-09-01

    A variety of simulation tools, including optical design and analysis, have benefited by many years of evolution in software functionality and computing power, thus making the notion of virtual design environments a reality. To simulate the optical characteristics of a system, one needs to include optical performance, mechanical design and manufacturing aspects simultaneously. To date, no single software program offers a universal solution. One approach to achieve an integrated environment is to select tools that offer a high degree of interoperability. This allows the selection of the best tools for each aspect of the design working in concert to solve the problem. This paper discusses the issues of how to assemble a design environment and provides an example of a combination of tools for illumination design. We begin by offering a broad definition of interoperability from an optical analysis perspective. This definition includes aspects of file interchange formats, software communications protocols and customized applications. One example solution is proposed by combining SolidWorks1 for computer-aided design (CAD), TracePro2 for optical analysis and MATLAB3 as the mathematical engine for tolerance analysis. The resulting virtual tool will be applied to a lightpipe design task to illustrate how such a system can be used.

  6. Code lists for interoperability - Principles and best practices in INSPIRE

    Science.gov (United States)

    Lutz, M.; Portele, C.; Cox, S.; Murray, K.

    2012-04-01

    Using free text for attribute values when exchanging geoscience data can lead to a number of problems, e.g. because different data providers and consumers use different languages, terminology or spellings. To overcome these issues, well-defined schemes of codes or concepts, known as code lists1, are preferred to free text in defining the value domain of an attribute. The "code list" concept is well established in geospatial modelling standards (e.g. ISO 191032), however, it has been used in many different ways. Here we present some considerations relating to code lists and related interoperability requirements in spatial data infrastructures (SDIs), in particular as discussed in the INSPIRE3 data specifications working groups. These will form the basis for the specification of code list requirements in the INSPIRE Implementing Rules on interoperability of spatial data sets and services4, which provide binding legal obligations for EU Member States for the interoperable provision of data related to the environment. Requirements or recommendations for code lists should address the following aspects: Governance: When modeling an application domain, for each feature attribute whose value is a 'term', should we re-use an existing code list or specify a new code list for the SDI initiative? Use of existing code lists is likely to maximize cross-initiative interoperability. Level of obligation: For each use of a code list, what is the level of obligation? Is use of a specified code list(s) mandatory or just recommended? This is particularly important where the specifications carry a legal mandate (as in the case of INSPIRE). Extensibility: Must data providers use only the specified values or may they extend the code list? Are arbitrary extensions allowed or do additional values have to be specialisations of existing values? Specifying values: For each code list, the allowed values have to be specified, either directly in the specification, or by reference to an existing

  7. The Influence of Information Systems Interoperability on Economic Activity in Poland

    Directory of Open Access Journals (Sweden)

    Ganczar Małgorzata

    2017-12-01

    Full Text Available In the text, I discuss the abilities and challenges of information systems interoperability. The anticipated and expected result of interoperability is to improve the provision of public utility services to citizens and companies by means of facilitating the provision of public utility services on the basis of a “single window” principle and reducing the costs incurred by public administrations, companies, and citizens, resulting from the efficiency of the provision of public utility services. In the article, the conceptual framework of interoperability is elaborated upon. Moreover, information systems and public registers for entrepreneurs in Poland exemplify whether the interoperability may be applied and, if so, whether interoperability fulfils its targets to the extent of e-Government services for entrepreneurs.

  8. D-ATM, a working example of health care interoperability: From dirt path to gravel road.

    Science.gov (United States)

    DeClaris, John-William

    2009-01-01

    For many years, there have been calls for interoperability within health care systems. The technology currently exists and is being used in business areas like banking and commerce, to name a few. Yet the question remains, why has interoperability not been achieved in health care? This paper examines issues encountered and success achieved with interoperability during the development of the Digital Access To Medication (D-ATM) project, sponsored by the Substance Abuse and Mental Health Services Administration (SAMHSA). D-ATM is the first government funded interoperable patient management system. The goal of this paper is to provide lessons learned and propose one possible road map for health care interoperability within private industry and how government can help.

  9. 75 FR 59290 - In the Matter of Certain Liquid Crystal Display Devices and Products Interoperable With the Same...

    Science.gov (United States)

    2010-09-27

    ... COMMISSION In the Matter of Certain Liquid Crystal Display Devices and Products Interoperable With the Same... States after importation of certain liquid crystal display devices and products interoperable with the... after importation of certain liquid crystal display devices and products interoperable with the same...

  10. A web services choreography scenario for interoperating bioinformatics applications.

    Science.gov (United States)

    de Knikker, Remko; Guo, Youjun; Li, Jin-Long; Kwan, Albert K H; Yip, Kevin Y; Cheung, David W; Cheung, Kei-Hoi

    2004-03-10

    Very often genome-wide data analysis requires the interoperation of multiple databases and analytic tools. A large number of genome databases and bioinformatics applications are available through the web, but it is difficult to automate interoperation because: 1) the platforms on which the applications run are heterogeneous, 2) their web interface is not machine-friendly, 3) they use a non-standard format for data input and output, 4) they do not exploit standards to define application interface and message exchange, and 5) existing protocols for remote messaging are often not firewall-friendly. To overcome these issues, web services have emerged as a standard XML-based model for message exchange between heterogeneous applications. Web services engines have been developed to manage the configuration and execution of a web services workflow. To demonstrate the benefit of using web services over traditional web interfaces, we compare the two implementations of HAPI, a gene expression analysis utility developed by the University of California San Diego (UCSD) that allows visual characterization of groups or clusters of genes based on the biomedical literature. This utility takes a set of microarray spot IDs as input and outputs a hierarchy of MeSH Keywords that correlates to the input and is grouped by Medical Subject Heading (MeSH) category. While the HTML output is easy for humans to visualize, it is difficult for computer applications to interpret semantically. To facilitate the capability of machine processing, we have created a workflow of three web services that replicates the HAPI functionality. These web services use document-style messages, which means that messages are encoded in an XML-based format. We compared three approaches to the implementation of an XML-based workflow: a hard coded Java application, Collaxa BPEL Server and Taverna Workbench. The Java program functions as a web services engine and interoperates with these web services using a web

  11. A web services choreography scenario for interoperating bioinformatics applications

    Directory of Open Access Journals (Sweden)

    Cheung David W

    2004-03-01

    Full Text Available Abstract Background Very often genome-wide data analysis requires the interoperation of multiple databases and analytic tools. A large number of genome databases and bioinformatics applications are available through the web, but it is difficult to automate interoperation because: 1 the platforms on which the applications run are heterogeneous, 2 their web interface is not machine-friendly, 3 they use a non-standard format for data input and output, 4 they do not exploit standards to define application interface and message exchange, and 5 existing protocols for remote messaging are often not firewall-friendly. To overcome these issues, web services have emerged as a standard XML-based model for message exchange between heterogeneous applications. Web services engines have been developed to manage the configuration and execution of a web services workflow. Results To demonstrate the benefit of using web services over traditional web interfaces, we compare the two implementations of HAPI, a gene expression analysis utility developed by the University of California San Diego (UCSD that allows visual characterization of groups or clusters of genes based on the biomedical literature. This utility takes a set of microarray spot IDs as input and outputs a hierarchy of MeSH Keywords that correlates to the input and is grouped by Medical Subject Heading (MeSH category. While the HTML output is easy for humans to visualize, it is difficult for computer applications to interpret semantically. To facilitate the capability of machine processing, we have created a workflow of three web services that replicates the HAPI functionality. These web services use document-style messages, which means that messages are encoded in an XML-based format. We compared three approaches to the implementation of an XML-based workflow: a hard coded Java application, Collaxa BPEL Server and Taverna Workbench. The Java program functions as a web services engine and interoperates

  12. Test Protocols for Advanced Inverter Interoperability Functions – Main Document

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Jay Dean [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gonzalez, Sigifredo [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ralph, Mark E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ellis, Abraham [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Broderick, Robert Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2013-11-01

    Distributed energy resources (DER) such as photovoltaic (PV) systems, when deployed in a large scale, are capable of influencing significantly the operation of power systems. Looking to the future, stakeholders are working on standards to make it possible to manage the potentially complex interactions between DER and the power system. In 2009, the Electric Power Research Institute (EPRI), Sandia National Laboratories (SNL) with the U.S. Department of Energy (DOE), and the Solar Electric Power Association (SEPA) initiated a large industry collaborative to identify and standardize definitions for a set of DER grid support functions. While the initial effort concentrated on grid-tied PV inverters and energy storage systems, the concepts have applicability to all DER. A partial product of this on-going effort is a reference definitions document (IEC TR 61850-90-7, Object models for power converters in distributed energy resources (DER) systems) that has become a basis for expansion of related International Electrotechnical Commission (IEC) standards, and is supported by US National Institute of Standards and Technology (NIST) Smart Grid Interoperability Panel (SGIP). Some industry-led organizations advancing communications protocols have also embraced this work. As standards continue to evolve, it is necessary to develop test protocols to independently verify that the inverters are properly executing the advanced functions. Interoperability is assured by establishing common definitions for the functions and a method to test compliance with operational requirements. This document describes test protocols developed by SNL to evaluate the electrical performance and operational capabilities of PV inverters and energy storage, as described in IEC TR 61850-90-7. While many of these functions are not currently required by existing grid codes or may not be widely available commercially, the industry is rapidly moving in that direction. Interoperability issues are already

  13. A web services choreography scenario for interoperating bioinformatics applications

    Science.gov (United States)

    de Knikker, Remko; Guo, Youjun; Li, Jin-long; Kwan, Albert KH; Yip, Kevin Y; Cheung, David W; Cheung, Kei-Hoi

    2004-01-01

    Background Very often genome-wide data analysis requires the interoperation of multiple databases and analytic tools. A large number of genome databases and bioinformatics applications are available through the web, but it is difficult to automate interoperation because: 1) the platforms on which the applications run are heterogeneous, 2) their web interface is not machine-friendly, 3) they use a non-standard format for data input and output, 4) they do not exploit standards to define application interface and message exchange, and 5) existing protocols for remote messaging are often not firewall-friendly. To overcome these issues, web services have emerged as a standard XML-based model for message exchange between heterogeneous applications. Web services engines have been developed to manage the configuration and execution of a web services workflow. Results To demonstrate the benefit of using web services over traditional web interfaces, we compare the two implementations of HAPI, a gene expression analysis utility developed by the University of California San Diego (UCSD) that allows visual characterization of groups or clusters of genes based on the biomedical literature. This utility takes a set of microarray spot IDs as input and outputs a hierarchy of MeSH Keywords that correlates to the input and is grouped by Medical Subject Heading (MeSH) category. While the HTML output is easy for humans to visualize, it is difficult for computer applications to interpret semantically. To facilitate the capability of machine processing, we have created a workflow of three web services that replicates the HAPI functionality. These web services use document-style messages, which means that messages are encoded in an XML-based format. We compared three approaches to the implementation of an XML-based workflow: a hard coded Java application, Collaxa BPEL Server and Taverna Workbench. The Java program functions as a web services engine and interoperates with these web

  14. Test Protocols for Advanced Inverter Interoperability Functions - Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Jay Dean [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gonzalez, Sigifredo [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ralph, Mark E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ellis, Abraham [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Broderick, Robert Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2013-11-01

    Distributed energy resources (DER) such as photovoltaic (PV) systems, when deployed in a large scale, are capable of influencing significantly the operation of power systems. Looking to the future, stakeholders are working on standards to make it possible to manage the potentially complex interactions between DER and the power system. In 2009, the Electric Power Research Institute (EPRI), Sandia National Laboratories (SNL) with the U.S. Department of Energy (DOE), and the Solar Electric Power Association (SEPA) initiated a large industry collaborative to identify and standardize definitions for a set of DER grid support functions. While the initial effort concentrated on grid-tied PV inverters and energy storage systems, the concepts have applicability to all DER. A partial product of this on-going effort is a reference definitions document (IEC TR 61850-90-7, Object models for power converters in distributed energy resources (DER) systems) that has become a basis for expansion of related International Electrotechnical Commission (IEC) standards, and is supported by US National Institute of Standards and Technology (NIST) Smart Grid Interoperability Panel (SGIP). Some industry-led organizations advancing communications protocols have also embraced this work. As standards continue to evolve, it is necessary to develop test protocols to independently verify that the inverters are properly executing the advanced functions. Interoperability is assured by establishing common definitions for the functions and a method to test compliance with operational requirements. This document describes test protocols developed by SNL to evaluate the electrical performance and operational capabilities of PV inverters and energy storage, as described in IEC TR 61850-90-7. While many of these functions are not now required by existing grid codes or may not be widely available commercially, the industry is rapidly moving in that direction. Interoperability issues are already apparent as

  15. Connectivity, interoperability and manageability challenges in internet of things

    Science.gov (United States)

    Haseeb, Shariq; Hashim, Aisha Hassan A.; Khalifa, Othman O.; Ismail, Ahmad Faris

    2017-09-01

    The vision of Internet of Things (IoT) is about interconnectivity between sensors, actuators, people and processes. IoT exploits connectivity between physical objects like fridges, cars, utilities, buildings and cities for enhancing the lives of people through automation and data analytics. However, this sudden increase in connected heterogeneous IoT devices takes a huge toll on the existing Internet infrastructure and introduces new challenges for researchers to embark upon. This paper highlights the effects of heterogeneity challenges on connectivity, interoperability, management in greater details. It also surveys some of the existing solutions adopted in the core network to solve the challenges of massive IoT deployment. The paper finally concludes that IoT architecture and network infrastructure needs to be reengineered ground-up, so that IoT solutions can be safely and efficiently deployed.

  16. Internet of Things Heterogeneous Interoperable Network Architecture Design

    DEFF Research Database (Denmark)

    Bhalerao, Dipashree M.

    2014-01-01

    Internet of Thing‘s (IoT) state of the art deduce that there is no mature Internet of Things architecture available. Thesis contributes an abstract generic IoT system reference architecture development with specifications. Novelties of thesis are proposed solutions and implementations....... It is proved that reduction of data at a source will result in huge vertical scalability and indirectly horizontal also. Second non functional feature contributes in heterogeneous interoperable network architecture for constrained Things. To eliminate increasing number of gateways, Wi-Fi access point...... with Bluetooth, Zigbee (new access point is called as BZ-Fi) is proposed. Co-existence of Wi-Fi, Bluetooth, and Zigbee network technologies results in interference. To reduce the interference, orthogonal frequency division multiplexing (OFDM) is proposed tobe implemented in Bluetooth and Zigbee. The proposed...

  17. Trust Model to Enhance Security and Interoperability of Cloud Environment

    Science.gov (United States)

    Li, Wenjuan; Ping, Lingdi

    Trust is one of the most important means to improve security and enable interoperability of current heterogeneous independent cloud platforms. This paper first analyzed several trust models used in large and distributed environment and then introduced a novel cloud trust model to solve security issues in cross-clouds environment in which cloud customer can choose different providers' services and resources in heterogeneous domains can cooperate. The model is domain-based. It divides one cloud provider's resource nodes into the same domain and sets trust agent. It distinguishes two different roles cloud customer and cloud server and designs different strategies for them. In our model, trust recommendation is treated as one type of cloud services just like computation or storage. The model achieves both identity authentication and behavior authentication. The results of emulation experiments show that the proposed model can efficiently and safely construct trust relationship in cross-clouds environment.

  18. Web services for distributed and interoperable hydro-information systems

    Science.gov (United States)

    Horak, J.; Orlik, A.; Stromsky, J.

    2008-03-01

    Web services support the integration and interoperability of Web-based applications and enable machine-to-machine interaction. The concepts of web services and open distributed architecture were applied to the development of T-DSS, the prototype customised for web based hydro-information systems. T-DSS provides mapping services, database related services and access to remote components, with special emphasis placed on the output flexibility (e.g. multilingualism), where SOAP web services are mainly used for communication. The remote components are represented above all by remote data and mapping services (e.g. meteorological predictions), modelling and analytical systems (currently HEC-HMS, MODFLOW and additional utilities), which support decision making in water management.

  19. Military Interoperable Digital Hospital Testbed (MIDHT) Phase III

    Science.gov (United States)

    2014-10-01

    nurses in order to protect their identity prevented a  sufficient sample size to execute a pairwise  analysis.  The Sidak  formula  was used to  adjust...Center  MIDHT  Military Interoperable Digital Hospital Testbed  MIS  Management Information Systems  MPI  Master  Patient Index  MYMC  Conemaugh Meyersdale...Crichton Rehab 0 MedSurg7 ORICN 0 Maternity 0 Pediatrics 0 Geropysch 0 A loysia Hall 0 Behavioral Health 0 School of Nursing 0 Float OTCU 49

  20. CAD Services: an Industry Standard Interface for Mechanical CAD Interoperability

    Science.gov (United States)

    Claus, Russell; Weitzer, Ilan

    2002-01-01

    Most organizations seek to design and develop new products in increasingly shorter time periods. At the same time, increased performance demands require a team-based multidisciplinary design process that may span several organizations. One approach to meet these demands is to use 'Geometry Centric' design. In this approach, design engineers team their efforts through one united representation of the design that is usually captured in a CAD system. Standards-based interfaces are critical to provide uniform, simple, distributed services that enable the 'Geometry Centric' design approach. This paper describes an industry-wide effort, under the Object Management Group's (OMG) Manufacturing Domain Task Force, to define interfaces that enable the interoperability of CAD, Computer Aided Manufacturing (CAM), and Computer Aided Engineering (CAE) tools. This critical link to enable 'Geometry Centric' design is called: Cad Services V1.0. This paper discusses the features of this standard and proposed application.

  1. Political, policy and social barriers to health system interoperability: emerging opportunities of Web 2.0 and 3.0.

    Science.gov (United States)

    Juzwishin, Donald W M

    2009-01-01

    Achieving effective health informatics interoperability in a fragmented and uncoordinated health system is by definition not possible. Interoperability requires the simultaneous integration of health care processes and information across different types and levels of care (systems thinking). The fundamental argument of this paper is that information system interoperability will remain an unfulfilled hope until health reforms effectively address the governance (accountability), structural and process barriers to interoperability of health care delivery. The ascendency of Web 2.0 and 3.0, although still unproven, signals the opportunity to accelerate patients' access to health information and their health record. Policy suggestions for simultaneously advancing health system delivery and information system interoperability are posited.

  2. Middleware Interoperability for Robotics: A ROS-YARP Framework

    Directory of Open Access Journals (Sweden)

    Plinio Moreno

    2016-10-01

    Full Text Available Middlewares are fundamental tools for progress in research and applications in robotics. They enable the integration of multiple heterogeneous sensing and actuation devices, as well as providing general purpose modules for key robotics functions (kinematics, navigation, planning. However, no existing middleware yet provides a complete set of functionalities for all robotics applications, and many robots may need to rely on more than one framework. This paper focuses on the interoperability between two of the most prevalent middleware in robotics: YARP and ROS. Interoperability between middlewares should ideally allow users to execute existing software without the necessity of: (i changing the existing code, and (ii writing hand-coded ``bridges'' for each use-case. We propose a framework enabling the communication between existing YARP modules and ROS nodes for robotics applications in an automated way. Our approach generates the ``bridging gap'' code from a configuration file, connecting YARP ports and ROS topics through code-generated YARP Bottles. %%The configuration file must describe: (i the sender entities, (ii the way to group and convert the information read from the sender, (iii the structure of the output message and (iv the receiving entity. Our choice for the many inputs to one output is the most common use-case in robotics applications, where examples include filtering, decision making and visualization. %We support YARP/ROS and ROS/YARP sender/receiver configurations, which are demonstrated in a humanoid on wheels robot that uses YARP for upper body motor control and visual perception, and ROS for mobile base control and navigation algorithms.

  3. Using ontologies to improve semantic interoperability in health data

    Directory of Open Access Journals (Sweden)

    Harshana Liyanage

    2015-07-01

    Full Text Available The present–day health data ecosystem comprises a wide array of complex heterogeneous data sources. A wide range of clinical, health care, social and other clinically relevant information are stored in these data sources. These data exist either as structured data or as free-text. These data are generally individual personbased records, but social care data are generally case based and less formal data sources may be shared by groups. The structured data may be organised in a proprietary way or be coded using one-of-many coding, classification or terminologies that have often evolved in isolation and designed to meet the needs of the context that they have been developed. This has resulted in a wide range of semantic interoperability issues that make the integration of data held on these different systems changing. We present semantic interoperability challenges and describe a classification of these. We propose a four-step process and a toolkit for those wishing to work more ontologically, progressing from the identification and specification of concepts to validating a final ontology. The four steps are: (1 the identification and specification of data sources; (2 the conceptualisation of semantic meaning; (3 defining to what extent routine data can be used as a measure of the process or outcome of care required in a particular study or audit and (4 the formalisation and validation of the final ontology. The toolkit is an extension of a previous schema created to formalise the development of ontologies related to chronic disease management. The extensions are focused on facilitating rapid building of ontologies for time-critical research studies. 

  4. Interoperability of Information Systems Managed and Used by the Local Health Departments.

    Science.gov (United States)

    Shah, Gulzar H; Leider, Jonathon P; Luo, Huabin; Kaur, Ravneet

    2016-01-01

    In the post-Affordable Care Act era marked by interorganizational collaborations and availability of large amounts of electronic data from other community partners, it is imperative to assess the interoperability of information systems used by the local health departments (LHDs). To describe the level of interoperability of LHD information systems and identify factors associated with lack of interoperability. This mixed-methods research uses data from the 2015 Informatics Capacity and Needs Assessment Survey, with a target population of all LHDs in the United States. A representative sample of 650 LHDs was drawn using a stratified random sampling design. A total of 324 completed responses were received (50% response rate). Qualitative data were used from a key informant interview study of LHD informatics staff from across the United States. Qualitative data were independently coded by 2 researchers and analyzed thematically. Survey data were cleaned, bivariate comparisons were conducted, and a multivariable logistic regression was run to characterize factors associated with interoperability. For 30% of LHDs, no systems were interoperable, and 38% of LHD respondents indicated some of the systems were interoperable. Significant determinants of interoperability included LHDs having leadership support (adjusted odds ratio [AOR] = 3.54), control of information technology budget allocation (AOR = 2.48), control of data systems (AOR = 2.31), having a strategic plan for information systems (AOR = 1.92), and existence of business process analysis and redesign (AOR = 1.49). Interoperability of all systems may be an informatics goal, but only a small proportion of LHDs reported having interoperable systems, pointing to a substantial need among LHDs nationwide.

  5. A state-of-the-art review of interoperability amongst heterogeneous software systems

    Directory of Open Access Journals (Sweden)

    Carlos Mario Zapata Jaramillo

    2009-05-01

    Full Text Available Information systems are sets of interacting elements aimed at supporting entrepreneurial or business activities; they cannot thus coexist in an isolated way but require their data to be shared so as to increase their productivity. Such systems’ interoperability is normally accomplished through mark-up standards, query languages and web services. The literature contains work related to software system interoperability; however, it presents some difficulties, such as the need for using the same platforms and different programming languages, the use of read only languages and the deficiencies in the formalism used for achieving it. This paper presents a critical review of the advances made regarding heterogeneous software systems’ interoperability.

  6. Public Key Infrastructure (PKI) Interoperability: A Security Services Approach to Support Transfer of Trust

    National Research Council Canada - National Science Library

    Hansen, Anthony

    1999-01-01

    .... This thesis defines interoperability as the capacity to support trust through retention of security services across PKI domains at a defined level of assurance and examines the elements of PKI...

  7. Interoperability and future internet for next generation enterprises - editorial and state of the art

    NARCIS (Netherlands)

    van Sinderen, Marten J.; Johnson, Pontus; Doumeingts, Guy

    2013-01-01

    Today’s global markets drive enterprises towards closer collaboration with customers, suppliers and partners. Interoperability problems constitute fundamental barriers to such collaboration. A characteristic of modern economic life is the requirement on continuous and rapid change and innovation.

  8. Interoperability and Network-Centric Warfare: US Army Future Force and German Army in 2015

    National Research Council Canada - National Science Library

    Alme, Thorsten

    2005-01-01

    .... Special consideration is given to technical and behavioral interoperability. The monograph also assesses the projected capabilities of the German Bundeswehr in the year 2015 with regard to Network-Centric Warfare (NCW...

  9. 78 FR 50075 - Statewide Communication Interoperability Plan Template and Annual Progress Report

    Science.gov (United States)

    2013-08-16

    ... Collection Request should be forwarded to DHS/NPPD/CS&C/OEC, 245 Murray Lane SW., Mail Stop 0640, Arlington... will assist states in their strategic planning for interoperable and emergency communications while...

  10. Analysis of Jordan's Proposed Emergency Communication Interoperability Plan (JECIP) for Disaster Response

    National Research Council Canada - National Science Library

    Alzaghal, Mohamad H

    2008-01-01

    ... country. It is essential to build a robust and interoperable Information and Communication Technology (ICT) infrastructure before the disaster, which will facilitate patch/restore/reconstruct it when and after the disaster hits...

  11. A cloud-based approach for interoperable electronic health records (EHRs).

    Science.gov (United States)

    Bahga, Arshdeep; Madisetti, Vijay K

    2013-09-01

    We present a cloud-based approach for the design of interoperable electronic health record (EHR) systems. Cloud computing environments provide several benefits to all the stakeholders in the healthcare ecosystem (patients, providers, payers, etc.). Lack of data interoperability standards and solutions has been a major obstacle in the exchange of healthcare data between different stakeholders. We propose an EHR system - cloud health information systems technology architecture (CHISTAR) that achieves semantic interoperability through the use of a generic design methodology which uses a reference model that defines a general purpose set of data structures and an archetype model that defines the clinical data attributes. CHISTAR application components are designed using the cloud component model approach that comprises of loosely coupled components that communicate asynchronously. In this paper, we describe the high-level design of CHISTAR and the approaches for semantic interoperability, data integration, and security.

  12. An application of ETICS Co-Scheduling Mechanism to Interoperability and Compliance Validation of Grid Services

    CERN Document Server

    Ronchieri, Elisabetta; Diez-andino Sancho, Guillermo; DI Meglio, Alberto; Marzolla, Moreno

    2008-01-01

    Grid software projects require infrastructures in order to evaluate interoperability with other projects and compliance with predefined standards. Interoperability and compliance are quality attributes that are expected from all distributed projects. ETICS is designed to automate the investigation of this kind of problems. It integrates well-established procedures, tools and resources in a coherent framework and adaptes them to the special needs of these projects. Interoperability and compliance to standards are important quality attributes of software developed for Grid environments where many different parts of an interconnected system have to interact. Compliance to standard is one of the major factors in making sure that interoperating parts of a distributed system can actually interconnect and exchange information. Taking the case of the Grid environment (Foster and Kesselman, 2003), most of the projects that are developing software have not reached the maturity level of other communities yet and have di...

  13. Medical Device Plug-and-Play Interoperability Standards and Technology Leadership

    Science.gov (United States)

    2013-10-01

    will enable the creation of complete electronic health records and will introduce error resistance into networked medical device systems. We are...technological advances, interoperability poses safety and medico -legal challenges as well. The development of standards and production of

  14. Mediation, Alignment, and Information Services for Semantic interoperability (MAISSI): A Trade Study

    National Research Council Canada - National Science Library

    Barlos, Fotis; Hunter, Dan; Krikeles, Basil; McDonough, James

    2007-01-01

    .... Semantic Interoperability (SI) encompasses a broad range of technologies such as data mediation and schema matching, ontology alignment, and context representation that attempt to enable systems to understand each others semantics...

  15. The ISO/EN 13606 Standard for the Interoperable Exchange of Electronic Health Records

    Directory of Open Access Journals (Sweden)

    Pilar Muñoz

    2011-01-01

    Full Text Available The standardization of Electronic Health Records (EHR is a crucial factor for ensuring interoperable sharing of health data. During recent decades, a plethora of initiatives - driven by international organizations - has emerged to define the required models describing the exchange of information between EHRs. These models cover different essential characteristics for building interoperable EHRs, such as architecture, methodology, communication, safety or terminology, among others. In this context, the European reference frame for the standardized exchange of EHR is the recently approved ISO/EN 13606 standard. This multi-part standard provides the syntactic and semantic capabilities (through a dual model approach as well as terminology, security and interface considerations for the standardized exchange of EHR. This paper provides (a an introduction to the different standardization efforts related to the interoperable exchange of EHR around the world, and (b a description of how the ISO/EN 13606 standard provides interoperable sharing of clinical information.

  16. A Pragmatic Approach to Sustainable Interoperability for the Web 2.0 World

    Science.gov (United States)

    Wright, D. J.; Sankaran, S.

    2015-12-01

    In the geosciences, interoperability is a fundamental requirement. Members of various standards organizations such as the OGC and ISO-TC 211 have done yeomen services to promote a standards-centric approach to manage the interoperability challenges that organizations face today. The specific challenges that organizations face when adopting interoperability patterns are very many. One approach, that of mandating the use of specific standards has been reasonably successful. But scientific communities, as with all others, ultimately want their solutions to be widely accepted and used. And to this end there is a crying need to explore all possible interoperability patterns without restricting the choices to mandated standards. Standards are created by a slow and deliberative process that sometimes takes a long time to come to fruition and therefore sometime feel to fall short of user expectations. It seems therefore that organizations are left with a series of perceived orthogonal requirements when they want to pursue interoperability. They want a robust but agile solution, a mature approach that also needs to satisfy latest technology trends and so on. Sustainable interoperability patterns need to be forward looking and should choose the patterns and paradigms of the Web 2.0 generation. To this end, the key is to choose platform technologies that embrace multiple interoperability mechanisms that are built on fundamental "open" principles and which align with popular mainstream patterns. We seek to explore data-, metadata- and web service-related interoperability patterns through the prism of building solutions that encourage strong implementer and end-user engagement, improved usability and scalability considerations, and appealing developer frameworks that can grow the audience. The path to tread is not new, and the geocommunity only needs to observe and align its end goals with current Web 2.0 patterns to realize all the benefits that today we all take for granted

  17. Balancing of Heterogeneity and Interoperability in E-Business Networks: The Role of Standards and Protocols

    OpenAIRE

    Frank-Dieter Dorloff; Ejub Kajan

    2012-01-01

    To reach this interoperability visibility and common understanding must be ensured on all levels of the interoperability pyramid. This includes common agreements about the visions, political and legal restrictions, clear descriptions about the collaboration scenarios, included business processes and-rules, the type and roles of the Documents, a common understandable vocabulary, etc. To do this in an effective and automatable manner, ICT based concepts, frameworks and models have to be defined...

  18. Interoperability for Space Mission Monitor and Control: Applying Technologies from Manufacturing Automation and Process Control Industries

    Science.gov (United States)

    Jones, Michael K.

    1998-01-01

    Various issues associated with interoperability for space mission monitor and control are presented in viewgraph form. Specific topics include: 1) Space Project Mission Operations Control Architecture (SuperMOCA) goals and methods for achieving them; 2) Specifics on the architecture: open standards ad layering, enhancing interoperability, and promoting commercialization; 3) An advertisement; 4) Status of the task - government/industry cooperation and architecture and technology demonstrations; and 5) Key features of messaging services and virtual devices.

  19. Proposed Interoperability Readiness Level Assessment for Mission Critical Interfaces During Navy Acquisition

    Science.gov (United States)

    2010-12-01

    Material Development Decision MRA Mission Readiness Assessment M&S Modeling and Simulation MSSE Masters of Science in Systems Engineering vii...time- compressed test events and a high cost for OT&E, it is mandatory that OTDs know exactly what parameters of their system must be examined to...considering interoperability at each milestone review, one can gain a sense of the level of interoperability of the system. This will allow the

  20. Medical Device Plug-and-Play Interoperability Standards and Technology Leadership

    Science.gov (United States)

    2015-10-01

    evaluating a variety of security technologies ranging from RFID access control to security testing products. Recently, there has been a dramatic...Award Number: W81XWH-09-1-0705 TITLE: “Medical Device Plug-and-Play Interoperability Standards and Technology Leadership” PRINCIPAL INVESTIGATOR...20 Sept 2015 4. TITLE AND SUBTITLE “Medical Device Plug-and-Play Interoperability 5a. CONTRACT NUMBER Standards and Technology Leadership” 5b

  1. Federative approach of interoperability at the design/manufacturing interface using ontologies

    OpenAIRE

    Fortineau, Virginie; Paviot, Thomas; Lamouri, Samir

    2011-01-01

    In the production enterprises, interoperability between information systems used is the key for a successfull Product Lifecycle Management approach. Despite many research works, the preservation of the information ow along the product life is still problematical because of the scienti c and technological locks existing. These locks are identifi ed in this paper and a new federative approach of interoperability is proposed, based upon the use of ontologies and semantic web tools.

  2. Implementing Interoperability in the Seafood Industry: Learning from Experiences in Other Sectors.

    Science.gov (United States)

    Bhatt, Tejas; Gooch, Martin; Dent, Benjamin; Sylvia, Gilbert

    2017-08-01

    Interoperability of communication and information technologies within and between businesses operating along supply chains is being pursued and implemented in numerous industries worldwide to increase the efficiency and effectiveness of operations. The desire for greater interoperability is also driven by the need to reduce business risk through more informed management decisions. Interoperability is achieved by the development of a technology architecture that guides the design and implementation of communication systems existing within individual businesses and between businesses comprising the supply chain. Technology architectures are developed through a purposeful dialogue about why the architecture is required, the benefits and opportunities that the architecture offers the industry, and how the architecture will translate into practical results. An assessment of how the finance, travel, and health industries and a sector of the food industry-fresh produce-have implemented interoperability was conducted to identify lessons learned that can aid the development of interoperability in the seafood industry. The findings include identification of the need for strong, effective governance during the establishment and operation of an interoperability initiative to ensure the existence of common protocols and standards. The resulting insights were distilled into a series of principles for enabling syntactic and semantic interoperability in any industry, which we summarize in this article. Categorized as "structural," "operational," and "integrative," the principles describe requirements and solutions that are pivotal to enabling businesses to create and capture value from full chain interoperability. The principles are also fundamental to allowing governments and advocacy groups to use traceability for public good. © 2017 Institute of Food Technologists®.

  3. The National Flood Interoperability Experiment: Bridging Resesarch and Operations

    Science.gov (United States)

    Salas, F. R.

    2015-12-01

    The National Weather Service's new National Water Center, located on the University of Alabama campus in Tuscaloosa, will become the nation's hub for comprehensive water resources forecasting. In conjunction with its federal partners the US Geological Survey, Army Corps of Engineers and Federal Emergency Management Agency, the National Weather Service will operationally support both short term flood prediction and long term seasonal forecasting of water resource conditions. By summer 2016, the National Water Center will begin evaluating four streamflow data products at the scale of the NHDPlus river reaches (approximately 2.67 million). In preparation for the release of these products, from September 2014 to August 2015, the National Weather Service partnered with the Consortium of Universities for the Advancement of Hydrologic Science, Inc. to support the National Flood Interoperability Experiment which included a seven week in-residence Summer Institute in Tuscaloosa for university students interested in learning about operational hydrology and flood forecasting. As part of the experiment, 15 hour forecasts from the operational High Resolution Rapid Refresh atmospheric model were used to drive a three kilometer Noah-MP land surface model loosely coupled to a RAPID river routing model operating on the NHDPlus dataset. This workflow was run every three hours during the Summer Institute and the results were made available to those engaged to pursue a range of research topics focused on flood forecasting (e.g. reservoir operations, ensemble forecasting, probabilistic flood inundation mapping, rainfall product evaluation etc.) Although the National Flood Interoperability Experiment was finite in length, it provided a platform through which the academic community could engage federal agencies and vice versa to narrow the gap between research and operations and demonstrate how state of the art research infrastructure, models, services, datasets etc. could be utilized

  4. Author identities an interoperability problem solved by a collaborative solution

    Science.gov (United States)

    Fleischer, D.; Czerniak, A.; Schirnick, C.

    2012-12-01

    The identity of authors and data providers is crucial for personalized interoperability. The marketplace of available identifiers is packed and the right choice is getting more and more complicated. Even though there are more then 15 different systems available there are still some under development and proposed to come up by the end of 2012 ('PubMed Central Author ID' and ORCID). Data Management on a scale beyond the size of a single research institute but on the scale of a scientific site including a university with student education program needs to tackle this problem and so did the Kiel Data Management an Infrastructure. The main problem with the identities of researchers is the quite high frequency changes in positions during a scientist life. The required system needed to be a system that already contained the potential of preregistered people with their scientific publications from other countries, institutions and organizations. Scanning the author ID marketplace brought up, that there us a high risk of additional workload to the researcher itself or the administration due to the fact that individuals need to register an ID for themselves or the chosen register is not yet big enough to simply find the right entry. On the other hand libraries deal with authors and their publications now for centuries and they have high quality catalogs with person identities already available. Millions of records internationally mapped are available by collaboration with libraries and can be used in exactly the same scope. The international collaboration between libraries (VIAF) provides a mapping between libraries from the US, CA, UK, FR, GER and many more. The international library author identification system made it possible to actually reach at the first matching a success of 60% of all scientists. The additional advantage is that librarians can finalize the Identity system in a kind of background process. The Kiel Data Management Infrastructure initiated a web service

  5. Interoperability challenges for the Sustainable Management of seagrass meadows (Invited)

    Science.gov (United States)

    Nativi, S.; Pastres, R.; Bigagli, L.; Venier, C.; Zucchetta, M.; Santoro, M.

    2013-12-01

    Seagrass meadows (marine angiosperm plants) occupy less than 0.2% of the global ocean surface, annually store about 10-18% of the so-called 'Blue Carbon', i.e. the Carbon stored in coastal vegetated areas. Recent literature estimates that the flux to the long-term carbon sink in seagrasses represents 10-20% of seagrasses global average production. Such figures can be translated into economic benefits, taking into account that a ton of carbon dioxide in Europe is paid at around 15 € in the carbon market. This means that the organic carbon retained in seagrass sediments in the Mediterranean is worth 138 - 1128 billion €, which represents 6-23 € per square meter. This is 9-35 times more than one square meter of tropical forest soil (0.66 € per square meter), or 5-17 times when considering both the above and the belowground compartments in tropical forests. According the most conservative estimations, about 10% of the Mediterranean meadows have been lost during the last century. In the framework of the GEOSS (Global Earth Observation System of Systems) initiative, the MEDINA project (funded by the European Commission and coordinated by the University of Ca'Foscari in Venice) prepared a showcase as part of the GEOSS Architecture Interoperability Pilot -phase 6 (AIP-6). This showcase aims at providing a tool for the sustainable management of seagrass meadows along the Mediterranean coastline. The application is based on an interoperability framework providing a set of brokerage services to easily ingest and run a Habitat Suitability model (a model predicting the probability a given site to provide a suitable habitat for the development of seagrass meadow and the average coverage expected). The presentation discusses such a framework explaining how the input data is discovered, accessed and processed to ingest the model (developed in the MEDINA project). Furthermore, the brokerage framework provides the necessary services to run the model and visualize results

  6. A federated semantic metadata registry framework for enabling interoperability across clinical research and care domains.

    Science.gov (United States)

    Sinaci, A Anil; Laleci Erturkmen, Gokce B

    2013-10-01

    In order to enable secondary use of Electronic Health Records (EHRs) by bridging the interoperability gap between clinical care and research domains, in this paper, a unified methodology and the supporting framework is introduced which brings together the power of metadata registries (MDR) and semantic web technologies. We introduce a federated semantic metadata registry framework by extending the ISO/IEC 11179 standard, and enable integration of data element registries through Linked Open Data (LOD) principles where each Common Data Element (CDE) can be uniquely referenced, queried and processed to enable the syntactic and semantic interoperability. Each CDE and their components are maintained as LOD resources enabling semantic links with other CDEs, terminology systems and with implementation dependent content models; hence facilitating semantic search, much effective reuse and semantic interoperability across different application domains. There are several important efforts addressing the semantic interoperability in healthcare domain such as IHE DEX profile proposal, CDISC SHARE and CDISC2RDF. Our architecture complements these by providing a framework to interlink existing data element registries and repositories for multiplying their potential for semantic interoperability to a greater extent. Open source implementation of the federated semantic MDR framework presented in this paper is the core of the semantic interoperability layer of the SALUS project which enables the execution of the post marketing safety analysis studies on top of existing EHR systems. Copyright © 2013 Elsevier Inc. All rights reserved.

  7. [Lessons learned in the implementation of interoperable National Health Information Systems: a systematic review].

    Science.gov (United States)

    Ovies-Bernal, Diana Paola; Agudelo-Londoño, Sandra M

    2014-01-01

    Identify shared criteria used throughout the world in the implementation of interoperable National Health Information Systems (NHIS) and provide validated scientific information on the dimensions affecting interoperability. This systematic review sought to identify primary articles on the implementation of interoperable NHIS published in scientific journals in English, Portuguese, or Spanish between 1990 and 2011 through a search of eight databases of electronic journals in the health sciences and informatics: MEDLINE (PubMed), Proquest, Ovid, EBSCO, MD Consult, Virtual Health Library, Metapress, and SciELO. The full texts of the articles were reviewed, and those that focused on technical computer aspects or on normative issues were excluded, as well as those that did not meet the quality criteria for systematic reviews of interventions. Of 291 studies found and reviewed, only five met the inclusion criteria. These articles reported on the process of implementing an interoperable NHIS in Brazil, China, the United States, Turkey, and the Semiautonomous Region of Zanzíbar, respectively. Five common basic criteria affecting implementation of the NHIS were identified: standards in place to govern the process, availability of trained human talent, financial and structural constraints, definition of standards, and assurance that the information is secure. Four dimensions affecting interoperability were defined: technical, semantic, legal, and organizational. The criteria identified have to be adapted to the actual situation in each country and a proactive approach should be used to ensure that implementation of the interoperable NHIS is strategic, simple, and reliable.

  8. Meeting people's needs in a fully interoperable domotic environment.

    Science.gov (United States)

    Miori, Vittorio; Russo, Dario; Concordia, Cesare

    2012-01-01

    The key idea underlying many Ambient Intelligence (AmI) projects and applications is context awareness, which is based mainly on their capacity to identify users and their locations. The actual computing capacity should remain in the background, in the periphery of our awareness, and should only move to the center if and when necessary. Computing thus becomes 'invisible', as it is embedded in the environment and everyday objects. The research project described herein aims to realize an Ambient Intelligence-based environment able to improve users' quality of life by learning their habits and anticipating their needs. This environment is part of an adaptive, context-aware framework designed to make today's incompatible heterogeneous domotic systems fully interoperable, not only for connecting sensors and actuators, but for providing comprehensive connections of devices to users. The solution is a middleware architecture based on open and widely recognized standards capable of abstracting the peculiarities of underlying heterogeneous technologies and enabling them to co-exist and interwork, without however eliminating their differences. At the highest level of this infrastructure, the Ambient Intelligence framework, integrated with the domotic sensors, can enable the system to recognize any unusual or dangerous situations and anticipate health problems or special user needs in a technological living environment, such as a house or a public space.

  9. Advances in a Distributed Approach for Ocean Model Data Interoperability

    Directory of Open Access Journals (Sweden)

    Richard P. Signell

    2014-03-01

    Full Text Available An infrastructure for earth science data is emerging across the globe based on common data models and web services. As we evolve from custom file formats and web sites to standards-based web services and tools, data is becoming easier to distribute, find and retrieve, leaving more time for science. We describe recent advances that make it easier for ocean model providers to share their data, and for users to search, access, analyze and visualize ocean data using MATLAB® and Python®. These include a technique for modelers to create aggregated, Climate and Forecast (CF metadata convention datasets from collections of non-standard Network Common Data Form (NetCDF output files, the capability to remotely access data from CF-1.6-compliant NetCDF files using the Open Geospatial Consortium (OGC Sensor Observation Service (SOS, a metadata standard for unstructured grid model output (UGRID, and tools that utilize both CF and UGRID standards to allow interoperable data search, browse and access. We use examples from the U.S. Integrated Ocean Observing System (IOOS® Coastal and Ocean Modeling Testbed, a project in which modelers using both structured and unstructured grid model output needed to share their results, to compare their results with other models, and to compare models with observed data. The same techniques used here for ocean modeling output can be applied to atmospheric and climate model output, remote sensing data, digital terrain and bathymetric data.

  10. A Working Framework for Enabling International Science Data System Interoperability

    Science.gov (United States)

    Hughes, J. Steven; Hardman, Sean; Crichton, Daniel J.; Martinez, Santa; Law, Emily; Gordon, Mitchell K.

    2016-07-01

    For diverse scientific disciplines to interoperate they must be able to exchange information based on a shared understanding. To capture this shared understanding, we have developed a knowledge representation framework that leverages ISO level reference models for metadata registries and digital archives. This framework provides multi-level governance, evolves independent of the implementation technologies, and promotes agile development, namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. The knowledge representation is captured in an ontology through a process of knowledge acquisition. Discipline experts in the role of stewards at the common, discipline, and project levels work to design and populate the ontology model. The result is a formal and consistent knowledge base that provides requirements for data representation, integrity, provenance, context, identification, and relationship. The contents of the knowledge base are translated and written to files in suitable formats to configure system software and services, provide user documentation, validate input, and support data analytics. This presentation will provide an overview of the framework, present a use case that has been adopted by an entire science discipline at the international level, and share some important lessons learned.

  11. Advances in a distributed approach for ocean model data interoperability

    Science.gov (United States)

    Signell, Richard P.; Snowden, Derrick P.

    2014-01-01

    An infrastructure for earth science data is emerging across the globe based on common data models and web services. As we evolve from custom file formats and web sites to standards-based web services and tools, data is becoming easier to distribute, find and retrieve, leaving more time for science. We describe recent advances that make it easier for ocean model providers to share their data, and for users to search, access, analyze and visualize ocean data using MATLAB® and Python®. These include a technique for modelers to create aggregated, Climate and Forecast (CF) metadata convention datasets from collections of non-standard Network Common Data Form (NetCDF) output files, the capability to remotely access data from CF-1.6-compliant NetCDF files using the Open Geospatial Consortium (OGC) Sensor Observation Service (SOS), a metadata standard for unstructured grid model output (UGRID), and tools that utilize both CF and UGRID standards to allow interoperable data search, browse and access. We use examples from the U.S. Integrated Ocean Observing System (IOOS®) Coastal and Ocean Modeling Testbed, a project in which modelers using both structured and unstructured grid model output needed to share their results, to compare their results with other models, and to compare models with observed data. The same techniques used here for ocean modeling output can be applied to atmospheric and climate model output, remote sensing data, digital terrain and bathymetric data.

  12. Providing trust and interoperability to federate distributed biobanks.

    Science.gov (United States)

    Lablans, Martin; Bartholomäus, Sebastian; Uckert, Frank

    2011-01-01

    Biomedical research requires large numbers of well annotated, quality-assessed samples which often cannot be provided by a single biobank. Connecting biobanks, researchers and service providers raises numerous challenges including trust among partners and towards the infrastructure as well as interoperability problems. Therefore we develop a holistic, open-source and easy-to-use IT infrastructure. Our federated approach allows partners to reflect their organizational structures and protect their data sovereignty. The search service and the contact arrangement processes increase data sovereignty without stigmatizing for rejecting a specific cooperation. The infrastructure supports daily processes with an integrated basic sample manager and user-definable electronic case report forms. Interfaces for existing IT systems avoid re-entering of data. Moreover, resource virtualization is supported to make underutilized resources of some partners accessible to those with insufficient equipment for mutual benefit. The functionality of the resulting infrastructure is outlined in a use-case to demonstrate collaboration within a translational research network. Compared to other existing or upcoming infrastructures, our approach has ultimately the same goals, but relies on gentle incentives rather than top-down imposed progress.

  13. A Proposed Information Architecture for Telehealth System Interoperability

    Energy Technology Data Exchange (ETDEWEB)

    Warren, S.; Craft, R.L.; Parks, R.C.; Gallagher, L.K.; Garcia, R.J.; Funkhouser, D.R.

    1999-04-07

    Telemedicine technology is rapidly evolving. Whereas early telemedicine consultations relied primarily on video conferencing, consultations today may utilize video conferencing, medical peripherals, store-and-forward capabilities, electronic patient record management software, and/or a host of other emerging technologies. These remote care systems rely increasingly on distributed, collaborative information technology during the care delivery process, in its many forms. While these leading-edge systems are bellwethers for highly advanced telemedicine, the remote care market today is still immature. Most telemedicine systems are custom-designed and do not interoperate with other commercial offerings. Users are limited to a set of functionality that a single vendor provides and must often pay high prices to obtain this functionality, since vendors in this marketplace must deliver entire systems in order to compete. Besides increasing corporate research and development costs, this inhibits the ability of the user to make intelligent purchasing decisions regarding best-of-breed technologies. We propose a secure, object-oriented information architecture for telemedicine systems that promotes plug-and-play interaction between system components through standardized interfaces, communication protocols, messaging formats, and data definitions. In this architecture, each component functions as a black box, and components plug together in a lego-like fashion to achieve the desired device or system functionality. The architecture will support various ongoing standards work in the medical device arena.

  14. Health level seven interoperability strategy: big data, incrementally structured.

    Science.gov (United States)

    Dolin, R H; Rogers, B; Jaffe, C

    2015-01-01

    Describe how the HL7 Clinical Document Architecture (CDA), a foundational standard in US Meaningful Use, contributes to a "big data, incrementally structured" interoperability strategy, whereby data structured incrementally gets large amounts of data flowing faster. We present cases showing how this approach is leveraged for big data analysis. To support the assertion that semi-structured narrative in CDA format can be a useful adjunct in an overall big data analytic approach, we present two case studies. The first assesses an organization's ability to generate clinical quality reports using coded data alone vs. coded data supplemented by CDA narrative. The second leverages CDA to construct a network model for referral management, from which additional observations can be gleaned. The first case shows that coded data supplemented by CDA narrative resulted in significant variances in calculated performance scores. In the second case, we found that the constructed network model enables the identification of differences in patient characteristics among different referral work flows. The CDA approach goes after data indirectly, by focusing first on the flow of narrative, which is then incrementally structured. A quantitative assessment of whether this approach will lead to a greater flow of data and ultimately a greater flow of structured data vs. other approaches is planned as a future exercise. Along with growing adoption of CDA, we are now seeing the big data community explore the standard, particularly given its potential to supply analytic en- gines with volumes of data previously not possible.

  15. Adaptation of interoperability standards for cross domain usage

    Science.gov (United States)

    Essendorfer, B.; Kerth, Christian; Zaschke, Christian

    2017-05-01

    As globalization affects most aspects of modern life, challenges of quick and flexible data sharing apply to many different domains. To protect a nation's security for example, one has to look well beyond borders and understand economical, ecological, cultural as well as historical influences. Most of the time information is produced and stored digitally and one of the biggest challenges is to receive relevant readable information applicable to a specific problem out of a large data stock at the right time. These challenges to enable data sharing across national, organizational and systems borders are known to other domains (e.g., ecology or medicine) as well. Solutions like specific standards have been worked on for the specific problems. The question is: what can the different domains learn from each other and do we have solutions when we need to interlink the information produced in these domains? A known problem is to make civil security data available to the military domain and vice versa in collaborative operations. But what happens if an environmental crisis leads to the need to quickly cooperate with civil or military security in order to save lives? How can we achieve interoperability in such complex scenarios? The paper introduces an approach to adapt standards from one domain to another and lines out problems that have to be overcome and limitations that may apply.

  16. An Interoperable System toward Cardiac Risk Stratification from ECG Monitoring

    Science.gov (United States)

    Mora-Jiménez, Inmaculada; Ramos-López, Javier; Quintanilla Fernández, Teresa; García-García, Antonio; Díez-Mazuela, Daniel; García-Alberola, Arcadi

    2018-01-01

    Many indices have been proposed for cardiovascular risk stratification from electrocardiogram signal processing, still with limited use in clinical practice. We created a system integrating the clinical definition of cardiac risk subdomains from ECGs and the use of diverse signal processing techniques. Three subdomains were defined from the joint analysis of the technical and clinical viewpoints. One subdomain was devoted to demographic and clinical data. The other two subdomains were intended to obtain widely defined risk indices from ECG monitoring: a simple-domain (heart rate turbulence (HRT)), and a complex-domain (heart rate variability (HRV)). Data provided by the three subdomains allowed for the generation of alerts with different intensity and nature, as well as for the grouping and scrutinization of patients according to the established processing and risk-thresholding criteria. The implemented system was tested by connecting data from real-world in-hospital electronic health records and ECG monitoring by considering standards for syntactic (HL7 messages) and semantic interoperability (archetypes based on CEN/ISO EN13606 and SNOMED-CT). The system was able to provide risk indices and to generate alerts in the health records to support decision-making. Overall, the system allows for the agile interaction of research and clinical practice in the Holter-ECG-based cardiac risk domain. PMID:29494497

  17. An Interoperable System toward Cardiac Risk Stratification from ECG Monitoring

    Directory of Open Access Journals (Sweden)

    Cristina Soguero-Ruiz

    2018-03-01

    Full Text Available Many indices have been proposed for cardiovascular risk stratification from electrocardiogram signal processing, still with limited use in clinical practice. We created a system integrating the clinical definition of cardiac risk subdomains from ECGs and the use of diverse signal processing techniques. Three subdomains were defined from the joint analysis of the technical and clinical viewpoints. One subdomain was devoted to demographic and clinical data. The other two subdomains were intended to obtain widely defined risk indices from ECG monitoring: a simple-domain (heart rate turbulence (HRT, and a complex-domain (heart rate variability (HRV. Data provided by the three subdomains allowed for the generation of alerts with different intensity and nature, as well as for the grouping and scrutinization of patients according to the established processing and risk-thresholding criteria. The implemented system was tested by connecting data from real-world in-hospital electronic health records and ECG monitoring by considering standards for syntactic (HL7 messages and semantic interoperability (archetypes based on CEN/ISO EN13606 and SNOMED-CT. The system was able to provide risk indices and to generate alerts in the health records to support decision-making. Overall, the system allows for the agile interaction of research and clinical practice in the Holter-ECG-based cardiac risk domain.

  18. Standardized headings as a foundation for semantic interoperability in EHR

    Directory of Open Access Journals (Sweden)

    Halilovic Amra

    2016-01-01

    Full Text Available The new Swedish Patient Act, which allows patients to choose health care in county councils other than their own, creates the need to be able to share health-related information contained in electronic health records [EHRs across county councils. This demands interoperability in terms of structured and standardized data. Headings in EHR could also be a part of structured and standardized data. The aim was to study to what extent terminology is shared and standardized across county councils in Sweden. Headings from three county councils were analyzed to see to what extent they were shared and to what extent they corresponded to concepts in SNOMED CT and the National Board of Health and Welfare’s term dictionary [NBHW’s TD. In total 41% of the headings were shared across two or three county councils. A third of the shared headings corresponded to concepts in SNOMED CT. Further, an eighth of the shared headings corresponded to concepts in NBHW’s TD. The results showed that the extent of shared and standardized terminology in terms of headings across the studied three county councils were negligible.

  19. Modular analytics management architecture for interoperability and decision support

    Science.gov (United States)

    Marotta, Stephen; Metzger, Max; Gorman, Joe; Sliva, Amy

    2016-05-01

    The Dual Node Decision Wheels (DNDW) architecture is a new approach to information fusion and decision support systems. By combining cognitive systems engineering organizational analysis tools, such as decision trees, with the Dual Node Network (DNN) technical architecture for information fusion, the DNDW can align relevant data and information products with an organization's decision-making processes. In this paper, we present the Compositional Inference and Machine Learning Environment (CIMLE), a prototype framework based on the principles of the DNDW architecture. CIMLE provides a flexible environment so heterogeneous data sources, messaging frameworks, and analytic processes can interoperate to provide the specific information required for situation understanding and decision making. It was designed to support the creation of modular, distributed solutions over large monolithic systems. With CIMLE, users can repurpose individual analytics to address evolving decision-making requirements or to adapt to new mission contexts; CIMLE's modular design simplifies integration with new host operating environments. CIMLE's configurable system design enables model developers to build analytical systems that closely align with organizational structures and processes and support the organization's information needs.

  20. A Proposed Information Architecture for Telehealth System Interoperability

    Energy Technology Data Exchange (ETDEWEB)

    Craft, R.L.; Funkhouser, D.R.; Gallagher, L.K.; Garica, R.J.; Parks, R.C.; Warren, S.

    1999-04-20

    We propose an object-oriented information architecture for telemedicine systems that promotes secure `plug-and-play' interaction between system components through standardized interfaces, communication protocols, messaging formats, and data definitions. In this architecture, each component functions as a black box, and components plug together in a ''lego-like'' fashion to achieve the desired device or system functionality. Introduction Telemedicine systems today rely increasingly on distributed, collaborative information technology during the care delivery process. While these leading-edge systems are bellwethers for highly advanced telemedicine, most are custom-designed and do not interoperate with other commercial offerings. Users are limited to a set of functionality that a single vendor provides and must often pay high prices to obtain this functionality, since vendors in this marketplace must deliver en- tire systems in order to compete. Besides increasing corporate research and development costs, this inhibits the ability of the user to make intelligent purchasing decisions regarding best-of-breed technologies. This paper proposes a reference architecture for plug-and-play telemedicine systems that addresses these issues.

  1. Interoperable Solar Data and Metadata via LISIRD 3

    Science.gov (United States)

    Wilson, A.; Lindholm, D. M.; Pankratz, C. K.; Snow, M. A.; Woods, T. N.

    2015-12-01

    LISIRD 3 is a major upgrade of the LASP Interactive Solar Irradiance Data Center (LISIRD), which serves several dozen space based solar irradiance and related data products to the public. Through interactive plots, LISIRD 3 provides data browsing supported by data subsetting and aggregation. Incorporating a semantically enabled metadata repository, LISIRD 3 users see current, vetted, consistent information about the datasets offered. Users can now also search for datasets based on metadata fields such as dataset type and/or spectral or temporal range. This semantic database enables metadata browsing, so users can discover the relationships between datasets, instruments, spacecraft, mission and PI. The database also enables creation and publication of metadata records in a variety of formats, such as SPASE or ISO, making these datasets more discoverable. The database also enables the possibility of a public SPARQL endpoint, making the metadata browsable in an automated fashion. LISIRD 3's data access middleware, LaTiS, provides dynamic, on demand reformatting of data and timestamps, subsetting and aggregation, and other server side functionality via a RESTful OPeNDAP compliant API, enabling interoperability between LASP datasets and many common tools. LISIRD 3's templated front end design, coupled with the uniform data interface offered by LaTiS, allows easy integration of new datasets. Consequently the number and variety of datasets offered by LISIRD has grown to encompass several dozen, with many more to come. This poster will discuss design and implementation of LISIRD 3, including tools used, capabilities enabled, and issues encountered.

  2. MPEG-4 IPMP Extension for Interoperable Protection of Multimedia Content

    Directory of Open Access Journals (Sweden)

    Zeng Wenjun

    2004-01-01

    Full Text Available To ensure secure content delivery, the Motion Picture Experts Group (MPEG has dedicated significant effort to the digital rights management (DRM issues. MPEG is now moving from defining only hooks to proprietary systems (e.g., in MPEG-2, MPEG-4 Version 1 to specifying a more encompassing standard in intellectual property management and protection (IPMP. MPEG feels that this is necessary in order to achieve MPEG's most important goal: interoperability. The design of the IPMP Extension framework also considers the complexity of the MPEG-4 standard and the diversity of its applications. This architecture leaves the details of the design of IPMP tools in the hands of applications developers, while ensuring the maximum flexibility and security. This paper first briefly describes the background of the development of the MPEG-4 IPMP Extension. It then presents an overview of the MPEG-4 IPMP Extension, including its architecture, the flexible protection signaling, and the secure messaging framework for the communication between the terminal and the tools. Two sample usage scenarios are also provided to illustrate how an MPEG-4 IPMP Extension compliant system works.

  3. Evaluating Sustainability Models for Interoperability through Brokering Software

    Science.gov (United States)

    Pearlman, Jay; Benedict, Karl; Best, Mairi; Fyfe, Sue; Jacobs, Cliff; Michener, William; Nativi, Stefano; Powers, Lindsay; Turner, Andrew

    2016-04-01

    Sustainability of software and research support systems is an element of innovation that is not often discussed. Yet, sustainment is essential if we expect research communities to make the time investment to learn and adopt new technologies. As the Research Data Alliance (RDA) is developing new approaches to interoperability, the question of uptake and sustainability is important. Brokering software sustainability is one of the areas that is being addressed in RDA. The Business Models Team of the Research Data Alliance Brokering Governance Working Group examined several support models proposed to promote the long-term sustainability of brokering middleware. The business model analysis includes examination of funding source, implementation frameworks and challenges, and policy and legal considerations. Results of this comprehensive analysis highlight advantages and disadvantages of the various models with respect to the specific requirements for brokering services. We offer recommendations based on the outcomes of this analysis that suggest that hybrid funding models present the most likely avenue to long term sustainability.

  4. Multispectral iris fusion for enhancement, interoperability, and cross wavelength matching

    Science.gov (United States)

    Burge, Mark J.; Monaco, Matthew K.

    2009-05-01

    Traditionally, only a narrow band of the Near-Infrared (NIR) spectrum (700-900nm) is utilized for iris recognition since this alleviates any physical discomfort from illumination, reduces specular reflections and increases the amount of texture captured for some iris colors. However, previous research has shown that matching performance is not invariant to iris color and can be improved by imaging outside of the NIR spectrum. Building on this research, we demonstrate that iris texture increases with the frequency of the illumination for lighter colored sections of the iris and decreases for darker sections. Using registered visible light and NIR iris images captured using a single-lens multispectral camera, we illustrate how physiological properties of the iris (e.g., the amount and distribution of melanin) impact the transmission, absorbance, and reflectance of different portions of the electromagnetic spectrum and consequently affect the quality of the imaged iris texture. We introduce a novel iris code, Multispectral Enhanced irisCode (MEC), which uses pixel-level fusion algorithms to exploit texture variations elicited by illuminating the iris at different frequencies, to improve iris matcher performance and reduce Failure-To-Enroll (FTE) rates. Finally, we present a model for approximating an NIR iris image using features derived from the color and structure of a visible light iris image. The simulated NIR images generated by this model are designed to improve the interoperability between legacy NIR iris images and those acquired under visible light by enabling cross wavelength matching of NIR and visible light iris images.

  5. Efficiency Analysis of an Interoperable Healthcare Operations Platform.

    Science.gov (United States)

    Osborne, Thomas F; Clark, Reese H; Blackowiak, Jason; Williamson, Patrick J; Werb, Shannon M; Strong, Benjamin W

    2017-04-01

    (1) Develop an enterprise platform to unify isolated information, software applications and team members. (2) Assess the efficiency of one benefit of the platform through comparative testing of employee document retrieval times. (3) Evaluate the level of satisfaction among our target audience. We developed an infrastructure to integrate information throughout our practice and make it available on a unified, secure, and remotely accessible platform. We solicited our practice for volunteers to test the new system. All interested volunteers participated. Thirteen employees searched for the same four items in both the new system and our legacy systems. Testing was performed in the pre-deployment stage. In our evaluation, we introduced an innovative method to precisely and objectively obtain data through the use of a widely available tool which could be leveraged for a variety of other studies. On average, it took our participants 7 min and 48 s to find four assigned items in our legacy systems. It only took our volunteers 1 min and 1 s to find the same items with the new platform (p-value 0.002). On a scale of 10 being the highest level of satisfaction, participants ranked the new system to be 8.7 while the traditional system was ranked at 6.3. An overarching enterprise platform is critical due to the ability to unify otherwise isolated applications, people and documents. Because navigating a new system would be expected to take longer than a familiar one, we were surprised by the dramatically improved efficiency and satisfaction of our new interoperable platform compared to the status quo. Since this platform was evaluated in the pre-deployment stage, we expect results to improve with employee experience as well as ongoing enhancements.

  6. Capturing Sensor Metadata for Cross-Domain Interoperability

    Science.gov (United States)

    Fredericks, J.

    2015-12-01

    capabilities will form the foundation for interoperable architectures designed to integrate and document observational data, thereby fostering their reproducibility.

  7. Interoperable Access to NCAR Research Data Archive Collections

    Science.gov (United States)

    Schuster, D.; Ji, Z.; Worley, S. J.; Manross, K.

    2014-12-01

    The National Center for Atmospheric Research (NCAR) Research Data Archive (RDA) provides free access to 600+ observational and gridded dataset collections. The RDA is designed to support atmospheric and related sciences research, updated frequently where datasets have ongoing production, and serves data to 10,000 unique users annually. The traditional data access options include web-based direct archive file downloads, user selected data subsets and format conversions produced by server-side computations, and client and cURL-based APIs for routine scripted data retrieval. To enhance user experience and utility, the RDA now also offers THREDDS Data Server (TDS) access for many highly valued dataset collections. TDS offered datasets are presented as aggregations, enabling users to access an entire dataset collection, that can be comprised of 1000's of files, through a single virtual file. The OPeNDAP protocol, supported by the TDS, allows compatible tools to open and access these virtual files remotely, and make the native data file format transparent to the end user. The combined functionality (TDS/OPeNDAP) gives users the ability to browse, select, visualize, and download data from a complete dataset collection without having to transfer archive files to a local host. This presentation will review the TDS basics and describe the specific TDS implementation on the RDA's diverse archive of GRIB-1, GRIB-2, and gridded NetCDF formatted dataset collections. Potential future TDS implementation on in-situ observational dataset collections will be discussed. Illustrative sample cases will be used to highlight the end users benefits from this interoperable data access to the RDA.

  8. Impact of Business Interoperability on the Performance of Complex Cooperative Supply Chain Networks: A Case Study

    Directory of Open Access Journals (Sweden)

    Izunildo Cabral

    2018-01-01

    Full Text Available This paper proposes an agent-based model for evaluating the effect of business interoperability on the performance of cooperative supply chain networks. The model is based on insights from the Industrial Marketing and Purchasing network approach and the complex systems theory perspective. To demonstrate its applicability, an explanatory case study regarding a Portuguese reverse logistics cooperative supply chain network is presented. Face-to-face interviews and forms were used to collect data. The findings show that the establishment of appropriate levels of business interoperability has helped to reduce several non-value-added interaction processes and consequently improve the operational performance of the Valorpneu network. Regarding the research implications, this paper extends the current knowledge on business interoperability and an important problem in business: how business interoperability gaps in dyadic organizational relationships affect the network of companies that the two companies belong to—network effect. In terms of practical implications, managers can use the proposed model as a starting point to simulate complex interactions between supply chain network partners and understand better how the performance of their networks emerges from these interactions and from the adoption of different levels of business interoperability.

  9. An adaptive semantic based mediation system for data interoperability among Health Information Systems.

    Science.gov (United States)

    Khan, Wajahat Ali; Khattak, Asad Masood; Hussain, Maqbool; Amin, Muhammad Bilal; Afzal, Muhammad; Nugent, Christopher; Lee, Sungyoung

    2014-08-01

    Heterogeneity in the management of the complex medical data, obstructs the attainment of data level interoperability among Health Information Systems (HIS). This diversity is dependent on the compliance of HISs with different healthcare standards. Its solution demands a mediation system for the accurate interpretation of data in different heterogeneous formats for achieving data interoperability. We propose an adaptive AdapteR Interoperability ENgine mediation system called ARIEN, that arbitrates between HISs compliant to different healthcare standards for accurate and seamless information exchange to achieve data interoperability. ARIEN stores the semantic mapping information between different standards in the Mediation Bridge Ontology (MBO) using ontology matching techniques. These mappings are provided by our System for Parallel Heterogeneity (SPHeRe) matching system and Personalized-Detailed Clinical Model (P-DCM) approach to guarantee accuracy of mappings. The realization of the effectiveness of the mappings stored in the MBO is evaluation of the accuracy in transformation process among different standard formats. We evaluated our proposed system with the transformation process of medical records between Clinical Document Architecture (CDA) and Virtual Medical Record (vMR) standards. The transformation process achieved over 90 % of accuracy level in conversion process between CDA and vMR standards using pattern oriented approach from the MBO. The proposed mediation system improves the overall communication process between HISs. It provides an accurate and seamless medical information exchange to ensure data interoperability and timely healthcare services to patients.

  10. The eXtensible ontology development (XOD) principles and tool implementation to support ontology interoperability.

    Science.gov (United States)

    He, Yongqun; Xiang, Zuoshuang; Zheng, Jie; Lin, Yu; Overton, James A; Ong, Edison

    2018-01-12

    Ontologies are critical to data/metadata and knowledge standardization, sharing, and analysis. With hundreds of biological and biomedical ontologies developed, it has become critical to ensure ontology interoperability and the usage of interoperable ontologies for standardized data representation and integration. The suite of web-based Ontoanimal tools (e.g., Ontofox, Ontorat, and Ontobee) support different aspects of extensible ontology development. By summarizing the common features of Ontoanimal and other similar tools, we identified and proposed an "eXtensible Ontology Development" (XOD) strategy and its associated four principles. These XOD principles reuse existing terms and semantic relations from reliable ontologies, develop and apply well-established ontology design patterns (ODPs), and involve community efforts to support new ontology development, promoting standardized and interoperable data and knowledge representation and integration. The adoption of the XOD strategy, together with robust XOD tool development, will greatly support ontology interoperability and robust ontology applications to support data to be Findable, Accessible, Interoperable and Reusable (i.e., FAIR).

  11. Towards a Holistic Approach to Policy Interoperability in Digital Libraries and Digital Repositories

    Directory of Open Access Journals (Sweden)

    Perla Innocenti

    2011-03-01

    Full Text Available Underpinning every digital library and digital repository there is a policy framework, which makes the digital library viable - without a policy framework a digital library is little more than a container for content. Policy governs how a digital library is instantiated and run. It is therefore a meta-domain which is situated both outside the digital library and any technologies used to deliver it, and within the digital library itself. Policy is also a key aspect of digital library and digital repository interoperability in a common and integrated information space. Policy interoperability - that is the exchange and reuse of policies - is a step beyond policy standardisation. Furthermore, effective and efficient policy frameworks are also one of the Digital Curation Center (DCC, DigitalPreservationEurope (DPE, nestor and Center for Research Libraries (CRL core criteria for digital repositories. In this article, we share our research on policy interoperability levels and the experimental survey on policy interoperability conducted with real-life digital libraries, as a contribution towards the definition of a Policy Interoperability Framework.

  12. The long road to semantic interoperability in support of public health: experiences from two states.

    Science.gov (United States)

    Dixon, Brian E; Vreeman, Daniel J; Grannis, Shaun J

    2014-06-01

    Proliferation of health information technologies creates opportunities to improve clinical and public health, including high quality, safer care and lower costs. To maximize such potential benefits, health information technologies must readily and reliably exchange information with other systems. However, evidence from public health surveillance programs in two states suggests that operational clinical information systems often fail to use available standards, a barrier to semantic interoperability. Furthermore, analysis of existing policies incentivizing semantic interoperability suggests they have limited impact and are fragmented. In this essay, we discuss three approaches for increasing semantic interoperability to support national goals for using health information technologies. A clear, comprehensive strategy requiring collaborative efforts by clinical and public health stakeholders is suggested as a guide for the long road towards better population health data and outcomes. Published by Elsevier Inc.

  13. Interoperability format translation and transformation between IFC architectural design file and simulation file formats

    Science.gov (United States)

    Chao, Tian-Jy; Kim, Younghun

    2015-01-06

    Automatically translating a building architecture file format (Industry Foundation Class) to a simulation file, in one aspect, may extract data and metadata used by a target simulation tool from a building architecture file. Interoperability data objects may be created and the extracted data is stored in the interoperability data objects. A model translation procedure may be prepared to identify a mapping from a Model View Definition to a translation and transformation function. The extracted data may be transformed using the data stored in the interoperability data objects, an input Model View Definition template, and the translation and transformation function to convert the extracted data to correct geometric values needed for a target simulation file format used by the target simulation tool. The simulation file in the target simulation file format may be generated.

  14. A Cloud Interoperability Broker (CIB for data migration in SaaS

    Directory of Open Access Journals (Sweden)

    Hassan Ali

    2016-12-01

    Full Text Available Cloud computing is becoming increasingly popular. Information technology market leaders, e.g., Microsoft, Google, and Amazon, are extensively shifting toward cloud-based solutions. However, there is isolation in the cloud implementations provided by the cloud vendors. Limited interoperability can cause one user to adhere to a single cloud provider; thus, a required migration of an application or data from one cloud provider to another may necessitate a significant effort and/or full-cycle redevelopment to fit the new provider's standards and implementation. The ability to move from one cloud vendor to another would be a step toward advancing cloud computing interoperability and increasing customer trust. This study proposes a cloud broker solution to fill the interoperability gap between different software-as-a-service providers. The proposed cloud broker was implemented and tested on a real enterprise application dataset. The migration process was completed and it worked correctly, according to a specified mapping model.

  15. Interoperable Medical Instrument Networking and Access System with Security Considerations for Critical Care

    Directory of Open Access Journals (Sweden)

    Deniz Gurkan

    2010-01-01

    Full Text Available The recent influx of electronic medical records in the health care field, coupled with the need of providing continuous care to patients in the critical care environment, has driven the need for interoperability of medical devices. Open standards are needed to support flexible processes and interoperability of medical devices, especially in intensive care units. In this paper, we present an interoperable networking and access architecture based on the CAN protocol. Predictability of the delay of medical data reports is a desirable attribute that can be realized using a tightly-coupled system architecture. Our simulations on network architecture demonstrate that a bounded delay for event reports offers predictability. In addition, we address security issues related to the storage of electronic medical records. We present a set of open source tools and tests to identify the security breaches, and appropriate measures that can be implemented to be compliant with the HIPAA rules.

  16. Assessment of collaboration and interoperability in an information management system to support bioscience research.

    Science.gov (United States)

    Myneni, Sahiti; Patel, Vimla L

    2009-11-14

    Biomedical researchers often have to work on massive, detailed, and heterogeneous datasets that raise new challenges of information management. This study reports an investigation into the nature of the problems faced by the researchers in two bioscience test laboratories when dealing with their data management applications. Data were collected using ethnographic observations, questionnaires, and semi-structured interviews. The major problems identified in working with these systems were related to data organization, publications, and collaboration. The interoperability standards were analyzed using a C(4)I framework at the level of connection, communication, consolidation, and collaboration. Such an analysis was found to be useful in judging the capabilities of data management systems at different levels of technological competency. While collaboration and system interoperability are the "must have" attributes of these biomedical scientific laboratory information management applications, usability and human interoperability are the other design concerns that must also be addressed for easy use and implementation.

  17. CityGML - Interoperable semantic 3D city models

    Science.gov (United States)

    Gröger, Gerhard; Plümer, Lutz

    2012-07-01

    CityGML is the international standard of the Open Geospatial Consortium (OGC) for the representation and exchange of 3D city models. It defines the three-dimensional geometry, topology, semantics and appearance of the most relevant topographic objects in urban or regional contexts. These definitions are provided in different, well-defined Levels-of-Detail (multiresolution model). The focus of CityGML is on the semantical aspects of 3D city models, its structures, taxonomies and aggregations, allowing users to employ virtual 3D city models for advanced analysis and visualization tasks in a variety of application domains such as urban planning, indoor/outdoor pedestrian navigation, environmental simulations, cultural heritage, or facility management. This is in contrast to purely geometrical/graphical models such as KML, VRML, or X3D, which do not provide sufficient semantics. CityGML is based on the Geography Markup Language (GML), which provides a standardized geometry model. Due to this model and its well-defined semantics and structures, CityGML facilitates interoperable data exchange in the context of geo web services and spatial data infrastructures. Since its standardization in 2008, CityGML has become used on a worldwide scale: tools from notable companies in the geospatial field provide CityGML interfaces. Many applications and projects use this standard. CityGML is also having a strong impact on science: numerous approaches use CityGML, particularly its semantics, for disaster management, emergency responses, or energy-related applications as well as for visualizations, or they contribute to CityGML, improving its consistency and validity, or use CityGML, particularly its different Levels-of-Detail, as a source or target for generalizations. This paper gives an overview of CityGML, its underlying concepts, its Levels-of-Detail, how to extend it, its applications, its likely future development, and the role it plays in scientific research. Furthermore, its

  18. Cross-domain Collaborative Research and People Interoperability: Beyond Knowledge Representation Frameworks

    Science.gov (United States)

    Fox, P. A.; Diviacco, P.; Busato, A.

    2016-12-01

    Geo-scientific research collaboration commonly faces of complex systems where multiple skills and competences are needed at the same time. Efficacy of such collaboration among researchers then becomes of paramount importance. Multidisciplinary studies draw from domains that are far from each other. Researchers also need to understand: how to extract what data they need and eventually produce something that can be used by others. The management of information and knowledge in this perspective is non-trivial. Interoperability is frequently sought in computer-to-computer environements, so-as to overcome mismatches in vocabulary, data formats, coordinate reference system and so on. Successful researcher collaboration also relies on interoperability of the people! Smaller, synchronous and face-to-face settings for researchers are knownn to enhance people interoperability. However changing settings; either geographically; temporally; or with increasing the team size, diversity, and expertise requires people-computer-people-computer (...) interoperability. To date, knowledge representation framework have been proposed but not proven as necessary and sufficient to achieve multi-way interoperability. In this contribution, we address epistemology and sociology of science advocating for a fluid perspective where science is mostly a social construct, conditioned by cognitive issues; especially cognitive bias. Bias cannot be obliterated. On the contrary it must be carefully taken into consideration. Information-centric interfaces built from different perspectives and ways of thinking by actors with different point of views, approaches and aims, are proposed as a means for enhancing people interoperability in computer-based settings. The contribution will provide details on the approach of augmenting and interfacing to knowledge representation frameworks to the cognitive-conceptual frameworks for people that are needed to meet and exceed collaborative research goals in the 21st

  19. 76 FR 66040 - NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Draft...

    Science.gov (United States)

    2011-10-25

    ... National Institute of Standards and Technology NIST Framework and Roadmap for Smart Grid Interoperability... and Technology (NIST) seeks comments on the draft NIST Framework and Roadmap for Smart Grid..., 2011. The entire draft version of the NIST Framework and Roadmap for Smart Grid Interoperability...

  20. Support interoperability and reusability of emerging forms of assessment: Some issues on integrating IMS LD with IMS QTI

    NARCIS (Netherlands)

    Miao, Yongwu; Boon, Jo; Van der Klink, Marcel; Sloep, Peter; Koper, Rob

    2009-01-01

    Miao, Y., Boon, J., Van der Klink, M., Sloep, P. B., & Koper, R. (2011). Support interoperability and reusability of emerging forms of assessment: Some issues on integrating IMS LD with IMS QTI. In F. Lazarinis, S. Green, & E. Pearson (Eds.), E-Learning Standards and Interoperability: Frameworks

  1. Collaborative Solution Architecture for Developing a National Interoperability Framework in Romania

    Directory of Open Access Journals (Sweden)

    Bogdan GHILIC-MICU

    2010-01-01

    Full Text Available Interoperability framework is a set of standards and guidelines that describe how organizations have established or will establish to interact. The framework is not static, but one that adapts to the change of standards, administrative requirements and technology. It can be adapted to the socio - economic, political, cultural, linguistic, historical and geographical purposes and to a specific context or situation. The article aims to clarify the essential concepts necessary for outlining Romanian national interoperability framework and to propose collaborative solution architecture for its development, updating and maintaining.

  2. C3I and Modelling and Simulation (M&S) Interoperability

    Science.gov (United States)

    2004-03-01

    applications within the RNLA [2]. The C3I Framework uses commercial of the shelf publish/subscribe services ( Tibco Rendezvous) and a tailored information...support interoperability within their own domain. The C2WS system uses the ‘C3I Framework’ middleware, which is based on Tibco /Rendezvous. The...simulation systems use the HLA interoperability standard. We have developed a ’ Tibco -HLA gateway’ to connect TIB/RV on one side to HLA on the other side (see

  3. Use of Annotations for Component and Framework Interoperability

    Science.gov (United States)

    David, O.; Lloyd, W.; Carlson, J.; Leavesley, G. H.; Geter, F.

    2009-12-01

    western United States at the USDA NRCS National Water and Climate Center. PRMS is a component based modular precipitation-runoff model developed to evaluate the impacts of various combinations of precipitation, climate, and land use on streamflow and general basin hydrology. The new OMS 3.0 PRMS model source code is more concise and flexible as a result of using the new framework’s annotation based approach. The fully annotated components are now providing information directly for (i) model assembly and building, (ii) dataflow analysis for implicit multithreading, (iii) automated and comprehensive model documentation of component dependencies, physical data properties, (iv) automated model and component testing, and (v) automated audit-traceability to account for all model resources leading to a particular simulation result. Experience to date has demonstrated the multi-purpose value of using annotations. Annotations are also a feasible and practical method to enable interoperability among models and modeling frameworks. As a prototype example, model code annotations were used to generate binding and mediation code to allow the use of OMS 3.0 model components within the OpenMI context.

  4. Towards Data Repository Interoperability: The Data Conservancy Data Packaging Specification

    Science.gov (United States)

    DiLauro, T.; Duerr, R.; Thessen, A. E.; Rippin, M.; Pralle, B.; Choudhury, G. S.

    2013-12-01

    description, the DCS instance will be able to provide default mappings for the directories and files within the package payload and enable support for deposited content at a lower level of service. Internally, the DCS will map these hybrid package serializations to its own internal business objects and their properties. Thus, this approach is highly extensible, as other packaging formats could be mapped in a similar manner. In addition, this scheme supports establishing the fixity of the payload while still supporting update of the semantic overlay data. This allows a data producer with scarce resources or an archivist who acquires a researcher's data to package the data for deposit with the intention of augmenting the resource description in the future. The Data Conservancy is partnering with the Sustainable Environment Actionable Data[4] project to test the interoperability of this new packaging mechanism. [1] Data Conservancy: http://dataconservancy.org/ [2] BagIt: https://datatracker.ietf.org/doc/draft-kunze-bagit/ [3] OAI-ORE: http://www.openarchives.org/ore/1.0/ [4] SEAD: http://sead-data.net/

  5. A versatile and interoperable network sensors for water resources monitoring

    Science.gov (United States)

    Ortolani, Alberto; Brandini, Carlo; Costantini, Roberto; Costanza, Letizia; Innocenti, Lucia; Sabatini, Francesco; Gozzini, Bernardo

    2010-05-01

    Monitoring systems to assess water resources quantity and quality require extensive use of in-situ measurements, that have great limitations like difficulties to access and share data, and to customise and easy reconfigure sensors network to fulfil end-users needs during monitoring or crisis phases. In order to address such limitations Sensor Web Enablement technologies for sensors management have been developed and applied to different environmental context under the EU-funded OSIRIS project (Open architecture for Smart and Interoperable networks in Risk management based on In-situ Sensors, www.osiris-fp6.eu). The main objective of OSIRIS was to create a monitoring system to manage different environmental crisis situations, through an efficient data processing chain where in-situ sensors are connected via an intelligent and versatile network infrastructure (based on web technologies) that enables end-users to remotely access multi-domain sensors information. Among the project application, one was focused on underground fresh-water monitoring and management. With this aim a monitoring system to continuously and automatically check water quality and quantity has been designed and built in a pilot test, identified as a portion of the Amiata aquifer feeding the Santa Fiora springs (Grosseto, Italy). This aquifer present some characteristics that make it greatly vulnerable under some conditions. It is a volcanic aquifer with a fractured structure. The volcanic nature in Santa Fiora causes levels of arsenic concentrations that normally are very close to the threshold stated by law, but that sometimes overpass such threshold for reasons still not fully understood. The presence of fractures makes the infiltration rate very inhomogeneous from place to place and very high in correspondence of big fractures. In case of liquid-pollutant spills (typically hydrocarbons spills from tanker accidents or leakage from house tanks containing fuel for heating), these fractures can act

  6. An ontology for regulating eHealth interoperability in developing African countries

    CSIR Research Space (South Africa)

    Moodley, D

    2013-08-01

    Full Text Available eHealth governance and regulation are necessary in low resource African countries to ensure effective and equitable use of health information technology and to realize national eHealth goals such as interoperability, adoption of standards and data...

  7. US Army and US Marine Corps Interoperability: A Bottom-up Series of Experiments

    National Research Council Canada - National Science Library

    Lynch, Rick

    2000-01-01

    In 1999-2000, an ad hoc Interoperability Team, composed of individuals from two Services and Joint organizations, was formed to help develop and coordinate a series of joint experiments between the U.S. Army and the U.S...

  8. Medical Device Plug-and-Play Interoperability Standards and Technology Leadership

    Science.gov (United States)

    2017-10-01

    reflect our vision of progressing medical device interoperability standards, whether specifically ICE-related or more generally applicable, and...White Coat Notes,” the Boston Globe online, June 2007. http://www.boston.com/yourlife/health/ blog /2007/06/getting_medical_1.html 6. Carr S, “Plug and

  9. An Approach to Semantic Interoperability for Improved Capability Exchanges in Federations of Systems

    Science.gov (United States)

    Moschoglou, Georgios

    2013-01-01

    This study seeks an affirmative answer to the question whether a knowledge-based approach to system of systems interoperation using semantic web standards and technologies can provide the centralized control of the capability for exchanging data and services lacking in a federation of systems. Given the need to collect and share real-time…

  10. A Proposed Engineering Process and Prototype Toolset for Developing C2-to-Simulation Interoperability Solutions

    NARCIS (Netherlands)

    Gautreau, B.; Khimeche, L.; Reus, N.M. de; Heffner, K.; Mevassvik, O.M.

    2014-01-01

    The Coalition Battle Management Language (C-BML) is an open standard being developed for the exchange of digitized military information among command and control (C2), simulation and autonomous systems by the Simulation Interoperability Standards Organization (SISO). As the first phase of the C-BML

  11. RuleML-Based Learning Object Interoperability on the Semantic Web

    Science.gov (United States)

    Biletskiy, Yevgen; Boley, Harold; Ranganathan, Girish R.

    2008-01-01

    Purpose: The present paper aims to describe an approach for building the Semantic Web rules for interoperation between heterogeneous learning objects, namely course outlines from different universities, and one of the rule uses: identifying (in)compatibilities between course descriptions. Design/methodology/approach: As proof of concept, a rule…

  12. Integrating IMS Learning Design and IMS Question and Test Interoperability using CopperCore Service Integration

    NARCIS (Netherlands)

    Vogten, Hubert; Martens, Harrie; Nadolski, Rob; Tattersall, Colin; Van Rosmalen, Peter; Koper, Rob

    2006-01-01

    Please, cite this publication as: Vogten, H., Martens, H., Nadolski, R., Tattersall, C., van Rosmalen, P., & Koper, R. (2006). Integrating IMS Learning Design and IMS Question and Test Interoperability using CopperCore Service Integration. Proceedings of International Workshop in Learning Networks

  13. CopperCore Service Integration, Integrating IMS Learning Design and IMS Question and Test Interoperability

    NARCIS (Netherlands)

    Vogten, Hubert; Martens, Harrie; Nadolski, Rob; Tattersall, Colin; Van Rosmalen, Peter; Koper, Rob

    2006-01-01

    Vogten, H., Martens, H., Nadolski, R., Tattersall, C., Rosmalen, van, P., Koper, R., (2006). CopperCore Service Integration, Integrating IMS Learning Design and IMS Question and Test Interoperability. Proceedings of the 6th IEEE International Conference on Advanced Learning Technologies (pp.

  14. Achieving control and interoperability through unified model-based systems and software engineering

    Science.gov (United States)

    Rasmussen, Robert; Ingham, Michel; Dvorak, Daniel

    2005-01-01

    Control and interoperation of complex systems is one of the most difficult challenges facing NASA's Exploration Systems Mission Directorate. An integrated but diverse array of vehicles, habitats, and supporting facilities, evolving over the long course of the enterprise, must perform ever more complex tasks while moving steadily away from the sphere of ground support and intervention.

  15. Educational Modelling Language: modelling reusable, interoperable, rich and personalised units of learning

    NARCIS (Netherlands)

    Koper, Rob; Manderveld, Jocelyn

    2003-01-01

    Published:
    Koper, E, J, R., & Manderveld, J. M. (2004). Educational modelling language: modelling reusable, interoperable, rich and personalised units of learning. British Journal of Educational Technology, 35 (5), 537-552.
    Please refer to the printed version of the article. Rob Koper and

  16. Globalization of the International Arms Industry: A Step Towards ABCA and NATO Interoperability?

    Science.gov (United States)

    2009-04-01

    industry should pay dividends in the technical interoperability of military forces. 9 U.S. Congress, Office of Technology Assessment, Global Arms...many other high technology industries established that the development of integrated global linkages and operations were crucial to survival.27 What...contraction, technological advances, compositional integration and continuity of first tier producers within the global arms market is forcing these

  17. Preface - Enterprise Interoperability: Proceedings of the Workshops of the Third International IFIP Working Conference IWEI 2011

    NARCIS (Netherlands)

    Johnson, Pontus; Zelm, Martin; van Sinderen, Marten J.; Doumeingts, Guy; Johnson, Pontus

    2011-01-01

    This book contains the short papers of the Third International IFIP Working Conference on Enterprise Interoperability, IWEI 2011, held March 22-23, 2011, in Stockholm, Sweden, and the papers of the co-located IWEI Workshops, held on March 21, 2011. The IWEI Working Conference highlighted

  18. IHE cross-enterprise document sharing for imaging: interoperability testing software.

    Science.gov (United States)

    Noumeir, Rita; Renaud, Bérubé

    2010-09-21

    With the deployments of Electronic Health Records (EHR), interoperability testing in healthcare is becoming crucial. EHR enables access to prior diagnostic information in order to assist in health decisions. It is a virtual system that results from the cooperation of several heterogeneous distributed systems. Interoperability between peers is therefore essential. Achieving interoperability requires various types of testing. Implementations need to be tested using software that simulates communication partners, and that provides test data and test plans. In this paper we describe a software that is used to test systems that are involved in sharing medical images within the EHR. Our software is used as part of the Integrating the Healthcare Enterprise (IHE) testing process to test the Cross Enterprise Document Sharing for imaging (XDS-I) integration profile. We describe its architecture and functionalities; we also expose the challenges encountered and discuss the elected design solutions. EHR is being deployed in several countries. The EHR infrastructure will be continuously evolving to embrace advances in the information technology domain. Our software is built on a web framework to allow for an easy evolution with web technology. The testing software is publicly available; it can be used by system implementers to test their implementations. It can also be used by site integrators to verify and test the interoperability of systems, or by developers to understand specifications ambiguities, or to resolve implementations difficulties.

  19. Interoperability Issues for Formal Authoring Processes, Community Efforts, and the Creation of Mashup PLE

    NARCIS (Netherlands)

    Klemke, Roland; Schmitz, Birgit

    2009-01-01

    Klemke, R., & Schmitz, B. (2009). Interoperability Issues for Formal Authoring Processes, Community Efforts, and the Creation of Mashup PLE. In F. Wild, M. Kalz, M. Palmér & D. Müller (Eds.), Proceedings of 2nd Workshop Mash-Up Personal Learning Environments (MUPPLE'09). Workshop in conjunction with

  20. Contribution of Clinical Archetypes, and the Challenges, towards Achieving Semantic Interoperability for EHRs.

    Science.gov (United States)

    Tapuria, Archana; Kalra, Dipak; Kobayashi, Shinji

    2013-12-01

    The objective is to introduce 'clinical archetype' which is a formal and agreed way of representing clinical information to ensure interoperability across and within Electronic Health Records (EHRs). The paper also aims at presenting the challenges building quality labeled clinical archetypes and the challenges towards achieving semantic interoperability between EHRs. Twenty years of international research, various European healthcare informatics projects and the pioneering work of the openEHR Foundation have led to the following results. The requirements for EHR information architectures have been consolidated within ISO 18308 and adopted within the ISO 13606 EHR interoperability standard. However, a generic EHR architecture cannot ensure that the clinical meaning of information from heterogeneous sources can be reliably interpreted by receiving systems and services. Therefore, clinical models called 'clinical archetypes' are required to formalize the representation of clinical information within the EHR. Part 2 of ISO 13606 defines how archetypes should be formally represented. The current challenge is to grow clinical communities to build a library of clinical archetypes and to identify how evidence of best practice and multi-professional clinical consensus should best be combined to define archetypes at the optimal level of granularity and specificity and quality label them for wide adoption. Standardizing clinical terms within EHRs using clinical terminology like Systematized Nomenclature of Medicine Clinical Terms is also a challenge. Clinical archetypes would play an important role in achieving semantic interoperability within EHRs. Attempts are being made in exploring the design and adoption challenges for clinical archetypes.

  1. Enterprise Interoperability - Proceedings of the Workshops of the Third International IFIP Working Conference IWEI 2011

    NARCIS (Netherlands)

    Zelm, Martin; van Sinderen, Marten J.; Doumeingts, Guy; Johnson, Pontus

    This book contains the short papers of the Third International IFIP Working Conference on Enterprise Interoperability, IWEI 2011, held March 22-23, 2011, in Stockholm, Sweden, and the papers of the co-located IWEI Workshops, held on March 21, 2011. The IWEI Working Conference highlighted

  2. REAL TIME SEMANTIC INTEROPERABILITY IN AD HOC NETWORKS OF GEOSPATIAL DATA SOURCES: CHALLENGES, ACHIEVEMENTS AND PERSPECTIVES

    Directory of Open Access Journals (Sweden)

    M. A. Mostafavi

    2012-07-01

    Full Text Available Recent advances in geospatial technologies have made available large amount of geospatial data. Meanwhile, new developments in Internet and communication technologies created a shift from isolated geospatial databases to ad hoc networks of geospatial data sources, where data sources can join or leave the network, and form groups to share data and services. However, effective integration and sharing of geospatial data among these data sources and their users are hampered by semantic heterogeneities. These heterogeneities affect the spatial, temporal and thematic aspects of geospatial concepts. There have been many efforts to address semantic interoperability issues in the geospatial domain. These efforts were mainly focused on resolving heterogeneities caused by different and implicit representations of the concepts. However, many approaches have focused on the thematic aspects, leaving aside the explicit representation of spatial and temporal aspects. Also, most semantic interoperability approaches for networks have focused on automating the semantic mapping process. However, the ad hoc network structure is continuously modified by source addition or removal, formation of groups, etc. This dynamic aspect is often neglected in those approaches. This paper proposes a conceptual framework for real time semantic interoperability in ad hoc networks of geospatial data sources. The conceptual framework presents the fundamental elements of real time semantic interoperability through a hierarchy of interrelated semantic states and processes. Then, we use the conceptual framework to set the discussion on the achievements that have already been made, the challenges that remain to be addressed and perspectives with respect to these challenges.

  3. IHE cross-enterprise document sharing for imaging: interoperability testing software

    Directory of Open Access Journals (Sweden)

    Renaud Bérubé

    2010-09-01

    Full Text Available Abstract Background With the deployments of Electronic Health Records (EHR, interoperability testing in healthcare is becoming crucial. EHR enables access to prior diagnostic information in order to assist in health decisions. It is a virtual system that results from the cooperation of several heterogeneous distributed systems. Interoperability between peers is therefore essential. Achieving interoperability requires various types of testing. Implementations need to be tested using software that simulates communication partners, and that provides test data and test plans. Results In this paper we describe a software that is used to test systems that are involved in sharing medical images within the EHR. Our software is used as part of the Integrating the Healthcare Enterprise (IHE testing process to test the Cross Enterprise Document Sharing for imaging (XDS-I integration profile. We describe its architecture and functionalities; we also expose the challenges encountered and discuss the elected design solutions. Conclusions EHR is being deployed in several countries. The EHR infrastructure will be continuously evolving to embrace advances in the information technology domain. Our software is built on a web framework to allow for an easy evolution with web technology. The testing software is publicly available; it can be used by system implementers to test their implementations. It can also be used by site integrators to verify and test the interoperability of systems, or by developers to understand specifications ambiguities, or to resolve implementations difficulties.

  4. A Review of Interoperability Standards in E-health and Imperatives for their Adoption in Africa

    Directory of Open Access Journals (Sweden)

    Funmi Adebesin

    2013-07-01

    Full Text Available The ability of healthcare information systems to share and exchange information (interoperate is essential to facilitate the quality and effectiveness of healthcare services. Although standardization is considered key to addressing the fragmentation currently challenging the healthcare environment, e-health standardization can be difficult for many reasons, one of which is making sense of the e-health interoperability standards landscape. Specifically aimed at the African health informatics community, this paper aims to provide an overview of e-health interoperability and the significance of standardization in its achievement. We conducted a literature study of e-health standards, their development, and the degree of participation by African countries in the process. We also provide a review of a selection of prominent e-health interoperability standards that have been widely adopted especially by developed countries, look at some of the factors that affect their adoption in Africa, and provide an overview of ongoing global initiatives to address the identified barriers. Although the paper is specifically aimed at the African community, its findings would be equally applicable to many other developing countries.

  5. Chief Information Officer's Role in Adopting an Interoperable Electronic Health Record System for Medical Data Exchange

    Science.gov (United States)

    Akpabio, Akpabio Enebong Ema

    2013-01-01

    Despite huge growth in hospital technology systems, there remains a dearth of literature examining health care administrator's perceptions of the efficacy of interoperable EHR systems. A qualitative research methodology was used in this multiple-case study to investigate the application of diffusion of innovations theory and the technology…

  6. Special issue on enabling open and interoperable access to Planetary Science and Heliophysics databases and tools

    Science.gov (United States)

    2018-01-01

    The large amount of data generated by modern space missions calls for a change of organization of data distribution and access procedures. Although long term archives exist for telescopic and space-borne observations, high-level functions need to be developed on top of these repositories to make Planetary Science and Heliophysics data more accessible and to favor interoperability. Results of simulations and reference laboratory data also need to be integrated to support and interpret the observations. Interoperable software and interfaces have recently been developed in many scientific domains. The Virtual Observatory (VO) interoperable standards developed for Astronomy by the International Virtual Observatory Alliance (IVOA) can be adapted to Planetary Sciences, as demonstrated by the VESPA (Virtual European Solar and Planetary Access) team within the Europlanet-H2020-RI project. Other communities have developed their own standards: GIS (Geographic Information System) for Earth and planetary surfaces tools, SPASE (Space Physics Archive Search and Extract) for space plasma, PDS4 (NASA Planetary Data System, version 4) and IPDA (International Planetary Data Alliance) for planetary mission archives, etc, and an effort to make them interoperable altogether is starting, including automated workflows to process related data from different sources.

  7. Interoperable web applications for sharing data and products of the International DORIS Service

    Science.gov (United States)

    Soudarin, L.; Ferrage, P.

    2017-12-01

    The International DORIS Service (IDS) was created in 2003 under the umbrella of the International Association of Geodesy (IAG) to foster scientific research related to the French satellite tracking system DORIS and to deliver scientific products, mostly related to the International Earth rotation and Reference systems Service (IERS). Since its start, the organization has continuously evolved, leading to additional and improved operational products from an expanded set of DORIS Analysis Centers. In addition, IDS has developed services for sharing data and products with the users. Metadata and interoperable web applications are proposed to explore, visualize and download the key products such as the position time series of the geodetic points materialized at the ground tracking stations. The Global Geodetic Observing System (GGOS) encourages the IAG Services to develop such interoperable facilities on their website. The objective for GGOS is to set up an interoperable portal through which the data and products produced by the IAG Services can be served to the user community. We present the web applications proposed by IDS to visualize time series of geodetic observables or to get information about the tracking ground stations and the tracked satellites. We discuss the future plans for IDS to meet the recommendations of GGOS. The presentation also addresses the needs for the IAG Services to adopt common metadata thesaurus to describe data and products, and interoperability standards to share them.

  8. Proceedings of the 1st Interoperability of Enterprise Software and Applications conference

    NARCIS (Netherlands)

    Konstantas, D.; Bourrieres, J-P.; Leonard, M.; Boudjlida, N.; Unknown, [Unknown

    2005-01-01

    Interoperability: the ability of a system or a product to work with other systems or products without special effort from the user is a key issue in manufacturing and industrial enterprise generally. It is fundamental to the production of goods and services quickly and at low cost at the same time

  9. Radio Interoperability: Addressing the Real Reasons We Don’t Communicate Well During Emergencies

    Science.gov (United States)

    2006-03-01

    Communications, First Responder Communications, Intergovernmental Relations, Procedures and Training 16. PRICE CODE 17. SECURITY CLASSIFICATION OF REPORT...recognition of the need for improved human interoperability 16 Y2K refers to millennial change of...reflexive reaction than a conscious contemplation of a range of options to be acted upon at a later date.50 This is sometimes referred to as intuitive

  10. Developing data interoperability using standards: A wheat community use case [version 2; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Esther Dzale Yeumo

    2017-12-01

    Full Text Available In this article, we present a joint effort of the wheat research community, along with data and ontology experts, to develop wheat data interoperability guidelines. Interoperability is the ability of two or more systems and devices to cooperate and exchange data, and interpret that shared information. Interoperability is a growing concern to the wheat scientific community, and agriculture in general, as the need to interpret the deluge of data obtained through high-throughput technologies grows. Agreeing on common data formats, metadata, and vocabulary standards is an important step to obtain the required data interoperability level in order to add value by encouraging data sharing, and subsequently facilitate the extraction of new information from existing and new datasets. During a period of more than 18 months, the RDA Wheat Data Interoperability Working Group (WDI-WG surveyed the wheat research community about the use of data standards, then discussed and selected a set of recommendations based on consensual criteria. The recommendations promote standards for data types identified by the wheat research community as the most important for the coming years: nucleotide sequence variants, genome annotations, phenotypes, germplasm data, gene expression experiments, and physical maps. For each of these data types, the guidelines recommend best practices in terms of use of data formats, metadata standards and ontologies. In addition to the best practices, the guidelines provide examples of tools and implementations that are likely to facilitate the adoption of the recommendations. To maximize the adoption of the recommendations, the WDI-WG used a community-driven approach that involved the wheat research community from the start, took into account their needs and practices, and provided them with a framework to keep the recommendations up to date. We also report this approach’s potential to be generalizable to other (agricultural domains.

  11. Interoperability of remote handling control system software modules at Divertor Test Platform 2 using middleware

    Energy Technology Data Exchange (ETDEWEB)

    Tuominen, Janne, E-mail: janne.m.tuominen@tut.fi [Tampere University of Technology, Department of Intelligent Hydraulics and Automation, Tampere (Finland); Rasi, Teemu; Mattila, Jouni [Tampere University of Technology, Department of Intelligent Hydraulics and Automation, Tampere (Finland); Siuko, Mikko [VTT, Technical Research Centre of Finland, Tampere (Finland); Esque, Salvador [F4E, Fusion for Energy, Torres Diagonal Litoral B3, Josep Pla2, 08019, Barcelona (Spain); Hamilton, David [ITER Organization, Route de Vinon sur Verdon, 13115 Saint Paul Lez Durance (France)

    2013-10-15

    Highlights: ► The prototype DTP2 remote handling control system is a heterogeneous collection of subsystems, each realizing a functional area of responsibility. ► Middleware provides well-known, reusable solutions to problems, such as heterogeneity, interoperability, security and dependability. ► A middleware solution was selected and integrated with the DTP2 RH control system. The middleware was successfully used to integrate all relevant subsystems and functionality was demonstrated. -- Abstract: This paper focuses on the inter-subsystem communication channels in a prototype distributed remote handling control system at Divertor Test Platform 2 (DTP2). The subsystems are responsible for specific tasks and, over the years, their development has been carried out using various platforms and programming languages. The communication channels between subsystems have different priorities, e.g. very high messaging rate and deterministic timing or high reliability in terms of individual messages. Generally, a control system's communication infrastructure should provide interoperability, scalability, performance and maintainability. An attractive approach to accomplish this is to use a standardized and proven middleware implementation. The selection of a middleware can have a major cost impact in future integration efforts. In this paper we present development done at DTP2 using the Object Management Group's (OMG) standard specification for Data Distribution Service (DDS) for ensuring communications interoperability. DDS has gained a stable foothold especially in the military field. It lacks a centralized broker, thereby avoiding a single-point-of-failure. It also includes an extensive set of Quality of Service (QoS) policies. The standard defines a platform- and programming language independent model and an interoperability wire protocol that enables DDS vendor interoperability, allowing software developers to avoid vendor lock-in situations.

  12. Definition and implementation of a SAML-XACML profile for authorization interoperability across grid middleware in OSG and EGEE

    Energy Technology Data Exchange (ETDEWEB)

    Garzoglio, Gabriele; Alderman, Ian; Altunay, Mine; Anathakrishnan, Rachana; Bester, Joe; Chadwick, Keith; Ciaschini, Vincenzo; Demchenko, Yuri; Ferraro, Andrea; Forti, Alberto; Groep, David; /Fermilab /NIKHEF, Amsterdam /Brookhaven /Amsterdam U. /SWITCH, Zurich /Bergen U. /INFN, CNAF /Argonne /Wisconsin U., Madison

    2009-04-01

    In order to ensure interoperability between middleware and authorization infrastructures used in the Open Science Grid (OSG) and the Enabling Grids for E-sciencE (EGEE) projects, an Authorization Interoperability activity was initiated in 2006. The interoperability goal was met in two phases: first, agreeing on a common authorization query interface and protocol with an associated profile that ensures standardized use of attributes and obligations; and second, implementing, testing, and deploying, on OSG and EGEE, middleware that supports the interoperability protocol and profile. The activity has involved people from OSG, EGEE, the Globus Toolkit project, and the Condor project. This paper presents a summary of the agreed-upon protocol, profile and the software components involved.

  13. A Network-Centric Enterprise Service for Mediation and Interoperability: The Dynamic Operational Object Registration Service (DOORS)

    National Research Council Canada - National Science Library

    Bollers, Jonathan C

    2004-01-01

    .... To achieve information superiority while engaged in such operations, commanders must transform component C2/C4I system data into interoperable information and shared knowledge, making the result...

  14. An architecture and reference implementation of an open health information mediator: enabling interoperability in the Rwandan health information exchange

    CSIR Research Space (South Africa)

    Crichton, R

    2013-06-01

    Full Text Available -1 Foundations of Health Information Engineering and Systems - FHIES 2012 June 2013/ Volume 7789, pp 87-104 An Architecture and Reference Implementation of an Open Health Information Mediator: Enabling Interoperability in the Rwandan Health Information...

  15. Implementation of a metadata architecture and knowledge collection to support semantic interoperability in an enterprise data warehouse.

    Science.gov (United States)

    Dhaval, Rakesh; Borlawsky, Tara; Ostrander, Michael; Santangelo, Jennifer; Kamal, Jyoti; Payne, Philip R O

    2008-11-06

    In order to enhance interoperability between enterprise systems, and improve data validity and reliability throughout The Ohio State University Medical Center (OSUMC), we have initiated the development of an ontology-anchored metadata architecture and knowledge collection for our enterprise data warehouse. The metadata and corresponding semantic relationships stored in the OSUMC knowledge collection are intended to promote consistency and interoperability across the heterogeneous clinical, research, business and education information managed within the data warehouse.

  16. Common business objects: Demonstrating interoperability in the oil and gas industry

    International Nuclear Information System (INIS)

    McLellan, S.G.; Abusalbi, N.; Brown, J.; Quinlivan, W.F.

    1997-01-01

    The PetroTechnical Open Software Corp. (POSC) was organized in 1990 to define technical methods to make it easier to design interoperable data solutions for oil and gas companies. When POSC rolls out seed implementations, oilfield service members must validate them, correct any errors or ambiguities, and champion these corrections into the original specifications before full integration into POSC-compliant, commercial products. Organizations like POSC are assuming a new role of promoting formation of projects where E and P companies and vendors jointly test their pieces of the migration puzzle on small subsets of the whole problem. The authors describe three such joint projects. While confirming the value of such open cross-company cooperation, these cases also help to redefine interoperability in terms of business objects that will be common across oilfield companies, their applications, access software, data, or data stores

  17. NASA and Industry Benefits of ACTS High Speed Network Interoperability Experiments

    Science.gov (United States)

    Zernic, M. J.; Beering, D. R.; Brooks, D. E.

    2000-01-01

    This paper provides synopses of the design. implementation, and results of key high data rate communications experiments utilizing the technologies of NASA's Advanced Communications Technology Satellite (ACTS). Specifically, the network protocol and interoperability performance aspects will be highlighted. The objectives of these key experiments will be discussed in their relevant context to NASA missions, as well as, to the comprehensive communications industry. Discussion of the experiment implementation will highlight the technical aspects of hybrid network connectivity, a variety of high-speed interoperability architectures, a variety of network node platforms, protocol layers, internet-based applications, and new work focused on distinguishing between link errors and congestion. In addition, this paper describes the impact of leveraging government-industry partnerships to achieve technical progress and forge synergistic relationships. These relationships will be the key to success as NASA seeks to combine commercially available technology with its own internal technology developments to realize more robust and cost effective communications for space operations.

  18. Enabling Technologies for Smart Grid Integration and Interoperability of Electric Vehicles

    DEFF Research Database (Denmark)

    Martinenas, Sergejus

    Conventional, centralized power plants are being replaced by intermittent, distributed renewable energy sources, thus raising the concern about the stability of the power grid in its current state. All the while, electrification of all forms of transportation is increasing the load on the transfo......Conventional, centralized power plants are being replaced by intermittent, distributed renewable energy sources, thus raising the concern about the stability of the power grid in its current state. All the while, electrification of all forms of transportation is increasing the load...... of interoperability in the field of e-mobility, investigated in the COTEVOS project, is explored. It is concluded, that collective testing of the OEM equipment in testing symposiums, is the best way to ensure interoperability between different OEMs, and to discuss as well as fix the issues in the standard itself...

  19. Combining Archetypes with Fast Health Interoperability Resources in Future-proof Health Information Systems.

    Science.gov (United States)

    Bosca, Diego; Moner, David; Maldonado, Jose Alberto; Robles, Montserrat

    2015-01-01

    Messaging standards, and specifically HL7 v2, are heavily used for the communication and interoperability of Health Information Systems. HL7 FHIR was created as an evolution of the messaging standards to achieve semantic interoperability. FHIR is somehow similar to other approaches like the dual model methodology as both are based on the precise modeling of clinical information. In this paper, we demonstrate how we can apply the dual model methodology to standards like FHIR. We show the usefulness of this approach for data transformation between FHIR and other specifications such as HL7 CDA, EN ISO 13606, and openEHR. We also discuss the advantages and disadvantages of defining archetypes over FHIR, and the consequences and outcomes of this approach. Finally, we exemplify this approach by creating a testing data server that supports both FHIR resources and archetypes.

  20. Interoperability Matter: Levels of Data Sharing, Starting from a 3d Information Modelling

    Science.gov (United States)

    Tommasi, C.; Achille, C.

    2017-02-01

    Nowadays, the adoption of BIM processes in the AEC (Architecture, Engineering and Construction) industry means to be oriented towards synergistic workflows, based on informative instruments capable of realizing the virtual model of the building. The target of this article is to speak about the interoperability matter, approaching the subject through a theoretical part and also a practice example, in order to show how these notions are applicable in real situations. In particular, the case study analysed belongs to the Cultural Heritage field, where it is possible to find some difficulties - both in the modelling and sharing phases - due to the complexity of shapes and elements. Focusing on the interoperability between different software, the questions are: What and how many kind of information can I share? Given that this process leads also to a standardization of the modelled parts, is there the possibility of an accuracy loss?

  1. End-to-end interoperability and workflows from building architecture design to one or more simulations

    Energy Technology Data Exchange (ETDEWEB)

    Chao, Tian-Jy; Kim, Younghun

    2015-02-10

    An end-to-end interoperability and workflows from building architecture design to one or more simulations, in one aspect, may comprise establishing a BIM enablement platform architecture. A data model defines data entities and entity relationships for enabling the interoperability and workflows. A data definition language may be implemented that defines and creates a table schema of a database associated with the data model. Data management services and/or application programming interfaces may be implemented for interacting with the data model. Web services may also be provided for interacting with the data model via the Web. A user interface may be implemented that communicates with users and uses the BIM enablement platform architecture, the data model, the data definition language, data management services and application programming interfaces to provide functions to the users to perform work related to building information management.

  2. Semantic Interoperability in Czech Healthcare Environment Supported by HL7 Version 3

    Czech Academy of Sciences Publication Activity Database

    Nagy, Miroslav; Hanzlíček, Petr; Přečková, Petra; Říha, Antonín; Dioszegi, Matěj; Seidl, Libor; Zvárová, Jana

    2010-01-01

    Roč. 49, č. 2 (2010), s. 186-195 ISSN 0026-1270 R&D Projects: GA MŠk(CZ) 1M06014; GA AV ČR 1ET200300413 Institutional research plan: CEZ:AV0Z10300504 Keywords : information storage and retrieval * electronic health record * HL7 * semantic interoperability * communication standards Subject RIV: IN - Informatics, Computer Science Impact factor: 1.472, year: 2010

  3. Achieving Interoperability Through Base Registries for Governmental Services and Document Management

    Science.gov (United States)

    Charalabidis, Yannis; Lampathaki, Fenareti; Askounis, Dimitris

    As digital infrastructures increase their presence worldwide, following the efforts of governments to provide citizens and businesses with high-quality one-stop services, there is a growing need for the systematic management of those newly defined and constantly transforming processes and electronic documents. E-government Interoperability Frameworks usually cater to the technical standards of e-government systems interconnection, but do not address service composition and use by citizens, businesses, or other administrations.

  4. Medical Device Plug-and-Play Interoperability Standards and Technology Leadership

    Science.gov (United States)

    2010-10-01

    Army position , policy or decision unless so designated by other documentation. REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188...software tools) and should position us to apply eventually for a center grant. We collaborated with the University of Pennsylvania on an NSF...Informatics Philips Research North America FiO2 Control in Preterm Infants – A Case for Device Interoperability Dale Wiggins Vice President and CTO

  5. Analysis of the proposed Jordan's Emergency Communication Interoperability Plan (JECIP) for disaster response

    OpenAIRE

    Alzaghal, Mohamad H.

    2008-01-01

    Recently, the world has been affected by man-made and natural disasters of a level not shown before which depicts the importance of communication for an efficient and rapid response of First Responder Community (FRC) members in the field. The resilience of communication infrastructure is vital for the well being of any country. It is essential to build a robust and interoperable Information and Communication Technology (ICT) infrastructure before the disaster, which will facilitate patch...

  6. Electronic Health Records: VA and DOD Need to Establish Goals and Metrics for Their Interoperability Efforts

    Science.gov (United States)

    2015-10-27

    systems. What GAO Recommends In its August 2015 report, GAO recommended that VA and DOD, working with the IPO , establish a time frame for...Veterans Affairs (VA) and Defense (DOD), with guidance from the Interagency Program Office ( IPO ) tasked with facilitating the departments’ efforts to...In accordance with its responsibilities, the IPO issued guidance outlining the technical approach for achieving interoperability between the

  7. Fundamental Data Standards for Science Data System Interoperability and Data Correlation

    Science.gov (United States)

    Hughes, J. Steven; Gopala Krishna, Barla; Rye, Elizabeth; Crichton, Daniel

    The advent of the Web and languages such as XML have brought an explosion of online science data repositories and the promises of correlated data and interoperable systems. However there have been relatively few successes in meeting the expectations of science users in the internet age. For example a Google-like search for images of Mars will return many highly-derived and appropriately tagged images but largely ignore the majority of images in most online image repositories. Once retrieved, users are further frustrated by poor data descriptions, arcane formats, and badly organized ancillary information. A wealth of research indicates that shared information models are needed to enable system interoperability and data correlation. However, at a more fundamental level, data correlation and system interoperability are dependant on a relatively few shared data standards. A com-mon data dictionary standard, for example, allows the controlled vocabulary used in a science repository to be shared with potential collaborators. Common data registry and product iden-tification standards enable systems to efficiently find, locate, and retrieve data products and their metadata from remote repositories. Information content standards define categories of descriptive data that help make the data products scientifically useful to users who were not part of the original team that produced the data. The Planetary Data System (PDS) has a plan to move the PDS to a fully online, federated system. This plan addresses new demands on the system including increasing data volume, numbers of missions, and complexity of missions. A key component of this plan is the upgrade of the PDS Data Standards. The adoption of the core PDS data standards by the International Planetary Data Alliance (IPDA) adds the element of international cooperation to the plan. This presentation will provide an overview of the fundamental data standards being adopted by the PDS that transcend science domains and that

  8. U.S. Navy Interoperability with its High-End Allies

    Science.gov (United States)

    2000-10-01

    7 Frederic Ruiz-Ramon, “Is There an Interoperability Gap,” Seguridad y Communicaciones, Volume 11 (May...significant assets to support operations in this arena. For example, the French Navy will soon be putting a nuclear carrier, Charles de Gaulle, into...are out of step with many of its allied submarine forces. Many allied submarines use HF communications and do not have the hardware for SATCOM, which

  9. Building a Global Earth Observation System of Systems (GEOSS) and Its Interoperability Challenges

    Science.gov (United States)

    Ryan, B. J.

    2015-12-01

    Launched in 2005 by industrialized nations, the Group on Earth Observations (GEO) began building the Global Earth Observation System of Systems (GEOSS). Consisting of both a policy framework, and an information infrastructure, GEOSS, was intended to link and/or integrate the multitude of Earth observation systems, primarily operated by its Member Countries and Participating Organizations, so that users could more readily benefit from global information assets for a number of society's key environmental issues. It was recognized that having ready access to observations from multiple systems was a prerequisite for both environmental decision-making, as well as economic development. From the very start, it was also recognized that the shear complexity of the Earth's system cannot be captured by any single observation system, and that a federated, interoperable approach was necessary. While this international effort has met with much success, primarily in advancing broad, open data policies and practices, challenges remain. In 2014 (Geneva, Switzerland) and 2015 (Mexico City, Mexico), Ministers from GEO's Member Countries, including the European Commission, came together to assess progress made during the first decade (2005 to 2015), and approve implementation strategies and mechanisms for the second decade (2016 to 2025), respectively. The approved implementation strategies and mechanisms are intended to advance GEOSS development thereby facilitating the increased uptake of Earth observations for informed decision-making. Clearly there are interoperability challenges that are technological in nature, and several will be discussed in this presentation. There are, however, interoperability challenges that can be better characterized as economic, governmental and/or political in nature, and these will be discussed as well. With the emergence of the Sustainable Development Goals (SDGs), the World Conference on Disaster Risk Reduction (WCDRR), and the United Nations

  10. Autonomous Underwater Vehicle Data Management and Metadata Interoperability for Coastal Ocean Studies

    Science.gov (United States)

    McCann, M. P.; Ryan, J. P.; Chavez, F. P.; Rienecker, E.

    2004-12-01

    Data from over 1000 km of Autonomous Underwater Vehicle (AUV) surveys of Monterey Bay have been collected and cataloged in an ocean observatory data management system. The Monterey Bay Aquarium Institute's AUV is equipped with a suite of instruments that include a conductivity, temperature, depth (CTD) instrument, transmissometers, a fluorometer, a nitrate sensor, and an inertial navigation system. Data are logged on the vehicle and upon completion of a survey XML descriptions of the data are submitted to the Shore Side Data System (SSDS). Instrument data are then processed on shore to apply calibrations and produce scientifically useful data products. The SSDS employs a data model that tracks data from the instrument that created it through all the consuming processes that generate derived products. SSDS employs OPeNDAP and netCDF to provide data set interoperability at the data level. The core of SSDS is the metadata that is the catalog of these data sets and their relation to all other relevant data. The metadata is managed in a relational database and governed by a Enterprise Java Bean (EJB) server application. Cross-platform Java applications have been written to manage and visualize these data. A Java Swing application - the Hierarchical Ocean Observatory Visualization and Editing System (HOOVES) - has been developed to provide visualization of data set pedigree and data set variables. Because the SSDS data model is generalized according to "Data Producers" and "Data Containers" many different types of data can be represented in SSDS allowing for interoperability at a metadata level. Comparisons of appropriate data sets, whether they are from an autonomous underwater vehicle or from a fixed mooring are easily made using SSDS. The authors will present the SSDS data model and show examples of how the model helps organize data set metadata allowing for data discovery and interoperability. With improved discovery and interoperability the system is helping us

  11. In vivo evaluation of inter-operator reproducibility of digital dental and conventional impression techniques

    Science.gov (United States)

    Kamimura, Emi; Tanaka, Shinpei; Takaba, Masayuki; Tachi, Keita; Baba, Kazuyoshi

    2017-01-01

    Purpose The aim of this study was to evaluate and compare the inter-operator reproducibility of three-dimensional (3D) images of teeth captured by a digital impression technique to a conventional impression technique in vivo. Materials and methods Twelve participants with complete natural dentition were included in this study. A digital impression of the mandibular molars of these participants was made by two operators with different levels of clinical experience, 3 or 16 years, using an intra-oral scanner (Lava COS, 3M ESPE). A silicone impression also was made by the same operators using the double mix impression technique (Imprint3, 3M ESPE). Stereolithography (STL) data were directly exported from the Lava COS system, while STL data of a plaster model made from silicone impression were captured by a three-dimensional (3D) laboratory scanner (D810, 3shape). The STL datasets recorded by two different operators were compared using 3D evaluation software and superimposed using the best-fit-algorithm method (least-squares method, PolyWorks, InnovMetric Software) for each impression technique. Inter-operator reproducibility as evaluated by average discrepancies of corresponding 3D data was compared between the two techniques (Wilcoxon signed-rank test). Results The visual inspection of superimposed datasets revealed that discrepancies between repeated digital impression were smaller than observed with silicone impression. Confirmation was forthcoming from statistical analysis revealing significantly smaller average inter-operator reproducibility using a digital impression technique (0.014± 0.02 mm) than when using a conventional impression technique (0.023 ± 0.01 mm). Conclusion The results of this in vivo study suggest that inter-operator reproducibility with a digital impression technique may be better than that of a conventional impression technique and is independent of the clinical experience of the operator. PMID:28636642

  12. Building a portable data and information interoperability infrastructure-framework for a standard Taiwan Electronic Medical Record Template.

    Science.gov (United States)

    Jian, Wen-Shan; Hsu, Chien-Yeh; Hao, Te-Hui; Wen, Hsyien-Chia; Hsu, Min-Huei; Lee, Yen-Liang; Li, Yu-Chuan; Chang, Polun

    2007-11-01

    Traditional electronic health record (EHR) data are produced from various hospital information systems. They could not have existed independently without an information system until the incarnation of XML technology. The interoperability of a healthcare system can be divided into two dimensions: functional interoperability and semantic interoperability. Currently, no single EHR standard exists that provides complete EHR interoperability. In order to establish a national EHR standard, we developed a set of local EHR templates. The Taiwan Electronic Medical Record Template (TMT) is a standard that aims to achieve semantic interoperability in EHR exchanges nationally. The TMT architecture is basically composed of forms, components, sections, and elements. Data stored in the elements which can be referenced by the code set, data type, and narrative block. The TMT was established with the following requirements in mind: (1) transformable to international standards; (2) having a minimal impact on the existing healthcare system; (3) easy to implement and deploy, and (4) compliant with Taiwan's current laws and regulations. The TMT provides a basis for building a portable, interoperable information infrastructure for EHR exchange in Taiwan.

  13. OR.NET: a service-oriented architecture for safe and dynamic medical device interoperability.

    Science.gov (United States)

    Kasparick, Martin; Schmitz, Malte; Andersen, Björn; Rockstroh, Max; Franke, Stefan; Schlichting, Stefan; Golatowski, Frank; Timmermann, Dirk

    2018-02-23

    Modern surgical departments are characterized by a high degree of automation supporting complex procedures. It recently became apparent that integrated operating rooms can improve the quality of care, simplify clinical workflows, and mitigate equipment-related incidents and human errors. Particularly using computer assistance based on data from integrated surgical devices is a promising opportunity. However, the lack of manufacturer-independent interoperability often prevents the deployment of collaborative assistive systems. The German flagship project OR.NET has therefore developed, implemented, validated, and standardized concepts for open medical device interoperability. This paper describes the universal OR.NET interoperability concept enabling a safe and dynamic manufacturer-independent interconnection of point-of-care (PoC) medical devices in the operating room and the whole clinic. It is based on a protocol specifically addressing the requirements of device-to-device communication, yet also provides solutions for connecting the clinical information technology (IT) infrastructure. We present the concept of a service-oriented medical device architecture (SOMDA) as well as an introduction to the technical specification implementing the SOMDA paradigm, currently being standardized within the IEEE 11073 service-oriented device connectivity (SDC) series. In addition, the Session concept is introduced as a key enabler for safe device interconnection in highly dynamic ensembles of networked medical devices; and finally, some security aspects of a SOMDA are discussed.

  14. Breaking barriers to interoperability: assigning spatially and temporally unique identifiers to spaces and buildings.

    Science.gov (United States)

    Pyke, Christopher R; Madan, Isaac

    2013-08-01

    The real estate industry routinely uses specialized information systems for functions, including design, construction, facilities management, brokerage, tax assessment, and utilities. These systems are mature and effective within vertically integrated market segments. However, new questions are reaching across these traditional information silos. For example, buyers may be interested in evaluating the design, energy efficiency characteristics, and operational performance of a commercial building. This requires the integration of information across multiple databases held by different institutions. Today, this type of data integration is difficult to automate and propone to errors due, in part, to the lack of generally accepted building and spaces identifiers. Moving forward, the real estate industry needs a new mechanism to assign identifiers for whole buildings and interior spaces for the purpose of interoperability, data exchange, and integration. This paper describes a systematic process to identify activities occurring at building or within interior spaces to provide a foundation for exchange and interoperability. We demonstrate the application of the approach with a prototype Web application. This concept and demonstration illustrate the elements of a practical interoperability framework that can increase productivity, create new business opportunities, and reduce errors, waste, and redundancy. © 2013 New York Academy of Sciences.

  15. Interoperability of Geographic Information: A Communication Process –Based Prototype

    Directory of Open Access Journals (Sweden)

    Jean Brodeur

    2005-04-01

    Full Text Available Since 1990, municipal, state/provincial, and federal governments have developed numerous geographic databases over the years to fulfill organizations' specific needs. As such, same real world topographic phenomena have been abstracted differently, for instance vegetation (surface, trees (surface, wooded area (line, wooded area (point and line, milieu boisé (surface, zone boisée (unknown geometry. Today, information about these geographic phenomena is accessible on the Internet from Web infrastructures specially developed to simplify their access. Early in the nineties, the development of interoperability of geographic information has been undertaken to solve syntactic, structural, and semantic heterogeneities as well as spatial and temporal heterogeneities to facilitate sharing and integration of such data. Recently, we have proposed a new conceptual framework for interoperability of geographic information based on the human communication process, cognitive science, and ontology, and introduced geosemantic proximity, a reasoning methodology to qualify dynamically the semantic similarity between geographic abstractions. This framework could be of interest to other disciplines. This paper presents the details of our framework for interoperability of geographic information as well as a prototype.

  16. CCR+: Metadata Based Extended Personal Health Record Data Model Interoperable with the ASTM CCR Standard.

    Science.gov (United States)

    Park, Yu Rang; Yoon, Young Jo; Jang, Tae Hun; Seo, Hwa Jeong; Kim, Ju Han

    2014-01-01

    Extension of the standard model while retaining compliance with it is a challenging issue because there is currently no method for semantically or syntactically verifying an extended data model. A metadata-based extended model, named CCR+, was designed and implemented to achieve interoperability between standard and extended models. Furthermore, a multilayered validation method was devised to validate the standard and extended models. The American Society for Testing and Materials (ASTM) Community Care Record (CCR) standard was selected to evaluate the CCR+ model; two CCR and one CCR+ XML files were evaluated. In total, 188 metadata were extracted from the ASTM CCR standard; these metadata are semantically interconnected and registered in the metadata registry. An extended-data-model-specific validation file was generated from these metadata. This file can be used in a smartphone application (Health Avatar CCR+) as a part of a multilayered validation. The new CCR+ model was successfully evaluated via a patient-centric exchange scenario involving multiple hospitals, with the results supporting both syntactic and semantic interoperability between the standard CCR and extended, CCR+, model. A feasible method for delivering an extended model that complies with the standard model is presented herein. There is a great need to extend static standard models such as the ASTM CCR in various domains: the methods presented here represent an important reference for achieving interoperability between standard and extended models.

  17. The e-MapScholar project—an example of interoperability in GIScience education

    Science.gov (United States)

    Purves, R. S.; Medyckyj-Scott, D. J.; Mackaness, W. A.

    2005-03-01

    The proliferation of the use of digital spatial data in learning and teaching provides a set of opportunities and challenges for the development of e-learning materials suitable for use by a broad spectrum of disciplines in Higher Education. Effective e-learning materials must both provide engaging materials with which the learner can interact and be relevant to the learners' disciplinary and background knowledge. Interoperability aims to allow sharing of data and materials through the use of common agreements and specifications. Shared learning materials can take advantage of interoperable components to provide customisable components, and must consider issues in sharing data across institutional borders. The e-MapScholar project delivers teaching materials related to spatial data, which are customisable with respect to both context and location. Issues in the provision of such interoperable materials are discussed, including suitable levels of granularity of materials, the provision of tools to facilitate customisation and mechanisms to deliver multiple data sets and the metadata issues related to such materials. The examples shown make extensive use of the OpenGIS consortium specifications in the delivery of spatial data.

  18. A Tale of Two Observing Systems: Interoperability in the World of Microsoft Windows

    Science.gov (United States)

    Babin, B. L.; Hu, L.

    2008-12-01

    Louisiana Universities Marine Consortium's (LUMCON) and Dauphin Island Sea Lab's (DISL) Environmental Monitoring System provide a unified coastal ocean observing system. These two systems are mirrored to maintain autonomy while offering an integrated data sharing environment. Both systems collect data via Campbell Scientific Data loggers, store the data in Microsoft SQL servers, and disseminate the data in real- time on the World Wide Web via Microsoft Internet Information Servers and Active Server Pages (ASP). The utilization of Microsoft Windows technologies presented many challenges to these observing systems as open source tools for interoperability grow. The current open source tools often require the installation of additional software. In order to make data available through common standards formats, "home grown" software has been developed. One example of this is the development of software to generate xml files for transmission to the National Data Buoy Center (NDBC). OOSTethys partners develop, test and implement easy-to-use, open-source, OGC-compliant software., and have created a working prototype of networked, semantically interoperable, real-time data systems. Partnering with OOSTethys, we are developing a cookbook to implement OGC web services. The implementation will be written in ASP, will run in a Microsoft operating system environment, and will serve data via Sensor Observation Services (SOS). This cookbook will give observing systems running Microsoft Windows the tools to easily participate in the Open Geospatial Consortium (OGC) Oceans Interoperability Experiment (OCEANS IE).

  19. A Service Oriented Architecture Approach to Achieve Interoperability between Immunization Information Systems in Iran.

    Science.gov (United States)

    Hosseini, Masoud; Ahmadi, Maryam; Dixon, Brian E

    2014-01-01

    Clinical decision support (CDS) systems can support vaccine forecasting and immunization reminders; however, immunization decision-making requires data from fragmented, independent systems. Interoperability and accurate data exchange between immunization information systems (IIS) is an essential factor to utilize Immunization CDS systems. Service oriented architecture (SOA) and Health Level 7 (HL7) are dominant standards for web-based exchange of clinical information. We implemented a system based on SOA and HL7 v3 to support immunization CDS in Iran. We evaluated system performance by exchanging 1500 immunization records for roughly 400 infants between two IISs. System turnaround time is less than a minute for synchronous operation calls and the retrieved immunization history of infants were always identical in different systems. CDS generated reports were accordant to immunization guidelines and the calculations for next visit times were accurate. Interoperability is rare or nonexistent between IIS. Since inter-state data exchange is rare in United States, this approach could be a good prototype to achieve interoperability of immunization information.

  20. Standards-based Content Resources: A Prerequisite for Content Integration and Content Interoperability

    Directory of Open Access Journals (Sweden)

    Christian Galinski

    2010-05-01

    Full Text Available Objective: to show how standards-based approaches for content standardization, content management, content related services and tools as well as the respective certification systems not only guarantee reliable content integration and content interoperability, but also are of particular benefit to people with special needs in eAccessibility/eInclusion. Method: document MoU/MG/05 N0221 ''Semantic Interoperability and the need for a coherent policy for a framework of distributed, possibly federated repositories for all kinds of content items on a world-wide scale''2, which was adopted in 2005, was a first step towards the formulation of global interoperability requirements for structured content. These requirements -based on advanced terminological principles- were taken up in EU-projects such as IN-SAFETY (INfrastructure and SAFETY and OASIS (Open architecture for Accessible Services Integration and Standardization. Results: Content integration and content interoperability are key concepts in connection with the emergence of state-of-the-art distributed and federated databases/repositories of structured content. Given the fact that linguistic content items are increasingly combined with or embedded in non-linguistic content items (and vice versa, a systemic and generic approach to data modelling and content management has become the order of the day. Fulfilling the requirements of capability for multilinguality and multimodality, based on open standards makes software and database design fit for eAccessibility/eInclusion from the outset. It also makes structured content capable for global content integration and content interoperability, because it enhances its potential for being re-used and re-purposed in totally different eApplications. Such content as well as the methods, tools and services applied can be subject to new kinds of certification schemes which also should be based on standards. Conclusions: Content must be totally reliable in some

  1. Sustainability of Open-Source Software Organizations as Underpinning for Sustainable Interoperability on Large Scales

    Science.gov (United States)

    Fulker, D. W.; Gallagher, J. H. R.

    2015-12-01

    OPeNDAP's Hyrax data server is an open-source framework fostering interoperability via easily-deployed Web services. Compatible with solutions listed in the (PA001) session description—federation, rigid standards and brokering/mediation—the framework can support tight or loose coupling, even with dependence on community-contributed software. Hyrax is a Web-services framework with a middleware-like design and a handler-style architecture that together reduce the interoperability challenge (for N datatypes and M user contexts) to an O(N+M) problem, similar to brokering. Combined with an open-source ethos, this reduction makes Hyrax a community tool for gaining interoperability. E.g., in its response to the Big Earth Data Initiative (BEDI), NASA references OPeNDAP-based interoperability. Assuming its suitability, the question becomes: how sustainable is OPeNDAP, a small not-for-profit that produces open-source software, i.e., has no software-sales? In other words, if geoscience interoperability depends on OPeNDAP and similar organizations, are those entities in turn sustainable? Jim Collins (in Good to Great) highlights three questions that successful companies can answer (paraphrased here): What is your passion? Where is your world-class excellence? What drives your economic engine? We attempt to shed light on OPeNDAP sustainability by examining these. Passion: OPeNDAP has a focused passion for improving the effectiveness of scientific data sharing and use, as deeply-cooperative community endeavors. Excellence: OPeNDAP has few peers in remote, scientific data access. Skills include computer science with experience in data science, (operational, secure) Web services, and software design (for servers and clients, where the latter vary from Web pages to standalone apps and end-user programs). Economic Engine: OPeNDAP is an engineering services organization more than a product company, despite software being key to OPeNDAP's reputation. In essence, provision of

  2. Interoperability Trends in Extravehicular Activity (EVA) Space Operations for the 21st Century

    Science.gov (United States)

    Miller, Gerald E.

    1999-01-01

    No other space operations in the 21 st century more comprehensively embody the challenges and dependencies of interoperability than EVA. This discipline is already functioning at an W1paralleled level of interagency, inter-organizational and international cooperation. This trend will only increase as space programs endeavor to expand in the face of shrinking budgets. Among the topics examined in this paper are hardware-oriented issues. Differences in design standards among various space participants dictate differences in the EVA tools that must be manufactured, flown and maintained on-orbit. Presently only two types of functional space suits exist in the world. However, three versions of functional airlocks are in operation. Of the three airlocks, only the International Space Station (ISS) Joint Airlock can accommodate both types of suits. Due to functional differences in the suits, completely different operating protocols are required for each. Should additional space suit or airlock designs become available, the complexity will increase. The lessons learned as a result of designing and operating within such a system are explored. This paper also examines the non-hardware challenges presented by interoperability for a discipline that is as uniquely dependent upon the individual as EVA. Operation of space suits (essentially single-person spacecrafts) by persons whose native language is not that of the suits' designers is explored. The intricacies of shared mission planning, shared control and shared execution of joint EVA's are explained. For example, once ISS is fully functional, the potential exists for two crewmembers of different nationality to be wearing suits manufactured and controlled by a third nation, while operating within an airlock manufactured and controlled by a fourth nation, in an effort to perform tasks upon hardware belonging to a fifth nation. Everything from training issues, to procedures development and writing, to real-time operations is

  3. Launching an EarthCube Interoperability Workbench for Constructing Workflows and Employing Service Interfaces

    Science.gov (United States)

    Fulker, D. W.; Pearlman, F.; Pearlman, J.; Arctur, D. K.; Signell, R. P.

    2016-12-01

    A major challenge for geoscientists—and a key motivation for the National Science Foundation's EarchCube initiative—is to integrate data across disciplines, as is necessary for complex Earth-system studies such as climate change. The attendant technical and social complexities have led EarthCube participants to devise a system-of-systems architectural concept. Its centerpiece is a (virtual) interoperability workbench, around which a learning community can coalesce, supported in their evolving quests to join data from diverse sources, to synthesize new forms of data depicting Earth phenomena, and to overcome immense obstacles that arise, for example, from mismatched nomenclatures, projections, mesh geometries and spatial-temporal scales. The full architectural concept will require significant time and resources to implement, but this presentation describes a (minimal) starter kit. With a keep-it-simple mantra this workbench starter kit can fulfill the following four objectives: 1) demonstrate the feasibility of an interoperability workbench by mid-2017; 2) showcase scientifically useful examples of cross-domain interoperability, drawn, e.g., from funded EarthCube projects; 3) highlight selected aspects of EarthCube's architectural concept, such as a system of systems (SoS) linked via service interfaces; 4) demonstrate how workflows can be designed and used in a manner that enables sharing, promotes collaboration and fosters learning. The outcome, despite its simplicity, will embody service interfaces sufficient to construct—from extant components—data-integration and data-synthesis workflows involving multiple geoscience domains. Tentatively, the starter kit will build on the Jupyter Notebook web application, augmented with libraries for interfacing current services (at data centers involved in EarthCube's Council of Data Facilities, e.g.) and services developed specifically for EarthCube and spanning most geoscience domains.

  4. INTEROPERABILITY AND STANDARDISATION IN THE DEPARTMENT OF DEFENCE: AN EXPLORATORY STUDY

    Directory of Open Access Journals (Sweden)

    J. De Waal

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: The political changes in South Africa have extended its international obligations by actively involving it in the social wellbeing of troubled African states. Under the auspices of the United Nations, this role is manifested in peacekeeping operations and other standard international practices. The ability of African allied forces to train, exercise, and operate efficiently, effectively, and economically together depends on the interoperability of their operational procedures, doctrine, administration, materiel and technology. This implies that all parties must have the same interpretation of ‘interoperability’. In this study, a conceptual model that explains interoperability and standardisation in terms of a systems hierarchy and the systems engineering process is developed. The study also explores the level of understanding of interoperability in the South African Department of Defence in terms of the levels of standardisation and its relationship to the concepts of systems, systems hierarchy, and systems engineering.

    AFRIKAANSE OPSOMMING: Die politieke veranderinge in Suid-Afrika het daartoe aanleiding gegee dat verdere internasionale verpligtinge die land opgelê is. Suid-Afrika, in samewerking met mede-Afrika lande en onder toesig van die Verenigde Nasies, moet deur middel van vredesoperasies by onstabiele Afrika lande betrokke raak. Die vermoë om gesamentlik aan vredesopleiding, vredesoefeninge en vredesoperasies op ‘n effektiewe, doeltreffende en ekonomiese wyse deel te neem, vereis dat daar versoenbaarheid tussen onderlinge operasionele prosedures, doktrine, administrasie, materieel en tegnologie is. Dit beteken dat alle partye eens omtrent die begrip ‘versoenbaarheid’ moet wees. In hierdie studie is ‘n konseptuele model wat versoenbaarheid en standaardisasie verduidelik in terme van die stelselhiërargie en die stelselingenieursweseproses ontwikkel. Hierdie studie het ook die vlak van begrip en

  5. Interoperability in healthcare: major challenges in the creation of the enterprise environment

    Science.gov (United States)

    Lindsköld, L.; Wintell, M.; Lundberg, N.

    2009-02-01

    There is today a lack of interoperability in healthcare although the need for it is obvious. A new healthcare enterprise environment has been deployed for secure healthcare interoperability in the Western Region in Sweden (WRS). This paper is an empirical overview of the new enterprise environment supporting regional shared and transparent radiology domain information in the WRS. The enterprise environment compromises 17 radiology departments, 1,5 million inhabitants, using different RIS and PACS in a joint work-oriented network and additional cardiology, dentistry and clinical physiology departments. More than 160 terabytes of information are stored in the enterprise repository. Interoperability is developed according to the IHE mission, i.e. applying standards such as Digital Imaging and Communication in Medicine (DICOM) and Health Level 7 (HL7) to address specific clinical communication needs and support optimal patient care. The entire enterprise environment is implemented and used daily in WRS. The central prerequisites in the development of the enterprise environment in western region of Sweden were: 1) information harmonization, 2) reuse of standardized messages e.g. HL7 v2.x and v3.x, 3) development of a holistic information domain including both text and images, and 4) to create a continuous and dynamic update functionality. The central challenges in this project were: 1) the many different vendors acting in the region and the negotiations with them to apply communication roles/profiles such as HL7 (CDA, CCR), DICOM, and XML, 2) the question of whom owns the data, and 3) incomplete technical standards. This study concludes that to create a workflow that runs within an enterprise environment there are a number of central prerequisites and challenges that needs to be in place. This calls for negotiations on an international, national and regional level with standardization organizations, vendors, health management and health personnel.

  6. Semantic Interoperability Almost Without Using The Same Vocabulary: Is It Possible?

    Science.gov (United States)

    Krisnadhi, A. A.

    2016-12-01

    Semantic interoperability, which is a key requirement in realizing cross-repository data integration, is often understood as using the same ontology or vocabulary. Consequently, within a particular domain, one can easily assume that there has to be one unifying domain ontology covering as many vocabulary terms in the domain as possible in order to realize any form of data integration across multiple data sources. Furthermore, the desire to provide very precise definition of those many terms led to the development of huge, foundational and domain ontologies that are comprehensive, but too complicated, restrictive, monolithic, and difficult to use and reuse, which cause common data providers to avoid using them. This problem is especially true in a domain as diverse as geosciences as it is virtually impossible to reach an agreement to the semantics of many terms (e.g., there are hundreds of definitions of forest used throughout the world). To overcome this challenge, modular ontology architecture has emerged in recent years, fueled among others, by advances in the ontology design pattern research. Each ontology pattern models only one key notion. It can act as a small module of a larger ontology. Such a module is developed in such a way that it is largely independent of how other notions in the same domain are modeled. This leads to an increased reusability. Furthermore, an ontology formed out of such modules would have an improved understandability over large, monolithic ontologies. Semantic interoperability in the aforementioned architecture is not achieved by enforcing the use of the same vocabulary, but rather, promoting alignment to the same ontology patterns. In this work, we elaborate how this architecture realizes the above idea. In particular, we describe how multiple data sources with differing perspectives and vocabularies can interoperate through this architecture. Building the solution upon semantic technologies such as Linked Data and the Web Ontology

  7. Data interoperability software solution for emergency reaction in the Europe Union

    Science.gov (United States)

    Casado, R.; Rubiera, E.; Sacristan, M.; Schütte, F.; Peters, R.

    2015-07-01

    Emergency management becomes more challenging in international crisis episodes because of cultural, semantic and linguistic differences between all stakeholders, especially first responders. Misunderstandings between first responders makes decision making slower and more difficult. However, spread and development of networks and IT-based emergency management systems (EMSs) have improved emergency responses, which have become more coordinated. Despite improvements made in recent years, EMSs have not still solved problems related to cultural, semantic and linguistic differences which are the real cause of slower decision making. In addition, from a technical perspective, the consolidation of current EMSs and the different formats used to exchange information offers another problem to be solved in any solution proposed for information interoperability between heterogeneous EMSs in different contexts. To overcome these problems, we present a software solution based on semantic and mediation technologies. EMERGency ELements (EMERGEL) (Fundacion CTIC and AntwortING Ingenieurbüro PartG, 2013), a common and modular ontology shared by all the stakeholders, has been defined. It offers the best solution to gather all stakeholders' knowledge in a unique and flexible data model, taking into account different countries' cultural and linguistic issues. To deal with the diversity of data protocols and formats, we have designed a service-oriented architecture for data interoperability (named DISASTER: Data Interoperability Solution At STakeholders Emergency Reaction) providing a flexible extensible solution to solve the mediation issues. Web services have been adopted as specific technology to implement this paradigm that has the most significant academic and industrial visibility and attraction. Contributions of this work have been validated through the design and development of a cross-border realistic prototype scenario, actively involving both emergency managers and emergency

  8. AstroCloud, a Cyber-Infrastructure for Astronomy Research: Data Access and Interoperability

    Science.gov (United States)

    Fan, D.; He, B.; Xiao, J.; Li, S.; Li, C.; Cui, C.; Yu, C.; Hong, Z.; Yin, S.; Wang, C.; Cao, Z.; Fan, Y.; Mi, L.; Wan, W.; Wang, J.

    2015-09-01

    Data access and interoperability module connects the observation proposals, data, virtual machines and software. According to the unique identifier of PI (principal investigator), an email address or an internal ID, data can be collected by PI's proposals, or by the search interfaces, e.g. conesearch. Files associated with the searched results could be easily transported to cloud storages, including the storage with virtual machines, or several commercial platforms like Dropbox. Benefitted from the standards of IVOA (International Observatories Alliance), VOTable formatted searching result could be sent to kinds of VO software. Latter endeavor will try to integrate more data and connect archives and some other astronomical resources.

  9. Satellite/Terrestrial Networks: End-to-End Communication Interoperability Quality of Service Experiments

    Science.gov (United States)

    Ivancic, William D.

    1998-01-01

    Various issues associated with satellite/terrestrial end-to-end communication interoperability are presented in viewgraph form. Specific topics include: 1) Quality of service; 2) ATM performance characteristics; 3) MPEG-2 transport stream mapping to AAL-5; 4) Observation and discussion of compressed video tests over ATM; 5) Digital video over satellites status; 6) Satellite link configurations; 7) MPEG-2 over ATM with binomial errors; 8) MPEG-2 over ATM channel characteristics; 8) MPEG-2 over ATM over emulated satellites; 9) MPEG-2 transport stream with errors; and a 10) Dual decoder test.

  10. The caCORE Software Development Kit: Streamlining construction of interoperable biomedical information services

    Directory of Open Access Journals (Sweden)

    Warzel Denise

    2006-01-01

    Full Text Available Abstract Background Robust, programmatically accessible biomedical information services that syntactically and semantically interoperate with other resources are challenging to construct. Such systems require the adoption of common information models, data representations and terminology standards as well as documented application programming interfaces (APIs. The National Cancer Institute (NCI developed the cancer common ontologic representation environment (caCORE to provide the infrastructure necessary to achieve interoperability across the systems it develops or sponsors. The caCORE Software Development Kit (SDK was designed to provide developers both within and outside the NCI with the tools needed to construct such interoperable software systems. Results The caCORE SDK requires a Unified Modeling Language (UML tool to begin the development workflow with the construction of a domain information model in the form of a UML Class Diagram. Models are annotated with concepts and definitions from a description logic terminology source using the Semantic Connector component. The annotated model is registered in the Cancer Data Standards Repository (caDSR using the UML Loader component. System software is automatically generated using the Codegen component, which produces middleware that runs on an application server. The caCORE SDK was initially tested and validated using a seven-class UML model, and has been used to generate the caCORE production system, which includes models with dozens of classes. The deployed system supports access through object-oriented APIs with consistent syntax for retrieval of any type of data object across all classes in the original UML model. The caCORE SDK is currently being used by several development teams, including by participants in the cancer biomedical informatics grid (caBIG program, to create compatible data services. caBIG compatibility standards are based upon caCORE resources, and thus the caCORE SDK has

  11. The 'PEARL' Data Warehouse: Initial Challenges Faced with Semantic and Syntactic Interoperability.

    Science.gov (United States)

    Mahmoud, Samhar; Boyd, Andy; Curcin, Vasa; Bache, Richard; Ali, Asad; Miles, Simon; Taweel, Adel; Delaney, Brendan; Macleod, John

    2017-01-01

    Data about patients are available from diverse sources, including those routinely collected as individuals interact with service providers, and those provided directly by individuals through surveys. Linking these data can lead to a more complete picture about the individual, to inform either care decision making or research investigations. However, post-linkage, differences in data recording systems and formats present barriers to achieving these aims. This paper describes an approach to combine linked GP records with study observations, and reports initial challenges related to semantic and syntactic interoperability issues.

  12. Scalability, Interoperability, and Security at the Data Discovery Level: A System Administrator's Perspective

    Science.gov (United States)

    2006-01-01

    The Global Change Master Directory (GCMD) has been one of the best known Earth science and global change data discovery online resources throughout its extended operational history. The growing popularity of the system since its introduction on the World Wide Web in 1994 has created an environment where resolving issues of scalability, security, and interoperability have been critical to providing the best available service to the users and partners of the GCMD. Innovative approaches developed at the GCMD in these areas will be presented with a focus on how they relate to current and future GO-ESSP community needs.

  13. Ocean Data Interoperability Platform (ODIP): using regional data systems for global ocean research

    Science.gov (United States)

    Schaap, D.; Thijsse, P.; Glaves, H.

    2017-12-01

    Ocean acidification, loss of coral reefs, sustainable exploitation of the marine environment are just a few of the challenges researchers around the world are currently attempting to understand and address. However, studies of these ecosystem level challenges are impossible unless researchers can discover and re-use the large volumes of interoperable multidisciplinary data that are currently only accessible through regional and global data systems that serve discreet, and often discipline specific, user communities. The plethora of marine data systems currently in existence are also using different standards, technologies and best practices making re-use of the data problematic for those engaged in interdisciplinary marine research. The Ocean Data Interoperability Platform (ODIP) is responding to this growing demand for discoverable, accessible and reusable data by establishing the foundations for a common global framework for marine data management. But creation of such an infrastructure is a major undertaking, and one that needs to be achieved in part by establishing different levels of interoperability across existing regional and global marine e-infrastructures. Workshops organised by ODIP II facilitate dialogue between selected regional and global marine data systems in an effort to identify potential solutions that integrate these marine e-infrastructures. The outcomes of these discussions have formed the basis for a number of prototype development tasks that aim to demonstrate effective sharing of data across multiple data systems, and allow users to access data from more than one system through a single access point. The ODIP II project is currently developing four prototype solutions that are establishing interoperability between selected regional marine data management infrastructures in Europe, the USA, Canada and Australia, and with the global POGO, IODE Ocean Data Portal (ODP) and GEOSS systems. The potential impact of implementing these solutions for

  14. Web services-based text-mining demonstrates broad impacts for interoperability and process simplification

    Science.gov (United States)

    Wiegers, Thomas C.; Davis, Allan Peter; Mattingly, Carolyn J.

    2014-01-01

    The Critical Assessment of Information Extraction systems in Biology (BioCreAtIvE) challenge evaluation tasks collectively represent a community-wide effort to evaluate a variety of text-mining and information extraction systems applied to the biological domain. The BioCreative IV Workshop included five independent subject areas, including Track 3, which focused on named-entity recognition (NER) for the Comparative Toxicogenomics Database (CTD; http://ctdbase.org). Previously, CTD had organized document ranking and NER-related tasks for the BioCreative Workshop 2012; a key finding of that effort was that interoperability and integration complexity were major impediments to the direct application of the systems to CTD's text-mining pipeline. This underscored a prevailing problem with software integration efforts. Major interoperability-related issues included lack of process modularity, operating system incompatibility, tool configuration complexity and lack of standardization of high-level inter-process communications. One approach to potentially mitigate interoperability and general integration issues is the use of Web services to abstract implementation details; rather than integrating NER tools directly, HTTP-based calls from CTD's asynchronous, batch-oriented text-mining pipeline could be made to remote NER Web services for recognition of specific biological terms using BioC (an emerging family of XML formats) for inter-process communications. To test this concept, participating groups developed Representational State Transfer /BioC-compliant Web services tailored to CTD's NER requirements. Participants were provided with a comprehensive set of training materials. CTD evaluated results obtained from the remote Web service-based URLs against a test data set of 510 manually curated scientific articles. Twelve groups participated in the challenge. Recall, precision, balanced F-scores and response times were calculated. Top balanced F-scores for gene, chemical and

  15. Interoperability of the CDPP tools and databases through the EPN-TAP protocol

    Science.gov (United States)

    Gangloff, M.; Génot, V.; André, N.; Erard, S.; Cecconi, B.; Jourdane, N.; Indurain, M.; Bouchemit, M.; Blelly, P.-L.; Rouillard, A. P.; Marchaudon, A.; Beigbeder, L.; Budnik, E.; Glorian, J.-M.

    2017-09-01

    The French Plasma Physics Data Centre (CDPP, http://cdpp.eu) )distributes and valorizes natural plasma data for nearly 20 years. The CDPP is involved for many years in the development and implementation of interoperability standards like SPASE, IVOA and IPDA. In the frame of the VESPA work package of Europlanet H2020, the CDPP has developed an EPN-TAP compatible server which distributes observational time series from the AMDA database, illumination maps from the 67P comet, and simulation results from the IPIM model. An EPN-TAP compatible interface was also added in AMDA , 3DView and PropagationTool.

  16. Inter-operator Variability in Defining Uterine Position Using Three-dimensional Ultrasound Imaging

    DEFF Research Database (Denmark)

    Baker, Mariwan; Jensen, Jørgen Arendt; Behrens, Claus F.

    2013-01-01

    to displacement by applied operator-pressure that mimics an actual GYN patient. The transabdominal scanning was performed using a 3D-US system (Clarity® Model 310C00, Elekta, Montreal, Canada). It consists of a US acquisition-station, workstation, and a 128- element 1D array curved probe. The iterated US......-scans were performed in four subsequent sessions (totally 21 US-scans) in a period of four weeks to investigate the randomness of the inter-operator variability. An additionally US-scan was performed as a reference target volume to the consecutive scans. At first, the phantom was marked with ball bearings...

  17. Position paper: cognitive radio networking for multiple sensor network interoperability in mines

    CSIR Research Space (South Africa)

    Kagize, BM

    2008-01-01

    Full Text Available . These commercially available networks are purported to be self-organizing and self correcting, though the software behind these networks are proprietary with the caveat of inter-operability difficulties with other networks [5]. There is a non-propriety and open...-layer communication. This is done in to allow for smooth technology transition. Figure 6: The proposed Cross Layer model Artificial Intelligence is generally accepted as a means to realise a full scale cognitive radio networking. There are several AI...

  18. Facilitating Semantic Interoperability Among Ocean Data Systems: ODIP-R2R Student Outcomes

    Science.gov (United States)

    Stocks, K. I.; Chen, Y.; Shepherd, A.; Chandler, C. L.; Dockery, N.; Elya, J. L.; Smith, S. R.; Ferreira, R.; Fu, L.; Arko, R. A.

    2014-12-01

    With informatics providing an increasingly important set of tools for geoscientists, it is critical to train the next generation of scientists in information and data techniques. The NSF-supported Rolling Deck to Repository (R2R) Program works with the academic fleet community to routinely document, assess, and preserve the underway sensor data from U.S. research vessels. The Ocean Data Interoperability Platform (ODIP) is an EU-US-Australian collaboration fostering interoperability among regional e-infrastructures through workshops and joint prototype development. The need to align terminology between systems is a common challenge across all of the ODIP prototypes. Five R2R students were supported to address aspects of semantic interoperability within ODIP. Developing a vocabulary matching service that links terms from different vocabularies with similar concept. The service implements Google Refine reconciliation service interface such that users can leverage Google Refine application as a friendly user interface while linking different vocabulary terms. Developing Resource Description Framework (RDF) resources that map Shipboard Automated Meteorological Oceanographic System (SAMOS) vocabularies to internationally served vocabularies. Each SAMOS vocabulary term (data parameter and quality control flag) will be described as an RDF resource page. These RDF resources allow for enhanced discoverability and retrieval of SAMOS data by enabling data searches based on parameter. Improving data retrieval and interoperability by exposing data and mapped vocabularies using Semantic Web technologies. We have collaborated with ODIP participating organizations in order to build a generalized data model that will be used to populate a SPARQL endpoint in order to provide expressive querying over our data files. Mapping local and regional vocabularies used by R2R to those used by ODIP partners. This work is described more fully in a companion poster. Making published Linked Data

  19. Reducing barriers to interoperability through collaborative development of standards for Earth science information systems

    Science.gov (United States)

    Percivall, G. S.; Arctur, D. K.

    2010-12-01

    Increasingly, Earth science research must make effective use of interdisciplinary data sources and processes. Non-interoperability impedes sharing of data and computing resources. Standards from the Open Geospatial Consortium (OGC) and other organizations are the basis for successfully deploying a seamless, distributed information infrastructure for the geosciences. Collaborative development of the standards has proven effective in reducing barriers to standards adoption. Standards are the basis for the success of the Internet and the World Wide Web. A standard describes a set of rules that have been agreed to in some consensus forum, such as the Internet Engineering Task Force (IETF), the International Organization for Standardization (ISO), or the OGC. As described in The Importance of Going Open, “non-interoperability causes organizations to spend much more than necessary on geospatial information technology development”. In the context of e-Science, the National Science Foundation’s Cyberinfrastructure Council argues that “The use of standards creates economies of scale and scope for developing and deploying common resources, tools, software, and services that enhance the use of cyberinfrastructure in multiple science and engineering communities.” Barriers to adoption include misperceptions and misuse of standards. “Adhering to standards costs more” - typically this statement is made when a research program considers implementing standards as a one-time modification to an existing system. Multiple economic studies have shown lower development costs when using standards over the life of a project. “Standards stifle innovation” - a key decision in research is to consider what assumptions to consider fixed and what to challenge. The long history of standards in research, e.g., SI units, is fundamental to assessing repeatable results by independent researchers. Similar need for common standards exist in the information systems used for Earth

  20. Semantic Gateway as a Service architecture for IoT Interoperability

    OpenAIRE

    Desai, Pratikkumar; Sheth, Amit; Anantharam, Pramod

    2014-01-01

    The Internet of Things (IoT) is set to occupy a substantial component of future Internet. The IoT connects sensors and devices that record physical observations to applications and services of the Internet. As a successor to technologies such as RFID and Wireless Sensor Networks (WSN), the IoT has stumbled into vertical silos of proprietary systems, providing little or no interoperability with similar systems. As the IoT represents future state of the Internet, an intelligent and scalable arc...

  1. Bringing Health and Fitness Data Together for Connected Health Care: Mobile Apps as Enablers of Interoperability.

    Science.gov (United States)

    Gay, Valerie; Leijdekkers, Peter

    2015-11-18

    A transformation is underway regarding how we deal with our health. Mobile devices make it possible to have continuous access to personal health information. Wearable devices, such as Fitbit and Apple's smartwatch, can collect data continuously and provide insights into our health and fitness. However, lack of interoperability and the presence of data silos prevent users and health professionals from getting an integrated view of health and fitness data. To provide better health outcomes, a complete picture is needed which combines informal health and fitness data collected by the user together with official health records collected by health professionals. Mobile apps are well positioned to play an important role in the aggregation since they can tap into these official and informal health and data silos. The objective of this paper is to demonstrate that a mobile app can be used to aggregate health and fitness data and can enable interoperability. It discusses various technical interoperability challenges encountered while integrating data into one place. For 8 years, we have worked with third-party partners, including wearable device manufacturers, electronic health record providers, and app developers, to connect an Android app to their (wearable) devices, back-end servers, and systems. The result of this research is a health and fitness app called myFitnessCompanion, which enables users to aggregate their data in one place. Over 6000 users use the app worldwide to aggregate their health and fitness data. It demonstrates that mobile apps can be used to enable interoperability. Challenges encountered in the research process included the different wireless protocols and standards used to communicate with wireless devices, the diversity of security and authorization protocols used to be able to exchange data with servers, and lack of standards usage, such as Health Level Seven, for medical information exchange. By limiting the negative effects of health data silos

  2. tmBioC: improving interoperability of text-mining tools with BioC.

    Science.gov (United States)

    Khare, Ritu; Wei, Chih-Hsuan; Mao, Yuqing; Leaman, Robert; Lu, Zhiyong

    2014-01-01

    The lack of interoperability among biomedical text-mining tools is a major bottleneck in creating more complex applications. Despite the availability of numerous methods and techniques for various text-mining tasks, combining different tools requires substantial efforts and time owing to heterogeneity and variety in data formats. In response, BioC is a recent proposal that offers a minimalistic approach to tool interoperability by stipulating minimal changes to existing tools and applications. BioC is a family of XML formats that define how to present text documents and annotations, and also provides easy-to-use functions to read/write documents in the BioC format. In this study, we introduce our text-mining toolkit, which is designed to perform several challenging and significant tasks in the biomedical domain, and repackage the toolkit into BioC to enhance its interoperability. Our toolkit consists of six state-of-the-art tools for named-entity recognition, normalization and annotation (PubTator) of genes (GenNorm), diseases (DNorm), mutations (tmVar), species (SR4GN) and chemicals (tmChem). Although developed within the same group, each tool is designed to process input articles and output annotations in a different format. We modify these tools and enable them to read/write data in the proposed BioC format. We find that, using the BioC family of formats and functions, only minimal changes were required to build the newer versions of the tools. The resulting BioC wrapped toolkit, which we have named tmBioC, consists of our tools in BioC, an annotated full-text corpus in BioC, and a format detection and conversion tool. Furthermore, through participation in the 2013 BioCreative IV Interoperability Track, we empirically demonstrate that the tools in tmBioC can be more efficiently integrated with each other as well as with external tools: Our experimental results show that using BioC reduces >60% in lines of code for text-mining tool integration. The tmBioC toolkit

  3. Web services-based text-mining demonstrates broad impacts for interoperability and process simplification.

    Science.gov (United States)

    Wiegers, Thomas C; Davis, Allan Peter; Mattingly, Carolyn J

    2014-01-01

    The Critical Assessment of Information Extraction systems in Biology (BioCreAtIvE) challenge evaluation tasks collectively represent a community-wide effort to evaluate a variety of text-mining and information extraction systems applied to the biological domain. The BioCreative IV Workshop included five independent subject areas, including Track 3, which focused on named-entity recognition (NER) for the Comparative Toxicogenomics Database (CTD; http://ctdbase.org). Previously, CTD had organized document ranking and NER-related tasks for the BioCreative Workshop 2012; a key finding of that effort was that interoperability and integration complexity were major impediments to the direct application of the systems to CTD's text-mining pipeline. This underscored a prevailing problem with software integration efforts. Major interoperability-related issues included lack of process modularity, operating system incompatibility, tool configuration complexity and lack of standardization of high-level inter-process communications. One approach to potentially mitigate interoperability and general integration issues is the use of Web services to abstract implementation details; rather than integrating NER tools directly, HTTP-based calls from CTD's asynchronous, batch-oriented text-mining pipeline could be made to remote NER Web services for recognition of specific biological terms using BioC (an emerging family of XML formats) for inter-process communications. To test this concept, participating groups developed Representational State Transfer /BioC-compliant Web services tailored to CTD's NER requirements. Participants were provided with a comprehensive set of training materials. CTD evaluated results obtained from the remote Web service-based URLs against a test data set of 510 manually curated scientific articles. Twelve groups participated in the challenge. Recall, precision, balanced F-scores and response times were calculated. Top balanced F-scores for gene, chemical and

  4. METHODS FOR DESCRIPTION OF EDUCATION AND SCIENTIFIC SERVICES IN INFORMATION AND EDUCATION ON THE BASIS OF INTEROPERABILITY STACK EIF

    Directory of Open Access Journals (Sweden)

    Ekaterina V. Pavlova

    2015-01-01

    Full Text Available The article deals with methodology for description of scientifi c and educational services in education and information on the basis of interoperability stack EIF (European Interoperability Framework. The passage describes operation factors to depict services on every level of the methodology, tools used to describe the services and the content. We also provide the link between methodology of description with the life span of the service. The article presents an example of service description according to the methodology considering the current education and professional standards, ITIL recommendations, ontology on the OWL basis and WSDL-description. 

  5. Using Open and Interoperable Ways to Publish and Access LANCE AIRS Near-Real Time Data

    Science.gov (United States)

    Zhao, Peisheng; Lynnes, Christopher; Vollmer, Bruce; Savtchenko, Andrey; Theobald, Michael; Yang, Wenli

    2011-01-01

    The Atmospheric Infrared Sounder (AIRS) Near-Real Time (NRT) data from the Land Atmosphere Near real-time Capability for EOS (LANCE) element at the Goddard Earth Sciences Data and Information Services Center (GES DISC) provides information on the global and regional atmospheric state, with very low temporal latency, to support climate research and improve weather forecasting. An open and interoperable platform is useful to facilitate access to, and integration of, LANCE AIRS NRT data. As Web services technology has matured in recent years, a new scalable Service-Oriented Architecture (SOA) is emerging as the basic platform for distributed computing and large networks of interoperable applications. Following the provide-register-discover-consume SOA paradigm, this presentation discusses how to use open-source geospatial software components to build Web services for publishing and accessing AIRS NRT data, explore the metadata relevant to registering and discovering data and services in the catalogue systems, and implement a Web portal to facilitate users' consumption of the data and services.

  6. Multi-Agent Decision Support Tool to Enable Interoperability among Heterogeneous Energy Systems

    Directory of Open Access Journals (Sweden)

    Brígida Teixeira

    2018-02-01

    Full Text Available Worldwide electricity markets are undergoing a major restructuring process. One of the main reasons for the ongoing changes is to enable the adaptation of current market models to the new paradigm that arises from the large-scale integration of distributed generation sources. In order to deal with the unpredictability caused by the intermittent nature of the distributed generation and the large number of variables that contribute to the energy sector balance, it is extremely important to use simulation systems that are capable of dealing with the required complexity. This paper presents the Tools Control Center (TOOCC, a framework that allows the interoperability between heterogeneous energy and power simulation systems through the use of ontologies, allowing the simulation of scenarios with a high degree of complexity, through the cooperation of the individual capacities of each system. A case study based on real data is presented in order to demonstrate the interoperability capabilities of TOOCC. The simulation considers the energy management of a microgrid of a real university campus, from the perspective of the network manager and also of its consumers/producers, in a projection for a typical day of the winter of 2050.

  7. Development of high performance scientific components for interoperability of computing packages

    Energy Technology Data Exchange (ETDEWEB)

    Gulabani, Teena Pratap [Iowa State Univ., Ames, IA (United States)

    2008-01-01

    Three major high performance quantum chemistry computational packages, NWChem, GAMESS and MPQC have been developed by different research efforts following different design patterns. The goal is to achieve interoperability among these packages by overcoming the challenges caused by the different communication patterns and software design of each of these packages. A chemistry algorithm is hard to develop as well as being a time consuming process; integration of large quantum chemistry packages will allow resource sharing and thus avoid reinvention of the wheel. Creating connections between these incompatible packages is the major motivation of the proposed work. This interoperability is achieved by bringing the benefits of Component Based Software Engineering through a plug-and-play component framework called Common Component Architecture (CCA). In this thesis, I present a strategy and process used for interfacing two widely used and important computational chemistry methodologies: Quantum Mechanics and Molecular Mechanics. To show the feasibility of the proposed approach the Tuning and Analysis Utility (TAU) has been coupled with NWChem code and its CCA components. Results show that the overhead is negligible when compared to the ease and potential of organizing and coping with large-scale software applications.

  8. Two-Level Evaluation on Sensor Interoperability of Features in Fingerprint Image Segmentation

    Directory of Open Access Journals (Sweden)

    Ya-Shuo Li

    2012-03-01

    Full Text Available Features used in fingerprint segmentation significantly affect the segmentation performance. Various features exhibit different discriminating abilities on fingerprint images derived from different sensors. One feature which has better discriminating ability on images derived from a certain sensor may not adapt to segment images derived from other sensors. This degrades the segmentation performance. This paper empirically analyzes the sensor interoperability problem of segmentation feature, which refers to the feature’s ability to adapt to the raw fingerprints captured by different sensors. To address this issue, this paper presents a two-level feature evaluation method, including the first level feature evaluation based on segmentation error rate and the second level feature evaluation based on decision tree. The proposed method is performed on a number of fingerprint databases which are obtained from various sensors. Experimental results show that the proposed method can effectively evaluate the sensor interoperability of features, and the features with good evaluation results acquire better segmentation accuracies of images originating from different sensors.

  9. Interoperability at ESA Heliophysics Science Archives: IVOA, HAPI and other implementations

    Science.gov (United States)

    Martinez-Garcia, B.; Cook, J. P.; Perez, H.; Fernandez, M.; De Teodoro, P.; Osuna, P.; Arnaud, M.; Arviset, C.

    2017-12-01

    The data of ESA heliophysics science missions are preserved at the ESAC Science Data Centre (ESDC). The ESDC aims for the long term preservation of those data, which includes missions such as Ulysses, Soho, Proba-2, Cluster, Double Star, and in the future, Solar Orbiter. Scientists have access to these data through web services, command line and graphical user interfaces for each of the corresponding science mission archives. The International Virtual Observatory Alliance (IVOA) provides technical standards that allow interoperability among different systems that implement them. By adopting some IVOA standards, the ESA heliophysics archives are able to share their data with those tools and services that are VO-compatible. Implementation of those standards can be found in the existing archives: Ulysses Final Archive (UFA) and Soho Science Archive (SSA). They already make use of VOTable format definition and Simple Application Messaging Protocol (SAMP). For re-engineered or new archives, the implementation of services through Table Access Protocol (TAP) or Universal Worker Service (UWS) will leverage this interoperability. This will be the case for the Proba-2 Science Archive (P2SA) and the Solar Orbiter Archive (SOAR). We present here which IVOA standards were already used by the ESA Heliophysics archives in the past and the work on-going.

  10. Usability and Interoperability in Wireless Sensor Networks for Patient Telemonitoring in Chronic Disease Management.

    Science.gov (United States)

    Jiménez-Fernández, Silvia; de Toledo, Paula; del Pozo, Francisco

    2013-12-01

    This paper addresses two key technological barriers to the wider adoption of patient telemonitoring systems for chronic disease management, namely, usability and sensor device interoperability. As a great percentage of chronic patients are elderly patients as well, usability of the system has to be adapted to their needs. This paper identifies (from previous research) a set of design criteria to address these challenges, and describes the resulting system based on a wireless sensor network, and including a node as a custom-made interface that follows usability design criteria stated. This system has been tested with 22 users (mean age 65) and evaluated with a validated usability questionnaire. Results are good and improve those of other systems based on TV or smartphone. Our results suggest that user interfaces alternative to TVs and smartphones could play an important role on the usability of sensor networks for patient monitoring. Regarding interoperability, only very recently a standard has been published (2010, the ISO IEEE 11073 Personal health devices) that can support the needs of limited computational power environments typical of patient monitoring sensor networks.

  11. Electronic Toll Collection Systems and their Interoperability: The State of Art

    Energy Technology Data Exchange (ETDEWEB)

    Heras Molina, J. de la; Gomez Sanchez, J.; Vassallo Magro, J.M.

    2016-07-01

    The European Electronic Toll Service (EETS) was created in 2004 with the aim of ensuring interoperability among the existing electronic toll collection (ETC) systems in Europe. However, the lack of cooperation between groups of stakeholders has not made possible to achieve this goal ten years later. The purpose of this research is to determine the better way to achieve interoperability among the different ETC systems in Europe. Our study develops a review of the six main ETC systems available worldwide: Automatic Number Plate Recognition (ANPR), Dedicated Short-Range Communications (DSRC), Radio Frequency Identification (RFID), Satellite systems (GNSS), Tachograph, and Mobile communications tolling systems. The research also provides some insight on different emerging technologies. By focusing on different operational and strategic aspects offered by each technology, we identify their main strengths, weaknesses, opportunities and threats and makes different recommendations to improve the current framework. The research concludes that given the diversity of advantages and inconveniences offered by each system, the selection of a certain ETC technology should also take into account its potential to overcome the weaknesses in the current ETC framework. In this line, different policy recommendations are proposed to improve the present ETC strategy at the EU. (Author)

  12. The impact of SOA for achieving healthcare interoperability. An empirical investigation based on a hypothetical adoption.

    Science.gov (United States)

    Daskalakis, S; Mantas, J

    2009-01-01

    The evaluation of a service-oriented prototype implementation for healthcare interoperability. A prototype framework was developed, aiming to exploit the use of service-oriented architecture (SOA) concepts for achieving healthcare interoperability and to move towards a virtual patient record (VPR) paradigm. The prototype implementation was evaluated for its hypothetical adoption. The evaluation strategy was based on the initial proposition of the DeLone and McLean model of information systems (IS) success [1], as modeled by Iivari [2]. A set of SOA and VPR characteristics were empirically encapsulated within the dimensions of IS success model, combined with measures from previous research works. The data gathered was analyzed using partial least squares (PLS). The results highlighted that system quality is a partial predictor of system use but not of user satisfaction. On the contrary, information quality proved to be a significant predictor of user satisfaction and partially a strong significant predictor of system use. Moreover, system use did not prove to be a significant predictor of individual impact whereas the bi-directional relation between use and user satisfaction did not confirm. Additionally, user satisfaction was found to be a strong significant predictor of individual impact. Finally, individual impact proved to be a strong significant predictor of organizational impact. The empirical study attempted to obtain hypothetical, but still useful beliefs and perceptions regarding the SOA prototype implementation. The deduced observations can form the basis for further investigation regarding the adaptability of SOA implementations with VPR characteristics in the healthcare domain.

  13. Measures for interoperability of phenotypic data: minimum information requirements and formatting

    Directory of Open Access Journals (Sweden)

    Hanna Ćwiek-Kupczyńska

    2016-11-01

    Full Text Available Abstract Background Plant phenotypic data shrouds a wealth of information which, when accurately analysed and linked to other data types, brings to light the knowledge about the mechanisms of life. As phenotyping is a field of research comprising manifold, diverse and time-consuming experiments, the findings can be fostered by reusing and combining existing datasets. Their correct interpretation, and thus replicability, comparability and interoperability, is possible provided that the collected observations are equipped with an adequate set of metadata. So far there have been no common standards governing phenotypic data description, which hampered data exchange and reuse. Results In this paper we propose the guidelines for proper handling of the information about plant phenotyping experiments, in terms of both the recommended content of the description and its formatting. We provide a document called “Minimum Information About a Plant Phenotyping Experiment”, which specifies what information about each experiment should be given, and a Phenotyping Configuration for the ISA-Tab format, which allows to practically organise this information within a dataset. We provide examples of ISA-Tab-formatted phenotypic data, and a general description of a few systems where the recommendations have been implemented. Conclusions Acceptance of the rules described in this paper by the plant phenotyping community will help to achieve findable, accessible, interoperable and reusable data.

  14. Harmonising phenomics information for a better interoperability in the rare disease field.

    Science.gov (United States)

    Maiella, Sylvie; Olry, Annie; Hanauer, Marc; Lanneau, Valérie; Lourghi, Halima; Donadille, Bruno; Rodwell, Charlotte; Köhler, Sebastian; Seelow, Dominik; Jupp, Simon; Parkinson, Helen; Groza, Tudor; Brudno, Michael; Robinson, Peter N; Rath, Ana

    2018-02-07

    HIPBI-RD (Harmonising phenomics information for a better interoperability in the rare disease field) is a three-year project which started in 2016 funded via the E-Rare 3 ERA-NET program. This project builds on three resources largely adopted by the rare disease (RD) community: Orphanet, its ontology ORDO (the Orphanet Rare Disease Ontology), HPO (the Human Phenotype Ontology) as well as PhenoTips software for the capture and sharing of structured phenotypic data for RD patients. Our project is further supported by resources developed by the European Bioinformatics Institute and the Garvan Institute. HIPBI-RD aims to provide the community with an integrated, RD-specific bioinformatics ecosystem that will harmonise the way phenomics information is stored in databases and patient files worldwide, and thereby contribute to interoperability. This ecosystem will consist of a suite of tools and ontologies, optimized to work together, and made available through commonly used software repositories. The project workplan follows three main objectives: The HIPBI-RD ecosystem will contribute to the interpretation of variants identified through exome and full genome sequencing by harmonising the way phenotypic information is collected, thus improving diagnostics and delineation of RD. The ultimate goal of HIPBI-RD is to provide a resource that will contribute to bridging genome-scale biology and a disease-centered view on human pathobiology. Achievements in Year 1. Copyright © 2018. Published by Elsevier Masson SAS.

  15. A Web Service Protocol Realizing Interoperable Internet of Things Tasking Capability

    Science.gov (United States)

    Huang, Chih-Yuan; Wu, Cheng-Hung

    2016-01-01

    The Internet of Things (IoT) is an infrastructure that interconnects uniquely-identifiable devices using the Internet. By interconnecting everyday appliances, various monitoring, and physical mashup applications can be constructed to improve human’s daily life. In general, IoT devices provide two main capabilities: sensing and tasking capabilities. While the sensing capability is similar to the World-Wide Sensor Web, this research focuses on the tasking capability. However, currently, IoT devices created by different manufacturers follow different proprietary protocols and are locked in many closed ecosystems. This heterogeneity issue impedes the interconnection between IoT devices and damages the potential of the IoT. To address this issue, this research aims at proposing an interoperable solution called tasking capability description that allows users to control different IoT devices using a uniform web service interface. This paper demonstrates the contribution of the proposed solution by interconnecting different IoT devices for different applications. In addition, the proposed solution is integrated with the OGC SensorThings API standard, which is a Web service standard defined for the IoT sensing capability. Consequently, the Extended SensorThings API can realize both IoT sensing and tasking capabilities in an integrated and interoperable manner. PMID:27589759

  16. A semantic interoperability approach to support integration of gene expression and clinical data in breast cancer.

    Science.gov (United States)

    Alonso-Calvo, Raul; Paraiso-Medina, Sergio; Perez-Rey, David; Alonso-Oset, Enrique; van Stiphout, Ruud; Yu, Sheng; Taylor, Marian; Buffa, Francesca; Fernandez-Lozano, Carlos; Pazos, Alejandro; Maojo, Victor

    2017-08-01

    The introduction of omics data and advances in technologies involved in clinical treatment has led to a broad range of approaches to represent clinical information. Within this context, patient stratification across health institutions due to omic profiling presents a complex scenario to carry out multi-center clinical trials. This paper presents a standards-based approach to ensure semantic integration required to facilitate the analysis of clinico-genomic clinical trials. To ensure interoperability across different institutions, we have developed a Semantic Interoperability Layer (SIL) to facilitate homogeneous access to clinical and genetic information, based on different well-established biomedical standards and following International Health (IHE) recommendations. The SIL has shown suitability for integrating biomedical knowledge and technologies to match the latest clinical advances in healthcare and the use of genomic information. This genomic data integration in the SIL has been tested with a diagnostic classifier tool that takes advantage of harmonized multi-center clinico-genomic data for training statistical predictive models. The SIL has been adopted in national and international research initiatives, such as the EURECA-EU research project and the CIMED collaborative Spanish project, where the proposed solution has been applied and evaluated by clinical experts focused on clinico-genomic studies. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. A Web Service Protocol Realizing Interoperable Internet of Things Tasking Capability.

    Science.gov (United States)

    Huang, Chih-Yuan; Wu, Cheng-Hung

    2016-08-31

    The Internet of Things (IoT) is an infrastructure that interconnects uniquely-identifiable devices using the Internet. By interconnecting everyday appliances, various monitoring, and physical mashup applications can be constructed to improve human's daily life. In general, IoT devices provide two main capabilities: sensing and tasking capabilities. While the sensing capability is similar to the World-Wide Sensor Web, this research focuses on the tasking capability. However, currently, IoT devices created by different manufacturers follow different proprietary protocols and are locked in many closed ecosystems. This heterogeneity issue impedes the interconnection between IoT devices and damages the potential of the IoT. To address this issue, this research aims at proposing an interoperable solution called tasking capability description that allows users to control different IoT devices using a uniform web service interface. This paper demonstrates the contribution of the proposed solution by interconnecting different IoT devices for different applications. In addition, the proposed solution is integrated with the OGC SensorThings API standard, which is a Web service standard defined for the IoT sensing capability. Consequently, the Extended SensorThings API can realize both IoT sensing and tasking capabilities in an integrated and interoperable manner.

  18. Intelligent semantic interoperability: Integrating knowledge, terminology and information models to support stroke care.

    Science.gov (United States)

    Goossen, William T F

    2006-01-01

    Electronic patient record (EPR) systems for the continuity of care for stroke patient are under development. These systems are based on standards such as for clinical practice, vocabularies, and the HL7 information model. In order to achieve intelligent semantic interoperability, knowledge about evidence based patient care, vocabulary and information models need to be integrated. A format was developed in which the clinical knowledge, clinical terminology, and standard information models are integrated as specification for the technical implementation of electronic health systems and electronic messages. This format is verified by clinicians and technicians. The document structure consists of meta-information such as version control and changes, purpose of the clinical content, evidence from the literature, variables and values, terminology used, guidelines for application and interpretation, HL7 message models, coding, and technical data specification. Further, XML message excerpts, archetypes and screen designs are developed from these documents to facilitate implementation. The combination of these aspects in one document creates valuable content for intelligent semantic interoperability by means of development of messages and systems.

  19. Next Generation Air Quality Platform: Openness and Interoperability for the Internet of Things

    Directory of Open Access Journals (Sweden)

    Alexander Kotsev

    2016-03-01

    Full Text Available The widespread diffusion of sensors, mobile devices, social media and open data are reconfiguring the way data underpinning policy and science are being produced and consumed. This in turn is creating both opportunities and challenges for policy-making and science. There can be major benefits from the deployment of the IoT in smart cities and environmental monitoring, but to realize such benefits, and reduce potential risks, there is an urgent need to address current limitations, including the interoperability of sensors, data quality, security of access and new methods for spatio-temporal analysis. Within this context, the manuscript provides an overview of the AirSensEUR project, which establishes an affordable open software/hardware multi-sensor platform, which is nonetheless able to monitor air pollution at low concentration levels. AirSensEUR is described from the perspective of interoperable data management with emphasis on possible use case scenarios, where reliable and timely air quality data would be essential.

  20. An Interoperability Platform Enabling Reuse of Electronic Health Records for Signal Verification Studies

    Directory of Open Access Journals (Sweden)

    Mustafa Yuksel

    2016-01-01

    Full Text Available Depending mostly on voluntarily sent spontaneous reports, pharmacovigilance studies are hampered by low quantity and quality of patient data. Our objective is to improve postmarket safety studies by enabling safety analysts to seamlessly access a wide range of EHR sources for collecting deidentified medical data sets of selected patient populations and tracing the reported incidents back to original EHRs. We have developed an ontological framework where EHR sources and target clinical research systems can continue using their own local data models, interfaces, and terminology systems, while structural interoperability and Semantic Interoperability are handled through rule-based reasoning on formal representations of different models and terminology systems maintained in the SALUS Semantic Resource Set. SALUS Common Information Model at the core of this set acts as the common mediator. We demonstrate the capabilities of our framework through one of the SALUS safety analysis tools, namely, the Case Series Characterization Tool, which have been deployed on top of regional EHR Data Warehouse of the Lombardy Region containing about 1 billion records from 16 million patients and validated by several pharmacovigilance researchers with real-life cases. The results confirm significant improvements in signal detection and evaluation compared to traditional methods with the missing background information.

  1. Inter-operator and inter-device agreement and reliability of the SEM Scanner.

    Science.gov (United States)

    Clendenin, Marta; Jaradeh, Kindah; Shamirian, Anasheh; Rhodes, Shannon L

    2015-02-01

    The SEM Scanner is a medical device designed for use by healthcare providers as part of pressure ulcer prevention programs. The objective of this study was to evaluate the inter-rater and inter-device agreement and reliability of the SEM Scanner. Thirty-one (31) volunteers free of pressure ulcers or broken skin at the sternum, sacrum, and heels were assessed with the SEM Scanner. Each of three operators utilized each of three devices to collect readings from four anatomical sites (sternum, sacrum, left and right heels) on each subject for a total of 108 readings per subject collected over approximately 30 min. For each combination of operator-device-anatomical site, three SEM readings were collected. Inter-operator and inter-device agreement and reliability were estimated. Over the course of this study, more than 3000 SEM Scanner readings were collected. Agreement between operators was good with mean differences ranging from -0.01 to 0.11. Inter-operator and inter-device reliability exceeded 0.80 at all anatomical sites assessed. The results of this study demonstrate the high reliability and good agreement of the SEM Scanner across different operators and different devices. Given the limitations of current methods to prevent and detect pressure ulcers, the SEM Scanner shows promise as an objective, reliable tool for assessing the presence or absence of pressure-induced tissue damage such as pressure ulcers. Copyright © 2015 Bruin Biometrics, LLC. Published by Elsevier Ltd.. All rights reserved.

  2. Self-describing schemes for interoperable MPEG-7 multimedia content descriptions

    Science.gov (United States)

    Paek, Seungyup; Benitez, Ana B.; Chang, Shih-Fu

    1998-12-01

    In this paper, we present the self-describing schemes for interoperable image/video content descriptions, which are being developed as part of our proposal to the MPEG-7 standard. MPEG-7 aims to standardize content descriptions for multimedia data. The objective of this standard is to facilitate content-focused applications like multimedia searching, filtering, browsing, and summarization. To ensure maximum interoperability and flexibility, our descriptions are defined using the eXtensible Markup Language (XML), developed by the World Wide Web Consortium. We demonstrate the feasibility and efficiency of our self-describing schemes in our MPEG-7 testbed. First, we show how our scheme can accommodate image and video descriptions that are generated by a wide variety of systems. Then, we present two systems being developed that are enabled and enhanced by the proposed approach for multimedia content descriptions. The first system is an intelligent search engine with an associated expressive query interface. The second system is a new version of MetaSEEk, a metasearch system for mediation among multiple search engines for audio-visual information.

  3. Next Generation Air Quality Platform: Openness and Interoperability for the Internet of Things.

    Science.gov (United States)

    Kotsev, Alexander; Schade, Sven; Craglia, Massimo; Gerboles, Michel; Spinelle, Laurent; Signorini, Marco

    2016-03-18

    The widespread diffusion of sensors, mobile devices, social media and open data are reconfiguring the way data underpinning policy and science are being produced and consumed. This in turn is creating both opportunities and challenges for policy-making and science. There can be major benefits from the deployment of the IoT in smart cities and environmental monitoring, but to realize such benefits, and reduce potential risks, there is an urgent need to address current limitations, including the interoperability of sensors, data quality, security of access and new methods for spatio-temporal analysis. Within this context, the manuscript provides an overview of the AirSensEUR project, which establishes an affordable open software/hardware multi-sensor platform, which is nonetheless able to monitor air pollution at low concentration levels. AirSensEUR is described from the perspective of interoperable data management with emphasis on possible use case scenarios, where reliable and timely air quality data would be essential.

  4. Interoperability Measurement

    Science.gov (United States)

    2008-08-01

    1999), the Treasury Enterprise Architecture Framework (TEAF) (Department of the Treasury, 2000), the Open Group Architecture Framework ( TOGAF ) (The...some of their titles (the Zachman framework and TOGAF excepted), each of these frameworks was developed for a specific government agency or 23...The Open Group. "Welcome to TOGAF Version 8.1.1 Enterprise Edition." The Open Group. 2008. June 25, 2008 <http://www.opengroup.org/ togaf

  5. Publication, discovery and interoperability of Clinical Decision Support Systems: A Linked Data approach.

    Science.gov (United States)

    Marco-Ruiz, Luis; Pedrinaci, Carlos; Maldonado, J A; Panziera, Luca; Chen, Rong; Bellika, J Gustav

    2016-08-01

    The high costs involved in the development of Clinical Decision Support Systems (CDSS) make it necessary to share their functionality across different systems and organizations. Service Oriented Architectures (SOA) have been proposed to allow reusing CDSS by encapsulating them in a Web service. However, strong barriers in sharing CDS functionality are still present as a consequence of lack of expressiveness of services' interfaces. Linked Services are the evolution of the Semantic Web Services paradigm to process Linked Data. They aim to provide semantic descriptions over SOA implementations to overcome the limitations derived from the syntactic nature of Web services technologies. To facilitate the publication, discovery and interoperability of CDS services by evolving them into Linked Services that expose their interfaces as Linked Data. We developed methods and models to enhance CDS SOA as Linked Services that define a rich semantic layer based on machine interpretable ontologies that powers their interoperability and reuse. These ontologies provided unambiguous descriptions of CDS services properties to expose them to the Web of Data. We developed models compliant with Linked Data principles to create a semantic representation of the components that compose CDS services. To evaluate our approach we implemented a set of CDS Linked Services using a Web service definition ontology. The definitions of Web services were linked to the models developed in order to attach unambiguous semantics to the service components. All models were bound to SNOMED-CT and public ontologies (e.g. Dublin Core) in order to count on a lingua franca to explore them. Discovery and analysis of CDS services based on machine interpretable models was performed reasoning over the ontologies built. Linked Services can be used effectively to expose CDS services to the Web of Data by building on current CDS standards. This allows building shared Linked Knowledge Bases to provide machine

  6. Principles of data integration and interoperability in the GEO Biodiversity Observation Network

    Science.gov (United States)

    Saarenmaa, Hannu; Ó Tuama, Éamonn

    2010-05-01

    The goal of the Global Earth Observation System of Systems (GEOSS) is to link existing information systems into a global and flexible network to address nine areas of critical importance to society. One of these "societal benefit areas" is biodiversity and it will be supported by a GEOSS sub-system known as the GEO Biodiversity Observation Network (GEO BON). In planning the GEO BON, it was soon recognised that there are already a multitude of existing networks and initiatives in place worldwide. What has been lacking is a coordinated framework that allows for information sharing and exchange between the networks. Traversing across the various scales of biodiversity, in particular from the individual and species levels to the ecosystems level has long been a challenge. Furthermore, some of the major regions of the world have already taken steps to coordinate their efforts, but links between the regions have not been a priority until now. Linking biodiversity data to that of the other GEO societal benefit areas, in particular ecosystems, climate, and agriculture to produce useful information for the UN Conventions and other policy-making bodies is another need that calls for integration of information. Integration and interoperability are therefore a major theme of GEO BON, and a "system of systems" is very much needed. There are several approaches to integration that need to be considered. Data integration requires harmonising concepts, agreeing on vocabularies, and building ontologies. Semantic mediation of data using these building blocks is still not easy to achieve. Agreements on, or mappings between, the metadata standards that will be used across the networks is a major requirement that will need to be addressed early on. With interoperable metadata, service integration will be possible through registry of registries systems such as GBIF's forthcoming GBDRS and the GEO Clearinghouse. Chaining various services that build intermediate products using workflow

  7. GEOSS AIP-2 Climate Change and Biodiversity Use Scenarios: Interoperability Infrastructures

    Science.gov (United States)

    Nativi, Stefano; Santoro, Mattia

    2010-05-01

    In the last years, scientific community is producing great efforts in order to study the effects of climate change on life on Earth. In this general framework, a key role is played by the impact of climate change on biodiversity. To assess this, several use scenarios require the modeling of climatological change impact on the regional distribution of biodiversity species. Designing and developing interoperability infrastructures which enable scientists to search, discover, access and use multi-disciplinary resources (i.e. datasets, services, models, etc.) is currently one of the main research fields for the Earth and Space Science Informatics. This presentation introduces and discusses an interoperability infrastructure which implements the discovery, access, and chaining of loosely-coupled resources in the climatology and biodiversity domains. This allows to set up and run forecast and processing models. The presented framework was successfully developed and experimented in the context of GEOSS AIP-2 (Global Earth Observation System of Systems, Architecture Implementation Pilot- Phase 2) Climate Change & Biodiversity thematic Working Group. This interoperability infrastructure is comprised of the following main components and services: a)GEO Portal: through this component end user is able to search, find and access the needed services for the scenario execution; b)Graphical User Interface (GUI): this component provides user interaction functionalities. It controls the workflow manager to perform the required operations for the scenario implementation; c)Use Scenario controller: this component acts as a workflow controller implementing the scenario business process -i.e. a typical climate change & biodiversity projection scenario; d)Service Broker implementing Mediation Services: this component realizes a distributed catalogue which federates several discovery and access components (exposing them through a unique CSW standard interface). Federated components

  8. An Interoperable Architecture for Air Pollution Early Warning System Based on Sensor Web

    Science.gov (United States)

    Samadzadegan, F.; Zahmatkesh, H.; Saber, M.; Ghazi khanlou, H. J.

    2013-09-01

    Environmental monitoring systems deal with time-sensitive issues which require quick responses in emergency situations. Handling the sensor observations in near real-time and obtaining valuable information is challenging issues in these systems from a technical and scientific point of view. The ever-increasing population growth in urban areas has caused certain problems in developing countries, which has direct or indirect impact on human life. One of applicable solution for controlling and managing air quality by considering real time and update air quality information gathered by spatially distributed sensors in mega cities, using sensor web technology for developing monitoring and early warning systems. Urban air quality monitoring systems using functionalities of geospatial information system as a platform for analysing, processing, and visualization of data in combination with Sensor Web for supporting decision support systems in disaster management and emergency situations. This system uses Sensor Web Enablement (SWE) framework of the Open Geospatial Consortium (OGC), which offers a standard framework that allows the integration of sensors and sensor data into spatial data infrastructures. SWE framework introduces standards for services to access sensor data and discover events from sensor data streams as well as definition set of standards for the description of sensors and the encoding of measurements. The presented system provides capabilities to collect, transfer, share, process air quality sensor data and disseminate air quality status in real-time. It is possible to overcome interoperability challenges by using standard framework. In a routine scenario, air quality data measured by in-situ sensors are communicated to central station where data is analysed and processed. The extracted air quality status is processed for discovering emergency situations, and if necessary air quality reports are sent to the authorities. This research proposed an

  9. AN INTEROPERABLE ARCHITECTURE FOR AIR POLLUTION EARLY WARNING SYSTEM BASED ON SENSOR WEB

    Directory of Open Access Journals (Sweden)

    F. Samadzadegan

    2013-09-01

    Full Text Available Environmental monitoring systems deal with time-sensitive issues which require quick responses in emergency situations. Handling the sensor observations in near real-time and obtaining valuable information is challenging issues in these systems from a technical and scientific point of view. The ever-increasing population growth in urban areas has caused certain problems in developing countries, which has direct or indirect impact on human life. One of applicable solution for controlling and managing air quality by considering real time and update air quality information gathered by spatially distributed sensors in mega cities, using sensor web technology for developing monitoring and early warning systems. Urban air quality monitoring systems using functionalities of geospatial information system as a platform for analysing, processing, and visualization of data in combination with Sensor Web for supporting decision support systems in disaster management and emergency situations. This system uses Sensor Web Enablement (SWE framework of the Open Geospatial Consortium (OGC, which offers a standard framework that allows the integration of sensors and sensor data into spatial data infrastructures. SWE framework introduces standards for services to access sensor data and discover events from sensor data streams as well as definition set of standards for the description of sensors and the encoding of measurements. The presented system provides capabilities to collect, transfer, share, process air quality sensor data and disseminate air quality status in real-time. It is possible to overcome interoperability challenges by using standard framework. In a routine scenario, air quality data measured by in-situ sensors are communicated to central station where data is analysed and processed. The extracted air quality status is processed for discovering emergency situations, and if necessary air quality reports are sent to the authorities. This research

  10. U.K. MoD Land Open Systems Architecture and coalition interoperability with the U.S.

    Science.gov (United States)

    Pearson, Gavin; Kolodny, Mike

    2013-05-01

    The UK Land Open System Architecture (LOSA) is an open, service-based architecture for systems integration and interoperability in the land environment. It is being developed in order to deliver coherent and agile force elements at readiness to operations. LOSA affects planning, delivery and force generation, and supports Future Force 2020. This paper will review the objectives of LOSA and the progress made to date; before focusing on an approach to achieve plug-and-play interoperability of ISR assets. This approach has been proposed to the US DoD Coalition Warfare Program Office as a programme to develop a technology solution to achieve the goal of ISR interoperability. The approach leverages the efforts of the UK Land Open System Architecture (LOSA) and the US Terra Harvest (TH) programs. An open architecture approach is used to enable rapid integration and for disparate assets to autonomously operate collaboratively and coherently; assets share situational awareness and cue other assets when a prescribed set of operational conditions are met. The objective of the interoperability programme being to develop a common lexicon and coherent approach to collaborative operation and information release.

  11. COTS-based OO-component approach for software inter-operability and reuse (software systems engineering methodology)

    Science.gov (United States)

    Yin, J.; Oyaki, A.; Hwang, C.; Hung, C.

    2000-01-01

    The purpose of this research and study paper is to provide a summary description and results of rapid development accomplishments at NASA/JPL in the area of advanced distributed computing technology using a Commercial-Off--The-Shelf (COTS)-based object oriented component approach to open inter-operable software development and software reuse.

  12. Sensor Interoperability and Fusion in Fingerprint Verification: A Case Study using Minutiae-and Ridge-Based Matchers

    NARCIS (Netherlands)

    Alonso-Fernandez, F.; Veldhuis, Raymond N.J.; Bazen, A.M.; Fierrez-Aguilar, J.; Ortega-Garcia, J.

    2006-01-01

    Information fusion in fingerprint recognition has been studied in several papers. However, only a few papers have been focused on sensor interoperability and sensor fusion. In this paper, these two topics are studied using a multisensor database acquired with three different fingerprint sensors.

  13. 76 FR 6496 - In the Matter of Certain Liquid Crystal Display Devices and Products Interoperable With the Same...

    Science.gov (United States)

    2011-02-04

    ... From the Federal Register Online via the Government Publishing Office INTERNATIONAL TRADE COMMISSION In the Matter of Certain Liquid Crystal Display Devices and Products Interoperable With the Same... for importation, and the sale within the United States after importation of certain liquid crystal...

  14. The impact of interoperability of electronic health records on ambulatory physician practices: a discrete-event simulation study

    Directory of Open Access Journals (Sweden)

    Yuan Zhou

    2014-02-01

    Full Text Available Background The effect of health information technology (HIT on efficiency and workload among clinical and nonclinical staff has been debated, with conflicting evidence about whether electronic health records (EHRs increase or decrease effort. None of this paper to date, however, examines the effect of interoperability quantitatively using discrete event simulation techniques.Objective To estimate the impact of EHR systems with various levels of interoperability on day-to-day tasks and operations of ambulatory physician offices.Methods Interviews and observations were used to collect workflow data from 12 adult primary and specialty practices. A discrete event simulation model was constructed to represent patient flows and clinical and administrative tasks of physicians and staff members.Results High levels of EHR interoperability were associated with reduced time spent by providers on four tasks: preparing lab reports, requesting lab orders, prescribing medications, and writing referrals. The implementation of an EHR was associated with less time spent by administrators but more time spent by physicians, compared with time spent at paper-based practices. In addition, the presence of EHRs and of interoperability did not significantly affect the time usage of registered nurses or the total visit time and waiting time of patients.Conclusion This paper suggests that the impact of using HIT on clinical and nonclinical staff work efficiency varies, however, overall it appears to improve time efficiency more for administrators than for physicians and nurses.

  15. Distributed GIS Systems, Open Specifications and Interoperability: How do They Relate to the Sustainable Management of Natural Resources?

    Science.gov (United States)

    Rafael Moreno-Sanchez

    2006-01-01

    The aim of this is paper is to provide a conceptual framework for the session: “The role of web-based Geographic Information Systems in supporting sustainable management.” The concepts of sustainability, sustainable forest management, Web Services, Distributed Geographic Information Systems, interoperability, Open Specifications, and Open Source Software are defined...

  16. Solving Interoperability in Translational Health. Perspectives of Students from the International Partnership in Health Informatics Education (IPHIE) 2016 Master Class.

    Science.gov (United States)

    Turner, Anne M; Facelli, Julio C; Jaspers, Monique; Wetter, Thomas; Pfeifer, Daniel; Gatewood, Laël Cranmer; Adam, Terry; Li, Yu-Chuan; Lin, Ming-Chin; Evans, R Scott; Beukenhorst, Anna; van Mens, Hugo Johan Theodoore; Tensen, Esmee; Bock, Christian; Fendrich, Laura; Seitz, Peter; Suleder, Julian; Aldelkhyyel, Ranyah; Bridgeman, Kent; Hu, Zhen; Sattler, Aaron; Guo, Shin-Yi; Mohaimenul, Islam Md Mohaimenul; Anggraini Ningrum, Dina Nur; Tung, Hsin-Ru; Bian, Jiantano; Plasek, Joseph M; Rommel, Casey; Burke, Juandalyn; Sohih, Harkirat

    2017-06-20

    In the summer of 2016 an international group of biomedical and health informatics faculty and graduate students gathered for the 16th meeting of the International Partnership in Health Informatics Education (IPHIE) masterclass at the University of Utah campus in Salt Lake City, Utah. This international biomedical and health informatics workshop was created to share knowledge and explore issues in biomedical health informatics (BHI). The goal of this paper is to summarize the discussions of biomedical and health informatics graduate students who were asked to define interoperability, and make critical observations to gather insight on how to improve biomedical education. Students were assigned to one of four groups and asked to define interoperability and explore potential solutions to current problems of interoperability in health care. We summarize here the student reports on the importance and possible solutions to the "interoperability problem" in biomedical informatics. Reports are provided from each of the four groups of highly qualified graduate students from leading BHI programs in the US, Europe and Asia. International workshops such as IPHIE provide a unique opportunity for graduate student learning and knowledge sharing. BHI faculty are encouraged to incorporate into their curriculum opportunities to exercise and strengthen student critical thinking to prepare our students for solving health informatics problems in the future.

  17. Reference architecture and interoperability model for data mining and fusion in scientific cross-domain infrastructures

    Science.gov (United States)

    Haener, Rainer; Waechter, Joachim; Grellet, Sylvain; Robida, Francois

    2017-04-01

    Interoperability is the key factor in establishing scientific research environments and infrastructures, as well as in bringing together heterogeneous, geographically distributed risk management, monitoring, and early warning systems. Based on developments within the European Plate Observing System (EPOS), a reference architecture has been devised that comprises architectural blue-prints and interoperability models regarding the specification of business processes and logic as well as the encoding of data, metadata, and semantics. The architectural blueprint is developed on the basis of the so called service-oriented architecture (SOA) 2.0 paradigm, which combines intelligence and proactiveness of event-driven with service-oriented architectures. SOA 2.0 supports analysing (Data Mining) both, static and real-time data in order to find correlations of disparate information that do not at first appear to be intuitively obvious: Analysed data (e.g., seismological monitoring) can be enhanced with relationships discovered by associating them (Data Fusion) with other data (e.g., creepmeter monitoring), with digital models of geological structures, or with the simulation of geological processes. The interoperability model describes the information, communication (conversations) and the interactions (choreographies) of all participants involved as well as the processes for registering, providing, and retrieving information. It is based on the principles of functional integration, implemented via dedicated services, communicating via service-oriented and message-driven infrastructures. The services provide their functionality via standardised interfaces: Instead of requesting data directly, users share data via services that are built upon specific adapters. This approach replaces the tight coupling at data level by a flexible dependency on loosely coupled services. The main component of the interoperability model is the comprehensive semantic description of the information

  18. GéoSAS: A modular and interoperable Open Source Spatial Data Infrastructure for research

    Directory of Open Access Journals (Sweden)

    R. Bera

    2015-05-01

    Full Text Available To-date, the commonest way to deal with geographical information and processes still appears to consume local resources, i.e. locally stored data processed on a local desktop or server. The maturity and subsequent growing use of OGC standards to exchange data on the World Wide Web, enhanced in Europe by the INSPIRE Directive, is bound to change the way people (and among them research scientists, especially in environmental sciences make use of, and manage, spatial data. A clever use of OGC standards can help scientists to better store, share and use data, in particular for modelling. We propose a framework for online processing by making an intensive use of OGC standards. We illustrate it using the Spatial Data Infrastructure (SDI GéoSAS which is the SDI set up for researchers’ needs in our department. It is based on the existing open source, modular and interoperable Spatial Data Architecture geOrchestra.

  19. The Open Anatomy Browser: A Collaborative Web-Based Viewer for Interoperable Anatomy Atlases.

    Science.gov (United States)

    Halle, Michael; Demeusy, Valentin; Kikinis, Ron

    2017-01-01

    The Open Anatomy Browser (OABrowser) is an open source, web-based, zero-installation anatomy atlas viewer based on current web browser technologies and evolving anatomy atlas interoperability standards. OABrowser displays three-dimensional anatomical models, image cross-sections of labeled structures and source radiological imaging, and a text-based hierarchy of structures. The viewer includes novel collaborative tools: users can save bookmarks of atlas views for later access and exchange those bookmarks with other users, and dynamic shared views allow groups of users can participate in a collaborative interactive atlas viewing session. We have published several anatomy atlases (an MRI-derived brain atlas and atlases of other parts of the anatomy) to demonstrate OABrowser's functionality. The atlas source data, processing tools, and the source for OABrowser are freely available through GitHub and are distributed under a liberal open source license.

  20. How to ensure sustainable interoperability in heterogeneous distributed systems through architectural approach.

    Science.gov (United States)

    Pape-Haugaard, Louise; Frank, Lars

    2011-01-01

    A major obstacle in ensuring ubiquitous information is the utilization of heterogeneous systems in eHealth. The objective in this paper is to illustrate how an architecture for distributed eHealth databases can be designed without lacking the characteristic features of traditional sustainable databases. The approach is firstly to explain traditional architecture in central and homogeneous distributed database computing, followed by a possible approach to use an architectural framework to obtain sustainability across disparate systems i.e. heterogeneous databases, concluded with a discussion. It is seen that through a method of using relaxed ACID properties on a service-oriented architecture it is possible to achieve data consistency which is essential when ensuring sustainable interoperability.

  1. Data Access, Interoperability and Sustainability: Key Challenges for the Evolution of Science Capabilities

    Science.gov (United States)

    Walton, A. L.

    2015-12-01

    In 2016, the National Science Foundation (NSF) will support a portfolio of activities and investments focused upon challenges in data access, interoperability, and sustainability. These topics are fundamental to science questions of increasing complexity that require multidisciplinary approaches and expertise. Progress has become tractable because of (and sometimes complicated by) unprecedented growth in data (both simulations and observations) and rapid advances in technology (such as instrumentation in all aspects of the discovery process, together with ubiquitous cyberinfrastructure to connect, compute, visualize, store, and discover). The goal is an evolution of capabilities for the research community based on these investments, scientific priorities, technology advances, and policies. Examples from multiple NSF directorates, including investments by the Advanced Cyberinfrastructure Division, are aimed at these challenges and can provide the geosciences research community with models and opportunities for participation. Implications for the future are highlighted, along with the importance of continued community engagement on key issues.

  2. Interoperability and different ways of knowing: How semantics can aid in cross-cultural understanding

    Science.gov (United States)

    Pulsifer, P. L.; Parsons, M. A.; Duerr, R. E.; Fox, P. A.; Khalsa, S. S.; McCusker, J. P.; McGuinness, D. L.

    2012-12-01

    To address interoperability, we first need to understand how human perspectives and worldviews influence the way people conceive of and describe geophysical phenomena. There is never a single, unambiguous description of a phenomenon - the terminology used is based on the relationship people have with it and what their interests are. So how can these perspectives be reconciled in a way that is not only clear to different people but also formally described so that information systems can interoperate? In this paper we explore conceptions of Arctic sea ice as a means of exploring these issues. We examine multiple conceptions of sea ice and related processes as fundamental components of the Earth system. Arctic sea ice is undergoing rapid and dramatic decline. This will have huge impact on climate and biological systems as well as on shipping, exploration, human culture, and geopolitics. Local hunters, operational shipping forecasters, global climate researchers, and others have critical needs for sea ice data and information, but they conceive of, and describe sea ice phenomena in very different ways. Our hypothesis is that formally representing these diverse conceptions in a suite of formal ontologies can help facilitate sharing of information across communities and enhance overall Arctic data interoperability. We present initial work to model operational, research, and Indigenous (Iñupiat and Yup'ik) concepts of sea ice phenomena and data. Our results illustrate important and surprising differences in how these communities describe and represent sea ice, and we describe our approach to resolving incongruities and inconsistencies. We begin by exploring an intriguing information artifact, the World Meteorological Organization "egg code". The egg code is a compact, information rich way of illustrating detailed ice conditions that has been used broadly for a century. There is much agreement on construction and content encoding, but there are important regional

  3. Ocean Data Interoperability Platform: developing a common global framework for marine data management

    Science.gov (United States)

    Glaves, Helen; Schaap, Dick

    2017-04-01

    elsewhere. To add a further layer of complexity there are also global initiatives providing marine data infrastructures e.g. IOC-IODE, POGO as well as those with a wider remit which includes environmental data e.g. GEOSS, COPERNICUS etc. Ecosystem level marine research requires a common framework for marine data management that supports the sharing of data across these regional and global data systems, and provides the user with access to the data available from these services via a single point of access. This framework must be based on existing data systems and established by developing interoperability between them. The Ocean Data and Interoperability Platform (ODIP/ODIP II) project brings together those organisations responsible for maintaining selected regional data infrastructures along with other relevant experts in order to identify the common standards and best practice necessary to underpin this framework, and to evaluate the differences and commonalties between the regional data infrastructures in order to establish interoperability between them for the purposes of data sharing. This coordinated approach is being demonstrated and validated through the development of a series of prototype interoperability solutions that demonstrate the mechanisms and standards necessary to facilitate the sharing of marine data across these existing data infrastructures.

  4. Interoperating AliEn and ARC for a Distributed Tier1 in the Nordic Countries

    CERN Document Server

    Gros, Philippe; Lindemann, Jonas; Saiz, Pablo; Zarochentsev, Andrey

    2011-01-01

    To reach its large computing needs, the ALICE experiment at CERN has developed its own middleware called AliEn, centralised and relying on pilot jobs. One of its strength is the automatic installation of the required packages. The Nordic countries have offered a distributed Tier-1 centre for the CERN experiments, where the job management should be done with the NorduGrid middleware ARC. We have developed an interoperation module to allow to unify several computing sites using ARC, and make them look like a single site from the point of view of AliEn. A prototype has been completed and tested out of production. This talk will present implementation details of the system and its performance in tests.

  5. Direct2Experts: a pilot national network to demonstrate interoperability among research-networking platforms

    Science.gov (United States)

    Barnett, William; Conlon, Mike; Eichmann, David; Kibbe, Warren; Falk-Krzesinski, Holly; Halaas, Michael; Johnson, Layne; Meeks, Eric; Mitchell, Donald; Schleyer, Titus; Stallings, Sarah; Warden, Michael; Kahlon, Maninder

    2011-01-01

    Research-networking tools use data-mining and social networking to enable expertise discovery, matchmaking and collaboration, which are important facets of team science and translational research. Several commercial and academic platforms have been built, and many institutions have deployed these products to help their investigators find local collaborators. Recent studies, though, have shown the growing importance of multiuniversity teams in science. Unfortunately, the lack of a standard data-exchange model and resistance of universities to share information about their faculty have presented barriers to forming an institutionally supported national network. This case report describes an initiative, which, in only 6 months, achieved interoperability among seven major research-networking products at 28 universities by taking an approach that focused on addressing institutional concerns and encouraging their participation. With this necessary groundwork in place, the second phase of this effort can begin, which will expand the network's functionality and focus on the end users. PMID:22037890

  6. RABIX: AN OPEN-SOURCE WORKFLOW EXECUTOR SUPPORTING RECOMPUTABILITY AND INTEROPERABILITY OF WORKFLOW DESCRIPTIONS.

    Science.gov (United States)

    Kaushik, Gaurav; Ivkovic, Sinisa; Simonovic, Janko; Tijanic, Nebojsa; Davis-Dusenbery, Brandi; Kural, Deniz

    2017-01-01

    As biomedical data has become increasingly easy to generate in large quantities, the methods used to analyze it have proliferated rapidly. Reproducible and reusable methods are required to learn from large volumes of data reliably. To address this issue, numerous groups have developed workflow specifications or execution engines, which provide a framework with which to perform a sequence of analyses. One such specification is the Common Workflow Language, an emerging standard which provides a robust and flexible framework for describing data analysis tools and workflows. In addition, reproducibility can be furthered by executors or workflow engines which interpret the specification and enable additional features, such as error logging, file organization, optim1izations to computation and job scheduling, and allow for easy computing on large volumes of data. To this end, we have developed the Rabix Executor, an open-source workflow engine for the purposes of improving reproducibility through reusability and interoperability of workflow descriptions.

  7. Evaluating the impact of a service-oriented framework for healthcare interoperability.

    Science.gov (United States)

    Daskalakis, Stylianos; Mantas, John

    2008-01-01

    This paper describes the evaluation of a service-oriented prototype implementation. The prototype development aims to exploit the use of service-oriented concepts for achieving healthcare interoperability while it also attempts to move towards a virtual patient record paradigm. The proposed evaluation strategy investigates the adaptation of the DeLone and McLean model of information systems success with respect to service-oriented implementations. Specific service-oriented and virtual patient record characteristics were empirically encapsulated in the DeLone and McLean model and respective evaluation measures were produced. The proposed theoretical framework was utilized for conducting an empirical study amongst sixty two participants in order to observe their perceptions with respect to the hypothetical adoption of the prototype framework. The data gathered was analyzed using partial least squares. The generated results highlighted the importance of information quality whereas system quality did not prove to be a strong significant predictor in the overall model.

  8. Enabling interoperability, accessibility and reusability of virtual patients across Europe - design and implementation.

    Science.gov (United States)

    Zary, Nabil; Hege, Inga; Heid, Jörn; Woodham, Luke; Donkers, Jeroen; Kononowicz, Andrzej A

    2009-01-01

    Virtual Patients (VPs) have successfully been integrated into medical and healthcare curricula for a number of years. Lack of time and resources is a frequently reported problem encountered when developing VPs for teaching and learning. Consequently there is a need for cross-institutional repositories of VPs. The aims of the study were two-fold: to enable interoperability between virtual patient systems and to investigate if (and how) an application profile is implemented in four different types of VP systems. This European collaborative implementation of a blend of several specifications (Medbiquitous VP XML, Medbiquitous Healthcare LOM, and SCORM) is innovative and the study has shown a variation in how the application profile could be implemented.

  9. eHealth integration and interoperability issues: towards a solution through enterprise architecture.

    Science.gov (United States)

    Adenuga, Olugbenga A; Kekwaletswe, Ray M; Coleman, Alfred

    2015-01-01

    Investments in healthcare information and communication technology (ICT) and health information systems (HIS) continue to increase. This is creating immense pressure on healthcare ICT and HIS to deliver and show significance in such investments in technology. It is discovered in this study that integration and interoperability contribute largely to this failure in ICT and HIS investment in healthcare, thus resulting in the need towards healthcare architecture for eHealth. This study proposes an eHealth architectural model that accommodates requirement based on healthcare need, system, implementer, and hardware requirements. The model is adaptable and examines the developer's and user's views that systems hold high hopes for their potential to change traditional organizational design, intelligence, and decision-making.

  10. An Integrated Framework to Achieve Interoperability in Person-Centric Health Management

    Directory of Open Access Journals (Sweden)

    Fabio Vergari

    2011-01-01

    Full Text Available The need for high-quality out-of-hospital healthcare is a known socioeconomic problem. Exploiting ICT's evolution, ad-hoc telemedicine solutions have been proposed in the past. Integrating such ad-hoc solutions in order to cost-effectively support the entire healthcare cycle is still a research challenge. In order to handle the heterogeneity of relevant information and to overcome the fragmentation of out-of-hospital instrumentation in person-centric healthcare systems, a shared and open source interoperability component can be adopted, which is ontology driven and based on the semantic web data model. The feasibility and the advantages of the proposed approach are demonstrated by presenting the use case of real-time monitoring of patients' health and their environmental context.

  11. Semantic modeling and interoperability in product and process engineering a technology for engineering informatics

    CERN Document Server

    2013-01-01

    In the past decade, feature-based design and manufacturing has gained some momentum in various engineering domains to represent and reuse semantic patterns with effective applicability. However, the actual scope of feature application is still very limited. Semantic Modeling and Interoperability in Product and Process Engineering provides a systematic solution for the challenging engineering informatics field aiming at the enhancement of sustainable knowledge representation, implementation and reuse in an open and yet practically manageable scale.   This semantic modeling technology supports uniform, multi-facet and multi-level collaborative system engineering with heterogeneous computer-aided tools, such as CADCAM, CAE, and ERP.  This presented unified feature model can be applied to product and process representation, development, implementation and management. Practical case studies and test samples are provided to illustrate applications which can be implemented by the readers in real-world scenarios. �...

  12. DIMP: an interoperable solution for software integration and product data exchange

    Science.gov (United States)

    Wang, Xi Vincent; Xu, Xun William

    2012-08-01

    Today, globalisation has become one of the main trends of manufacturing business that has led to a world-wide decentralisation of resources amongst not only individual departments within one company but also business partners. However, despite the development and improvement in the last few decades, difficulties in information exchange and sharing still exist in heterogeneous applications environments. This article is divided into two parts. In the first part, related research work and integrating solutions are reviewed and discussed. The second part introduces a collaborative environment called distributed interoperable manufacturing platform, which is based on a module-based, service-oriented architecture (SOA). In the platform, the STEP-NC data model is used to facilitate data-exchange among heterogeneous CAD/CAM/CNC systems.

  13. An approach for the semantic interoperability of ISO EN 13606 and OpenEHR archetypes.

    Science.gov (United States)

    Martínez-Costa, Catalina; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás

    2010-10-01

    The communication between health information systems of hospitals and primary care organizations is currently an important challenge to improve the quality of clinical practice and patient safety. However, clinical information is usually distributed among several independent systems that may be syntactically or semantically incompatible. This fact prevents healthcare professionals from accessing clinical information of patients in an understandable and normalized way. In this work, we address the semantic interoperability of two EHR standards: OpenEHR and ISO EN 13606. Both standards follow the dual model approach which distinguishes information and knowledge, this being represented through archetypes. The solution presented here is capable of transforming OpenEHR archetypes into ISO EN 13606 and vice versa by combining Semantic Web and Model-driven Engineering technologies. The resulting software implementation has been tested using publicly available collections of archetypes for both standards.

  14. A Thermal Simulation Tool for Building and Its Interoperability through the Building Information Modeling (BIM Platform

    Directory of Open Access Journals (Sweden)

    Christophe Nicolle

    2013-05-01

    Full Text Available This paper describes potential challenges and opportunities for using thermal simulation tools to optimize building performance. After reviewing current trends in thermal simulation, it outlines major criteria for the evaluation of building thermal simulation tools based on specifications and capabilities in interoperability. Details are discussed including workflow of data exchange of multiple thermal analyses such as the BIM-based application. The present analysis focuses on selected thermal simulation tools that provide functionalities to exchange data with other tools in order to obtain a picture of its basic work principles and to identify selection criteria for generic thermal tools in BIM. Significances and barriers to integration design with BIM and building thermal simulation tools are also discussed.

  15. Interoperability and models for exchange of data between information systems in public administration

    Science.gov (United States)

    Glavev, Victor

    2016-12-01

    The types of software applications used by public administrations can be divided in three main groups: document management systems, record management systems and business process systems. Each one of them generates outputs that can be used as input data to the others. This is the main reason that requires exchange of data between these three groups and well defined models that should be followed. There are also many other reasons that will be discussed in the paper. Interoperability is a key aspect when those models are implemented, especially when there are different manufactures of systems in the area of software applications used by public authorities. The report includes examples of implementation of models for exchange of data between software systems deployed in one of the biggest administration in Bulgaria.

  16. 3D facial landmarks: Inter-operator variability of manual annotation

    DEFF Research Database (Denmark)

    Fagertun, Jens; Harder, Stine; Rosengren, Anders

    2014-01-01

    in regards to intra-operator and portraits. Using a sparse set of landmarks (n=14), that capture the whole face, the dense point mean variance was reduced from 1.92 to 0.54 mm. Conclusion The inter-operator variability was primarily associated with particular landmarks, where more leniently landmarks had......Background Manual annotation of landmarks is a known source of variance, which exist in all fields of medical imaging, influencing the accuracy and interpretation of the results. However, the variability of human facial landmarks is only sparsely addressed in the current literature as opposed to e.......g. the research fields of orthodontics and cephalometrics. We present a full facial 3D annotation procedure and a sparse set of manually annotated landmarks, in effort to reduce operator time and minimize the variance. Method Facial scans from 36 voluntary unrelated blood donors from the Danish Blood Donor Study...

  17. ImageJ-MATLAB: a bidirectional framework for scientific image analysis interoperability.

    Science.gov (United States)

    Hiner, Mark C; Rueden, Curtis T; Eliceiri, Kevin W

    2017-02-15

    ImageJ-MATLAB is a lightweight Java library facilitating bi-directional interoperability between MATLAB and ImageJ. By defining a standard for translation between matrix and image data structures, researchers are empowered to select the best tool for their image-analysis tasks. Freely available extension to ImageJ2 ( http://imagej.net/Downloads ). Installation and use instructions available at http://imagej.net/MATLAB_Scripting. Tested with ImageJ 2.0.0-rc-54 , Java 1.8.0_66 and MATLAB R2015b. eliceiri@wisc.edu. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  18. XML and Graphs for Modeling, Integration and Interoperability:a CMS Perspective

    CERN Document Server

    van Lingen, Frank

    2004-01-01

    This thesis reports on a designer's Ph.D. project called “XML and Graphs for Modeling, Integration and Interoperability: a CMS perspective”. The project has been performed at CERN, the European laboratory for particle physics, in collaboration with the Eindhoven University of Technology and the University of the West of England in Bristol. CMS (Compact Muon Solenoid) is a next-generation high energy physics experiment at CERN, which will start running in 2007. The complexity of such a detector used in the experiment and the autonomous groups that are part of the CMS experiment, result in disparate data sources (different in format, type and structure). Users need to access and exchange data located in multiple heterogeneous sources in a domain-specific manner and may want to access a simple unit of information without having to understand details of the underlying schema. Users want to access the same information from several different heterogeneous sources. It is neither desirable nor fea...

  19. Academic Research Library as Broker in Addressing Interoperability Challenges for the Geosciences

    Science.gov (United States)

    Smith, P., II

    2015-12-01

    Data capture is an important process in the research lifecycle. Complete descriptive and representative information of the data or database is necessary during data collection whether in the field or in the research lab. The National Science Foundation's (NSF) Public Access Plan (2015) mandates the need for federally funded projects to make their research data more openly available. Developing, implementing, and integrating metadata workflows into to the research process of the data lifecycle facilitates improved data access while also addressing interoperability challenges for the geosciences such as data description and representation. Lack of metadata or data curation can contribute to (1) semantic, (2) ontology, and (3) data integration issues within and across disciplinary domains and projects. Some researchers of EarthCube funded projects have identified these issues as gaps. These gaps can contribute to interoperability data access, discovery, and integration issues between domain-specific and general data repositories. Academic Research Libraries have expertise in providing long-term discovery and access through the use of metadata standards and provision of access to research data, datasets, and publications via institutional repositories. Metadata crosswalks, open archival information systems (OAIS), trusted-repositories, data seal of approval, persistent URL, linking data, objects, resources, and publications in institutional repositories and digital content management systems are common components in the library discipline. These components contribute to a library perspective on data access and discovery that can benefit the geosciences. The USGS Community for Data Integration (CDI) has developed the Science Support Framework (SSF) for data management and integration within its community of practice for contribution to improved understanding of the Earth's physical and biological systems. The USGS CDI SSF can be used as a reference model to map to Earth

  20. SMART on FHIR: a standards-based, interoperable apps platform for electronic health records.

    Science.gov (United States)

    Mandel, Joshua C; Kreda, David A; Mandl, Kenneth D; Kohane, Isaac S; Ramoni, Rachel B

    2016-09-01

    In early 2010, Harvard Medical School and Boston Children's Hospital began an interoperability project with the distinctive goal of developing a platform to enable medical applications to be written once and run unmodified across different healthcare IT systems. The project was called Substitutable Medical Applications and Reusable Technologies (SMART). We adopted contemporary web standards for application programming interface transport, authorization, and user interface, and standard medical terminologies for coded data. In our initial design, we created our own openly licensed clinical data models to enforce consistency and simplicity. During the second half of 2013, we updated SMART to take advantage of the clinical data models and the application-programming interface described in a new, openly licensed Health Level Seven draft standard called Fast Health Interoperability Resources (FHIR). Signaling our adoption of the emerging FHIR standard, we called the new platform SMART on FHIR. We introduced the SMART on FHIR platform with a demonstration that included several commercial healthcare IT vendors and app developers showcasing prototypes at the Health Information Management Systems Society conference in February 2014. This established the feasibility of SMART on FHIR, while highlighting the need for commonly accepted pragmatic constraints on the base FHIR specification. In this paper, we describe the creation of SMART on FHIR, relate the experience of the vendors and developers who built SMART on FHIR prototypes, and discuss some challenges in going from early industry prototyping to industry-wide production use. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  1. Interoperability of Volcano Observation Thematic Core Services with the EPOS Integrated Core Services

    Science.gov (United States)

    Vogfjord, Kristin; Sigurdsson, Sigurdur F.; Reitano, Danilo

    2017-04-01

    The volcano observations community, represented by Volcano Observatories (VO) and Volcano Research Institutions (VRI) participating in The European Plate Observing System (EPOS), will implement services to enable open access to data, data products, software and services (DDSS) from the community. Technical implementation of these services is established within the Volcano Observations Thematic Core Service (VO-TCS), which will coordinate activities among the contributing VOs and VRIs to ensure their interoperability with the EPOS Integrated Core services (ICS). The goal is to implement a service-oriented architecture (SOA) to guarantee interoperability among the different components of the VO-TCS and the EPOS-ICS architecture. This entails linking and harmonizing the technical implementation of the VO-TCS with the EPOS-ICS, defining standards for TCS-ICS interaction and implementing a prototype for a RESTful service (REpresentational State Transfer). The VO-TCS services will also coordinate with services and platforms already developed and implemented within the two Volcano Supersite projects, FUTUREVOLC and MED-SUV and will utilize some of their already established services to enable initial access to the community's products. To prepare for initial implementation in the fall of 2017, a survey among the VO-TCS participants was carried out to evaluate the maturity level of their different products (DDSSs). The specific goal was to obtain a report for each participating institution describing the real cross-reference between each DDSS status and the TCS requirements, as well as to determine the availability of data and metadata for each DDSS and their level of maturity. Data and metadata similarities between the participants highlighted by the survey results are used to reorganize and simplify the list of products to be made available in the VO-TCS. The presentation will give an overview of the planned services in the Volcano Observations TCS and outline the roadmap

  2. Designing a Distributed Space Systems Simulation in Accordance with the Simulation Interoperability Standards Organization (SISO)

    Science.gov (United States)

    Cowen, Benjamin

    2011-01-01

    Simulations are essential for engineering design. These virtual realities provide characteristic data to scientists and engineers in order to understand the details and complications of the desired mission. A standard development simulation package known as Trick is used in developing a source code to model a component (federate in HLA terms). The runtime executive is integrated into an HLA based distributed simulation. TrickHLA is used to extend a Trick simulation for a federation execution, develop a source code for communication between federates, as well as foster data input and output. The project incorporates international cooperation along with team collaboration. Interactions among federates occur throughout the simulation, thereby relying on simulation interoperability. Communication through the semester went on between participants to figure out how to create this data exchange. The NASA intern team is designing a Lunar Rover federate and a Lunar Shuttle federate. The Lunar Rover federate supports transportation across the lunar surface and is essential for fostering interactions with other federates on the lunar surface (Lunar Shuttle, Lunar Base Supply Depot and Mobile ISRU Plant) as well as transporting materials to the desired locations. The Lunar Shuttle federate transports materials to and from lunar orbit. Materials that it takes to the supply depot include fuel and cargo necessary to continue moon-base operations. This project analyzes modeling and simulation technologies as well as simulation interoperability. Each team from participating universities will work on and engineer their own federate(s) to participate in the SISO Spring 2011 Workshop SIW Smackdown in Boston, Massachusetts. This paper will focus on the Lunar Rover federate.

  3. The development of a nursing subset of patient problems to support interoperability.

    Science.gov (United States)

    Kieft, R A M M; Vreeke, E M; de Groot, E M; Volkert, P A; Francke, A L; Delnoij, D M J

    2017-12-04

    Since the emergence of electronic health records, nursing information is increasingly being recorded and stored digitally. Several studies have shown that a wide range of nursing information is not interoperable and cannot be re-used in different health contexts. Difficulties arise when nurses share information with others involved in the delivery of nursing care. The aim of this study is to develop a nursing subset of patient problems that are prevalent in nursing practice, based on the SNOMED CT terminology to assist in the exchange and comparability of nursing information. Explorative qualitative focus groups were used to collect data. Mixed focus groups were defined. Additionally, a nursing researcher and a nursing expert with knowledge of terminologies and a terminologist participated in each focus group. The participants, who work in a range of practical contexts, discussed and reviewed patient problems from various perspectives. Sixty-seven participants divided over seven focus groups selected and defined 119 patient problems. Each patient problem could be documented and coded with a current status or an at-risk status. Sixty-six percent of the patient problems included are covered by the definitions established by the International Classification of Nursing Practice, the reference terminology for nursing practice. For the remainder, definitions from either an official national guideline or a classification were used. Each of the 119 patient problems has a unique SNOMED CT identifier. To support the interoperability of nursing information, a national nursing subset of patient problems based on a terminology (SNOMED CT) has been developed. Using unambiguously defined patient problems is beneficial for clinical nursing practice, because nurses can then compare and exchange information from different settings. A key strength of this study is that nurses were extensively involved in the development process. Further research is required to link or associate

  4. NASA's Earth Observing Data and Information System - Supporting Interoperability through a Scalable Architecture (Invited)

    Science.gov (United States)

    Mitchell, A. E.; Lowe, D. R.; Murphy, K. J.; Ramapriyan, H. K.

    2013-12-01

    Initiated in 1990, NASA's Earth Observing System Data and Information System (EOSDIS) is currently a petabyte-scale archive of data designed to receive, process, distribute and archive several terabytes of science data per day from NASA's Earth science missions. Comprised of 12 discipline specific data centers collocated with centers of science discipline expertise, EOSDIS manages over 6800 data products from many science disciplines and sources. NASA supports global climate change research by providing scalable open application layers to the EOSDIS distributed information framework. This allows many other value-added services to access NASA's vast Earth Science Collection and allows EOSDIS to interoperate with data archives from other domestic and international organizations. EOSDIS is committed to NASA's Data Policy of full and open sharing of Earth science data. As metadata is used in all aspects of NASA's Earth science data lifecycle, EOSDIS provides a spatial and temporal metadata registry and order broker called the EOS Clearing House (ECHO) that allows efficient search and access of cross domain data and services through the Reverb Client and Application Programmer Interfaces (APIs). Another core metadata component of EOSDIS is NASA's Global Change Master Directory (GCMD) which represents more than 25,000 Earth science data set and service descriptions from all over the world, covering subject areas within the Earth and environmental sciences. With inputs from the ECHO, GCMD and Soil Moisture Active Passive (SMAP) mission metadata models, EOSDIS is developing a NASA ISO 19115 Best Practices Convention. Adoption of an international metadata standard enables a far greater level of interoperability among national and international data products. NASA recently concluded a 'Metadata Harmony Study' of EOSDIS metadata capabilities/processes of ECHO and NASA's Global Change Master Directory (GCMD), to evaluate opportunities for improved data access and use, reduce

  5. Community-Driven Initiatives to Achieve Interoperability for Ecological and Environmental Data

    Science.gov (United States)

    Madin, J.; Bowers, S.; Jones, M.; Schildhauer, M.

    2007-12-01

    Advances in ecology and environmental science increasingly depend on information from multiple disciplines to tackle broader and more complex questions about the natural world. Such advances, however, are hindered by data heterogeneity, which impedes the ability of researchers to discover, interpret, and integrate relevant data that have been collected by others. Here, we outline two community-building initiatives for improving data interoperability in the ecological and environmental sciences, one that is well-established (the Ecological Metadata Language [EML]), and another that is actively underway (a unified model for observations and measurements). EML is a metadata specification developed for the ecology discipline, and is based on prior work done by the Ecological Society of America and associated efforts to ensure a modular and extensible framework to document ecological data. EML "modules" are designed to describe one logical part of the total metadata that should be included with any ecological dataset. EML was developed through a series of working meetings, ongoing discussion forums and email lists, with participation from a broad range of ecological and environmental scientists, as well as computer scientists and software developers. Where possible, EML adopted syntax from the other metadata standards for other disciplines (e.g., Dublin Core, Content Standard for Digital Geospatial Metadata, and more). Although EML has not yet been ratified through a standards body, it has become the de facto metadata standard for a large range of ecological data management projects, including for the Long Term Ecological Research Network, the National Center for Ecological Analysis and Synthesis, and the Ecological Society of America. The second community-building initiative is based on work through the Scientific Environment for Ecological Knowledge (SEEK) as well as a recent workshop on multi-disciplinary data management. This initiative aims at improving

  6. Building an Interoperability Test System for Electric Vehicle Chargers Based on ISO/IEC 15118 and IEC 61850 Standards

    Directory of Open Access Journals (Sweden)

    Minho Shin

    2016-05-01

    Full Text Available The electric vehicle market is rapidly growing due to its environmental friendliness and governmental support. As electric vehicles are powered by electricity, the interoperability between the vehicles and the chargers made by multiple vendors is crucial for the success of the technology. Relevant standards are being published, but the methods for conformance testing need to be developed. In this paper, we present our conformance test system for the electric vehicle charger in accordance with the standards ISO/IEC 15118, IEC 61851 and IEC 61850-90-8. Our test system leverages the TTCN-3 framework for its flexibility and productivity. We evaluate the test system by lab tests with two reference chargers that we built. We also present the test results in two international testival events for the ISO/IEC 15118 interoperability. We confirmed that our test system is robust, efficient and practical.

  7. The 2nd DBCLS BioHackathon: interoperable bioinformatics Web services for integrated applications

    Directory of Open Access Journals (Sweden)

    Katayama Toshiaki

    2011-08-01

    Full Text Available Abstract Background The interaction between biological researchers and the bioinformatics tools they use is still hampered by incomplete interoperability between such tools. To ensure interoperability initiatives are effectively deployed, end-user applications need to be aware of, and support, best practices and standards. Here, we report on an initiative in which software developers and genome biologists came together to explore and raise awareness of these issues: BioHackathon 2009. Results Developers in attendance came from diverse backgrounds, with experts in Web services, workflow tools, text mining and visualization. Genome biologists provided expertise and exemplar data from the domains of sequence and pathway analysis and glyco-informatics. One goal of the meeting was to evaluate the ability to address real world use cases in these domains using the tools that the developers represented. This resulted in i a workflow to annotate 100,000 sequences from an invertebrate species; ii an integrated system for analysis of the transcription factor binding sites (TFBSs enriched based on differential gene expression data obtained from a microarray experiment; iii a workflow to enumerate putative physical protein interactions among enzymes in a metabolic pathway using protein structure data; iv a workflow to analyze glyco-gene-related diseases by searching for human homologs of glyco-genes in other species, such as fruit flies, and retrieving their phenotype-annotated SNPs. Conclusions Beyond deriving prototype solutions for each use-case, a second major purpose of the BioHackathon was to highlight areas of insufficiency. We discuss the issues raised by our exploration of the problem/solution space, concluding that there are still problems with the way Web services are modeled and annotated, including: i the absence of several useful data or analysis functions in the Web service "space"; ii the lack of documentation of methods; iii lack of

  8. Bringing it All Together: NODC's Geoportal Server as an Integration Tool for Interoperable Data Services

    Science.gov (United States)

    Casey, K. S.; Li, Y.

    2011-12-01

    The US National Oceanographic Data Center (NODC) has implemented numerous interoperable data technologies in recent years to enhance the discovery, understanding, and use of the vast quantities of data in the NODC archives. These services include OPeNDAP's Hyrax server, Unidata's THREDDS Data Server (TDS), NOAA's Live Access Server (LAS), and most recently the ESRI ArcGIS Server. Combined, these technologies enable NODC to provide access to its data holdings and products through most of the commonly-used standardized web services like the Data Access Protocol (DAP) and the Open Geospatial Consortium suite of services such as the Web Mapping Service (WMS) and Web Coverage Service (WCS). Despite the strong demand for and use of these services, the acronym-rich environment of services can also result in confusion for producers of data to the NODC archives, for consumers of data from the NODC archives, and for the data stewards at the archives as well. The situation is further complicated by the fact that NODC also maintains some ad hoc services like WODselect, and that not all services can be applied to all of the tens of thousands of collections in the NODC archive; where once every data set was available only through FTP and HTTP servers, now many are also available from the LAS, TDS, Hyrax, and ArcGIS Server. To bring order and clarity to this potentially confusing collection of services, NODC deployed the Geoportal Server into its Archive Management System as an integrating technology that brings together its various data access, visualization, and discovery services as well as its overall metadata management workflows. While providing an enhanced web-based interface for more integrated human-to-machine discovery and access, the deployment also enables NODC for the first time to support a robust set of machine-to-machine discovery services such as the Catalog Service for the Web (CS/W), OpenSearch, and Search and Retrieval via URL (SRU) . This approach allows NODC

  9. Sharing meanings: developing interoperable semantic technologies to enhance reproducibility in earth and environmental science research

    Science.gov (United States)

    Schildhauer, M.

    2015-12-01

    Earth and environmental scientists are familiar with the entities, processes, and theories germane to their field of study, and comfortable collecting and analyzing data in their area of interest. Yet, while there appears to be consistency and agreement as to the scientific "terms" used to describe features in their data and analyses, aside from a few fundamental physical characteristics—such as mass or velocity-- there can be broad tolerances, if not considerable ambiguity, in how many earth science "terms" map to the underlying "concepts" that they actually represent. This ambiguity in meanings, or "semantics", creates major problems for scientific reproducibility. It greatly impedes the ability to replicate results—by making it difficult to determine the specifics of the intended meanings of terms such as deforestation or carbon flux -- as to scope, composition, magnitude, etc. In addition, semantic ambiguity complicates assemblage of comparable data for reproducing results, due to ambiguous or idiosyncratic labels for measurements, such as percent cover of forest, where the term "forest" is undefined; or where a reported output of "total carbon-emissions" might just include CO2 emissions, but not methane emissions. In this talk, we describe how the NSF-funded DataONE repository for earth and environmental science data (http://dataone.org), is using W3C-standard languages (RDF/OWL) to build an ontology for clarifying concepts embodied in heterogeneous data and model outputs. With an initial focus on carbon cycling concepts using terrestrial biospheric model outputs and LTER productivity data, we describe how we are achieving interoperability with "semantic vocabularies" (or ontologies) from aligned earth and life science domains, including OBO-foundry ontologies such as ENVO and BCO; the ISO/OGC O&M; and the NSF Earthcube GeoLink project. Our talk will also discuss best practices that may be helpful for other groups interested in constructing their own

  10. Interoperable Access to Near Real Time Ocean Observations with the Observing System Monitoring Center

    Science.gov (United States)

    O'Brien, K.; Hankin, S.; Mendelssohn, R.; Simons, R.; Smith, B.; Kern, K. J.

    2013-12-01

    The Observing System Monitoring Center (OSMC), a project funded by the National Oceanic and Atmospheric Administration's Climate Observations Division (COD), exists to join the discrete 'networks' of In Situ ocean observing platforms -- ships, surface floats, profiling floats, tide gauges, etc. - into a single, integrated system. The OSMC is addressing this goal through capabilities in three areas focusing on the needs of specific user groups: 1) it provides real time monitoring of the integrated observing system assets to assist management in optimizing the cost-effectiveness of the system for the assessment of climate variables; 2) it makes the stream of real time data coming from the observing system available to scientific end users into an easy-to-use form; and 3) in the future, it will unify the delayed-mode data from platform-focused data assembly centers into a standards- based distributed system that is readily accessible to interested users from the science and education communities. In this presentation, we will be focusing on the efforts of the OSMC to provide interoperable access to the near real time data stream that is available via the Global Telecommunications System (GTS). This is a very rich data source, and includes data from nearly all of the oceanographic platforms that are actively observing. We will discuss how the data is being served out using a number of widely used 'web services' (including OPeNDAP and SOS) and downloadable file formats (KML, csv, xls, netCDF), so that it can be accessed in web browsers and popular desktop analysis tools. We will also be discussing our use of the Environmental Research Division's Data Access Program (ERDDAP), available from NOAA/NMFS, which has allowed us to achieve our goals of serving the near real time data. From an interoperability perspective, it's important to note that access to the this stream of data is not just for humans, but also for machine-to-machine requests. We'll also delve into how we

  11. The GIIDA (Management of the CNR Environmental Data for Interoperability) project

    Science.gov (United States)

    Nativi, S.

    2009-04-01

    This work presents the GIIDA (Gestione Integrata e Interoperativa dei Dati Ambientali del CNR) inter-departimental project of the Italian National Research Council (CNR). The project is an initiative of the Earth and Environment Department (Dipartimento Terra e Ambiente) of the CNR. GIIDA mission is "To implement the Spatial Information Infrastructure (SII) of CNR for Environmental and Earth Observation data". The project aims to design and develop a multidisciplinary cyber-infrastructure for the management, processing and evaluation of Earth and environmental data. This infrastructure will contribute to the Italian presence in international projects and initiatives, such as: INSPIRE, GMES, GEOSS and SEIS. The main GIIDA goals are: • Networking: To create a network of CNR Institutes for implementing a common information space and sharing spatial resources. • Observation: Re-engineering the environmental observation system of CNR • Modeling: Re-engineering the environmental modeling system del CNR • Processing: Re-engineering the environmental processing system del CNR • Mediation: To define mediation methods and instruments for implementing the international interoperability standards. The project started in July 2008 releasing a specification document of the GIIDA architecture for interoperability and security. Based on these documents, a Call for Proposals was issued in September 2008. GIIDA received 23 proposed pilots from 16 different Institutes belonging to five CNR Departments and from 15 non-CNR Institutions (e.g. three Italian regional administrations, three national research centers, four universities, some SMEs). These pilot were divided into thematic areas. In fact, GIIDA considers seven main thematic areas/domains: • Biodiversity; • Climate Changes; • Air Quality; • Soil and Water Quality; • Risks; • Infrastructures for Research and Public Administrations; • Sea and Marine resources Each of these thematic areas is covered by a

  12. NetCDF-CF-OPeNDAP: Standards for ocean data interoperability and object lessons for community data standards processes

    Science.gov (United States)

    Hankin, Steven C.; Blower, Jon D.; Carval, Thierry; Casey, Kenneth S.; Donlon, Craig; Lauret, Olivier; Loubrieu, Thomas; Srinivasan, Ashwanth; Trinanes, Joaquin; Godøy, Øystein; Mendelssohn, Roy; Signell, Richard P.; de La Beaujardiere, Jeff; Cornillon, Peter; Blanc, Frederique; Rew, Russ; Harlan, Jack; Hall, Julie; Harrison, D.E.; Stammer, Detlef

    2010-01-01

    It is generally recognized that meeting society's emerging environmental science and management needs will require the marine data community to provide simpler, more effective and more interoperable access to its data. There is broad agreement, as well, that data standards are the bedrock upon which interoperability will be built. The path that would bring the marine data community to agree upon and utilize such standards, however, is often elusive. In this paper we examine the trio of standards 1) netCDF files; 2) the Climate and Forecast (CF) metadata convention; and 3) the OPeNDAP data access protocol. These standards taken together have brought our community a high level of interoperability for "gridded" data such as model outputs, satellite products and climatological analyses, and they are gaining rapid acceptance for ocean observations. We will provide an overview of the scope of the contribution that has been made. We then step back from the information technology considerations to examine the community or "social" process by which the successes were achieved. We contrast the path by which the World Meteorological Organization (WMO) has advanced the Global Telecommunications System (GTS) - netCDF/CF/OPeNDAP exemplifying a "bottom up" standards process whereas GTS is "top down". Both of these standards are tales of success at achieving specific purposes, yet each is hampered by technical limitations. These limitations sometimes lead to controversy over whether alternative technological directions should be pursued. Finally we draw general conclusions regarding the factors that affect the success of a standards development effort - the likelihood that an IT standard will meet its design goals and will achieve community-wide acceptance. We believe that a higher level of thoughtful awareness by the scientists, program managers and technology experts of the vital role of standards and the merits of alternative standards processes can help us as a community to

  13. Building an Interoperability Test System for Electric Vehicle Chargers Based on ISO/IEC 15118 and IEC 61850 Standards

    OpenAIRE

    Minho Shin; Hwimin Kim; Hyoseop Kim; Hyuksoo Jang

    2016-01-01

    The electric vehicle market is rapidly growing due to its environmental friendliness and governmental support. As electric vehicles are powered by electricity, the interoperability between the vehicles and the chargers made by multiple vendors is crucial for the success of the technology. Relevant standards are being published, but the methods for conformance testing need to be developed. In this paper, we present our conformance test system for the electric vehicle charger in accordance with...

  14. Electronic Health Records: VA and DOD Need to Support Cost and Schedule Claims, Develop Interoperability Plans, and Improve Collaboration

    Science.gov (United States)

    2014-02-01

    conform to interoperability standards, they can be created, managed, and consulted by authorized clinicians and staff across more than one health care...capabilities to be delivered were those supporting laboratory, anatomic pathology, pharmacy, and immunizations. In addition, the initiative was to deliver...they included language related to the office having budgetary control over the iEHR program. For example, this charter gave the IPO Director the

  15. The NASA Scientific and Technical Information (STI) Program's Implementation of Open Archives Initiation (OAI) for Data Interoperability and Data Exchange

    Science.gov (United States)

    Rocker, JoAnne; Roncaglia, George J.; Heimerl, Lynn N.; Nelson, Michael L.

    2002-01-01

    Interoperability and data-exchange are critical for the survival of government information management programs. E-government initiatives are transforming the way the government interacts with the public. More information is to be made available through web-enabled technologies. Programs such as the NASA's Scientific and Technical Information (STI) Program Office are tasked to find more effective ways to disseminate information to the public. The NASA STI Program is an agency-wide program charged with gathering, organizing, storing, and disseminating NASA-produced information for research and public use. The program is investigating the use of a new protocol called the Open Archives Initiative (OAI) as a means to improve data interoperability and data collection. OAI promotes the use of the OAI harvesting protocol as a simple way for data sharing among repositories. In two separate initiatives, the STI Program is implementing OAI In collaboration with the Air Force, Department of Energy, and Old Dominion University, the NASA STI Program has funded research on implementing the OAI to exchange data between the three organizations. The second initiative is the deployment of OAI for the NASA technical report server (TRS) environment. The NASA TRS environment is comprised of distributed technical report servers with a centralized search interface. This paper focuses on the implementation of OAI to promote interoperability among diverse data repositories.

  16. Device interoperability and authentication for telemedical appliance based on the ISO/IEEE 11073 Personal Health Device (PHD) Standards.

    Science.gov (United States)

    Caranguian, Luther Paul R; Pancho-Festin, Susan; Sison, Luis G

    2012-01-01

    In this study, we focused on the interoperability and authentication of medical devices in the context of telemedical systems. A recent standard called the ISO/IEEE 11073 Personal Health Device (X73-PHD) Standards addresses the device interoperability problem by defining common protocols for agent (medical device) and manager (appliance) interface. The X73-PHD standard however has not addressed security and authentication of medical devices which is important in establishing integrity of a telemedical system. We have designed and implemented a security policy within the X73-PHD standards. The policy will enable device authentication using Asymmetric-Key Cryptography and the RSA algorithm as the digital signature scheme. We used two approaches for performing the digital signatures: direct software implementation and use of embedded security modules (ESM). The two approaches were evaluated and compared in terms of execution time and memory requirement. For the standard 2048-bit RSA, ESM calculates digital signatures only 12% of the total time for the direct implementation. Moreover, analysis shows that ESM offers more security advantage such as secure storage of keys compared to using direct implementation. Interoperability with other systems was verified by testing the system with LNI Healthlink, a manager software that implements the X73-PHD standard. Lastly, security analysis was done and the system's response to common attacks on authentication systems was analyzed and several measures were implemented to protect the system against them.

  17. Archetype-based electronic health records: a literature review and evaluation of their applicability to health data interoperability and access.

    Science.gov (United States)

    Wollersheim, Dennis; Sari, Anny; Rahayu, Wenny

    Health Information Managers (HIMs) are responsible for overseeing health information. The change management necessary during the transition to electronic health records (EHR) is substantial, and ongoing. Archetype-based EHRs are a core health information system component which solve many of the problems that arise during this period of change. Archetypes are models of clinical content, and they have many beneficial properties. They are interoperable, both between settings and through time. They are more amenable to change than conventional paradigms, and their design is congruent with clinical practice. This paper is an overview of the current archetype literature relevant to Health Information Managers. The literature was sourced in the English language sections of ScienceDirect, IEEE Explore, Pubmed, Google Scholar, ACM Digital library and other databases on the usage of archetypes for electronic health record storage, looking at the current areas of archetype research, appropriate usage, and future research. We also used reference lists from the cited papers, papers referenced by the openEHR website, and the recommendations from experts in the area. Criteria for inclusion were (a) if studies covered archetype research and (b) were either studies of archetype use, archetype system design, or archetype effectiveness. The 47 papers included show a wide and increasing worldwide archetype usage, in a variety of medical domains. Most of the papers noted that archetypes are an appropriate solution for future-proof and interoperable medical data storage. We conclude that archetypes are a suitable solution for the complex problem of electronic health record storage and interoperability.

  18. Meeting People’s Needs in a Fully Interoperable Domotic Environment

    Directory of Open Access Journals (Sweden)

    Vittorio Miori

    2012-05-01

    Full Text Available The key idea underlying many Ambient Intelligence (AmI projects and applications is context awareness, which is based mainly on their capacity to identify users and their locations. The actual computing capacity should remain in the background, in the periphery of our awareness, and should only move to the center if and when necessary. Computing thus becomes ‘invisible’, as it is embedded in the environment and everyday objects. The research project described herein aims to realize an Ambient Intelligence-based environment able to improve users’ quality of life by learning their habits and anticipating their needs. This environment is part of an adaptive, context-aware framework designed to make today’s incompatible heterogeneous domotic systems fully interoperable, not only for connecting sensors and actuators, but for providing comprehensive connections of devices to users. The solution is a middleware architecture based on open and widely recognized standards capable of abstracting the peculiarities of underlying heterogeneous technologies and enabling them to co-exist and interwork, without however eliminating their differences. At the highest level of this infrastructure, the Ambient Intelligence framework, integrated with the domotic sensors, can enable the system to recognize any unusual or dangerous situations and anticipate health problems or special user needs in a technological living environment, such as a house or a public space.

  19. Interoperability between biomedical ontologies through relation expansion, upper-level ontologies and automatic reasoning.

    Directory of Open Access Journals (Sweden)

    Robert Hoehndorf

    Full Text Available Researchers design ontologies as a means to accurately annotate and integrate experimental data across heterogeneous and disparate data- and knowledge bases. Formal ontologies make the semantics of terms and relations explicit such that automated reasoning can be used to verify the consistency of knowledge. However, many biomedical ontologies do not sufficiently formalize the semantics of their relations and are therefore limited with respect to automated reasoning for large scale data integration and knowledge discovery. We describe a method to improve automated reasoning over biomedical ontologies and identify several thousand contradictory class definitions. Our approach aligns terms in biomedical ontologies with foundational classes in a top-level ontology and formalizes composite relations as class expressions. We describe the semi-automated repair of contradictions and demonstrate expressive queries over interoperable ontologies. Our work forms an important cornerstone for data integration, automatic inference and knowledge discovery based on formal representations of knowledge. Our results and analysis software are available at http://bioonto.de/pmwiki.php/Main/ReasonableOntologies.

  20. A Proof-of-Concept for Semantically Interoperable Federation of IoT Experimentation Facilities

    Science.gov (United States)

    Lanza, Jorge; Sanchez, Luis; Gomez, David; Elsaleh, Tarek; Steinke, Ronald; Cirillo, Flavio

    2016-01-01

    The Internet-of-Things (IoT) is unanimously identified as one of the main pillars of future smart scenarios. The potential of IoT technologies and deployments has been already demonstrated in a number of different application areas, including transport, energy, safety and healthcare. However, despite the growing number of IoT deployments, the majority of IoT applications tend to be self-contained, thereby forming application silos. A lightweight data centric integration and combination of these silos presents several challenges that still need to be addressed. Indeed, the ability to combine and synthesize data streams and services from diverse IoT platforms and testbeds, holds the promise to increase the potentiality of smart applications in terms of size, scope and targeted business context. In this article, a proof-of-concept implementation that federates two different IoT experimentation facilities by means of semantic-based technologies will be described. The specification and design of the implemented system and information models will be described together with the practical details of the developments carried out and its integration with the existing IoT platforms supporting the aforementioned testbeds. Overall, the system described in this paper demonstrates that it is possible to open new horizons in the development of IoT applications and experiments at a global scale, that transcend the (silo) boundaries of individual deployments, based on the semantic interconnection and interoperability of diverse IoT platforms and testbeds. PMID:27367695

  1. Agile Management and Interoperability Testing of SDN/NFV‐Enriched 5G Core Networks

    Directory of Open Access Journals (Sweden)

    Taesang Choi

    2018-02-01

    Full Text Available In the fifth generation (5G era, the radio internet protocol capacity is expected to reach 20 Gb/s per sector, and ultralarge content traffic will travel across a faster wireless/wireline access network and packet core network. Moreover, the massive and mission‐critical Internet of Things is the main differentiator of 5G services. These types of real‐time and large‐bandwidth‐consuming services require a radio latency of less than 1 ms and an end‐to‐end latency of less than a few milliseconds. By distributing 5G core nodes closer to cell sites, the backhaul traffic volume and latency can be significantly reduced by having mobile devices download content immediately from a closer content server. In this paper, we propose a novel solution based on software‐defined network and network function virtualization technologies in order to achieve agile management of 5G core network functionalities with a proof‐of‐concept implementation targeted for the PyeongChang Winter Olympics and describe the results of interoperability testing experiences between two core networks.

  2. Enhancing Interoperability and Capabilities of Earth Science Data using the Observations Data Model 2 (ODM2

    Directory of Open Access Journals (Sweden)

    Leslie Hsu

    2017-02-01

    Full Text Available Earth Science researchers require access to integrated, cross-disciplinary data in order to answer critical research questions. Partially due to these science drivers, it is common for disciplinary data systems to expand from their original scope in order to accommodate collaborative research. The result is multiple disparate databases with overlapping but incompatible data. In order to enable more complete data integration and analysis, the Observations Data Model Version 2 (ODM2 was developed to be a general information model, with one of its major goals to integrate data collected by 'in situ' sensors with those by 'ex-situ' analyses of field specimens. Four use cases with different science drivers and disciplines have adopted ODM2 because of benefits to their users. The disciplines behind the four cases are diverse – hydrology, rock geochemistry, soil geochemistry, and biogeochemistry. For each case, we outline the benefits, challenges, and rationale for adopting ODM2. In each case, the decision to implement ODM2 was made to increase interoperability and expand data and metadata capabilities. One of the common benefits was the ability to use the flexible handling and comprehensive description of specimens and data collection sites in ODM2’s sampling feature concept. We also summarize best practices for implementing ODM2 based on the experience of these initial adopters. The descriptions here should help other potential adopters of ODM2 implement their own instances or to modify ODM2 to suit their needs.

  3. Interoperability in digital electrocardiography: harmonization of ISO/IEEE x73-PHD and SCP-ECG.

    Science.gov (United States)

    Trigo, Jesús D; Chiarugi, Franco; Alesanco, Alvaro; Martínez-Espronceda, Miguel; Serrano, Luis; Chronaki, Catherine E; Escayola, Javier; Martínez, Ignacio; García, José

    2010-11-01

    The ISO/IEEE 11073 (x73) family of standards is a reference frame for medical device interoperability. A draft for an ECG device specialization (ISO/IEEE 11073-10406-d02) has already been presented to the Personal Health Device (PHD) Working Group, and the Standard Communications Protocol for Computer-Assisted ElectroCardioGraphy (SCP-ECG) Standard for short-term diagnostic ECGs (EN1064:2005+A1:2007) has recently been approved as part of the x73 family (ISO 11073-91064:2009). These factors suggest the coordinated use of these two standards in foreseeable telecardiology environments, and hence the need to harmonize them. Such harmonization is the subject of this paper. Thus, a mapping of the mandatory attributes defined in the second draft of the ISO/IEEE 11073-10406-d02 and the minimum SCP-ECG fields is presented, and various other capabilities of the SCP-ECG Standard (such as the messaging part) are also analyzed from an x73-PHD point of view. As a result, this paper addresses and analyzes the implications of some inconsistencies in the coordinated use of these two standards. Finally, a proof-of-concept implementation of the draft x73-PHD ECG device specialization is presented, along with the conversion from x73-PHD to SCP-ECG. This paper, therefore, provides recommendations for future implementations of telecardiology systems that are compliant with both x73-PHD and SCP-ECG.

  4. MO-AB-204-00: Interoperability in Radiation Oncology: IHE-RO Committee Update

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2016-06-15

    You’ve experienced the frustration: vendor A’s device claims to work with vendor B’s device, but the practice doesn’t match the promise. Getting devices working together is the hidden art that Radiology and Radiation Oncology staff have to master. To assist with that difficult process, the Integrating the Healthcare Enterprise (IHE) effort was established in 1998, with the coordination of the Radiological Society of North America. Integrating the Healthcare Enterprise (IHE) is a consortium of healthcare professionals and industry partners focused on improving the way computer systems interconnect and exchange information. This is done by coordinating the use of published standards like DICOM and HL7. Several clinical and operational IHE domains exist in the healthcare arena, including Radiology and Radiation Oncology. The ASTRO-sponsored IHE Radiation Oncology (IHE-RO) domain focuses on radiation oncology specific information exchange. This session will explore the IHE Radiology and IHE RO process for; IHE solicitation process for new profiles. Improving the way computer systems interconnect and exchange information in the healthcare enterprise Supporting interconnectivity descriptions and proof of adherence by vendors Testing and assuring the vendor solutions to connectivity problems. Including IHE profiles in RFPs for future software and hardware purchases. Learning Objectives: Understand IHE role in improving interoperability in health care. Understand process of profile development and implantation. Understand how vendors prove adherence to IHE RO profiles. S. Hadley, ASTRO Supported Activity.

  5. Postmarketing Safety Study Tool: A Web Based, Dynamic, and Interoperable System for Postmarketing Drug Surveillance Studies

    Directory of Open Access Journals (Sweden)

    A. Anil Sinaci

    2015-01-01

    Full Text Available Postmarketing drug surveillance is a crucial aspect of the clinical research activities in pharmacovigilance and pharmacoepidemiology. Successful utilization of available Electronic Health Record (EHR data can complement and strengthen postmarketing safety studies. In terms of the secondary use of EHRs, access and analysis of patient data across different domains are a critical factor; we address this data interoperability problem between EHR systems and clinical research systems in this paper. We demonstrate that this problem can be solved in an upper level with the use of common data elements in a standardized fashion so that clinical researchers can work with different EHR systems independently of the underlying information model. Postmarketing Safety Study Tool lets the clinical researchers extract data from different EHR systems by designing data collection set schemas through common data elements. The tool interacts with a semantic metadata registry through IHE data element exchange profile. Postmarketing Safety Study Tool and its supporting components have been implemented and deployed on the central data warehouse of the Lombardy region, Italy, which contains anonymized records of about 16 million patients with over 10-year longitudinal data on average. Clinical researchers in Roche validate the tool with real life use cases.

  6. The Value of Data and Metadata Standardization for Interoperability in Giovanni

    Science.gov (United States)

    Smit, C.; Hegde, M.; Strub, R. F.; Bryant, K.; Li, A.; Petrenko, M.

    2017-12-01

    Giovanni (https://giovanni.gsfc.nasa.gov/giovanni/) is a data exploration and visualization tool at the NASA Goddard Earth Sciences Data Information Services Center (GES DISC). It has been around in one form or another for more than 15 years. Giovanni calculates simple statistics and produces 22 different visualizations for more than 1600 geophysical parameters from more than 90 satellite and model products. Giovanni relies on external data format standards to ensure interoperability, including the NetCDF CF Metadata Conventions. Unfortunately, these standards were insufficient to make Giovanni's internal data representation truly simple to use. Finding and working with dimensions can be convoluted with the CF Conventions. Furthermore, the CF Conventions are silent on machine-friendly descriptive metadata such as the parameter's source product and product version. In order to simplify analyzing disparate earth science data parameters in a unified way, we developed Giovanni's internal standard. First, the format standardizes parameter dimensions and variables so they can be easily found. Second, the format adds all the machine-friendly metadata Giovanni needs to present our parameters to users in a consistent and clear manner. At a glance, users can grasp all the pertinent information about parameters both during parameter selection and after visualization. This poster gives examples of how our metadata and data standards, both external and internal, have both simplified our code base and improved our users' experiences.

  7. W2E--Wellness Warehouse Engine for Semantic Interoperability of Consumer Health Data.

    Science.gov (United States)

    Honko, Harri; Andalibi, Vafa; Aaltonen, Timo; Parak, Jakub; Saaranen, Mika; Viik, Jari; Korhonen, Ilkka

    2016-11-01

    Novel health monitoring devices and applications allow consumers easy and ubiquitous ways to monitor their health status. However, technologies from different providers lack both technical and semantic interoperability and hence the resulting health data are often deeply tied to a specific service, which is limiting its reusability and utilization in different services. We have designed a Wellness Warehouse Engine (W2E) that bridges this gap and enables seamless exchange of data between different services. W2E provides interfaces to various data sources and makes data available via unified representational state transfer application programming interface to other services. Importantly, it includes Unifier--an engine that allows transforming input data into generic units reusable by other services, and Analyzer--an engine that allows advanced analysis of input data, such as combining different data sources into new output parameters. In this paper, we describe the architecture of W2E and demonstrate its applicability by using it for unifying data from four consumer activity trackers, using a test base of 20 subjects each carrying out three different tracking sessions. Finally, we discuss challenges of building a scalable Unifier engine for the ever-enlarging number of new devices.

  8. MediCoordination: a practical approach to interoperability in the Swiss health system.

    Science.gov (United States)

    Müller, Henning; Schumacher, Michael; Godel, David; Omar, Abu Khaled; Mooser, Francois; Ding, Sandrine

    2009-01-01

    Interoperability and data exchange between partners in the health sector is seen as one of the important domains that can improve care processes and in the long run also decrease costs of the health care system. Data exchange can assure that the data on the patient are as complete as possible avoiding potential mistreatments, and it can avoid double examinations if the data required are already available. On the other hand, health data is a sensible point for many people and strong protection needs to be implemented to protect patient data against misuse as well as tools to let the patient manage his/her own data. Many countries have eHealth initiatives in preparation or already implemented. However, health data exchange on a large scale still has a fairly long way to go as the political processes for global solutions are often complicated. In the MediCoordination project a pragmatic approach is selected trying to integrate several partners in health care on a regional scale. In parallel with the Swiss eHealth strategy that is currently being elaborated by the Swiss confederation, particularly medium-sized hospitals and external partners are targeted in MediCoordination to implement concrete added-value scenarios of information exchange between hospitals and external medical actors.

  9. An Interoperability Consideration in Selecting Domain Parameters for Elliptic Curve Cryptography

    Science.gov (United States)

    Ivancic, Will (Technical Monitor); Eddy, Wesley M.

    2005-01-01

    Elliptic curve cryptography (ECC) will be an important technology for electronic privacy and authentication in the near future. There are many published specifications for elliptic curve cryptosystems, most of which contain detailed descriptions of the process for the selection of domain parameters. Selecting strong domain parameters ensures that the cryptosystem is robust to attacks. Due to a limitation in several published algorithms for doubling points on elliptic curves, some ECC implementations may produce incorrect, inconsistent, and incompatible results if domain parameters are not carefully chosen under a criterion that we describe. Few documents specify the addition or doubling of points in such a manner as to avoid this problematic situation. The safety criterion we present is not listed in any ECC specification we are aware of, although several other guidelines for domain selection are discussed in the literature. We provide a simple example of how a set of domain parameters not meeting this criterion can produce catastrophic results, and outline a simple means of testing curve parameters for interoperable safety over doubling.

  10. Software design and implementation concepts for an interoperable medical communication framework.

    Science.gov (United States)

    Besting, Andreas; Bürger, Sebastian; Kasparick, Martin; Strathen, Benjamin; Portheine, Frank

    2018-02-23

    The new IEEE 11073 service-oriented device connectivity (SDC) standard proposals for networked point-of-care and surgical devices constitutes the basis for improved interoperability due to its independence of vendors. To accelerate the distribution of the standard a reference implementation is indispensable. However, the implementation of such a framework has to overcome several non-trivial challenges. First, the high level of complexity of the underlying standard must be reflected in the software design. An efficient implementation has to consider the limited resources of the underlying hardware. Moreover, the frameworks purpose of realizing a distributed system demands a high degree of reliability of the framework itself and its internal mechanisms. Additionally, a framework must provide an easy-to-use and fail-safe application programming interface (API). In this work, we address these challenges by discussing suitable software engineering principles and practical coding guidelines. A descriptive model is developed that identifies key strategies. General feasibility is shown by outlining environments in which our implementation has been utilized.

  11. A Proof-of-Concept for Semantically Interoperable Federation of IoT Experimentation Facilities

    Directory of Open Access Journals (Sweden)

    Jorge Lanza

    2016-06-01

    Full Text Available The Internet-of-Things (IoT is unanimously identified as one of the main pillars of future smart scenarios. The potential of IoT technologies and deployments has been already demonstrated in a number of different application areas, including transport, energy, safety and healthcare. However, despite the growing number of IoT deployments, the majority of IoT applications tend to be self-contained, thereby forming application silos. A lightweight data centric integration and combination of these silos presents several challenges that still need to be addressed. Indeed, the ability to combine and synthesize data streams and services from diverse IoT platforms and testbeds, holds the promise to increase the potentiality of smart applications in terms of size, scope and targeted business context. In this article, a proof-of-concept implementation that federates two different IoT experimentation facilities by means of semantic-based technologies will be described. The specification and design of the implemented system and information models will be described together with the practical details of the developments carried out and its integration with the existing IoT platforms supporting the aforementioned testbeds. Overall, the system described in this paper demonstrates that it is possible to open new horizons in the development of IoT applications and experiments at a global scale, that transcend the (silo boundaries of individual deployments, based on the semantic interconnection and interoperability of diverse IoT platforms and testbeds.

  12. Beyond accuracy: creating interoperable and scalable text-mining web services.

    Science.gov (United States)

    Wei, Chih-Hsuan; Leaman, Robert; Lu, Zhiyong

    2016-06-15

    The biomedical literature is a knowledge-rich resource and an important foundation for future research. With over 24 million articles in PubMed and an increasing growth rate, research in automated text processing is becoming increasingly important. We report here our recently developed web-based text mining services for biomedical concept recognition and normalization. Unlike most text-mining software tools, our web services integrate several state-of-the-art entity tagging systems (DNorm, GNormPlus, SR4GN, tmChem and tmVar) and offer a batch-processing mode able to process arbitrary text input (e.g. scholarly publications, patents and medical records) in multiple formats (e.g. BioC). We support multiple standards to make our service interoperable and allow simpler integration with other text-processing pipelines. To maximize scalability, we have preprocessed all PubMed articles, and use a computer cluster for processing large requests of arbitrary text. Our text-mining web service is freely available at http://www.ncbi.nlm.nih.gov/CBBresearch/Lu/Demo/tmTools/#curl : Zhiyong.Lu@nih.gov. Published by Oxford University Press 2016. This work is written by US Government employees and is in the public domain in the US.

  13. A Proof-of-Concept for Semantically Interoperable Federation of IoT Experimentation Facilities.

    Science.gov (United States)

    Lanza, Jorge; Sanchez, Luis; Gomez, David; Elsaleh, Tarek; Steinke, Ronald; Cirillo, Flavio

    2016-06-29

    The Internet-of-Things (IoT) is unanimously identified as one of the main pillars of future smart scenarios. The potential of IoT technologies and deployments has been already demonstrated in a number of different application areas, including transport, energy, safety and healthcare. However, despite the growing number of IoT deployments, the majority of IoT applications tend to be self-contained, thereby forming application silos. A lightweight data centric integration and combination of these silos presents several challenges that still need to be addressed. Indeed, the ability to combine and synthesize data streams and services from diverse IoT platforms and testbeds, holds the promise to increase the potentiality of smart applications in terms of size, scope and targeted business context. In this article, a proof-of-concept implementation that federates two different IoT experimentation facilities by means of semantic-based technologies will be described. The specification and design of the implemented system and information models will be described together with the practical details of the developments carried out and its integration with the existing IoT platforms supporting the aforementioned testbeds. Overall, the system described in this paper demonstrates that it is possible to open new horizons in the development of IoT applications and experiments at a global scale, that transcend the (silo) boundaries of individual deployments, based on the semantic interconnection and interoperability of diverse IoT platforms and testbeds.

  14. Implementation and Evaluation of Four Interoperable Open Standards for the Internet of Things

    Science.gov (United States)

    Jazayeri, Mohammad Ali; Liang, Steve H. L.; Huang, Chih-Yuan

    2015-01-01

    Recently, researchers are focusing on a new use of the Internet called the Internet of Things (IoT), in which enabled electronic devices can be remotely accessed over the Internet. As the realization of IoT concept is still in its early stages, manufacturers of Internet-connected devices and IoT web service providers are defining their proprietary protocols based on their targeted applications. Consequently, IoT becomes heterogeneous in terms of hardware capabilities and communication protocols. Addressing these heterogeneities by following open standards is a necessary step to communicate with various IoT devices. In this research, we assess the feasibility of applying existing open standards on resource-constrained IoT devices. The standard protocols developed in this research are OGC PUCK over Bluetooth, TinySOS, SOS over CoAP, and OGC SensorThings API. We believe that by hosting open standard protocols on IoT devices, not only do the devices become self-describable, self-contained, and interoperable, but innovative applications can also be easily developed with standardized interfaces. In addition, we use memory consumption, request message size, response message size, and response latency to benchmark the efficiency of the implemented protocols. In all, this research presents and evaluates standard-based solutions to better understand the feasibility of applying existing standards to the IoT vision. PMID:26402683

  15. Data Distribution Service-Based Interoperability Framework for Smart Grid Testbed Infrastructure

    Directory of Open Access Journals (Sweden)

    Tarek A. Youssef

    2016-03-01

    Full Text Available This paper presents the design and implementation of a communication and control infrastructure for smart grid operation. The proposed infrastructure enhances the reliability of the measurements and control network. The advantages of utilizing the data-centric over message-centric communication approach are discussed in the context of smart grid applications. The data distribution service (DDS is used to implement a data-centric common data bus for the smart grid. This common data bus improves the communication reliability, enabling distributed control and smart load management. These enhancements are achieved by avoiding a single point of failure while enabling peer-to-peer communication and an automatic discovery feature for dynamic participating nodes. The infrastructure and ideas presented in this paper were implemented and tested on the smart grid testbed. A toolbox and application programing interface for the testbed infrastructure are developed in order to facilitate interoperability and remote access to the testbed. This interface allows control, monitoring, and performing of experiments remotely. Furthermore, it could be used to integrate multidisciplinary testbeds to study complex cyber-physical systems (CPS.

  16. Implementation and Evaluation of Four Interoperable Open Standards for the Internet of Things.

    Science.gov (United States)

    Jazayeri, Mohammad Ali; Liang, Steve H L; Huang, Chih-Yuan

    2015-09-22

    Recently, researchers are focusing on a new use of the Internet called the Internet of Things (IoT), in which enabled electronic devices can be remotely accessed over the Internet. As the realization of IoT concept is still in its early stages, manufacturers of Internet-connected devices and IoT web service providers are defining their proprietary protocols based on their targeted applications. Consequently, IoT becomes heterogeneous in terms of hardware capabilities and communication protocols. Addressing these heterogeneities by following open standards is a necessary step to communicate with various IoT devices. In this research, we assess the feasibility of applying existing open standards on resource-constrained IoT devices. The standard protocols developed in this research are OGC PUCK over Bluetooth, TinySOS, SOS over CoAP, and OGC SensorThings API. We believe that by hosting open standard protocols on IoT devices, not only do the devices become self-describable, self-contained, and interoperable, but innovative applications can also be easily developed with standardized interfaces. In addition, we use memory consumption, request message size, response message size, and response latency to benchmark the efficiency of the implemented protocols. In all, this research presents and evaluates standard-based solutions to better understand the feasibility of applying existing standards to the IoT vision.

  17. Development of a gateway for interoperability in community-based care: An empirical study.

    Science.gov (United States)

    Ota, Sakiko; Kudo, Ken-Ichi; Taguchi, Kenta; Ihori, Mikio; Yoshie, Satoru; Yamamoto, Takuma; Sudoh, Osamu; Tsuji, Tetsuo; Iijima, Katsuya

    2018-01-01

    Information and communications technology has attracted attention as a useful way of sharing care records in community-based care. Such information sharing systems, however, imposed the burden of inputting the same records into different information systems due to a lack of interoperability of the systems. The purpose of this study was to develop a gateway that links information systems and to investigate the functionality and usability of the gateway through an empirical study. We developed a gateway with healthcare and welfare professionals in Kashiwa city, Japan. The gateway system consisted of two sub-systems: a data exchange sub-system and a common sub-system. Regarding the security, we used the transport layer security 1.2 and a public key infrastructure. For document formats, we utilized the health level seven international, extensible markup language, and portable document format. In addition, we performed an empirical study with 11 scenarios of four simulated patients and a questionnaire survey to the professionals. Professionals of eight occupations participated the empirical study and verified the gateway to link information systems of six vendors. For a questionnaire survey, 32 professionals out of 40 reported that the gateway would eliminate the burden of inputting the same records into different information systems.

  18. Managing the Interoperability and Privacy of e-Health Systems as an Interdisciplinary Challenge

    Directory of Open Access Journals (Sweden)

    Alexandru Soceanu

    2016-10-01

    Full Text Available The growing number of patients with chronic diseases, the ageing population worldwide, the rapid increase in hospital costs and in the cost of care personnel as well as the achieving medical objectives "increase the patient quality of life and survival" face Europe with a huge challenge. One of the solutions for reaching these challenges in the future is the deployment of complex eHealth systems in support of all the healthcare aspects on the way between patient home and healthcare provider. In the last decade the European Commission (EC in cooperation with healthcare associations and standardization institutes announced large frameworks for supporting research and development of various components of the future eHealth systems. This may be considered as an immediate interdisciplinary opportunity for European researchers and developers to create jointly the spine of future healthcare systems. After a short introduction to eHealth architecture, interoperability, security and privacy the talk refers to the interdisciplinary solutions which approach these healthcare huge overall challenge. Two case studies will be addressed: a interdisciplinary partnership for conducting jointly European research concerning remote control and management of future wearable dialysis devices, and b ERASMUS supported international education programs for creating future interdisciplinary expert networks working on developing and implementing a better healthcare system.

  19. Embracing the Importance of FAIR Research Products - Findable, Accessible, Interoperable, and Reusable

    Science.gov (United States)

    Stall, S.

    2017-12-01

    Integrity and transparency within research is solidified by a complete set of research products that are findable, accessible, interoperable, and reusable. In other words, they follow the FAIR Guidelines developed by FORCE11.org. Your datasets, images, video, software, scripts, models, physical samples, and other tools and technology are an integral part of the narrative you tell about your research. These research products increasingly are being captured through workflow tools and preserved and connected through persistent identifiers across multiple repositories that keep them safe. They help secure, with your publications, the supporting evidence and integrity of the scientific record. This is the direction that Earth and space science as well as other disciplines is moving. Within our community, some science domains are further along, and others are taking more measured steps. AGU as a publisher is working to support the full scientific record with peer reviewed publications. Working with our community and all the Earth and space science journals, AGU is developing new policies to encourage researchers to plan for proper data preservation and provide data citations along with their research submission and to encourage adoption of best practices throughout the research workflow and data life cycle. Providing incentives, community standards, and easy-to-use tools are some important factors for helping researchers embrace the FAIR Guidelines and support transparency and integrity.

  20. IEEE 1547 and 2030 Standards for Distributed Energy Resources Interconnection and Interoperability with the Electricity Grid

    Energy Technology Data Exchange (ETDEWEB)

    Basso, T.

    2014-12-01

    Public-private partnerships have been a mainstay of the U.S. Department of Energy and the National Renewable Energy Laboratory (DOE/NREL) approach to research and development. These partnerships also include technology development that enables grid modernization and distributed energy resources (DER) advancement, especially renewable energy systems integration with the grid. Through DOE/NREL and industry support of Institute of Electrical and Electronics Engineers (IEEE) standards development, the IEEE 1547 series of standards has helped shape the way utilities and other businesses have worked together to realize increasing amounts of DER interconnected with the distribution grid. And more recently, the IEEE 2030 series of standards is helping to further realize greater implementation of communications and information technologies that provide interoperability solutions for enhanced integration of DER and loads with the grid. For these standards development partnerships, for approximately $1 of federal funding, industry partnering has contributed $5. In this report, the status update is presented for the American National Standards IEEE 1547 and IEEE 2030 series of standards. A short synopsis of the history of the 1547 standards is first presented, then the current status and future direction of the ongoing standards development activities are discussed.

  1. A Large-Scale Study of Fingerprint Matching Systems for Sensor Interoperability Problem.

    Science.gov (United States)

    AlShehri, Helala; Hussain, Muhammad; AboAlSamh, Hatim; AlZuair, Mansour

    2018-03-28

    The fingerprint is a commonly used biometric modality that is widely employed for authentication by law enforcement agencies and commercial applications. The designs of existing fingerprint matching methods are based on the hypothesis that the same sensor is used to capture fingerprints during enrollment and verification. Advances in fingerprint sensor technology have raised the question about the usability of current methods when different sensors are employed for enrollment and verification; this is a fingerprint sensor interoperability problem. To provide insight into this problem and assess the status of state-of-the-art matching methods to tackle this problem, we first analyze the characteristics of fingerprints captured with different sensors, which makes cross-sensor matching a challenging problem. We demonstrate the importance of fingerprint enhancement methods for cross-sensor matching. Finally, we conduct a comparative study of state-of-the-art fingerprint recognition methods and provide insight into their abilities to address this problem. We performed experiments using a public database (FingerPass) that contains nine datasets captured with different sensors. We analyzed the effects of different sensors and found that cross-sensor matching performance deteriorates when different sensors are used for enrollment and verification. In view of our analysis, we propose future research directions for this problem.

  2. The Building Blocks of Interoperability. A Multisite Analysis of Patient Demographic Attributes Available for Matching.

    Science.gov (United States)

    Culbertson, Adam; Goel, Satyender; Madden, Margaret B; Safaeinili, Niloufar; Jackson, Kathryn L; Carton, Thomas; Waitman, Russ; Liu, Mei; Krishnamurthy, Ashok; Hall, Lauren; Cappella, Nickie; Visweswaran, Shyam; Becich, Michael J; Applegate, Reuben; Bernstam, Elmer; Rothman, Russell; Matheny, Michael; Lipori, Gloria; Bian, Jiang; Hogan, William; Bell, Douglas; Martin, Andrew; Grannis, Shaun; Klann, Jeff; Sutphen, Rebecca; O'Hara, Amy B; Kho, Abel

    2017-04-05

    Patient matching is a key barrier to achieving interoperability. Patient demographic elements must be consistently collected over time and region to be valuable elements for patient matching. We sought to determine what patient demographic attributes are collected at multiple institutions in the United States and see how their availability changes over time and across clinical sites. We compiled a list of 36 demographic elements that stakeholders previously identified as essential patient demographic attributes that should be collected for the purpose of linking patient records. We studied a convenience sample of 9 health care systems from geographically distinct sites around the country. We identified changes in the availability of individual patient demographic attributes over time and across clinical sites. Several attributes were consistently available over the study period (2005-2014) including last name (99.96%), first name (99.95%), date of birth (98.82%), gender/sex (99.73%), postal code (94.71%), and full street address (94.65%). Other attributes changed significantly from 2005-2014: Social security number (SSN) availability declined from 83.3% to 50.44% (pnumber increased from 20.61% to 52.33% (pnumbers are increasing while SSN use is declining. Understanding the relative availability of patient attributes can inform strategies for optimal matching in healthcare.

  3. The Italian Cloud-based brokering Infrastructure to sustain Interoperability for Operative Hydrology

    Science.gov (United States)

    Boldrini, E.; Pecora, S.; Bussettini, M.; Bordini, F.; Nativi, S.

    2015-12-01

    This work presents the informatics platform carried out to implement the National Hydrological Operative Information System of Italy. In particular, the presentation will focus on the governing aspects of the cloud infrastructure and brokering software that make possible to sustain the hydrology data flow between heterogeneous user clients and data providers.The Institute for Environmental Protection and Research, ISPRA (Istituto Superiore per la Protezione e la Ricerca Ambientale) in collaboration with the Regional Agency for Environmental Protection in the Emilia-Romagna region, ARPA-ER (Agenzia Regionale per la Prevenzione e l´Ambiente dell´Emilia-Romagna) and CNR-IIA (National Research Council of Italy) designed and developed an innovative platform for the discovery and access of hydrological data coming from 19 Italian administrative regions and 2 Italian autonomous provinces, in near real time. ISPRA has deployed and governs such a system. The presentation will introduce and discuss the technological barriers for interoperability as well as social and policy ones. The adopted solutions will be described outlining the sustainability challenges and benefits.

  4. Common Patterns with End-to-end Interoperability for Data Access

    Science.gov (United States)

    Gallagher, J.; Potter, N.; Jones, M. B.

    2010-12-01

    At first glance, using common storage formats and open standards should be enough to ensure interoperability between data servers and client applications, but that is often not the case. In the REAP (Realtime Environment for Analytical Processing; NSF #0619060) project we integrated access to data from OPeNDAP servers into the Kepler workflow system and found that, as in previous cases, we spent the bulk of our effort addressing the twin issues of data model compatibility and integration strategies. Implementing seamless data access between a remote data source and a client application (data sink) can be broken down into two kinds of issues. First, the solution must address any differences in the data models used by the data source (OPeNDAP) and the data sink (the Kepler workflow system). If these models match completely, there is little work to be done. However, that is rarely the case. To map OPeNDAP's data model to Kepler's, we used two techniques (ignoring trivial conversions): On-the-fly type mapping and out-of-band communication. Type conversion takes place both for data and metadata because Kepler requires a priori knowledge of some aspects (e.g., syntactic metadata) of the data to build a workflow. In addition, OPeNDAP's constraint expression syntax was used to send out-of-band information to restrict the data requested from the server, facilitating changes in the returned data's type. This technique provides a way for users to exert fine-grained control over the data request, a potentially useful technique, at the cost of requiring that users understand a little about the data source's processing capabilities. The second set of issues for end-to-end data access are integration strategies. OPeNDAP provides several different tools for bringing data into an application: C++, C and Java libraries that provide functions for newly written software; The netCDF library which enables existing applications to read from servers using an older interface; and simple

  5. Leveraging Emerging Standards to Advance Data Interoperability in the Marine Geosciences

    Science.gov (United States)

    Arko, R. A.; Fishman, A. V.

    2005-12-01

    Data interoperability in the marine geosciences has long been hampered by the heterogeneity of our data sets (i.e. the large number and variety of expeditions, platforms, instruments, data types, etc); the corresponding lack of metadata standardization; and a tendency to focus on graphical user interfaces (because geoscience data is highly visual in nature) rather than programmatic interfaces. The Marine Geoscience Data Management System (mgDMS; www.marine-geo.org) is an umbrella project based at Lamont-Doherty Earth Observatory that is building data repositories and services for the NSF-funded Ridge2000, MARGINS, and U.S. Antarctic Programs. mgDMS is partnered with several closely-related NSF projects including the Ocean Floor Petrology Database (PetDB), Marine Seismic Data Center (SDC), Sediment Geochemistry Database (SedDB), and others -- all of which include international collaborators and data sets -- and thus provides an excellent testbed to develop interoperability. Toward that end, we are implementing metadata standards and programmatic interfaces to facilitate the discovery and exchange of well-documented data sets. ISO 19115 (published in May 2003 and adopted by ANSI in December 2003) is emerging as an international standard for geoscience metadata, and has been adopted by national standards bodies and agencies in the U.S. (FGDC), E.U., Japan, and others. ISO 19115 defines a comprehensive set of elements for both "discovery" (search) and "markup" (use) metadata, and is easily extensible. We have developed a metadata profile for mgDMS which implements the mandatory elements of 19115, and extends it to accommodate the unique aspects of marine geoscience expedition-based data sets. We have implemented the profile as a lightweight REST-type Web service based on a W3C XML schema and associated XSL stylesheet. Closely related to the development of metadata standards is the development of controlled vocabularies to describe platforms, instruments, etc. The

  6. Engaging a community towards marine cyberinfrastructure: Lessons Learned from The Marine Metadata Interoperability initiative

    Science.gov (United States)

    Galbraith, N. R.; Graybeal, J.; Bermudez, L. E.; Wright, D.

    2005-12-01

    The Marine Metadata Interoperability (MMI) initiative promotes the exchange, integration and use of marine data through enhanced data publishing, discovery, documentation and accessibility. The project, operating since late 2004, presents several cultural organizational challenges because of the diversity of participants: scientists, technical experts, and data managers from around the world, all working in organizations with different corporate cultures, funding structures, and systems of decision-making. MMI provides educational resources at several levels. For instance, short introductions to metadata concepts are available, as well as guides and "cookbooks" for the quick and efficient preparation of marine metadata. For those who are building major marine data systems, including ocean-observing capabilities, there are training materials, marine metadata content examples, and resources for mapping elements between different metadata standards. The MMI also provides examples of good metadata practices in existing data systems, including the EU's Marine XML project, and functioning ocean/coastal clearinghouses and atlases developed by MMI team members. Communication tools that help build community: 1) Website, used to introduce the initiative to new visitors, and to provide in-depth guidance and resources to members and visitors. The site is built using Plone, an open source web content management system. Plone allows the site to serve as a wiki, to which every user can contribute material. This keeps the membership engaged and spreads the responsibility for the tasks of updating and expanding the site. 2) Email-lists, to engage the broad ocean sciences community. The discussion forums "news," "ask," and "site-help" are available for receiving regular updates on MMI activities, seeking advice or support on projects and standards, or for assistance with using the MMI site. Internal email lists are provided for the Technical Team, the Steering Committee and

  7. Best Practices for International Collaboration and Applications of Interoperability within a NASA Data Center

    Science.gov (United States)

    Moroni, D. F.; Armstrong, E. M.; Tauer, E.; Hausman, J.; Huang, T.; Thompson, C. K.; Chung, N.

    2013-12-01

    The Physical Oceanographic Distributed Active Archive Center (PO.DAAC) is one of 12 data centers sponsored by NASA's Earth Science Data and Information System (ESDIS) project. The PO.DAAC is tasked with archival and distribution of NASA Earth science missions specific to physical oceanography, many of which have interdisciplinary applications for weather forecasting/monitoring, ocean biology, ocean modeling, and climate studies. PO.DAAC has a 20-year history of cross-project and international collaborations with partners in Europe, Japan, Australia, and the UK. Domestically, the PO.DAAC has successfully established lasting partners with non-NASA institutions and projects including the National Oceanic and Atmospheric Administration (NOAA), United States Navy, Remote Sensing Systems, and Unidata. A key component of these partnerships is PO.DAAC's direct involvement with international working groups and science teams, such as the Group for High Resolution Sea Surface Temperature (GHRSST), International Ocean Vector Winds Science Team (IOVWST), Ocean Surface Topography Science Team (OSTST), and the Committee on Earth Observing Satellites (CEOS). To help bolster new and existing collaborations, the PO.DAAC has established a standardized approach to its internal Data Management and Archiving System (DMAS), utilizing a Data Dictionary to provide the baseline standard for entry and capture of dataset and granule metadata. Furthermore, the PO.DAAC has established an end-to-end Dataset Lifecycle Policy, built upon both internal and external recommendations of best practices toward data stewardship. Together, DMAS, the Data Dictionary, and the Dataset Lifecycle Policy provide the infrastructure to enable standardized data and metadata to be fully ingested and harvested to facilitate interoperability and compatibility across data access protocols, tools, and services. The Dataset Lifecycle Policy provides the checks and balances to help ensure all incoming HDF and net

  8. Enabling Interoperability and Servicing Multiple User Segments Through Web Services, Standards, and Data Tools

    Science.gov (United States)

    Palanisamy, Giriprakash; Wilson, Bruce E.; Cook, Robert B.; Lenhardt, Chris W.; Santhana Vannan, Suresh; Pan, Jerry; McMurry, Ben F.; Devarakonda, Ranjeet

    2010-12-01

    The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) is one of the science-oriented data centers in EOSDIS, aligned primarily with terrestrial ecology. The ORNL DAAC archives and serves data from NASA-funded field campaigns (such as BOREAS, FIFE, and LBA), regional and global data sets relevant to biogeochemical cycles, land validation studies for remote sensing, and source code for some terrestrial ecology models. Users of the ORNL DAAC include field ecologists, remote sensing scientists, modelers at various scales, synthesis scientific groups, a range of educational users (particularly baccalaureate and graduate instruction), and decision support analysts. It is clear that the wide range of users served by the ORNL DAAC have differing needs and differing capabilities for accessing and using data. It is also not possible for the ORNL DAAC, or the other data centers in EDSS to develop all of the tools and interfaces to support even most of the potential uses of data directly. As is typical of Information Technology to support a research enterprise, the user needs will continue to evolve rapidly over time and users themselves cannot predict future needs, as those needs depend on the results of current investigation. The ORNL DAAC is addressing these needs by targeted implementation of web services and tools which can be consumed by other applications, so that a modeler can retrieve data in netCDF format with the Climate Forecasting convention and a field ecologist can retrieve subsets of that same data in a comma separated value format, suitable for use in Excel or R. Tools such as our MODIS Subsetting capability, the Spatial Data Access Tool (SDAT; based on OGC web services), and OPeNDAP-compliant servers such as THREDDS particularly enable such diverse means of access. We also seek interoperability of metadata, recognizing that terrestrial ecology is a field where there are a very large number of relevant data repositories. ORNL DAAC

  9. Beyond Open Data: the importance of data standards and interoperability - Experiences from ECMWF's Open Data Week

    Science.gov (United States)

    Wagemann, Julia; Siemen, Stephan

    2017-04-01

    The European Centre for Medium-Range Weather Forecasts (ECMWF) has been providing an increasing amount of data to the public. One of the most widely used datasets include the global climate reanalyses (e.g. ERA-interim) and atmospheric composition data, which are available to the public free of charge. The centre is further operating, on behalf of the European Commission, two Copernicus Services, the Copernicus Atmosphere Monitoring Service (CAMS) and Climate Change Service (C3S), which are making up-to-date environmental information freely available for scientists, policy makers and businesses. However, to fully benefit from open data, large environmental datasets also have to be easily accessible in a standardised, machine-readable format. Traditional data centres, such as ECMWF, currently face challenges in providing interoperable standardised access to increasingly large and complex datasets for scientists and industry. Therefore, ECMWF put open data in the spotlight during a week of events in March 2017 exploring the potential of freely available weather- and climate-related data and to review technological solutions serving these data. Key events included a Workshop on Meteorological Operational Systems (MOS) and a two-day hackathon. The MOS workshop aimed at reviewing technologies and practices to ensure efficient (open) data processing and provision. The hackathon focused on exploring creative uses of open environmental data and to see how open data is beneficial for various industries. The presentation aims to give a review of the outcomes and conclusions of the Open Data Week at ECMWF. A specific focus will be set on the importance of data standards and web services to make open environmental data a success. The presentation overall examines the opportunities and challenges of open environmental data from a data provider's perspective.

  10. OBIS-USA: Enhancing Ocean Science Outcomes through Data Interoperability and Usability

    Science.gov (United States)

    Goldstein, P.; Fornwall, M.

    2014-12-01

    Commercial and industrial information systems have long built and relied upon standard data formats and transactions. Business processes, analytics, applications, and social networks emerge on top of these standards to create value. Examples of value delivered include operational productivity, analytics that enable growth and profit, and enhanced human communication and creativity for innovation. In science informatics, some research and operational activities operate with only scattered adoption of standards and few of the emergent benefits of interoperability. In-situ biological data management in the marine domain is an exemplar. From the origination of biological occurrence records in surveys, observer programs, monitoring and experimentation, through distribution techniques, to applications, decisions, and management response, marine biological data can be difficult, limited, and costly to integrate because of non-standard and undocumented conditions in the data. While this presentation identifies deficits in marine biological data practices, the presentation also identifies this as a field of opportunity. Standards for biological data and metadata do exist, with growing global adoption and extensibility features. Scientific, economic, and social-value motivations provide incentives to maximize marine science investments. Diverse science communities of national and international scale begin to see benefits of collaborative technologies. OBIS-USA (http://USGS.gov/obis-usa) is a program of the United States Geological Survey. This presentation shows how OBIS-USA directly addresses the opportunity to enhance ocean science outcomes through data infrastructure, including: (1) achieving rapid, economical, and high-quality data capture and data flow, (2) offering technology for data storage and methods for data discovery and quality/suitability evaluation, (3) making data understandable and consistent for application purposes, (4) distributing and integrating data in

  11. XACML profile and implementation for authorization interoperability between OSG and EGEE

    Energy Technology Data Exchange (ETDEWEB)

    Garzoglio, G.; Alderman, I.; Altunay, M.; Ananthakrishnan, R.; Bester, J.; Chadwick, K.; Ciaschini, V.; Demchenko, Y.; Ferraro, A.; Forti, A.; Groep, D.; /Fermilab /Wisconsin U., Madison /Argonne /INFN, CNAF /Amsterdam U. /NIKHEF, Amsterdam /Brookhaven /SWITCH, Zurich /Bergen Coll. Higher Educ.

    2009-05-01

    The Open Science Grid (OSG) and the Enabling Grids for E-sciencE (EGEE) have a common security model, based on Public Key Infrastructure. Grid resources grant access to users because of their membership in a Virtual Organization (VO), rather than on personal identity. Users push VO membership information to resources in the form of identity attributes, thus declaring that resources will be consumed on behalf of a specific group inside the organizational structure of the VO. Resources contact an access policies repository, centralized at each site, to grant the appropriate privileges for that VO group. Before the work in this paper, despite the commonality of the model, OSG and EGEE used different protocols for the communication between resources and the policy repositories. Hence, middleware developed for one Grid could not naturally be deployed on the other Grid, since the authorization module of the middleware would have to be enhanced to support the other Grid's communication protocol. In addition, maintenance and support for different authorization call-out protocols represents a duplication of effort for our relatively small community. To address these issues, OSG and EGEE initiated a joint project on authorization interoperability. The project defined a common communication protocol and attribute identity profile for authorization call-out and provided implementation and integration with major Grid middleware. The activity had resonance with middleware development communities, such as the Globus Toolkit and Condor, who decided to join the collaboration and contribute requirements and software. In this paper, we discuss the main elements of the profile, its implementation, and deployment in EGEE and OSG. We focus in particular on the operations of the authorization infrastructures of both Grids.

  12. Distributed data discovery, access and visualization services to Improve Data Interoperability across different data holdings

    Science.gov (United States)

    Palanisamy, G.; Krassovski, M.; Devarakonda, R.; Santhana Vannan, S.

    2012-12-01

    The current climate debate is highlighting the importance of free, open, and authoritative sources of high quality climate data that are available for peer review and for collaborative purposes. It is increasingly important to allow various organizations around the world to share climate data in an open manner, and to enable them to perform dynamic processing of climate data. This advanced access to data can be enabled via Web-based services, using common "community agreed" standards without having to change their internal structure used to describe the data. The modern scientific community has become diverse and increasingly complex in nature. To meet the demands of such diverse user community, the modern data supplier has to provide data and other related information through searchable, data and process oriented tool. This can be accomplished by setting up on-line, Web-based system with a relational database as a back end. The following common features of the web data access/search systems will be outlined in the proposed presentation: - A flexible data discovery - Data in commonly used format (e.g., CSV, NetCDF) - Preparing metadata in standard formats (FGDC, ISO19115, EML, DIF etc.) - Data subseting capabilities and ability to narrow down to individual data elements - Standards based data access protocols and mechanisms (SOAP, REST, OpenDAP, OGC etc.) - Integration of services across different data systems (discovery to access, visualizations and subseting) This presentation will also include specific examples of integration of various data systems that are developed by Oak Ridge National Laboratory's - Climate Change Science Institute, their ability to communicate between each other to enable better data interoperability and data integration. References: [1] Devarakonda, Ranjeet, and Harold Shanafield. "Drupal: Collaborative framework for science research." Collaboration Technologies and Systems (CTS), 2011 International Conference on. IEEE, 2011. [2

  13. Design and study of geosciences data share platform :platform framework, data interoperability, share approach

    Science.gov (United States)

    Lu, H.; Yi, D.

    2010-12-01

    The Deep Exploration is one of the important approaches to the Geoscience research. Since 1980s we had started it and achieved a lot of data. Researchers usually integrate both data of space exploration and deep exploration to study geological structures and represent the Earth’s subsurface, and analyze and explain on the base of integrated data. Due to the different exploration approach it results the heterogeneity of data, and therefore the data achievement is always of the import issue to make the researchers confused. The problem of data share and interaction has to be solved during the development of the SinoProbe research project. Through the research of domestic and overseas well-known exploration project and geosciences data platform, the subject explores the solution of data share and interaction. Based on SOA we present the deep exploration data share framework which comprises three level: data level is used for the solution of data store and the integration of the heterogeneous data; medial level provides the data service of geophysics, geochemistry, etc. by the means of Web service, and carry out kinds of application combination by the use of GIS middleware and Eclipse RCP; interaction level provides professional and non-professional customer the access to different accuracy data. The framework adopts GeoSciML data interaction approach. GeoSciML is a geosciences information markup language, as an application of the OpenGIS Consortium’s (OGC) Geography Markup Language (GML). It transfers heterogeneous data into one earth frame and implements inter-operation. We dissertate in this article the solution how to integrate the heterogeneous data and share the data in the project of SinoProbe.

  14. Investigating interoperability of the LSST data management software stack with Astropy

    Science.gov (United States)

    Jenness, Tim; Bosch, James; Owen, Russell; Parejko, John; Sick, Jonathan; Swinbank, John; de Val-Borro, Miguel; Dubois-Felsmann, Gregory; Lim, K.-T.; Lupton, Robert H.; Schellart, Pim; Krughoff, K. S.; Tollerud, Erik J.

    2016-07-01

    The Large Synoptic Survey Telescope (LSST) will be an 8.4m optical survey telescope sited in Chile and capable of imaging the entire sky twice a week. The data rate of approximately 15TB per night and the requirements to both issue alerts on transient sources within 60 seconds of observing and create annual data releases means that automated data management systems and data processing pipelines are a key deliverable of the LSST construction project. The LSST data management software has been in development since 2004 and is based on a C++ core with a Python control layer. The software consists of nearly a quarter of a million lines of code covering the system from fundamental WCS and table libraries to pipeline environments and distributed process execution. The Astropy project began in 2011 as an attempt to bring together disparate open source Python projects and build a core standard infrastructure that can be used and built upon by the astronomy community. This project has been phenomenally successful in the years since it has begun and has grown to be the de facto standard for Python software in astronomy. Astropy brings with it considerable expectations from the community on how astronomy Python software should be developed and it is clear that by the time LSST is fully operational in the 2020s many of the prospective users of the LSST software stack will expect it to be fully interoperable with Astropy. In this paper we describe the overlap between the LSST science pipeline software and Astropy software and investigate areas where the LSST software provides new functionality. We also discuss the possibilities of re-engineering the LSST science pipeline software to build upon Astropy, including the option of contributing affliated packages.

  15. The National Opportunity for Interoperability and its Benefits for a Reliable, Robust, and Future Grid Realized Through Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Office of Energy Efficiency and Renewable Energy

    2016-02-01

    Today, increasing numbers of intermittent generation sources (e.g., wind and photovoltaic) and new mobile intermittent loads (e.g., electric vehicles) can significantly affect traditional utility business practices and operations. At the same time, a growing number of technologies and devices, from appliances to lighting systems, are being deployed at consumer premises that have more sophisticated controls and information that remain underused for anything beyond basic building equipment operations. The intersection of these two drivers is an untapped opportunity and underused resource that, if appropriately configured and realized in open standards, can provide significant energy efficiency and commensurate savings on utility bills, enhanced and lower cost reliability to utilities, and national economic benefits in the creation of new markets, sectors, and businesses being fueled by the seamless coordination of energy and information through device and technology interoperability. Or, as the Quadrennial Energy Review puts it, “A plethora of both consumer-level and grid-level devices are either in the market, under development, or at the conceptual stage. When tied together through the information technology that is increasingly being deployed on electric utilities’ distribution grids, they can be an important enabling part of the emerging grid of the future. However, what is missing is the ability for all of these devices to coordinate and communicate their operations with the grid, and among themselves, in a common language — an open standard.” In this paper, we define interoperability as the ability to exchange actionable information between two or more systems within a home or building, or across and within organizational boundaries. Interoperability relies on the shared meaning of the exchanged information, with agreed-upon expectations and consequences, for the response to the information exchange.

  16. Implementing standards for the interoperability among healthcare providers in the public regionalized Healthcare Information System of the Lombardy Region.

    Science.gov (United States)

    Barbarito, Fulvio; Pinciroli, Francesco; Mason, John; Marceglia, Sara; Mazzola, Luca; Bonacina, Stefano

    2012-08-01

    Information technologies (ITs) have now entered the everyday workflow in a variety of healthcare providers with a certain degree of independence. This independence may be the cause of difficulty in interoperability between information systems and it can be overcome through the implementation and adoption of standards. Here we present the case of the Lombardy Region, in Italy, that has been able, in the last 10 years, to set up the Regional Social and Healthcare Information System, connecting all the healthcare providers within the region, and providing full access to clinical and health-related documents independently from the healthcare organization that generated the document itself. This goal, in a region with almost 10 millions citizens, was achieved through a twofold approach: first, the political and operative push towards the adoption of the Health Level 7 (HL7) standard within single hospitals and, second, providing a technological infrastructure for data sharing based on interoperability specifications recognized at the regional level for messages transmitted from healthcare providers to the central domain. The adoption of such regional interoperability specifications enabled the communication among heterogeneous systems placed in different hospitals in Lombardy. Integrating the Healthcare Enterprise (IHE) integration profiles which refer to HL7 standards are adopted within hospitals for message exchange and for the definition of integration scenarios. The IHE patient administration management (PAM) profile with its different workflows is adopted for patient management, whereas the Scheduled Workflow (SWF), the Laboratory Testing Workflow (LTW), and the Ambulatory Testing Workflow (ATW) are adopted for order management. At present, the system manages 4,700,000 pharmacological e-prescriptions, and 1,700,000 e-prescriptions for laboratory exams per month. It produces, monthly, 490,000 laboratory medical reports, 180,000 radiology medical reports, 180

  17. Interoperability of clinical decision-support systems and electronic health records using archetypes: a case study in clinical trial eligibility.

    Science.gov (United States)

    Marcos, Mar; Maldonado, Jose A; Martínez-Salvador, Begoña; Boscá, Diego; Robles, Montserrat

    2013-08-01

    Clinical decision-support systems (CDSSs) comprise systems as diverse as sophisticated platforms to store and manage clinical data, tools to alert clinicians of problematic situations, or decision-making tools to assist clinicians. Irrespective of the kind of decision-support task CDSSs should be smoothly integrated within the clinical information system, interacting with other components, in particular with the electronic health record (EHR). However, despite decades of developments, most CDSSs lack interoperability features. We deal with the interoperability problem of CDSSs and EHRs by exploiting the dual-model methodology. This methodology distinguishes a reference model and archetypes. A reference model is represented by a stable and small object-oriented model that describes the generic properties of health record information. For their part, archetypes are reusable and domain-specific definitions of clinical concepts in the form of structured and constrained combinations of the entities of the reference model. We rely on archetypes to make the CDSS compatible with EHRs from different institutions. Concretely, we use archetypes for modelling the clinical concepts that the CDSS requires, in conjunction with a series of knowledge-intensive mappings relating the archetypes to the data sources (EHR and/or other archetypes) they depend on. We introduce a comprehensive approach, including a set of tools as well as methodological guidelines, to deal with the interoperability of CDSSs and EHRs based on archetypes. Archetypes are used to build a conceptual layer of the kind of a virtual health record (VHR) over the EHR whose contents need to be integrated and used in the CDSS, associating them with structural and terminology-based semantics. Subsequently, the archetypes are mapped to the EHR by means of an expressive mapping language and specific-purpose tools. We also describe a case study where the tools and methodology have been employed in a CDSS to support

  18. Interoperability Between Coastal Web Atlases Using Semantic Mediation: A Case Study of the International Coastal Atlas Network (ICAN)

    Science.gov (United States)

    Wright, D. J.; Lassoued, Y.; Dwyer, N.; Haddad, T.; Bermudez, L. E.; Dunne, D.

    2009-12-01

    Coastal mapping plays an important role in informing marine spatial planning, resource management, maritime safety, hazard assessment and even national sovereignty. As such, there is now a plethora of data/metadata catalogs, pre-made maps, tabular and text information on resource availability and exploitation, and decision-making tools. A recent trend has been to encapsulate these in a special class of web-enabled geographic information systems called a coastal web atlas (CWA). While multiple benefits are derived from tailor-made atlases, there is great value added from the integration of disparate CWAs. CWAs linked to one another can query more successfully to optimize planning and decision-making. If a dataset is missing in one atlas, it may be immediately located in another. Similar datasets in two atlases may be combined to enhance study in either region. *But how best to achieve semantic interoperability to mitigate vague data queries, concepts or natural language semantics when retrieving and integrating data and information?* We report on the development of a new prototype seeking to interoperate between two initial CWAs: the Marine Irish Digital Atlas (MIDA) and the Oregon Coastal Atlas (OCA). These two mature atlases are used as a testbed for more regional connections, with the intent for the OCA to use lessons learned to develop a regional network of CWAs along the west coast, and for MIDA to do the same in building and strengthening atlas networks with the UK, Belgium, and other parts of Europe. Our prototype uses semantic interoperability via services harmonization and ontology mediation, allowing local atlases to use their own data structures, and vocabularies (ontologies). We use standard technologies such as OGC Web Map Services (WMS) for delivering maps, and OGC Catalogue Service for the Web (CSW) for delivering and querying ISO-19139 metadata. The metadata records of a given CWA use a given ontology of terms called local ontology. Human or machine

  19. A Fast Healthcare Interoperability Resources (FHIR) layer implemented over i2b2.

    Science.gov (United States)

    Boussadi, Abdelali; Zapletal, Eric

    2017-08-14

    Standards and technical specifications have been developed to define how the information contained in Electronic Health Records (EHRs) should be structured, semantically described, and communicated. Current trends rely on differentiating the representation of data instances from the definition of clinical information models. The dual model approach, which combines a reference model (RM) and a clinical information model (CIM), sets in practice this software design pattern. The most recent initiative, proposed by HL7, is called Fast Health Interoperability Resources (FHIR). The aim of our study was to investigate the feasibility of applying the FHIR standard to modeling and exposing EHR data of the Georges Pompidou European Hospital (HEGP) integrating biology and the bedside (i2b2) clinical data warehouse (CDW). We implemented a FHIR server over i2b2 to expose EHR data in relation with five FHIR resources: DiagnosisReport, MedicationOrder, Patient, Encounter, and Medication. The architecture of the server combines a Data Access Object design pattern and FHIR resource providers, implemented using the Java HAPI FHIR API. Two types of queries were tested: query type #1 requests the server to display DiagnosticReport resources, for which the diagnosis code is equal to a given ICD-10 code. A total of 80 DiagnosticReport resources, corresponding to 36 patients, were displayed. Query type #2, requests the server to display MedicationOrder, for which the FHIR Medication identification code is equal to a given code expressed in a French coding system. A total of 503 MedicationOrder resources, corresponding to 290 patients, were displayed. Results were validated by manually comparing the results of each request to the results displayed by an ad-hoc SQL query. We showed the feasibility of implementing a Java layer over the i2b2 database model to expose data of the CDW as a set of FHIR resources. An important part of this work was the structural and semantic mapping between the

  20. Architectural Blueprint for Plate Boundary Observatories based on interoperable Data Management Platforms

    Science.gov (United States)

    Kerschke, D. I.; Häner, R.; Schurr, B.; Oncken, O.; Wächter, J.

    2014-12-01

    Interoperable data management platforms play an increasing role in the advancement of knowledge and technology in many scientific disciplines. Through high quality services they support the establishment of efficient and innovative research environments. Well-designed research environments can facilitate the sustainable utilization, exchange, and re-use of scientific data and functionality by using standardized community models. Together with innovative 3D/4D visualization, these concepts provide added value in improving scientific knowledge-gain, even across the boundaries of disciplines. A project benefiting from the added value is the Integrated Plate boundary Observatory in Chile (IPOC). IPOC is a European-South American network to study earthquakes and deformation at the Chilean continental margin and to monitor the plate boundary system for capturing an anticipated great earthquake in a seismic gap. In contrast to conventional observatories that monitor individual signals only, IPOC captures a large range of different processes through various observation methods (e.g., seismographs, GPS, magneto-telluric sensors, creep-meter, accelerometer, InSAR). For IPOC a conceptual design has been devised that comprises an architectural blueprint for a data management platform based on common and standardized data models, protocols, and encodings as well as on an exclusive use of Free and Open Source Software (FOSS) including visualization components. Following the principles of event-driven service-oriented architectures, the design enables novel processes by sharing and re-using functionality and information on the basis of innovative data mining and data fusion technologies. This platform can help to improve the understanding of the physical processes underlying plate deformations as well as the natural hazards induced by them. Through the use of standards, this blueprint can not only be facilitated for other plate observing systems (e.g., the European Plate

  1. GEOSS AIP-2 Climate Change and Biodiversity Use Scenarios: Interoperability Infrastructures (Invited)

    Science.gov (United States)

    Nativi, S.; Santoro, M.

    2009-12-01

    Currently, one of the major challenges for scientific community is the study of climate change effects on life on Earth. To achieve this, it is crucial to understand how climate change will impact on biodiversity and, in this context, several application scenarios require modeling the impact of climate change on distribution of individual species. In the context of GEOSS AIP-2 (Global Earth Observation System of Systems, Architecture Implementation Pilot- Phase 2), the Climate Change & Biodiversity thematic Working Group developed three significant user scenarios. A couple of them make use of a GEOSS-based framework to study the impact of climate change factors on regional species distribution. The presentation introduces and discusses this framework which provides an interoperability infrastructures to loosely couple standard services and components to discover and access climate and biodiversity data, and run forecast and processing models. The framework is comprised of the following main components and services: a)GEO Portal: through this component end user is able to search, find and access the needed services for the scenario execution; b)Graphical User Interface (GUI): this component provides user interaction functionalities. It controls the workflow manager to perform the required operations for the scenario implementation; c)Use Scenario controller: this component acts as a workflow controller implementing the scenario business process -i.e. a typical climate change & biodiversity projection scenario; d)Service Broker implementing Mediation Services: this component realizes a distributed catalogue which federates several discovery and access components (exposing them through a unique CSW standard interface). Federated components publish climate, environmental and biodiversity datasets; e)Ecological Niche Model Server: this component is able to run one or more Ecological Niche Models (ENM) on selected biodiversity and climate datasets; f)Data Access

  2. Exploring interoperability: The advancements and challenges of improving data discovery, access, and visualization of scientific data through the NOAA Earth Information System (NEIS). (Invited)

    Science.gov (United States)

    Stewart, J.; Lynge, J.; Hackathorn, E.; MacDermaid, C.; Pierce, R.; Smith, J.

    2013-12-01

    Interoperability is a complex subject and often leads to different definitions in different environments. An interoperable framework of web services can improve the user experience by providing an interface for interaction with data regardless of it's format or physical location. This in itself improves accessibility to data, fosters data exploration and use, and provides a framework for new tools and applications. With an interoperable system you have: -- Data ready for action. Services model facilitates agile response to events. Services can be combined or reused quickly, upgraded or modified independently. -- Any data available through an interoperable framework can be operated on or combined with other data. Integrating standardized formats and access. -- New and existing systems have access to wide variety of data. Any new data added is easily incorporated with minimal changes required. The possibilities are limitless. The NOAA Earth Information System (NEIS) at the Earth System Research Laboratory (ESRL) is continuing research into an interoperable framework of layered services designed to facilitate the discovery, access, integration, visualization, and understanding of all NOAA (past, present, and future) data. An underlying philosophy of NEIS is to take advantage of existing off-the-shelf technologies and standards to minimize development of custom code allowing everyone to take advantage of the framework to meet these goals above. This framework, while built by NOAA are not limited to NOAA data or applications. Any other data available through similar services or applications that understand these standards can work interchangeably. Two major challenges are under active research at ESRL are data discoverability and fast access to big data. This presentation will provide an update on development of NEIS, including these challenges, the findings, and recommendations on what is needed for an interoperable system, as well as ongoing research activities

  3. Interoperation of an UHF RFID Reader and a TCP/IP Device via Wired and Wireless Links

    Directory of Open Access Journals (Sweden)

    Ik Soo Jin

    2011-11-01

    Full Text Available A main application in radio frequency identification (RFID sensor networks is the function that processes real-time tag information after gathering the required data from multiple RFID tags. The component technologies that contain an RFID reader, called the interrogator, which has a tag chip, processors, coupling antenna, and a power management system have advanced significantly over the last decade. This paper presents a system implementation for interoperation between an UHF RFID reader and a TCP/IP device that is used as a gateway. The proposed system consists of an UHF RFID tag, an UHF RFID reader, an RF end-device, an RF coordinator, and a TCP/IP I/F. The UHF RFID reader, operating at 915 MHz, is compatible with EPC Class-0/Gen1, Class-1/Gen1 and 2, and ISO18000-6B. In particular, the UHF RFID reader can be combined with the RF end-device/coordinator for a ZigBee (IEEE 802.15.4 interface, which is a low-power wireless standard. The TCP/IP device communicates with the RFID reader via wired links. On the other hand, it is connected to the ZigBee end-device via wireless links. The web based test results show that the developed system can remotely recognize information of multiple tags through the interoperation between the RFID reader and the TCP/IP device.

  4. Promoting A-Priori Interoperability of HLA-Based Simulations in the Space Domain: The SISO Space Reference FOM Initiative

    Science.gov (United States)

    Moller, Bjorn; Garro, Alfredo; Falcone, Alberto; Crues, Edwin Z.; Dexter, Daniel E.

    2016-01-01

    Distributed and Real-Time Simulation plays a key-role in the Space domain being exploited for missions and systems analysis and engineering as well as for crew training and operational support. One of the most popular standards is the 1516-2010 IEEE Standard for Modeling and Simulation (M&S) High Level Architecture (HLA). HLA supports the implementation of distributed simulations (called Federations) in which a set of simulation entities (called Federates) can interact using a Run-Time Infrastructure (RTI). In a given Federation, a Federate can publish and/or subscribes objects and interactions on the RTI only in accordance with their structures as defined in a FOM (Federation Object Model). Currently, the Space domain is characterized by a set of incompatible FOMs that, although meet the specific needs of different organizations and projects, increases the long-term cost for interoperability. In this context, the availability of a reference FOM for the Space domain will enable the development of interoperable HLA-based simulators for related joint projects and collaborations among worldwide organizations involved in the Space domain (e.g. NASA, ESA, Roscosmos, and JAXA). The paper presents a first set of results achieved by a SISO standardization effort that aims at providing a Space Reference FOM for international collaboration on Space systems simulations.

  5. The DBCLS BioHackathon: standardization and interoperability for bioinformatics web services and workflows. The DBCLS BioHackathon Consortium*

    Directory of Open Access Journals (Sweden)

    Katayama Toshiaki

    2010-08-01

    Full Text Available Abstract Web services have become a key technology for bioinformatics, since life science databases are globally decentralized and the exponential increase in the amount of available data demands for efficient systems without the need to transfer entire databases for every step of an analysis. However, various incompatibilities among database resources and analysis services make it difficult to connect and integrate these into interoperable workflows. To resolve this situation, we invited domain specialists from web service providers, client software developers, Open Bio* projects, the BioMoby project and researchers of emerging areas where a standard exchange data format is not well established, for an intensive collaboration entitled the BioHackathon 2008. The meeting was hosted by the Database Center for Life Science (DBCLS and Computational Biology Research Center (CBRC and was held in Tokyo from February 11th to 15th, 2008. In this report we highlight the work accomplished and the common issues arisen from this event, including the standardization of data exchange formats and services in the emerging fields of glycoinformatics, biological interaction networks, text mining, and phyloinformatics. In addition, common shared object development based on BioSQL, as well as technical challenges in large data management, asynchronous services, and security are discussed. Consequently, we improved interoperability of web services in several fields, however, further cooperation among major database centers and continued collaborative efforts between service providers and software developers are still necessary for an effective advance in bioinformatics web service technologies.

  6. EarthCube Cyberinfrastructure: The Importance of and Need for International Strategic Partnerships to Enhance Interconnectivity and Interoperability

    Science.gov (United States)

    Ramamurthy, M. K.; Lehnert, K.; Zanzerkia, E. E.

    2017-12-01

    The United States National Science Foundation's EarthCube program is a community-driven activity aimed at transforming the conduct of geosciences research and education by creating a well-connected cyberinfrastructure for sharing and integrating data and knowledge across all geoscience disciplines in an open, transparent, and inclusive manner and to accelerate our ability to understand and predict the Earth system. After five years of community engagement, governance, and development activities, EarthCube is now transitioning into an implementation phase. In the first phase of implementing the EarthCube architecture, the project leadership has identified the following architectural components as the top three priorities, focused on technologies, interfaces and interoperability elements that will address: a) Resource Discovery; b) Resource Registry; and c) Resource Distribution and Access. Simultaneously, EarthCube is exploring international partnerships to leverage synergies with other e-infrastructure programs and projects in Europe, Australia, and other regions and discuss potential partnerships and mutually beneficial collaborations to increase interoperability of systems for advancing EarthCube's goals in an efficient and effective manner. In this session, we will present the progress of EarthCube on a number of fronts and engage geoscientists and data scientists in the future steps toward the development of EarthCube for advancing research and discovery in the geosciences. The talk will underscore the importance of strategic partnerships with other like eScience projects and programs across the globe.

  7. Prototype Interoperability Document between NASA-JSC and DLR-GSOC Describing the CCSDS SM and C Mission Operations Prototype

    Science.gov (United States)

    Lucord, Steve A.; Gully, Sylvain

    2009-01-01

    The purpose of the PROTOTYPE INTEROPERABILITY DOCUMENT is to document the design and interfaces for the service providers and consumers of a Mission Operations prototype between JSC-OTF and DLR-GSOC. The primary goal is to test the interoperability sections of the CCSDS Spacecraft Monitor & Control (SM&C) Mission Operations (MO) specifications between both control centers. An additional goal is to provide feedback to the Spacecraft Monitor and Control (SM&C) working group through the Review Item Disposition (RID) process. This Prototype is considered a proof of concept and should increase the knowledge base of the CCSDS SM&C Mission Operations standards. No operational capabilities will be provided. The CCSDS Mission Operations (MO) initiative was previously called Spacecraft Monitor and Control (SM&C). The specifications have been renamed to better reflect the scope and overall objectives. The working group retains the name Spacecraft Monitor and Control working group and is under the Mission Operations and Information Services Area (MOIMS) of CCSDS. This document will refer to the specifications as SM&C Mission Operations, Mission Operations or just MO.

  8. The DBCLS BioHackathon: standardization and interoperability for bioinformatics web services and workflows. The DBCLS BioHackathon Consortium*.

    Science.gov (United States)

    Katayama, Toshiaki; Arakawa, Kazuharu; Nakao, Mitsuteru; Ono, Keiichiro; Aoki-Kinoshita, Kiyoko F; Yamamoto, Yasunori; Yamaguchi, Atsuko; Kawashima, Shuichi; Chun, Hong-Woo; Aerts, Jan; Aranda, Bruno; Barboza, Lord Hendrix; Bonnal, Raoul Jp; Bruskiewich, Richard; Bryne, Jan C; Fernández, José M; Funahashi, Akira; Gordon, Paul Mk; Goto, Naohisa; Groscurth, Andreas; Gutteridge, Alex; Holland, Richard; Kano, Yoshinobu; Kawas, Edward A; Kerhornou, Arnaud; Kibukawa, Eri; Kinjo, Akira R; Kuhn, Michael; Lapp, Hilmar; Lehvaslaiho, Heikki; Nakamura, Hiroyuki; Nakamura, Yasukazu; Nishizawa, Tatsuya; Nobata, Chikashi; Noguchi, Tamotsu; Oinn, Thomas M; Okamoto, Shinobu; Owen, Stuart; Pafilis, Evangelos; Pocock, Matthew; Prins, Pjotr; Ranzinger, René; Reisinger, Florian; Salwinski, Lukasz; Schreiber, Mark; Senger, Martin; Shigemoto, Yasumasa; Standley, Daron M; Sugawara, Hideaki; Tashiro, Toshiyuki; Trelles, Oswaldo; Vos, Rutger A; Wilkinson, Mark D; York, William; Zmasek, Christian M; Asai, Kiyoshi; Takagi, Toshihisa

    2010-08-21

    Web services have become a key technology for bioinformatics, since life science databases are globally decentralized and the exponential increase in the amount of available data demands for efficient systems without the need to transfer entire databases for every step of an analysis. However, various incompatibilities among database resources and analysis services make it difficult to connect and integrate these into interoperable workflows. To resolve this situation, we invited domain specialists from web service providers, client software developers, Open Bio* projects, the BioMoby project and researchers of emerging areas where a standard exchange data format is not well established, for an intensive collaboration entitled the BioHackathon 2008. The meeting was hosted by the Database Center for Life Science (DBCLS) and Computational Biology Research Center (CBRC) and was held in Tokyo from February 11th to 15th, 2008. In this report we highlight the work accomplished and the common issues arisen from this event, including the standardization of data exchange formats and services in the emerging fields of glycoinformatics, biological interaction networks, text mining, and phyloinformatics. In addition, common shared object development based on BioSQL, as well as technical challenges in large data management, asynchronous services, and security are discussed. Consequently, we improved interoperability of web services in several fields, however, further cooperation among major database centers and continued collaborative efforts between service providers and software developers are still necessary for an effective advance in bioinformatics web service technologies.

  9. An observational study of the relationship between meaningful use-based electronic health information exchange, interoperability, and medication reconciliation capabilities.

    Science.gov (United States)

    Elysee, Gerald; Herrin, Jeph; Horwitz, Leora I

    2017-10-01

    Stagnation in hospitals' adoption of data integration functionalities coupled with reduction in the number of operational health information exchanges could become a significant impediment to hospitals' adoption of 3 critical capabilities: electronic health information exchange, interoperability, and medication reconciliation, in which electronic systems are used to assist with resolving medication discrepancies and improving patient safety. Against this backdrop, we assessed the relationships between the 3 capabilities.We conducted an observational study applying partial least squares-structural equation modeling technique to 27 variables obtained from the 2013 American Hospital Association annual survey Information Technology (IT) supplement, which describes health IT capabilities.We included 1330 hospitals. In confirmatory factor analysis, out of the 27 variables, 15 achieved loading values greater than 0.548 at P building blocks of the 3 capabilities. Subsequent path analysis showed a significant, positive, and cyclic relationship between the capabilities, in that decreases in the hospitals' adoption of one would lead to decreases in the adoption of the others.These results show that capability for high quality medication reconciliation may be impeded by lagging adoption of interoperability and health information exchange capabilities. Policies focused on improving one or more of these capabilities may have ancillary benefits.

  10. The EGI-Engage EPOS Competence Center - Interoperating heterogeneous AAI mechanisms and Orchestrating distributed computational resources

    Science.gov (United States)

    Bailo, Daniele; Scardaci, Diego; Spinuso, Alessandro; Sterzel, Mariusz; Schwichtenberg, Horst; Gemuend, Andre

    2016-04-01

    The mission of EGI-Engage project [1] is to accelerate the implementation of the Open Science Commons vision, where researchers from all disciplines have easy and open access to the innovative digital services, data, knowledge and expertise they need for collaborative and excellent research. The Open Science Commons is grounded on three pillars: the e-Infrastructure Commons, an ecosystem of services that constitute the foundation layer of distributed infrastructures; the Open Data Commons, where observations, results and applications are increasingly available for scientific research and for anyone to use and reuse; and the Knowledge Commons, in which communities have shared ownership of knowledge, participate in the co-development of software and are technically supported to exploit state-of-the-art digital services. To develop the Knowledge Commons, EGI-Engage is supporting the work of a set of community-specific Competence Centres, with participants from user communities (scientific institutes), National Grid Initiatives (NGIs), technology and service providers. Competence Centres collect and analyse requirements, integrate community-specific applications into state-of-the-art services, foster interoperability across e-Infrastructures, and evolve services through a user-centric development model. One of these Competence Centres is focussed on the European Plate Observing System (EPOS) [2] as representative of the solid earth science communities. EPOS is a pan-European long-term plan to integrate data, software and services from the distributed (and already existing) Research Infrastructures all over Europe, in the domain of the solid earth science. EPOS will enable innovative multidisciplinary research for a better understanding of the Earth's physical and chemical processes that control earthquakes, volcanic eruptions, ground instability and tsunami as well as the processes driving tectonics and Earth's surface dynamics. EPOS will improve our ability to better

  11. Capitalizing on global demands for open data access and interoperability - the USGIN story

    Science.gov (United States)

    Richard, Stephen; Allison, Lee

    2016-04-01

    system is opening new exploration opportunities and shortening project development by making data easily discoverable, accessible, and interoperable at no cost to users. USGIN Foundation, Inc. was established in 2014 as a not-for-profit company to deploy the USGIN data integration framework for other natural resource (energy, water, and minerals), natural hazards, and geoscience investigations applications, nationally and worldwide. The USGIN vision is that as each data node adds to its data repositories, the system-wide USGIN functions become increasingly valuable to it. The long term goal is that the data network reach a 'tipping point' at which it becomes like a data equivalent to the World Wide Web - where everyone will maintain the function because it is expected by its clientele and it fills critical needs.

  12. CINERGI: Community Inventory of EarthCube Resources for Geoscience Interoperability

    Science.gov (United States)

    Zaslavsky, Ilya; Bermudez, Luis; Grethe, Jeffrey; Gupta, Amarnath; Hsu, Leslie; Lehnert, Kerstin; Malik, Tanu; Richard, Stephen; Valentine, David; Whitenack, Thomas

    2014-05-01

    Organizing geoscience data resources to support cross-disciplinary data discovery, interpretation, analysis and integration is challenging because of different information models, semantic frameworks, metadata profiles, catalogs, and services used in different geoscience domains, not to mention different research paradigms and methodologies. The central goal of CINERGI, a new project supported by the US National Science Foundation through its EarthCube Building Blocks program, is to create a methodology and assemble a large inventory of high-quality information resources capable of supporting data discovery needs of researchers in a wide range of geoscience domains. The key characteristics of the inventory are: 1) collaboration with and integration of metadata resources from a number of large data facilities; 2) reliance on international metadata and catalog service standards; 3) assessment of resource "interoperability-readiness"; 4) ability to cross-link and navigate data resources, projects, models, researcher directories, publications, usage information, etc.; 5) efficient inclusion of "long-tail" data, which are not appearing in existing domain repositories; 6) data registration at feature level where appropriate, in addition to common dataset-level registration, and 7) integration with parallel EarthCube efforts, in particular focused on EarthCube governance, information brokering, service-oriented architecture design and management of semantic information. We discuss challenges associated with accomplishing CINERGI goals, including defining the inventory scope; managing different granularity levels of resource registration; interaction with search systems of domain repositories; explicating domain semantics; metadata brokering, harvesting and pruning; managing provenance of the harvested metadata; and cross-linking resources based on the linked open data (LOD) approaches. At the higher level of the inventory, we register domain-wide resources such as domain

  13. Design and Realization of Integrated Management System for Data Interoperability between Point-of-Care Testing Equipment and Hospital Information System.

    Science.gov (United States)

    Park, Ki Sang; Heo, Hyuk; Choi, Young Keun

    2013-09-01

    The purpose of this study was to design an integrated data management system based on the POCT1-A2, LIS2-A, LIS2-A2, and HL7 standard to ensure data interoperability between mobile equipment, such as point-of-care testing equipment and the existing hospital data system, its efficiency was also evaluated. The method of this study was intended to design and realize a data management system which would provide a solution for the problems that occur when point-of-care testing equipment is introduced to existing hospital data, after classifying such problems into connectivity, integration, and interoperability. This study also checked if the data management system plays a sufficient role as a bridge between the point-of-care testing equipment and the hospital information system through connection persistence and reliability testing, as well as data integration and interoperability testing. In comparison with the existing system, the data management system facilitated integration by improving the result receiving time, improving the collection rate, and by enabling the integration of disparate types of data into a single system. And it was found out that we can solve the problems related to connectivity, integration and interoperability through generating the message in standardized types. It is expected that the proposed data management system, which is designed to improve the integration point-of-care testing equipment with existing systems, will establish a solid foundation on which better medical service may be provided by hospitals by improving the quality of patient service.

  14. Practical experience in deploying and controlling the data sharing interoperability layer at the U.K. Land Open Systems Architecture (LOSA) field trials in October 2012

    Science.gov (United States)

    Bergamaschi, Flavio; Conway-Jones, Dave; Pearson, Gavin

    2013-05-01

    In October 2012 the UK MoD sponsored a multi-vendor field integration trial in support of its Land Open Systems Architecture (LOSA), an open, service based architecture for systems integration and interoperability which builds on the progress made with the Generic Vehicle Architecture (GVA, DefStan 23-09), Generic Base Architecture (GBA, DefStan 23-13) and the Generic Soldier Architecture (DefStan 23-12) programs. The aim of this trial was to experiment with a common data and power interoperability across and in support of the Soldier, Vehicles and Bases domains. This paper presents an overview of the field trial and discusses how the ITA Information Fabric, technology originated in the US and UK International Technology Alliance program, was extended to support the control of the data interoperability layer across various data bearers. This included: (a) interoperability and information sharing across multiple stove piped and legacy solutions; (b) command and control and bandwidth optimization of streamed data (e.g. video) over a peer-to-peer ad-hoc network across multiple domains- integration of disparate sensor systems; (c) integration with DDS based C2 systems.

  15. Intra-operator and inter-operator reliability of manual and semiautomated measurement of fetal nuchal translucency : a cross sectional study

    NARCIS (Netherlands)

    Bakker, M.; Mulder, P.; Birnie, E.; Bilardo, C. M.

    2013-01-01

    ObjectiveThe goal of this study was to examine the intra-operator and inter-operator differences of the manual and semiautomated nuchal translucency (NT) measurements and to evaluate if these differences alter women's risk status. MethodsA cross sectional study was performed. Two operators obtained

  16. Improving component interoperability and reusability with the java connection framework (JCF): overview and application to the ages-w environmental model

    Science.gov (United States)

    Environmental modeling framework (EMF) design goals are multi-dimensional and often include many aspects of general software framework development. Many functional capabilities offered by current EMFs are closely related to interoperability and reuse aspects. For example, an EMF needs to support dev...

  17. NASA JPL Distributed Systems Technology (DST) Object-Oriented Component Approach for Software Inter-Operability and Reuse

    Science.gov (United States)

    Hall, Laverne; Hung, Chaw-Kwei; Lin, Imin

    2000-01-01

    The purpose of this paper is to provide a description of NASA JPL Distributed Systems Technology (DST) Section's object-oriented component approach to open inter-operable systems software development and software reuse. It will address what is meant by the terminology object component software, give an overview of the component-based development approach and how it relates to infrastructure support of software architectures and promotes reuse, enumerate on the benefits of this approach, and give examples of application prototypes demonstrating its usage and advantages. Utilization of the object-oriented component technology approach for system development and software reuse will apply to several areas within JPL, and possibly across other NASA Centers.

  18. A vital signs telemonitoring system - interoperability supported by a personal health record systema and a cloud service.

    Science.gov (United States)

    Gutiérrez, Miguel F; Cajiao, Alejandro; Hidalgo, José A; Cerón, Jesús D; López, Diego M; Quintero, Víctor M; Rendón, Alvaro

    2014-01-01

    This article presents the development process of an acquisition and data storage system managing clinical variables through a cloud storage service and a Personal Health Record (PHR) System. First, the paper explains how a Wireless Body Area Network (WBAN) that captures data from two sensors corresponding to arterial pressure and heart rate is designed. Second, this paper illustrates how data collected by the WBAN are transmitted to a cloud storage service. It is worth mentioning that this cloud service allows the data to be stored in a persistent way on an online database system. Finally, the paper describes, how the data stored in the cloud service are sent to the Indivo PHR System, where they are registered and charted for future revision by health professionals. The research demonstrated the feasibility of implementing WBAN networks for the acquisition of clinical data, and particularly for the use of Web technologies and standards to provide interoperability with PHR Systems at technical and syntactic levels.

  19. An assessment of NASA master directory/catalog interoperability for interdisciplinary study of the global water cycle

    Science.gov (United States)

    Peuquet, Donna J.

    1991-01-01

    The most important issue facing science is understanding global change; the causes, the processes involved and their consequences. The key to success in this massive Earth science research effort will depend on efficient identification and access to the most data available across the atmospheric, oceanographic, and land sciences. Current mechanisms used by earth scientists for accessing these data fall far short of meeting this need. Scientists must as a result frequently rely on a priori knowledge and informal person to person networks to find relevant data. The Master Directory/Catalog Interoperability Program (MC/CI) undertaken by NASA is an important step in overcoming these problems. The stated goal of the MD project is to enable researchers to efficiently identify, locate, and obtain access to space and Earth science data.

  20. Design and management of public health outreach using interoperable mobile multimedia: an analysis of a national winter weather preparedness campaign

    Directory of Open Access Journals (Sweden)

    Cesar Bandera

    2016-05-01

    Full Text Available Abstract Background The Office of Public Health Preparedness and Response (OPHPR in the Centers for Disease Control and Prevention conducts outreach for public preparedness for natural and manmade incidents. In 2011, OPHPR conducted a nationwide mobile public health (m-Health campaign that pushed brief videos on preparing for severe winter weather onto cell phones, with the objective of evaluating the interoperability of multimedia m-Health outreach with diverse cell phones (including handsets without Internet capability, carriers, and user preferences. Methods Existing OPHPR outreach material on winter weather preparedness was converted into mobile-ready multimedia using mobile marketing best practices to improve audiovisual quality and relevance. Middleware complying with opt-in requirements was developed to push nine bi-weekly multimedia broadcasts onto subscribers’ cell phones, and OPHPR promoted the campaign on its web site and to subscribers on its govdelivery.com notification platform. Multimedia, text, and voice messaging activity to/from the middleware was logged and analyzed. Results Adapting existing media into mobile video was straightforward using open source and commercial software, including web pages, PDF documents, and public service announcements. The middleware successfully delivered all outreach videos to all participants (a total of 504 videos regardless of the participant’s device. 54 % of videos were viewed on cell phones, 32 % on computers, and 14 % were retrieved by search engine web crawlers. 21 % of participating cell phones did not have Internet access, yet still received and displayed all videos. The time from media push to media viewing on cell phones was half that of push to viewing on computers. Conclusions Video delivered through multimedia messaging can be as interoperable as text messages, while providing much richer information. This may be the only multimedia mechanism available to outreach campaigns