WorldWideScience

Sample records for series interoperability matrix

  1. Interoperability

    DEFF Research Database (Denmark)

    Savin, Andrej

    be limited. Fourth, Data protection “by design” would be distinguished from data protection “by default”. Fifth, new fundamental rights would be introduced and the old ones clarified. Sixth, new rules on controllers’ and processors’ duties, on supervisory authorities and on sanctions would be introduced....... Finally, the Commission would obtain significant new powers to adopt delegated acts. This appendix explores the impact that the proposed Regulation might have on interoperability of user-­‐generated services.4 Since the proposed Regulation is an instrument of high complexity, only those provisions...... of direct relevance for the project and Work Package 5 will be analysed here....

  2. IEEE Smart Grid Series of Standards IEEE 2030 (Interoperability) and IEEE 1547 (Interconnection) Status: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Basso, T.; DeBlasio, R.

    2012-04-01

    The IEEE American National Standards smart grid publications and standards development projects IEEE 2030, which addresses smart grid interoperability, and IEEE 1547TM, which addresses distributed resources interconnection with the grid, have made substantial progress since 2009. The IEEE 2030TM and 1547 standards series focus on systems-level aspects and cover many of the technical integration issues involved in a mature smart grid. The status and highlights of these two IEEE series of standards, which are sponsored by IEEE Standards Coordinating Committee 21 (SCC21), are provided in this paper.

  3. Covariance matrix estimation for stationary time series

    OpenAIRE

    Xiao, Han; Wu, Wei Biao

    2011-01-01

    We obtain a sharp convergence rate for banded covariance matrix estimates of stationary processes. A precise order of magnitude is derived for spectral radius of sample covariance matrices. We also consider a thresholded covariance matrix estimator that can better characterize sparsity if the true covariance matrix is sparse. As our main tool, we implement Toeplitz [Math. Ann. 70 (1911) 351–376] idea and relate eigenvalues of covariance matrices to the spectral densities or Fourier transforms...

  4. Construction of the exact Fisher information matrix of Gaussian time series models by means of matrix differential rules

    NARCIS (Netherlands)

    Klein, A.A.B.; Melard, G.; Zahaf, T.

    2000-01-01

    The Fisher information matrix is of fundamental importance for the analysis of parameter estimation of time series models. In this paper the exact information matrix of a multivariate Gaussian time series model expressed in state space form is derived. A computationally efficient procedure is used

  5. Grid interoperability: the interoperations cookbook

    Energy Technology Data Exchange (ETDEWEB)

    Field, L; Schulz, M [CERN (Switzerland)], E-mail: Laurence.Field@cern.ch, E-mail: Markus.Schulz@cern.ch

    2008-07-01

    Over recent years a number of grid projects have emerged which have built grid infrastructures that are now the computing backbones for various user communities. A significant number of these communities are limited to one grid infrastructure due to the different middleware and procedures used in each grid. Grid interoperation is trying to bridge these differences and enable virtual organizations to access resources independent of the grid project affiliation. This paper gives an overview of grid interoperation and describes the current methods used to bridge the differences between grids. Actual use cases encountered during the last three years are discussed and the most important interfaces required for interoperability are highlighted. A summary of the standardisation efforts in these areas is given and we argue for moving more aggressively towards standards.

  6. Grid interoperability: the interoperations cookbook

    International Nuclear Information System (INIS)

    Field, L; Schulz, M

    2008-01-01

    Over recent years a number of grid projects have emerged which have built grid infrastructures that are now the computing backbones for various user communities. A significant number of these communities are limited to one grid infrastructure due to the different middleware and procedures used in each grid. Grid interoperation is trying to bridge these differences and enable virtual organizations to access resources independent of the grid project affiliation. This paper gives an overview of grid interoperation and describes the current methods used to bridge the differences between grids. Actual use cases encountered during the last three years are discussed and the most important interfaces required for interoperability are highlighted. A summary of the standardisation efforts in these areas is given and we argue for moving more aggressively towards standards

  7. Random matrix theory for heavy-tailed time series

    DEFF Research Database (Denmark)

    Heiny, Johannes

    2017-01-01

    This paper is a review of recent results for large random matrices with heavy-tailed entries. First, we outline the development of and some classical results in random matrix theory. We focus on large sample covariance matrices, their limiting spectral distributions, the asymptotic behavior...

  8. Asymptotic theory for the sample covariance matrix of a heavy-tailed multivariate time series

    DEFF Research Database (Denmark)

    Davis, Richard A.; Mikosch, Thomas Valentin; Pfaffel, Olivier

    2016-01-01

    In this paper we give an asymptotic theory for the eigenvalues of the sample covariance matrix of a multivariate time series. The time series constitutes a linear process across time and between components. The input noise of the linear process has regularly varying tails with index α∈(0,4) in...... particular, the time series has infinite fourth moment. We derive the limiting behavior for the largest eigenvalues of the sample covariance matrix and show point process convergence of the normalized eigenvalues. The limiting process has an explicit form involving points of a Poisson process and eigenvalues...... of a non-negative definite matrix. Based on this convergence we derive limit theory for a host of other continuous functionals of the eigenvalues, including the joint convergence of the largest eigenvalues, the joint convergence of the largest eigenvalue and the trace of the sample covariance matrix...

  9. Interoperability Strategic Vision

    Energy Technology Data Exchange (ETDEWEB)

    Widergren, Steven E.; Knight, Mark R.; Melton, Ronald B.; Narang, David; Martin, Maurice; Nordman, Bruce; Khandekar, Aditya; Hardy, Keith S.

    2018-02-28

    The Interoperability Strategic Vision whitepaper aims to promote a common understanding of the meaning and characteristics of interoperability and to provide a strategy to advance the state of interoperability as applied to integration challenges facing grid modernization. This includes addressing the quality of integrating devices and systems and the discipline to improve the process of successfully integrating these components as business models and information technology improve over time. The strategic vision for interoperability described in this document applies throughout the electric energy generation, delivery, and end-use supply chain. Its scope includes interactive technologies and business processes from bulk energy levels to lower voltage level equipment and the millions of appliances that are becoming equipped with processing power and communication interfaces. A transformational aspect of a vision for interoperability in the future electric system is the coordinated operation of intelligent devices and systems at the edges of grid infrastructure. This challenge offers an example for addressing interoperability concerns throughout the electric system.

  10. Clinical Usage of an Extracellular, Collagen-rich Matrix: A Case Series.

    Science.gov (United States)

    AbouIssa, Abdelfatah; Mari, Walid; Simman, Richard

    2015-11-01

    OASIS Ultra (Smith and Nephew, St. Petersburg, FL) is an extracellular, collagen-rich matrix derived from submucosa of porcine intestine. It is composed of collagen type I, glycosaminoglycan, and proteoglycans. This extracellular matrix (ECM) differs from the single layer in thickness and offers ease of handling and application. It also stimulates cell migration and structural support, provides moisture environment, decreases inflammation, and induces cell proliferation and cellular attachments. In this case series, the authors present their experience with this product in various clinical scenarios. The authors used the product in a variety of wounds with different etiologies to test the clinical outcome of the ECM. This was an observational case series with prospective review of 6 different patients with different types of wounds who received treatment with the ECM during their treatment. The product was applied on the following types of wounds: chronic venous ulcer, nonhealing Achilles tendon vasculitic wound, Marjolin's ulcer, posttraumatic wound, stage IV sacral-coccygeal pressure wound, and complicated transmetatarsal amputation of gangrenous left forefoot diabetic wound. All of these wounds healed within the expected time periods and without complications. In general, healing was achieved in 4-16 weeks using 1-12 applications of the ECM. Wounds with different etiologies were successfully treated with an extracellular, collagen-rich matrix. By replacing the lost ECM to guide cellular growth and migration, this product did ultimately hasten the healing process.

  11. Connected Lighting System Interoperability Study Part 1: Application Programming Interfaces

    Energy Technology Data Exchange (ETDEWEB)

    Gaidon, Clement [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Poplawski, Michael [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2017-10-31

    First in a series of studies that focuses on interoperability as realized by the use of Application Programming Interfaces (APIs), explores the diversity of such interfaces in several connected lighting systems; characterizes the extent of interoperability that they provide; and illustrates challenges, limitations, and tradeoffs that were encountered during this exploration.

  12. Advancing Smart Grid Interoperability and Implementing NIST's Interoperability Roadmap

    Energy Technology Data Exchange (ETDEWEB)

    Basso,T.; DeBlasio, R.

    2010-04-01

    The IEEE American National Standards project P2030TM addressing smart grid interoperability and the IEEE 1547 series of standards addressing distributed resources interconnection with the grid have been identified in priority action plans in the Report to NIST on the Smart Grid Interoperability Standards Roadmap. This paper presents the status of the IEEE P2030 development, the IEEE 1547 series of standards publications and drafts, and provides insight on systems integration and grid infrastructure. The P2030 and 1547 series of standards are sponsored by IEEE Standards Coordinating Committee 21.

  13. Toward an Interoperability Architecture

    National Research Council Canada - National Science Library

    Buddenberg, Rex

    2001-01-01

    .... The continued burgeoning of the Internet constitutes an existence proof. But a common networking base is insufficient to reach a goal of cross-system interoperability - the large information system...

  14. Interoperability for electronic ID

    OpenAIRE

    Zygadlo, Zuzanna

    2009-01-01

    Electronic Business, including eBanking, eCommerce and eGovernmental services, is today based on a large variety of security solutions, comprising electronic IDs provided by a broad community of Public Key Infrastructure (PKI) vendors. Significant differences in implementations of those solutions introduce a problem of lack of interoperability in electronic business, which have not yet been resolved by standardization and interoperability initiatives based on existing PKI trust models. It i...

  15. Buildings Interoperability Landscape

    Energy Technology Data Exchange (ETDEWEB)

    Hardin, Dave [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Stephan, Eric G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wang, Weimin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Corbin, Charles D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Widergren, Steven E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-12-31

    Through its Building Technologies Office (BTO), the United States Department of Energy’s Office of Energy Efficiency and Renewable Energy (DOE-EERE) is sponsoring an effort to advance interoperability for the integration of intelligent buildings equipment and automation systems, understanding the importance of integration frameworks and product ecosystems to this cause. This is important to BTO’s mission to enhance energy efficiency and save energy for economic and environmental purposes. For connected buildings ecosystems of products and services from various manufacturers to flourish, the ICT aspects of the equipment need to integrate and operate simply and reliably. Within the concepts of interoperability lie the specification, development, and certification of equipment with standards-based interfaces that connect and work. Beyond this, a healthy community of stakeholders that contribute to and use interoperability work products must be developed. On May 1, 2014, the DOE convened a technical meeting to take stock of the current state of interoperability of connected equipment and systems in buildings. Several insights from that meeting helped facilitate a draft description of the landscape of interoperability for connected buildings, which focuses mainly on small and medium commercial buildings. This document revises the February 2015 landscape document to address reviewer comments, incorporate important insights from the Buildings Interoperability Vision technical meeting, and capture thoughts from that meeting about the topics to be addressed in a buildings interoperability vision. In particular, greater attention is paid to the state of information modeling in buildings and the great potential for near-term benefits in this area from progress and community alignment.

  16. Optimum design of matrix fault current limiters using the series resistance connected with shunt coil

    Science.gov (United States)

    Chung, D. C.; Choi, H. S.; Lee, N. Y.; Nam, G. Y.; Cho, Y. S.; Sung, T. H.; Han, Y. H.; Kim, B. S.; Lim, S. H.

    2007-10-01

    In this paper we described the improved design for the matrix fault current limiters (MFCL). To do this, we used thin film-type superconducting elements. therefore it means that we can make the MFCL with minimized size and high switching speed because of the high current density and the high indexing value of superconducting thin film. Also we could minimize the bulky shunt coil using the connection of a series resistance with a shunt coil. Also we could effectively block up a leakage current in shunt coils under no-fault condition and simply control total impedances of a current-limiting part using this method. After we designed an appropriated 1 × 2 basic MFCL module with an applied voltage of 160 V, we enlarged it to a 2 × 2 MFCL module and a 3 × 2 MFCL module where applied voltages were 320 V and 480 V, respectively. Experimental results for our MFCL were reported in terms of various fault currents, variation of series resistance and so on. We think that these methods will be useful in the optimum design of an m × n MFCL.

  17. Optimum design of matrix fault current limiters using the series resistance connected with shunt coil

    International Nuclear Information System (INIS)

    Chung, D.C.; Choi, H.S.; Lee, N.Y.; Nam, G.Y.; Cho, Y.S.; Sung, T.H.; Han, Y.H.; Kim, B.S.; Lim, S.H.

    2007-01-01

    In this paper we described the improved design for the matrix fault current limiters (MFCL). To do this, we used thin film-type superconducting elements. therefore it means that we can make the MFCL with minimized size and high switching speed because of the high current density and the high indexing value of superconducting thin film. Also we could minimize the bulky shunt coil using the connection of a series resistance with a shunt coil. Also we could effectively block up a leakage current in shunt coils under no-fault condition and simply control total impedances of a current-limiting part using this method. After we designed an appropriated 1 x 2 basic MFCL module with an applied voltage of 160 V, we enlarged it to a 2 x 2 MFCL module and a 3 x 2 MFCL module where applied voltages were 320 V and 480 V, respectively. Experimental results for our MFCL were reported in terms of various fault currents, variation of series resistance and so on. We think that these methods will be useful in the optimum design of an m x n MFCL

  18. Towards technical interoperability in telemedicine.

    Energy Technology Data Exchange (ETDEWEB)

    Craft, Richard Layne, II

    2004-05-01

    For telemedicine to realize the vision of anywhere, anytime access to care, the question of how to create a fully interoperable technical infrastructure must be addressed. After briefly discussing how 'technical interoperability' compares with other types of interoperability being addressed in the telemedicine community today, this paper describes reasons for pursuing technical interoperability, presents a proposed framework for realizing technical interoperability, identifies key issues that will need to be addressed if technical interoperability is to be achieved, and suggests a course of action that the telemedicine community might follow to accomplish this goal.

  19. Semantically Interoperable XML Data.

    Science.gov (United States)

    Vergara-Niedermayr, Cristobal; Wang, Fusheng; Pan, Tony; Kurc, Tahsin; Saltz, Joel

    2013-09-01

    XML is ubiquitously used as an information exchange platform for web-based applications in healthcare, life sciences, and many other domains. Proliferating XML data are now managed through latest native XML database technologies. XML data sources conforming to common XML schemas could be shared and integrated with syntactic interoperability. Semantic interoperability can be achieved through semantic annotations of data models using common data elements linked to concepts from ontologies. In this paper, we present a framework and software system to support the development of semantic interoperable XML based data sources that can be shared through a Grid infrastructure. We also present our work on supporting semantic validated XML data through semantic annotations for XML Schema, semantic validation and semantic authoring of XML data. We demonstrate the use of the system for a biomedical database of medical image annotations and markups.

  20. Semantically Interoperable XML Data

    Science.gov (United States)

    Vergara-Niedermayr, Cristobal; Wang, Fusheng; Pan, Tony; Kurc, Tahsin; Saltz, Joel

    2013-01-01

    XML is ubiquitously used as an information exchange platform for web-based applications in healthcare, life sciences, and many other domains. Proliferating XML data are now managed through latest native XML database technologies. XML data sources conforming to common XML schemas could be shared and integrated with syntactic interoperability. Semantic interoperability can be achieved through semantic annotations of data models using common data elements linked to concepts from ontologies. In this paper, we present a framework and software system to support the development of semantic interoperable XML based data sources that can be shared through a Grid infrastructure. We also present our work on supporting semantic validated XML data through semantic annotations for XML Schema, semantic validation and semantic authoring of XML data. We demonstrate the use of the system for a biomedical database of medical image annotations and markups. PMID:25298789

  1. Lemnos Interoperable Security Program

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, John [Tennessee Valley Authority, Knoxville, TN (United States); Halbgewachs, Ron [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Chavez, Adrian [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Smith, Rhett [Schweitzer Engineering Laboratories, Chattanooga, TN (United States); Teumim, David [Teumim Technical, Allentown, PA (United States)

    2012-01-31

    The manner in which the control systems are being designed and operated in the energy sector is undergoing some of the most significant changes in history due to the evolution of technology and the increasing number of interconnections to other system. With these changes however come two significant challenges that the energy sector must face; 1) Cyber security is more important than ever before, and 2) Cyber security is more complicated than ever before. A key requirement in helping utilities and vendors alike in meeting these challenges is interoperability. While interoperability has been present in much of the discussions relating to technology utilized within the energy sector and especially the Smart Grid, it has been absent in the context of cyber security. The Lemnos project addresses these challenges by focusing on the interoperability of devices utilized within utility control systems which support critical cyber security functions. In theory, interoperability is possible with many of the cyber security solutions available to utilities today. The reality is that the effort required to achieve cyber security interoperability is often a barrier for utilities. For example, consider IPSec, a widely-used Internet Protocol to define Virtual Private Networks, or tunnels , to communicate securely through untrusted public and private networks. The IPSec protocol suite has a significant number of configuration options and encryption parameters to choose from, which must be agreed upon and adopted by both parties establishing the tunnel. The exercise in getting software or devices from different vendors to interoperate is labor intensive and requires a significant amount of security expertise by the end user. Scale this effort to a significant number of devices operating over a large geographical area and the challenge becomes so overwhelming that it often leads utilities to pursue solutions from a single vendor. These single vendor solutions may inadvertently lock

  2. FLTSATCOM interoperability applications

    Science.gov (United States)

    Woolford, Lynn

    A mobile Fleet Satellite Communications (FLTSATCOM) system called the Mobile Operational Control Center (MOCC) was developed which has demonstrated the ability to be interoperable with many of the current FLTSATCOM command and control channels. This low-cost system is secure in all its communications, is lightweight, and provides a gateway for other communications formats. The major elements of this system are made up of a personal computer, a protocol microprocessor, and off-the-shelf mobile communication components. It is concluded that with both FLTSATCOM channel protocol and data format interoperability, the MOCC has the ability provide vital information in or near real time, which significantly improves mission effectiveness.

  3. Towards an enterprise interoperability framework

    CSIR Research Space (South Africa)

    Kotzé, P

    2010-06-01

    Full Text Available This paper presents relevant interoperability approaches and solutions applied to global/international networked (collaborative) enterprises or organisations and conceptualise an enhanced enterprise interoperability framework. The paper covers...

  4. A Theory of Interoperability Failures

    National Research Council Canada - National Science Library

    McBeth, Michael S

    2003-01-01

    This paper develops a theory of interoperability failures. Interoperability in this paper refers to the exchange of information and the use of information, once exchanged, between two or more systems...

  5. A matrix formulation of Frobenius power series solutions using products of 4X4 matrices

    Directory of Open Access Journals (Sweden)

    Jeremy Mandelkern

    2015-08-01

    Full Text Available In Coddington and Levison [7, p. 119, Thm. 4.1] and Balser [4, p. 18-19, Thm. 5], matrix formulations of Frobenius theory, near a regular singular point, are given using 2X2 matrix recurrence relations yielding fundamental matrices consisting of two linearly independent solutions together with their quasi-derivatives. In this article we apply a reformulation of these matrix methods to the Bessel equation of nonintegral order. The reformulated approach of this article differs from [7] and [4] by its implementation of a new ``vectorization'' procedure that yields recurrence relations of an altogether different form: namely, it replaces the implicit 2X2 matrix recurrence relations of both [7] and [4] by explicit 4X4 matrix recurrence relations that are implemented by means only of 4X4 matrix products. This new idea of using a vectorization procedure may further enable the development of symbolic manipulator programs for matrix forms of the Frobenius theory.

  6. Interoperability does matter

    Directory of Open Access Journals (Sweden)

    Manfred Goepel

    2006-04-01

    Full Text Available In companies, the historically developed IT systems are mostly application islands. They always produce good results if the system's requirements and surroundings are not changed and as long as a system interface is not needed. With the ever increas-ing dynamic and globalization of the market, however, these IT islands are certain to collapse. Interoperability (IO is the bid of the hour, assuming the integration of users, data, applications and processes. In the following, important IO enablers such as ETL, EAI, and SOA will be examined on the basis of practica-bility. It will be shown that especially SOA produces a surge of interoperability that could rightly be referred to as IT evolution.

  7. An Interoperable Cartographic Database

    OpenAIRE

    Slobodanka Ključanin; Zdravko Galić

    2007-01-01

    The concept of producing a prototype of interoperable cartographic database is explored in this paper, including the possibilities of integration of different geospatial data into the database management system and their visualization on the Internet. The implementation includes vectorization of the concept of a single map page, creation of the cartographic database in an object-relation database, spatial analysis, definition and visualization of the database content in the form of a map on t...

  8. An Interoperable Cartographic Database

    Directory of Open Access Journals (Sweden)

    Slobodanka Ključanin

    2007-05-01

    Full Text Available The concept of producing a prototype of interoperable cartographic database is explored in this paper, including the possibilities of integration of different geospatial data into the database management system and their visualization on the Internet. The implementation includes vectorization of the concept of a single map page, creation of the cartographic database in an object-relation database, spatial analysis, definition and visualization of the database content in the form of a map on the Internet. 

  9. Inter-operability

    International Nuclear Information System (INIS)

    Plaziat, J.F.; Moulin, P.; Van Beurden, R.; Ballet, E.

    2005-01-01

    Building an internal gas market implies establishing harmonized rules for cross border trading between operators. To that effect, the European association EASEE-gas is carrying out standards and procedures, commonly called 'inter-operability'. Set up in 2002, the Association brings together all segments of the gas industry: producers, transporters, distributors, traders and shippers, suppliers, consumers and service providers. This workshop presents the latest status on issues such as barriers to gas trade in Europe, rules and procedures under preparation by EASEE-gas, and the implementation schedule of these rules by operators. This article gathers 5 presentations about this topic given at the gas conference

  10. River Basin Standards Interoperability Pilot

    Science.gov (United States)

    Pesquer, Lluís; Masó, Joan; Stasch, Christoph

    2016-04-01

    tests the combination of Gauge data in a WPS that is triggered by a meteorological alert. The data is translated into OGC WaterML 2.0 time series data format and will be ingested in a SOS 2.0. SOS data is visualized in a SOS Client that is able to handle time series. The meteorological forecast data (with the supervision of an operator manipulating the WPS user interface) ingests with WaterML 2.0 time series and terrain data is input for a flooding modelling algorithm. The WPS is able to produce flooding datasets in the form of coverages that is offered to clients via a WCS 2.0 service or a WMS 1.3 service, and downloaded and visualized by the respective clients. The WPS triggers a notification or an alert that will be monitored from an emergency control response service. Acronyms AS: Alert Service ES: Event Service ICT: Information and Communication Technology NS: Notification Service OGC: Open Geospatial Consortium RIBASE: River Basin Standards Interoperability Pilot SOS: Sensor Observation Service WaterML: Water Markup Language WCS: Web Coverage Service WMS: Web Map Service WPS: Web Processing Service

  11. Unmanned Ground Vehicle (UGV) Interoperability Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The UGV Interoperability Lab provides the capability to verify vendor conformance against government-defined interoperability profiles (IOPs). This capability allows...

  12. Leg ulcer treatment outcomes with new ovine collagen extracellular matrix dressing: a retrospective case series.

    Science.gov (United States)

    Bohn, Gregory A; Gass, Kimberly

    2014-10-01

    The purpose of this study was to describe the rate of closure observed in venous leg ulcers during treatment with ovine collagen extracellular matrix dressings and compression. Fourteen patients with 23 wounds were retrospectively evaluated with respect to healing rates, time to closure, and weekly facility charge fees.

  13. Characteristics of the co-fluctuation matrix transmission network based on financial multi-time series

    OpenAIRE

    Huajiao Li; Haizhong An; Xiangyun Gao; Wei Fang

    2015-01-01

    The co-fluctuation of two time series has often been studied by analysing the correlation coefficient over a selected period. However, in both domestic and global financial markets, there are more than two active time series that fluctuate constantly as a result of various factors, including geographic locations, information communications and so on. In addition to correlation relationships over longer periods, daily co-fluctuation relationships and their transmission features are also import...

  14. Flexible Language Interoperability

    DEFF Research Database (Denmark)

    Ekman, Torbjörn; Mechlenborg, Peter; Schultz, Ulrik Pagh

    2007-01-01

    Virtual machines raise the abstraction level of the execution environment at the cost of restricting the set of supported languages. Moreover, the ability of a language implementation to integrate with other languages hosted on the same virtual machine typically constrains the features...... of the language. In this paper, we present a highly flexible yet efficient approach to hosting multiple programming languages on an object-oriented virtual machine. Our approach is based on extending the interface of each class with language-specific wrapper methods, offering each language a tailored view...... of a given class. This approach can be deployed both on a statically typed virtual machine, such as the JVM, and on a dynamic virtual machine, such as a Smalltalk virtual machine. We have implemented our approach to language interoperability on top of a prototype virtual machine for embedded systems based...

  15. Evaluation of Enterprise Architecture Interoperability

    National Research Council Canada - National Science Library

    Jamison, Theresa A; Niska, Brice T; Layman, Phillip A; Whitney, Steven P

    2005-01-01

    ...), which describes these architectures. The purpose of this project, suggested by Air Force Space Command, was to examine the value of existing analytical tools in making an interoperability assessment of individual enterprises, as well...

  16. Minimal invasive surgery for unicameral bone cyst using demineralized bone matrix: a case series

    Directory of Open Access Journals (Sweden)

    Cho Hwan

    2012-07-01

    Full Text Available Abstract Background Various treatments for unicameral bone cyst have been proposed. Recent concern focuses on the effectiveness of closed methods. This study evaluated the effectiveness of demineralized bone matrix as a graft material after intramedullary decompression for the treatment of unicameral bone cysts. Methods Between October 2008 and June 2010, twenty-five patients with a unicameral bone cyst were treated with intramedullary decompression followed by grafting of demineralized bone matrix. There were 21 males and 4 female patients with mean age of 11.1 years (range, 3–19 years. The proximal metaphysis of the humerus was affected in 12 patients, the proximal femur in five, the calcaneum in three, the distal femur in two, the tibia in two, and the radius in one. There were 17 active cysts and 8 latent cysts. Radiologic change was evaluated according to a modified Neer classification. Time to healing was defined as the period required achieving cortical thickening on the anteroposterior and lateral plain radiographs, as well as consolidation of the cyst. The patients were followed up for mean period of 23.9 months (range, 15–36 months. Results Nineteen of 25 cysts had completely consolidated after a single procedure. The mean time to healing was 6.6 months (range, 3–12 months. Four had incomplete healing radiographically but had no clinical symptom with enough cortical thickness to prevent fracture. None of these four cysts needed a second intervention until the last follow-up. Two of 25 patients required a second intervention because of cyst recurrence. All of the two had a radiographical healing of cyst after mean of 10 additional months of follow-up. Conclusions A minimal invasive technique including the injection of DBM could serve as an excellent treatment method for unicameral bone cysts.

  17. Minimal invasive surgery for unicameral bone cyst using demineralized bone matrix: a case series.

    Science.gov (United States)

    Cho, Hwan Seong; Seo, Sung Hwa; Park, So Hyun; Park, Jong Hoon; Shin, Duk Seop; Park, Il Hyung

    2012-07-29

    Various treatments for unicameral bone cyst have been proposed. Recent concern focuses on the effectiveness of closed methods. This study evaluated the effectiveness of demineralized bone matrix as a graft material after intramedullary decompression for the treatment of unicameral bone cysts. Between October 2008 and June 2010, twenty-five patients with a unicameral bone cyst were treated with intramedullary decompression followed by grafting of demineralized bone matrix. There were 21 males and 4 female patients with mean age of 11.1  years (range, 3-19 years). The proximal metaphysis of the humerus was affected in 12 patients, the proximal femur in five, the calcaneum in three, the distal femur in two, the tibia in two, and the radius in one. There were 17 active cysts and 8 latent cysts. Radiologic change was evaluated according to a modified Neer classification. Time to healing was defined as the period required achieving cortical thickening on the anteroposterior and lateral plain radiographs, as well as consolidation of the cyst. The patients were followed up for mean period of 23.9 months (range, 15-36 months). Nineteen of 25 cysts had completely consolidated after a single procedure. The mean time to healing was 6.6 months (range, 3-12 months). Four had incomplete healing radiographically but had no clinical symptom with enough cortical thickness to prevent fracture. None of these four cysts needed a second intervention until the last follow-up. Two of 25 patients required a second intervention because of cyst recurrence. All of the two had a radiographical healing of cyst after mean of 10 additional months of follow-up. A minimal invasive technique including the injection of DBM could serve as an excellent treatment method for unicameral bone cysts.

  18. Smart Grid Interoperability Maturity Model

    Energy Technology Data Exchange (ETDEWEB)

    Widergren, Steven E.; Levinson, Alex; Mater, J.; Drummond, R.

    2010-04-28

    The integration of automation associated with electricity resources (including transmission and distribution automation and demand-side resources operated by end-users) is key to supporting greater efficiencies and incorporating variable renewable resources and electric vehicles into the power system. The integration problems faced by this community are analogous to those faced in the health industry, emergency services, and other complex communities with many stakeholders. To highlight this issue and encourage communication and the development of a smart grid interoperability community, the GridWise Architecture Council (GWAC) created an Interoperability Context-Setting Framework. This "conceptual model" has been helpful to explain the importance of organizational alignment in addition to technical and informational interface specifications for "smart grid" devices and systems. As a next step to building a community sensitive to interoperability, the GWAC is investigating an interoperability maturity model (IMM) based on work done by others to address similar circumstances. The objective is to create a tool or set of tools that encourages a culture of interoperability in this emerging community. The tools would measure status and progress, analyze gaps, and prioritize efforts to improve the situation.

  19. Interoperability and HealthGRID.

    Science.gov (United States)

    Bescos, C; Schmitt, D; Kass, J; García-Barbero, M; Kantchev, P

    2005-01-01

    GRID technology, with initiatives like the GGF, will have the potential to allow both competition and interoperability not only among applications and toolkits, but also among implementations of key services. The pyramid of eHealth interoperability should be achieved from standards in communication and data security, storage and processing, to the policy initiatives, including organizational protocols, financing procedures, and legal framework. The open challenges for GRID use in clinical fields illustrate the potential of the combination of grid technologies with medical routine into a wider interoperable framework. The Telemedicine Alliance is a consortium (ESA, WHO and ITU), initiated in 2002, in building a vision for the provision of eHealth to European citizens by 2010. After a survey with more that 50 interviews of experts, interoperability was identified as the main showstopper to eHealth implementation. There are already several groups and organizations contributing to standardization. TM-Alliance is supporting the "e-Health Standardization Coordination Group" (eHSCG). It is now, in the design and development phase of GRID technology in Health, the right moment to act with the aim of achieving an interoperable and open framework. The Health area should benefit from the initiatives started at the GGF in terms of global architecture and services definitions, as well as from the security and other web services applications developed under the Internet umbrella. There is a risk that existing important results of the standardization efforts in this area are not taken up simply because they are not always known.

  20. Linked data for transaction based enterprise interoperability

    NARCIS (Netherlands)

    Folmer, E.J.A.; Krukkert, D.

    2015-01-01

    Interoperability is of major importance in B2B environments. Starting with EDI in the ‘80s, currently interoperability relies heavily on XMLbased standards. Although having great impact, still issues remain to be solved for improving B2B interoperability. These issues include lack of dynamics, cost

  1. The interoperability force in the ERP field

    Science.gov (United States)

    Boza, Andrés; Cuenca, Llanos; Poler, Raúl; Michaelides, Zenon

    2015-04-01

    Enterprise resource planning (ERP) systems participate in interoperability projects and this participation sometimes leads to new proposals for the ERP field. The aim of this paper is to identify the role that interoperability plays in the evolution of ERP systems. To go about this, ERP systems have been first identified within interoperability frameworks. Second, the initiatives in the ERP field driven by interoperability requirements have been identified from two perspectives: technological and business. The ERP field is evolving from classical ERP as information system integrators to a new generation of fully interoperable ERP. Interoperability is changing the way of running business, and ERP systems are changing to adapt to the current stream of interoperability.

  2. Interoperability of Electronic Health Records: A Physician-Driven Redesign.

    Science.gov (United States)

    Miller, Holly; Johns, Lucy

    2018-01-01

    PURPOSE: Electronic health records (EHRs), now used by hundreds of thousands of providers and encouraged by federal policy, have the potential to improve quality and decrease costs in health care. But interoperability, although technically feasible among different EHR systems, is the weak link in the chain of logic. Interoperability is inhibited by poor understanding, by suboptimal implementation, and at times by a disinclination to dilute market share or patient base on the part of vendors or providers, respectively. The intent of this project has been to develop a series of practicable recommendations that, if followed by EHR vendors and users, can promote and enhance interoperability, helping EHRs reach their potential. METHODOLOGY: A group of 11 physicians, one nurse, and one health policy consultant, practicing from California to Massachusetts, has developed a document titled "Feature and Function Recommendations To Optimize Clinician Usability of Direct Interoperability To Enhance Patient Care" that offers recommendations from the clinician point of view. This report introduces some of these recommendations and suggests their implications for policy and the "virtualization" of EHRs. CONCLUSION: Widespread adoption of even a few of these recommendations by designers and vendors would enable a major advance toward the "Triple Aim" of improving the patient experience, improving the health of populations, and reducing per capita costs.

  3. Biodiversity information platforms: From standards to interoperability

    Directory of Open Access Journals (Sweden)

    Walter Berendsohn

    2011-11-01

    Full Text Available One of the most serious bottlenecks in the scientific workflows of biodiversity sciences is the need to integrate data from different sources, software applications, and services for analysis, visualisation and publication. For more than a quarter of a century the TDWG Biodiversity Information Standards organisation has a central role in defining and promoting data standards and protocols supporting interoperability between disparate and locally distributed systems. Although often not sufficiently recognized, TDWG standards are the foundation of many popular Biodiversity Informatics applications and infrastructures ranging from small desktop software solutions to large scale international data networks. However, individual scientists and groups of collaborating scientist have difficulties in fully exploiting the potential of standards that are often notoriously complex, lack non-technical documentations, and use different representations and underlying technologies. In the last few years, a series of initiatives such as Scratchpads, the EDIT Platform for Cybertaxonomy, and biowikifarm have started to implement and set up virtual work platforms for biodiversity sciences which shield their users from the complexity of the underlying standards. Apart from being practical work-horses for numerous working processes related to biodiversity sciences, they can be seen as information brokers mediating information between multiple data standards and protocols. The ViBRANT project will further strengthen the flexibility and power of virtual biodiversity working platforms by building software interfaces between them, thus facilitating essential information flows needed for comprehensive data exchange, data indexing, web-publication, and versioning. This work will make an important contribution to the shaping of an international, interoperable, and user-oriented biodiversity information infrastructure.

  4. Standards to open and interoperable digital libraries

    Directory of Open Access Journals (Sweden)

    Luís Fernando Sayão

    2007-12-01

    Full Text Available Interoperability is one of the main issues in creating a networked system of digital libraries. However, the interoperability as the way to accomplish data exchange and service collaboration requires adoption of a set of open standards covering all digital repository processes. The aim of this document is to revise the most important standards, protocols and the best pratices that form the framework to an open and fully interoperable digital library.

  5. A Guide to Understanding Emerging Interoperability Technologies

    National Research Council Canada - National Science Library

    Bollinger, Terry

    2000-01-01

    .... Over time, individual interoperability problems tend to disappear as the resources involved literally become part of one system through integration and standardization, but the overall problem...

  6. Interoperability of Web Archives and Digital Libraries

    DEFF Research Database (Denmark)

    Kalb, Hendrik; Lazaridou, Paraskevi; Pinsent, Edward

    2013-01-01

    The interoperability of web archives and digital libraries is crucial to avoid silos of preserved data and content. While various researches focus on specfic facets of the challenge to interoperate, there is a lack of empirical work about the overall situation of actual challenges. We conduct...

  7. Model for Trans-sector Digital Interoperability

    NARCIS (Netherlands)

    Madureira, António; den Hartog, Frank; Goncalves da Silva, Eduardo; Baken, Nico; Zhao, L.; Macaulay, L.

    2009-01-01

    Interoperability refers to the ability of two or more systems or components to exchange information and to use the information that has been exchanged. The importance of interoperability has grown together with the adoption of Digital Information Networks (DINs). DINs refer to information networks

  8. Model for Trans-sector Digital Interoperability

    NARCIS (Netherlands)

    Madureira, A.; Den Hartog, F.; Silva, E.; Baken, N.

    2010-01-01

    Interoperability refers to the ability of two or more systems or components to exchange information and to use the information that has been exchanged. The importance of interoperability has grown together with the adoption of Digital Information Networks (DINs). DINs refer to information networks

  9. Model for Trans-sector Digital Interoperability

    NARCIS (Netherlands)

    Popplewell, Keith; Madureira, António; Harding, Jenny; den Hartog, Frank; Goncalves da Silva, Eduardo; Poler, Raul; Chalmeta, Ricardo; Baken, Nico

    Interoperability refers to the ability of two or more systems or components to exchange information and to use the information that has been exchanged. The importance of interoperability has grown together with the adoption of Digital Information Networks (DINs). DINs refer to information networks

  10. Innovation in OGC: The Interoperability Program

    Directory of Open Access Journals (Sweden)

    George Percivall

    2015-10-01

    Full Text Available The OGC Interoperability Program is a source of innovation in the development of open standards. The approach to innovation is based on hands-on; collaborative engineering leading to more mature standards and implementations. The process of the Interoperability Program engages a community of sponsors and participants based on an economic model that benefits all involved. Each initiative begins with an innovative approach to identify interoperability needs followed by agile software development to advance the state of technology to the benefit of society. Over eighty initiatives have been conducted in the Interoperability Program since the breakthrough Web Mapping Testbed began the program in 1999. OGC standards that were initiated in Interoperability Program are the basis of two thirds of the certified compliant products.

  11. Comment on ‘Series expansions from the corner transfer matrix renormalization group method: the hard-squares model’

    International Nuclear Information System (INIS)

    Jensen, Iwan

    2012-01-01

    Earlier this year Chan extended the low-density series for the hard-squares partition function κ(z) to 92 terms. Here we analyse this extended series focusing on the behaviour at the dominant singularity z d which lies on the negative fugacity axis. We find that the series has a confluent singularity of order at least 2 at z d with exponents θ = 0.833 33(2) and θ′ = 1.6676(3). We thus confirm that the exponent θ has the exact value 5/6 as observed by Dhar. (comment)

  12. Matrix differential calculus applied to multiple stationary time series and an extended Whittle formula for information matrices

    NARCIS (Netherlands)

    Klein, A.; Spreij, P.

    2009-01-01

    The purpose of this paper is to set forth easily implementable expressions for the Fisher information matrix (FIM) of a Gaussian stationary vector autoregressive and moving average process with exogenous or input variables, a vector ARMAX or VARMAX process. The entries of the FIM are represented as

  13. Interoperable and accessible census and survey data from IPUMS.

    Science.gov (United States)

    Kugler, Tracy A; Fitch, Catherine A

    2018-02-27

    The first version of the Integrated Public Use Microdata Series (IPUMS) was released to users in 1993, and since that time IPUMS has come to stand for interoperable and accessible census and survey data. Initially created to harmonize U.S. census microdata over time, IPUMS now includes microdata from the U.S. and international censuses and from surveys on health, employment, and other topics. IPUMS also provides geo-spatial data, aggregate population data, and environmental data. IPUMS supports ten data products, each disseminating an integrated data collection with a set of tools that make complex data easy to find, access, and use. Key features are record-level integration to create interoperable datasets, user-friendly interfaces, and comprehensive metadata and documentation. The IPUMS philosophy aligns closely with the FAIR principles of findability, accessibility, interoperability, and re-usability. IPUMS data have catalyzed knowledge generation across a wide range of social science and other disciplines, as evidenced by the large volume of publications and other products created by the vast IPUMS user community.

  14. A Pragmatic Approach to Sustainable Interoperability for the Web 2.0 World

    Science.gov (United States)

    Wright, D. J.; Sankaran, S.

    2015-12-01

    In the geosciences, interoperability is a fundamental requirement. Members of various standards organizations such as the OGC and ISO-TC 211 have done yeomen services to promote a standards-centric approach to manage the interoperability challenges that organizations face today. The specific challenges that organizations face when adopting interoperability patterns are very many. One approach, that of mandating the use of specific standards has been reasonably successful. But scientific communities, as with all others, ultimately want their solutions to be widely accepted and used. And to this end there is a crying need to explore all possible interoperability patterns without restricting the choices to mandated standards. Standards are created by a slow and deliberative process that sometimes takes a long time to come to fruition and therefore sometime feel to fall short of user expectations. It seems therefore that organizations are left with a series of perceived orthogonal requirements when they want to pursue interoperability. They want a robust but agile solution, a mature approach that also needs to satisfy latest technology trends and so on. Sustainable interoperability patterns need to be forward looking and should choose the patterns and paradigms of the Web 2.0 generation. To this end, the key is to choose platform technologies that embrace multiple interoperability mechanisms that are built on fundamental "open" principles and which align with popular mainstream patterns. We seek to explore data-, metadata- and web service-related interoperability patterns through the prism of building solutions that encourage strong implementer and end-user engagement, improved usability and scalability considerations, and appealing developer frameworks that can grow the audience. The path to tread is not new, and the geocommunity only needs to observe and align its end goals with current Web 2.0 patterns to realize all the benefits that today we all take for granted

  15. Implementing Interoperability in the Seafood Industry: Learning from Experiences in Other Sectors.

    Science.gov (United States)

    Bhatt, Tejas; Gooch, Martin; Dent, Benjamin; Sylvia, Gilbert

    2017-08-01

    Interoperability of communication and information technologies within and between businesses operating along supply chains is being pursued and implemented in numerous industries worldwide to increase the efficiency and effectiveness of operations. The desire for greater interoperability is also driven by the need to reduce business risk through more informed management decisions. Interoperability is achieved by the development of a technology architecture that guides the design and implementation of communication systems existing within individual businesses and between businesses comprising the supply chain. Technology architectures are developed through a purposeful dialogue about why the architecture is required, the benefits and opportunities that the architecture offers the industry, and how the architecture will translate into practical results. An assessment of how the finance, travel, and health industries and a sector of the food industry-fresh produce-have implemented interoperability was conducted to identify lessons learned that can aid the development of interoperability in the seafood industry. The findings include identification of the need for strong, effective governance during the establishment and operation of an interoperability initiative to ensure the existence of common protocols and standards. The resulting insights were distilled into a series of principles for enabling syntactic and semantic interoperability in any industry, which we summarize in this article. Categorized as "structural," "operational," and "integrative," the principles describe requirements and solutions that are pivotal to enabling businesses to create and capture value from full chain interoperability. The principles are also fundamental to allowing governments and advocacy groups to use traceability for public good. © 2017 Institute of Food Technologists®.

  16. Binary versus non-binary information in real time series: empirical results and maximum-entropy matrix models

    Science.gov (United States)

    Almog, Assaf; Garlaschelli, Diego

    2014-09-01

    The dynamics of complex systems, from financial markets to the brain, can be monitored in terms of multiple time series of activity of the constituent units, such as stocks or neurons, respectively. While the main focus of time series analysis is on the magnitude of temporal increments, a significant piece of information is encoded into the binary projection (i.e. the sign) of such increments. In this paper we provide further evidence of this by showing strong nonlinear relations between binary and non-binary properties of financial time series. These relations are a novel quantification of the fact that extreme price increments occur more often when most stocks move in the same direction. We then introduce an information-theoretic approach to the analysis of the binary signature of single and multiple time series. Through the definition of maximum-entropy ensembles of binary matrices and their mapping to spin models in statistical physics, we quantify the information encoded into the simplest binary properties of real time series and identify the most informative property given a set of measurements. Our formalism is able to accurately replicate, and mathematically characterize, the observed binary/non-binary relations. We also obtain a phase diagram allowing us to identify, based only on the instantaneous aggregate return of a set of multiple time series, a regime where the so-called ‘market mode’ has an optimal interpretation in terms of collective (endogenous) effects, a regime where it is parsimoniously explained by pure noise, and a regime where it can be regarded as a combination of endogenous and exogenous factors. Our approach allows us to connect spin models, simple stochastic processes, and ensembles of time series inferred from partial information.

  17. Binary versus non-binary information in real time series: empirical results and maximum-entropy matrix models

    International Nuclear Information System (INIS)

    Almog, Assaf; Garlaschelli, Diego

    2014-01-01

    The dynamics of complex systems, from financial markets to the brain, can be monitored in terms of multiple time series of activity of the constituent units, such as stocks or neurons, respectively. While the main focus of time series analysis is on the magnitude of temporal increments, a significant piece of information is encoded into the binary projection (i.e. the sign) of such increments. In this paper we provide further evidence of this by showing strong nonlinear relations between binary and non-binary properties of financial time series. These relations are a novel quantification of the fact that extreme price increments occur more often when most stocks move in the same direction. We then introduce an information-theoretic approach to the analysis of the binary signature of single and multiple time series. Through the definition of maximum-entropy ensembles of binary matrices and their mapping to spin models in statistical physics, we quantify the information encoded into the simplest binary properties of real time series and identify the most informative property given a set of measurements. Our formalism is able to accurately replicate, and mathematically characterize, the observed binary/non-binary relations. We also obtain a phase diagram allowing us to identify, based only on the instantaneous aggregate return of a set of multiple time series, a regime where the so-called ‘market mode’ has an optimal interpretation in terms of collective (endogenous) effects, a regime where it is parsimoniously explained by pure noise, and a regime where it can be regarded as a combination of endogenous and exogenous factors. Our approach allows us to connect spin models, simple stochastic processes, and ensembles of time series inferred from partial information. (paper)

  18. Forcing Interoperability: An Intentionally Fractured Approach

    Science.gov (United States)

    Gallaher, D. W.; Brodzik, M.; Scambos, T.; Stroeve, J.

    2008-12-01

    The NSIDC is attempting to rebuild a significant portion of its public-facing cyberinfrastructure to better meet the needs expressed by the cryospheric community. The project initially addresses a specific science need - understanding Greenland's contribution to global sea level rise through comparison and analysis of variables such as temperature, albedo, melt, ice velocity and surface elevation. This project will ultimately be expanded to cover most of NSIDC's cryospheric data. Like many organizations, we need to provide users with data discovery interfaces, collaboration tools and mapping services. Complicating this effort is the need to reduce the volume of raw data delivered to the user. Data growth, especially with time-series data, will overwhelm our software, processors and network like never before. We need to provide the users the ability to perform first level analysis directly on our site. In order to accomplish this, the users should be free to modify the behavior of these tools as well as incorporate their own tools and analysis to meet their needs. Rather than building one monolithic project to build this system, we have chosen to build three semi-independent systems. One team is building a data discovery and web based distribution system, the second is building an advanced analysis and workflow system and the third is building a customized web mapping service. These systems will use the same underlying data structures and services but will employ different technologies and teams to build their objectives, schedules and user interfaces. Obviously, we are adding complexity and risk to the overall project however this may be the best method to achieve interoperability because the development teams will be required to build off each others work. The teams will be forced to design with other users in mind as opposed to building interoperability as an afterthought, which a tendency in monolithic systems. All three teams will take advantage of preexisting

  19. Impact of coalition interoperability on PKI

    Science.gov (United States)

    Krall, Edward J.

    2003-07-01

    This paper examines methods for providing PKI interoperability among units of a coalition of armed forces drawn from different nations. The area in question is tactical identity management, for the purposes of confidentiality, integrity and non-repudiation in such a dynamic coalition. The interoperating applications under consideration range from email and other forms of store-and-forward messaging to TLS and IPSEC-protected real-time communications. Six interoperability architectures are examined with advantages and disadvantages of each described in the paper.

  20. Preliminary Results of a Consecutive Series of Large & Massive Rotator Cuff Tears Treated with Arthroscopic Rotator Cuff Repairs Augmented with Extracellular Matrix

    Directory of Open Access Journals (Sweden)

    Paolo Consigliere

    2017-01-01

    Full Text Available Background: Recurrence rate of rotator cuff tears is still high despite the improvements of surgical techniques, materials used and a better knowledge of the healing process of the rotator cuff tendons. Large to massive rotator cuff tears are particularly associated with a high failure rate, especially in elderly. Augmentation of rotator cuff repairs with extracellular matrix or synthetic patches has gained popularity in recent years with the aim of reducing failure.The aim of this study was to investigate the outcome of rotator cuff repairs augmented with denatured extracellular matrix in a series of patients who underwent arthroscopic rotator cuff repair for large to massive tears.Methods: Ten consecutive patients, undergoing arthroscopic rotator cuff repair with extracellular matrix augment for large and massive tears, were prospectively enrolled into this single surgeon study. All repairs were performed arthroscopically with a double row technique augmented with extracellular matrix. Oxford Shoulder Score, Constant Score and pain visual analogue scale (VAS were used to monitor the shoulder function and outcome pre-operatively and at three, six and 12-month follow-up. Minimum follow up was tree months. Mean follow up was 7 months.Results: Mean Constant score improved from 53 (SD=4 pre-operatively to 75 (SD=11 at final follow up. Mean Oxford score also increased from 30 (SD=8 pre-operatively to 47 (SD=10 at the final follow up. The visual analogue scale (VAS improved from seven out of 10 (SD=2 preoperatively to 0.6 (SD=0.8 at final follow up. Additionally, there was significant improvement at three months mark in Constant score. Conclusion: Arthroscopic repair and augmentation of large and massive rotator cuff tears with extracellular matrix patch has good early outcome.

  1. Risk Management Considerations for Interoperable Acquisition

    National Research Council Canada - National Science Library

    Meyers, B. C

    2006-01-01

    .... The state of risk management practice -- the specification of standards and the methodologies to implement them -- is addressed and examined with respect to the needs of system-of-systems interoperability...

  2. Interoperability for Entreprise Systems and Applications '12

    CERN Document Server

    Doumeingts, Guy; Katzy, Bernhard; Chalmeta, Ricardo

    2012-01-01

    Within a scenario of globalised markets, where the capacity to efficiently cooperate with other firms starts to become essential in order to remain in the market in an economically, socially and environmentally cost-effective manner, it can be seen how the most innovative enterprises are beginning to redesign their business model to become interoperable. This goal of interoperability is essential, not only from the perspective of the individual enterprise but also in the new business structures that are now emerging, such as supply chains, virtual enterprises, interconnected organisations or extended enterprises, as well as in mergers and acquisitions. Composed of over 40 papers, Enterprise Interoperability V ranges from academic research through case studies to industrial and administrative experience of interoperability. The international nature of the authorship contnues to broaden. Many of the papers have examples and illustrations calculated to deepen understanding and generate new ideas. The I-ESA'12 Co...

  3. Intercloud Architecture Framework for Interoperability and Integration

    NARCIS (Netherlands)

    Demchenko, Y.; Ngo, C.; Makkes, M.X.; Strijkers, R.J.

    2013-01-01

    This report presents on-going research to develop the Intercloud Architecture Framework (ICAF) that addresses interoperability and integration issues in multi-provider multi-domain heterogeneous Cloud based infrastructure services and applications provisioning, including integration and

  4. Interoperability for Enterprise Systems and Applications

    CERN Document Server

    Jardim-Gonçalves, Ricardo; Popplewell, Keith; Mendonça, João

    2016-01-01

    A concise reference to the state of the art in systems interoperability, Enterprise Interoperability VII will be of great value to engineers and computer scientists working in manufacturing and other process industries and to software engineers and electronic and manufacturing engineers working in the academic environment. Furthermore, it shows how knowledge of the meaning within information and the use to which it will be put have to be held in common between enterprises for consistent and efficient inter-enterprise networks. Over 30 papers, ranging from academic research through case studies to industrial and administrative experience of interoperability show how, in a scenario of globalised markets, where the capacity to cooperate with other organizations efficiently is essential in order to remain economically, socially and environmentally cost-effective, the most innovative digitized and networked enterprises ensure that their systems and applications are able to interoperate across heterogeneous collabo...

  5. Epimenides: Interoperability Reasoning for Digital Preservation

    NARCIS (Netherlands)

    Kargakis, Yannis; Tzitzikas, Yannis; van Horik, M.P.M.

    2014-01-01

    This paper presents Epimenides, a system that implements a novel interoperability dependency reasoning approach for assisting digital preservation activities. A distinctive feature is that it can model also converters and emulators, and the adopted modelling approach enables the automatic reasoning

  6. Requirements for Interoperability in Healthcare Information Systems

    Directory of Open Access Journals (Sweden)

    Rita Noumeir

    2012-01-01

    Full Text Available Interoperability is a requirement for the successful deployment of Electronic Health Records (EHR. EHR improves the quality of healthcare by enabling access to all relevant information at the diagnostic decision moment, regardless of location. It is a system that results from the cooperation of several heterogeneous distributed subsystems that need to successfully exchange information relative to a specific healthcare process. This paper analyzes interoperability impediments in healthcare by first defining them and providing concrete healthcare examples, followed by discussion of how specifications can be defined and how verification can be conducted to eliminate those impediments and ensure interoperability in healthcare. This paper also analyzes how Integrating the Healthcare Enterprise (IHE has been successful in enabling interoperability, and identifies some neglected aspects that need attention.

  7. Data Modeling Challenges of Advanced Interoperability.

    Science.gov (United States)

    Blobel, Bernd; Oemig, Frank; Ruotsalainen, Pekka

    2018-01-01

    Progressive health paradigms, involving many different disciplines and combining multiple policy domains, requires advanced interoperability solutions. This results in special challenges for modeling health systems. The paper discusses classification systems for data models and enterprise business architectures and compares them with the ISO Reference Architecture. On that basis, existing definitions, specifications and standards of data models for interoperability are evaluated and their limitations are discussed. Amendments to correctly use those models and to better meet the aforementioned challenges are offered.

  8. Investigation of Automated Terminal Interoperability Test

    OpenAIRE

    Brammer, Niklas

    2008-01-01

    In order to develop and secure the functionality of its cellular communications systems, Ericsson deals with numerous R&D and I&V activities. One important aspect is interoperability with mobile terminals from different vendors on the world market. Therefore Ericsson co-operates with mobile platform and user equipment manufacturers. These companies visit the interoperability developmental testing (IoDT) laboratories in Linköping to test their developmental products and prototypes in o...

  9. Grid interoperability: joining grid information systems

    International Nuclear Information System (INIS)

    Flechl, M; Field, L

    2008-01-01

    A grid is defined as being 'coordinated resource sharing and problem solving in dynamic, multi-institutional virtual organizations'. Over recent years a number of grid projects, many of which have a strong regional presence, have emerged to help coordinate institutions and enable grids. Today, we face a situation where a number of grid projects exist, most of which are using slightly different middleware. Grid interoperation is trying to bridge these differences and enable Virtual Organizations to access resources at the institutions independent of their grid project affiliation. Grid interoperation is usually a bilateral activity between two grid infrastructures. Recently within the Open Grid Forum, the Grid Interoperability Now (GIN) Community Group is trying to build upon these bilateral activities. The GIN group is a focal point where all the infrastructures can come together to share ideas and experiences on grid interoperation. It is hoped that each bilateral activity will bring us one step closer to the overall goal of a uniform grid landscape. A fundamental aspect of a grid is the information system, which is used to find available grid services. As different grids use different information systems, interoperation between these systems is crucial for grid interoperability. This paper describes the work carried out to overcome these differences between a number of grid projects and the experiences gained. It focuses on the different techniques used and highlights the important areas for future standardization

  10. On MDA - SOA based Intercloud Interoperability framework

    Directory of Open Access Journals (Sweden)

    Tahereh Nodehi

    2013-01-01

    Full Text Available Cloud computing has been one of the latest technologies which assures reliable delivery of on - demand computing services over the Internet. Cloud service providers have established geographically distributed data centers and computing resources, which are available online as service. The clouds operated by different service providers working together in collaboration can open up lots more spaces for innovative scenarios with huge amount of resources provisioning on demand. However, current cloud systems do not support intercloud interoperability. This paper is thus motivated to address Intercloud Interoperabilityby analyzing different methodologies that have been applied to resolve various scenarios of interoperability. Model Driven Architecture (MDA and Service Oriented Architecture (SOA method have been used to address interoperability in various scenarios, which also opens up spaces to address intercloud interoperability by making use of these well accepted methodologies. The focus of this document is to show Intercloud Interoperability can be supported through a Model Driven approach and Service Oriented systems. Moreover, the current state of the art in Intercloud, concept and benefits of MDA and SOA are discussed in the paper. At the same time this paper also proposes a generic architecture for MDA - SOA based framework, which can be useful for developing applications which will require intercloud interoperability. The paper justi fies the usability of the framework by a use - case scenario for dynamic workload migration among heterogeneous clouds.

  11. Maturity Model for Advancing Smart Grid Interoperability

    Energy Technology Data Exchange (ETDEWEB)

    Knight, Mark; Widergren, Steven E.; Mater, J.; Montgomery, Austin

    2013-10-28

    Abstract—Interoperability is about the properties of devices and systems to connect and work properly. Advancing interoperability eases integration and maintenance of the resulting interconnection. This leads to faster integration, lower labor and component costs, predictability of projects and the resulting performance, and evolutionary paths for upgrade. When specifications are shared and standardized, competition and novel solutions can bring new value streams to the community of stakeholders involved. Advancing interoperability involves reaching agreement for how things join at their interfaces. The quality of the agreements and the alignment of parties involved in the agreement present challenges that are best met with process improvement techniques. The GridWise® Architecture Council (GWAC) sponsored by the United States Department of Energy is supporting an effort to use concepts from capability maturity models used in the software industry to advance interoperability of smart grid technology. An interoperability maturity model has been drafted and experience is being gained through trials on various types of projects and community efforts. This paper describes the value and objectives of maturity models, the nature of the interoperability maturity model and how it compares with other maturity models, and experiences gained with its use.

  12. Towards Interoperable Preservation Repositories: TIPR

    Directory of Open Access Journals (Sweden)

    Priscilla Caplan

    2010-07-01

    Full Text Available Towards Interoperable Preservation Repositories (TIPR is a project funded by the Institute of Museum and Library Services to create and test a Repository eXchange Package (RXP. The package will make it possible to transfer complex digital objects between dissimilar preservation repositories.  For reasons of redundancy, succession planning and software migration, repositories must be able to exchange copies of archival information packages with each other. Every different repository application, however, describes and structures its archival packages differently. Therefore each system produces dissemination packages that are rarely understandable or usable as submission packages by other repositories. The RXP is an answer to that mismatch. Other solutions for transferring packages between repositories focus either on transfers between repositories of the same type, such as DSpace-to-DSpace transfers, or on processes that rely on central translation services.  Rather than build translators between many dissimilar repository types, the TIPR project has defined a standards-based package of metadata files that can act as an intermediary information package, the RXP, a lingua franca all repositories can read and write.

  13. Enterprise interoperability VI : Interoperability for Agility, Resilience and Plasticity of Collaboration

    CERN Document Server

    Bénaben, Frédérick; Poler, Raúl; Bourrières, Jean-Paul

    2014-01-01

    A concise reference to the state of the art in systems interoperability, Enterprise Interoperability VI will be of great value to engineers and computer scientists working in manufacturing and other process industries and to software engineers and electronic and manufacturing engineers working in the academic environment. Over 40 papers, ranging from academic research through case studies to industrial and administrative experience of interoperability show how, in a scenario of globalised markets, where the capacity to cooperate with other firms efficiently starts to become essential in order to remain in the market in an economically, socially and environmentally cost-effective manner, the most innovative enterprises are beginning to redesign their business model to become interoperable. This goal of interoperability is essential, not only from the perspective of the individual enterprise but also in the new business structures that are now emerging, such as supply chains, virtual enterprises, interconnected...

  14. Interoperability of Heliophysics Virtual Observatories

    Science.gov (United States)

    Thieman, J.; Roberts, A.; King, T.; King, J.; Harvey, C.

    2008-01-01

    If you'd like to find interrelated heliophysics (also known as space and solar physics) data for a research project that spans, for example, magnetic field data and charged particle data from multiple satellites located near a given place and at approximately the same time, how easy is this to do? There are probably hundreds of data sets scattered in archives around the world that might be relevant. Is there an optimal way to search these archives and find what you want? There are a number of virtual observatories (VOs) now in existence that maintain knowledge of the data available in subdisciplines of heliophysics. The data may be widely scattered among various data centers, but the VOs have knowledge of what is available and how to get to it. The problem is that research projects might require data from a number of subdisciplines. Is there a way to search multiple VOs at once and obtain what is needed quickly? To do this requires a common way of describing the data such that a search using a common term will find all data that relate to the common term. This common language is contained within a data model developed for all of heliophysics and known as the SPASE (Space Physics Archive Search and Extract) Data Model. NASA has funded the main part of the development of SPASE but other groups have put resources into it as well. How well is this working? We will review the use of SPASE and how well the goal of locating and retrieving data within the heliophysics community is being achieved. Can the VOs truly be made interoperable despite being developed by so many diverse groups?

  15. A step-by-step methodology for enterprise interoperability projects

    Science.gov (United States)

    Chalmeta, Ricardo; Pazos, Verónica

    2015-05-01

    Enterprise interoperability is one of the key factors for enhancing enterprise competitiveness. Achieving enterprise interoperability is an extremely complex process which involves different technological, human and organisational elements. In this paper we present a framework to help enterprise interoperability. The framework has been developed taking into account the three domains of interoperability: Enterprise Modelling, Architecture and Platform and Ontologies. The main novelty of the framework in comparison to existing ones is that it includes a step-by-step methodology that explains how to carry out an enterprise interoperability project taking into account different interoperability views, like business, process, human resources, technology, knowledge and semantics.

  16. Enamel matrix protein derivative plus synthetic bone substitute for the treatment of mandibular Class II furcation defects: a case series.

    Science.gov (United States)

    Queiroz, Lucas Araujo; Santamaria, Mauro; Casati, Marcio; Silverio, Karina; Nociti-Junior, Francisco; Sallum, Enilson

    2015-03-01

    The aim of this study is to report on the treatment of mandibular Class II furcation defects with enamel matrix protein derivative (EMD) combined with a βTCP/HA (β-tricalcium phosphate/hydroxyapatite) alloplastic material. Thirteen patients were selected. All patients were nonsmokers, systemically healthy, and diagnosed with chronic periodontitis; had not taken medications known to interfere with periodontal tissue health and healing; presented one Class II mandibular furcation defect with horizontal probing equal to or greater than 4 mm at buccal site. The clinical parameters evaluated were probing depth (PD), relative gingival margin position (RGMP), relative vertical clinical attachment level (RVCAL), and relative horizontal clinical attachment level (RHCAL). A paired Student t test was used to detect differences between the baseline and 6-month measurements, with the level of significance of .05. After 6 months, the treatment produced a statistically significant reduction in PD and a significant gain in RVCAL and RHCAL, but no observable change in RGMP. RVCAL ranged from 13.77 (± 1.31) at baseline to 12.15 (± 1.29) after 6 months, with a mean change of -1.62 ± 1.00 mm (P < .05). RHCAL ranged from 5.54 (± 0.75) to 2.92 (± 0.92), with a mean change of -2.62 ± 0.63 mm (P < .05). After 6 months, 76.92% of the patients improved their diagnosis to Class I furcation defects while 23.08% remained as Class II. The present study has shown that positive clinical results may be expected from the combined treatment of Class II furcation defects with EMD and βTCP/HA, especially considering the gain of horizontal attachment level. Despite this result, controlled clinical studies are needed to confirm our outcomes.

  17. Reconstruction of full-thickness defects with bovine-derived collagen/elastin matrix: a series of challenging cases and the first reported post-burn facial reconstruction.

    Science.gov (United States)

    Haik, Josef; Weissman, Oren; Hundeshagen, Gabriel; Farber, Nimrod; Harats, Moti; Rozenblatt, Shira M; Kamolz, Lars Peter; Winkler, Eyal; Zilinsky, Isaac

    2012-07-01

    Reconstruction of full-thickness defects may benefit from integration of dermal substitutes, which serve as a foundation for split-thickness skin grafts, thus enhancing short and long-term results. We present a series of 7 patients who were treated between 2010 and 2012 for complicated full-thickness defects by the second-generation collagen/elastin matrix Matriderm® covered by a split-thickness skin graft. The defects resulted from malignancy resection, trauma, and post-burn scar reconstruction. Overall graft take was excellent and no complications were noted regarding the dermal substitute. Graft quality was close to normal skin in terms of elasticity, pliability, texture, and color. Good contour and cushioning of defects in weight bearing areas was also achieved. Matriderm was found to be a useful adjunct to full-thickness defect reconstruction, especially in difficult areas where the desired result is a scar of the highest quality possible.

  18. IHE based interoperability - benefits and challenges.

    Science.gov (United States)

    Wozak, Florian; Ammenwerth, Elske; Hörbst, Alexander; Sögner, Peter; Mair, Richard; Schabetsberger, Thomas

    2008-01-01

    Optimized workflows and communication between institutions involved in a patient's treatment process can lead to improved quality and efficiency in the healthcare sector. Electronic Health Records (EHRs) provide a patient-centered access to clinical data across institutional boundaries supporting the above mentioned aspects. Interoperability is regarded as vital success factor. However a clear definition of interoperability does not exist. The aim of this work is to define and to assess interoperability criteria as required for EHRs. The definition and assessment of interoperability criteria is supported by the analysis of existing literature and personal experience as well as by discussions with several domain experts. Criteria for interoperability addresses the following aspects: Interfaces, Semantics, Legal and organizational aspects and Security. The Integrating the Healthcare Enterprises initiative (IHE) profiles make a major contribution to these aspects, but they also arise new problems. Flexibility for adoption to different organizational/regional or other specific conditions is missing. Regional or national initiatives should get a possibility to realize their specific needs within the boundaries of IHE profiles. Security so far is an optional element which is one of IHE greatest omissions. An integrated security approach seems to be preferable. Irrespective of the so far practical significance of the IHE profiles it appears to be of great importance, that the profiles are constantly checked against practical experiences and are continuously adapted.

  19. Effects of hydrostatic pressure on the excitation-emission matrix (EEM) of a series of pure PAHs

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Zhi-juan [Ocean College, Zhejiang University of Technology, Hangzhou 310014 (China); Wang, Jie [College of Chemical Engineering, Zhejiang University of Technology, Hangzhou 310014 (China); Ye, Shu-ming [College of Biomedical Engineering and Instrument Science, Zhejiang University, Hangzhou 310027 (China); Jiang, Chun-yue, E-mail: zjjcy@zjut.edu.cn [College of Chemical Engineering, Zhejiang University of Technology, Hangzhou 310014 (China); Chen, Hang [College of Biomedical Engineering and Instrument Science, Zhejiang University, Hangzhou 310027 (China)

    2017-06-15

    The effects of hydrostatic pressure on the EEM of a series of pure PAHs (naphthalene, fluorene, phenanthrene, acenaphthene, fluoranthene, and anthracene) with three different concentrations (10{sup -6} mol L{sup -1}, 10{sup -5} mol L{sup -1}, and 10{sup -4} mol L{sup -1}) were investigated in a pressure range from 0.1 MPa to 60 MPa at room temperature. According to the EEM, 2 (naphthalene) to 12 (anthracene) fluorescence peaks were observed and the variation of EEM under high pressure were revealed by analyzing the fluorescence peaks positions and intensities with the increased hydrostatic pressure. It is found that fluorescence peak shifts were not detected in the compression, however, both the enhancement of intensity (e.g. naphthalene, 10{sup -6} mol L{sup -1}, peak of 225/330 nm, relative fluorescence intensity increased by 0.594) and reduction of intensity (e.g. fluorene, 10{sup -6} mol L{sup -1}, peak of 275/309 nm, relative fluorescence intensity decreased by 0.0966) were observed. Otherwise, the pressure effects were magnified when the concentration was increased (e.g. the relative fluorescence intensity of anthracene (peak of 380/425 nm) increased by 0.0165 (10{sup -6} mol L{sup -1}) and decreased by 0.479 (10{sup -6} mol L{sup -1}) when the pressure was elevated from 0.1 MPa to 60 MPa).

  20. Effects of hydrostatic pressure on the excitation-emission matrix (EEM) of a series of pure PAHs

    International Nuclear Information System (INIS)

    Sun, Zhi-juan; Wang, Jie; Ye, Shu-ming; Jiang, Chun-yue; Chen, Hang

    2017-01-01

    The effects of hydrostatic pressure on the EEM of a series of pure PAHs (naphthalene, fluorene, phenanthrene, acenaphthene, fluoranthene, and anthracene) with three different concentrations (10 -6 mol L -1 , 10 -5 mol L -1 , and 10 -4 mol L -1 ) were investigated in a pressure range from 0.1 MPa to 60 MPa at room temperature. According to the EEM, 2 (naphthalene) to 12 (anthracene) fluorescence peaks were observed and the variation of EEM under high pressure were revealed by analyzing the fluorescence peaks positions and intensities with the increased hydrostatic pressure. It is found that fluorescence peak shifts were not detected in the compression, however, both the enhancement of intensity (e.g. naphthalene, 10 -6 mol L -1 , peak of 225/330 nm, relative fluorescence intensity increased by 0.594) and reduction of intensity (e.g. fluorene, 10 -6 mol L -1 , peak of 275/309 nm, relative fluorescence intensity decreased by 0.0966) were observed. Otherwise, the pressure effects were magnified when the concentration was increased (e.g. the relative fluorescence intensity of anthracene (peak of 380/425 nm) increased by 0.0165 (10 -6 mol L -1 ) and decreased by 0.479 (10 -6 mol L -1 ) when the pressure was elevated from 0.1 MPa to 60 MPa).

  1. Scientific Digital Libraries, Interoperability, and Ontologies

    Science.gov (United States)

    Hughes, J. Steven; Crichton, Daniel J.; Mattmann, Chris A.

    2009-01-01

    Scientific digital libraries serve complex and evolving research communities. Justifications for the development of scientific digital libraries include the desire to preserve science data and the promises of information interconnectedness, correlative science, and system interoperability. Shared ontologies are fundamental to fulfilling these promises. We present a tool framework, some informal principles, and several case studies where shared ontologies are used to guide the implementation of scientific digital libraries. The tool framework, based on an ontology modeling tool, was configured to develop, manage, and keep shared ontologies relevant within changing domains and to promote the interoperability, interconnectedness, and correlation desired by scientists.

  2. The DFG Viewer for Interoperability in Germany

    Directory of Open Access Journals (Sweden)

    Ralf Goebel

    2010-02-01

    Full Text Available This article deals with the DFG Viewer for Interoperability, a free and open source web-based viewer for digitised books, and assesses its relevance for interoperability in Germany. First the specific situation in Germany is described, including the important role of the Deutsche Forschungsgemeinschaft (German Research Foundation. The article then moves on to the overall concept of the viewer and its technical background. It introduces the data formats and standards used, it briefly illustrates how the viewer works and includes a few examples.

  3. Benefit quantification of interoperability in coordinate metrology

    DEFF Research Database (Denmark)

    Savio, E.; Carmignato, S.; De Chiffre, Leonardo

    2014-01-01

    these inefficiencies. The paper presents a methodology for an economic evaluation of interoperability benefits with respect to the verification of geometrical product specifications. It requires input data from testing and inspection activities, as well as information on training of personnel and licensing of software......One of the factors contributing to limited reproducibility of coordinate measurements is the use of different inspection software. Time-consuming efforts for translation of part programmes are sometimes needed, and interoperability of inspection equipment has the potential to reduce...

  4. Toward semantic interoperability with linked foundational ontologies in ROMULUS

    CSIR Research Space (South Africa)

    Khan, ZC

    2013-06-01

    Full Text Available A purpose of a foundational ontology is to solve interoperability issues among ontologies. Many foundational ontologies have been developed, reintroducing the ontology interoperability problem. We address this with the new online foundational...

  5. Interoperable web applications for sharing data and products of the International DORIS Service

    Science.gov (United States)

    Soudarin, L.; Ferrage, P.

    2017-12-01

    The International DORIS Service (IDS) was created in 2003 under the umbrella of the International Association of Geodesy (IAG) to foster scientific research related to the French satellite tracking system DORIS and to deliver scientific products, mostly related to the International Earth rotation and Reference systems Service (IERS). Since its start, the organization has continuously evolved, leading to additional and improved operational products from an expanded set of DORIS Analysis Centers. In addition, IDS has developed services for sharing data and products with the users. Metadata and interoperable web applications are proposed to explore, visualize and download the key products such as the position time series of the geodetic points materialized at the ground tracking stations. The Global Geodetic Observing System (GGOS) encourages the IAG Services to develop such interoperable facilities on their website. The objective for GGOS is to set up an interoperable portal through which the data and products produced by the IAG Services can be served to the user community. We present the web applications proposed by IDS to visualize time series of geodetic observables or to get information about the tracking ground stations and the tracked satellites. We discuss the future plans for IDS to meet the recommendations of GGOS. The presentation also addresses the needs for the IAG Services to adopt common metadata thesaurus to describe data and products, and interoperability standards to share them.

  6. Intercloud architecture for interoperability and integration

    NARCIS (Netherlands)

    Demchenko, Y.; Makkes, M.X.; Strijkers, R.; de Laat, C.

    2012-01-01

    This paper presents on-going research to develop the Intercloud Architecture Framework (ICAF) that addresses problems in multi-provider multi-domain heterogeneous cloud based infrastructure services and applications integration and interoperability. The paper refers to existing standards in Cloud

  7. Intercloud Architecture for interoperability and integration

    NARCIS (Netherlands)

    Demchenko, Y.; Makkes, M.X.; Strijkers, R.J.; Laat, C. de

    2012-01-01

    This paper presents on-going research to develop the Intercloud Architecture Framework (ICAF) that addresses problems in multi-provider multi-domain heterogeneous cloud based infrastructure services and applications integration and interoperability. The paper refers to existing standards in Cloud

  8. Equipping the Enterprise Interoperability Problem Solver

    NARCIS (Netherlands)

    Oude Luttighuis, Paul; Folmer, Erwin Johan Albert; Charalabidis, Yannis

    2010-01-01

    The maturity of the enterprise interoperability field does not match the importance attached to it by many, both in the public as well as the private community. A host of models, paradigms, designs, standards, methods, and instruments seems to be available, but many of them are only used in rather

  9. Smart Grid Interoperability Maturity Model Beta Version

    Energy Technology Data Exchange (ETDEWEB)

    Widergren, Steven E.; Drummond, R.; Giroti, Tony; Houseman, Doug; Knight, Mark; Levinson, Alex; longcore, Wayne; Lowe, Randy; Mater, J.; Oliver, Terry V.; Slack, Phil; Tolk, Andreas; Montgomery, Austin

    2011-12-02

    The GridWise Architecture Council was formed by the U.S. Department of Energy to promote and enable interoperability among the many entities that interact with the electric power system. This balanced team of industry representatives proposes principles for the development of interoperability concepts and standards. The Council provides industry guidance and tools that make it an available resource for smart grid implementations. In the spirit of advancing interoperability of an ecosystem of smart grid devices and systems, this document presents a model for evaluating the maturity of the artifacts and processes that specify the agreement of parties to collaborate across an information exchange interface. You are expected to have a solid understanding of large, complex system integration concepts and experience in dealing with software component interoperation. Those without this technical background should read the Executive Summary for a description of the purpose and contents of the document. Other documents, such as checklists, guides, and whitepapers, exist for targeted purposes and audiences. Please see the www.gridwiseac.org website for more products of the Council that may be of interest to you.

  10. An interoperable security framework for connected healthcare

    NARCIS (Netherlands)

    Asim, M.; Petkovic, M.; Qu, M.; Wang, Changjie

    2011-01-01

    Connected and interoperable healthcare system promises to reduce the cost of healthcare delivery, increase its efficiency and enable consumers to better engage with clinicians and manage their care. However at the same time it introduces new risks towards security and privacy of personal health

  11. An Interoperable Security Framework for Connected Healthcare

    NARCIS (Netherlands)

    Asim, M.; Petkovic, M.; Qu, M.; Wang, C.

    2011-01-01

    Connected and interoperable healthcare system promises to reduce thecost of the healthcare delivery, increase its efficiency and enableconsumers to better engage with clinicians and manage their care. However at the same time it introduces new risks towards security andprivacy of personal health

  12. Enhancing Data Interoperability with Web Services

    Science.gov (United States)

    Shrestha, S. R.; Zimble, D. A.; Wang, W.; Herring, D.; Halpert, M.

    2014-12-01

    In an effort to improve data access and interoperability of climate and weather data, the National Oceanic and Atmospheric Administration's (NOAA) Climate.gov and Climate Prediction Center (CPC) are exploring various platform solutions to enhance a user's ability to locate, preview, and acquire the data. The Climate.gov and CPC data team faces multiple challenges including the various kinds of data and formats, inconsistency of metadata records, variety of data service implementations, very large volumes of data and geographically distributed locations. We have created the Data Access and Interoperability project to design a web-based platform, where interoperability between systems can be leveraged to allow greater data discovery, access, visualization and delivery. In the interoperable data platform, systems can integrate with each other to support the synthesis of climate and weather data. Interoperability is the ability for users to discover the available climate and weather data, preview and interact with the data, and acquire the data in common digital formats through a simple web-based interface. The goal of the interoperable data platform is to leverage existing web services, implement the established standards and integrate with existing solutions across the earth sciences domain instead of creating new technologies. Towards this effort to improve the interoperability of the platform, we are collaborating with ESRI Inc. to provide climate and weather data via web services. In this presentation, we will discuss and demonstrate how to use ArcGIS to author RESTful based scientific web services using open standards. These web services are able to encapsulate the logic required to handle and describe scientific data through a variety of service types including, image, map, feature, geoprocessing, and their respective service methods. Combining these types of services and leveraging well-documented APIs, including the ArcGIS JavaScript API, we can afford to

  13. Designing learning management system interoperability in semantic web

    Science.gov (United States)

    Anistyasari, Y.; Sarno, R.; Rochmawati, N.

    2018-01-01

    The extensive adoption of learning management system (LMS) has set the focus on the interoperability requirement. Interoperability is the ability of different computer systems, applications or services to communicate, share and exchange data, information, and knowledge in a precise, effective and consistent way. Semantic web technology and the use of ontologies are able to provide the required computational semantics and interoperability for the automation of tasks in LMS. The purpose of this study is to design learning management system interoperability in the semantic web which currently has not been investigated deeply. Moodle is utilized to design the interoperability. Several database tables of Moodle are enhanced and some features are added. The semantic web interoperability is provided by exploited ontology in content materials. The ontology is further utilized as a searching tool to match user’s queries and available courses. It is concluded that LMS interoperability in Semantic Web is possible to be performed.

  14. Augmenting interoperability across repositories architectural ideas

    CERN Multimedia

    CERN. Geneva

    2005-01-01

    The aDORe digital repository architecture designed and implemented by the Los Alamos Research Library is fully standards-based and highly modular, with the various components of the architecture interacting in a protocol-driven manner. Although aDORe was designed for use in the context of the Los Alamos Library, its modular and standards-based design has led to interesting insights regarding possible new levels of interoperability in a federation of heterogeneous repositories. The presentation will discuss these insights, and will illustrate that attractive federations of repositories can be built by introducing rather basic interoperability requirements. The presentation will also show that, once these requirements are met, a powerful service framework that overlays the federation can emerge.

  15. Regulatory Barriers Blocking Standardization of Interoperability

    OpenAIRE

    Zhong, Daidi; Kirwan, Michael J; Duan, Xiaolian

    2013-01-01

    Developing and implementing a set of personal health device interoperability standards is key to cultivating a healthy global industry ecosystem. The standardization organizations, including the Institute of Electrical and Electronics Engineers 11073 Personal Health Device Workgroup (IEEE 11073-PHD WG) and Continua Health Alliance, are striving for this purpose. However, factors like the medial device regulation, health policy, and market reality have placed non-technical barriers over the ad...

  16. UGV Control Interoperability Profile (IOP), Version 0

    Science.gov (United States)

    2011-12-21

    a tracked vehicle to climb stairs , traverse ditches/ruts, etc. The operator should be able to control the position of the flippers via the OCU and...Unclassified UGV Control Interoperability Profile (IOP) Version 0 Robotic Systems, Joint Project Office (RS JPO) SFAE-GCS-UGV MS...NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Robotic Systems, Joint Project Office (RS JPO),SFAE-GCS-UGV MS 266,6501 East 11 Mile Road

  17. Future Interoperability of Camp Protection Systems (FICAPS)

    Science.gov (United States)

    Caron, Sylvie; Gündisch, Rainer; Marchand, Alain; Stahl, Karl-Hermann

    2013-05-01

    The FICAPS Project has been established as a Project of the European Defence Agency based on an initiative of Germany and France. Goal of this Project was to derive Guidelines, which by a proper implementation in future developments improve Camp Protection Systems (CPS) by enabling and improving interoperability between Camp Protection Systems and its Equipments of different Nations involved in multinational missions. These Guidelines shall allow for: • Real-time information exchange between equipments and systems of different suppliers and nations (even via SatCom), • Quick and easy replacement of equipments (even of different Nations) at run-time in the field by means of plug and play capability, thus lowering the operational and logistic costs and making the system highly available, • Enhancement of system capabilities (open and modular systems) by adding new equipment with new capabilities (just plug-in, automatic adjustment of the HMI Human Machine Interface) without costly and time consuming validation and test on system level (validation and test can be done on Equipment level), Four scenarios have been identified to summarize the interoperability requirements from an operational viewpoint. To prove the definitions given in the Guideline Document, a French and a German Demonstration System, based on existing national assets, were realized. Demonstrations, showing the capabilities given by the defined interoperability requirements with respect to the operational scenarios, were performed. Demonstrations included remote control of a CPS by another CPS, remote sensor control (Electro-Optic/InfraRed EO/IR) and remote effector control. This capability can be applied to extend the protection area or to protect distant infrastructural assets Demonstrations have been performed. The required interoperability functionality was shown successfully. Even if the focus of the FICAPS project was on camp protection, the solution found is also appropriate for other

  18. Interoperability in the e-Government Context

    Science.gov (United States)

    2012-01-01

    TN-014 | 3 ing e- government systems focus primarily on these technical challenges [UNDP 2007a, p. 10; CS Transform 2009, p. 3]. More recently...Thailand’s government hits its own wall. Responding agencies and non- governmental groups are unable to share information vital to the rescue effort...Interoperability and Open Standards for e- Governance .” egov (Sep. 1, 2007): 17–19. [Secretary General, United Nations 2010] Secretary General, United

  19. Environmental Models as a Service: Enabling Interoperability ...

    Science.gov (United States)

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantage of streamlined deployment processes and affordable cloud access to move algorithms and data to the web for discoverability and consumption. In these deployments, environmental models can become available to end users through RESTful web services and consistent application program interfaces (APIs) that consume, manipulate, and store modeling data. RESTful modeling APIs also promote discoverability and guide usability through self-documentation. Embracing the RESTful paradigm allows models to be accessible via a web standard, and the resulting endpoints are platform- and implementation-agnostic while simultaneously presenting significant computational capabilities for spatial and temporal scaling. RESTful APIs present data in a simple verb-noun web request interface: the verb dictates how a resource is consumed using HTTP methods (e.g., GET, POST, and PUT) and the noun represents the URL reference of the resource on which the verb will act. The RESTful API can self-document in both the HTTP response and an interactive web page using the Open API standard. This lets models function as an interoperable service that promotes sharing, documentation, and discoverability. Here, we discuss the

  20. Telemedicine system interoperability architecture: concept description and architecture overview.

    Energy Technology Data Exchange (ETDEWEB)

    Craft, Richard Layne, II

    2004-05-01

    In order for telemedicine to realize the vision of anywhere, anytime access to care, it must address the question of how to create a fully interoperable infrastructure. This paper describes the reasons for pursuing interoperability, outlines operational requirements that any interoperability approach needs to consider, proposes an abstract architecture for meeting these needs, identifies candidate technologies that might be used for rendering this architecture, and suggests a path forward that the telemedicine community might follow.

  1. The role of architecture and ontology for interoperability.

    Science.gov (United States)

    Blobel, Bernd; González, Carolina; Oemig, Frank; Lopéz, Diego; Nykänen, Pirkko; Ruotsalainen, Pekka

    2010-01-01

    Turning from organization-centric to process-controlled or even to personalized approaches, advanced healthcare settings have to meet special interoperability challenges. eHealth and pHealth solutions must assure interoperability between actors cooperating to achieve common business objectives. Hereby, the interoperability chain also includes individually tailored technical systems, but also sensors and actuators. For enabling corresponding pervasive computing and even autonomic computing, individualized systems have to be based on an architecture framework covering many domains, scientifically managed by specialized disciplines using their specific ontologies in a formalized way. Therefore, interoperability has to advance from a communication protocol to an architecture-centric approach mastering ontology coordination challenges.

  2. PACS/information systems interoperability using Enterprise Communication Framework.

    Science.gov (United States)

    alSafadi, Y; Lord, W P; Mankovich, N J

    1998-06-01

    Interoperability among healthcare applications goes beyond connectivity to allow components to exchange structured information and work together in a predictable, coordinated fashion. To facilitate building an interoperability infrastructure, an Enterprise Communication Framework (ECF) was developed by the members of the Andover Working Group for Healthcare Interoperability (AWG-OHI). The ECF consists of four models: 1) Use Case Model, 2) Domain Information Model (DIM), 3) Interaction Model, and 4) Message Model. To realize this framework, a software component called the Enterprise Communicator (EC) is used. In this paper, we will demonstrate the use of the framework in interoperating a picture archiving and communication system (PACS) with a radiology information system (RIS).

  3. Robotics Systems Joint Project Office (RSJPO) Interoperability Profiles (IOPS) 101

    Science.gov (United States)

    2012-07-01

    interoperability, although they are supported by some interoperability attributes  For example, stair climbing » Stair climbing is not something that...IOPs need to specify » However, the mobility & actuation related interoperable messages can be used to provide stair climbing » Also...interoperability can enable management of different poses or modes, one of which may be stair climbing R O B O T IC S Y S T E M S J P O L e a d e r s h i p

  4. Holistic Framework For Establishing Interoperability of Heterogeneous Software Development Tools

    National Research Council Canada - National Science Library

    Puett, Joseph

    2003-01-01

    This dissertation presents a Holistic Framework for Software Engineering (HFSE) that establishes collaborative mechanisms by which existing heterogeneous software development tools and models will interoperate...

  5. Enterprise Interoperability - Proceedings of the 5th International IFIP Working Conference on Enterprise Interoperability, IWEI 2013

    NARCIS (Netherlands)

    van Sinderen, Marten J.; Oude Luttighuis, P.H.W.M.; Folmer, Erwin Johan Albert; Bosems, S.; Unknown, [Unknown

    IWEI is an International IFIP Working Conference covering all aspects of enterprise interoperability with the purpose of achieving flexible cross-organizational collaboration through integrated support at business and technical levels. It provides a forum for discussing ideas and results among both

  6. 77 FR 19575 - Promoting Interoperability in the 700 MHz Commercial Spectrum; Interoperability of Mobile User...

    Science.gov (United States)

    2012-04-02

    ... efforts and/or through modifications to the Commission's technical rules or other regulatory measures. The... regulatory measures. \\1\\ The Commission has a longstanding interest in promoting the interoperability of... standards for Long-Term Evolution (LTE) wireless broadband technology are developed by the 3rd Generation...

  7. Open Source Interoperability: It's More than Technology

    Directory of Open Access Journals (Sweden)

    Dominic Sartorio

    2008-01-01

    Full Text Available The Open Solutions Alliance is a consortium of leading commercial open source vendors, integrators and end users dedicated to the growth of open source based solutions in the enterprise. We believe Linux and other infrastructure software, such as Apache, has become mainstream, and packaged solutions represent the next great growth opportunity. However some unique challenges can temper that opportunity. These challenges include getting the word out about the maturity and enterprise-readiness of those solutions, ensuring interoperability both with each other and with other proprietary and legacy solutions, and ensuring healthy collaboration between vendors and their respective customer and developer communities.

  8. RFID in libraries a step toward interoperability

    CERN Document Server

    Ayre, Lori Bowen

    2012-01-01

    The approval by The National Information Standards Organization (NISO) of a new standard for RFID in libraries is a big step toward interoperability among libraries and vendors. By following this set of practices and procedures, libraries can ensure that an RFID tag in one library can be used seamlessly by another, assuming both comply, even if they have different suppliers for tags, hardware, and software. In this issue of Library Technology Reports, Lori Bowen Ayre, an experienced implementer of automated materials handling systems, Provides background on the evolution of the standard

  9. Extended biorthogonal matrix polynomials

    Directory of Open Access Journals (Sweden)

    Ayman Shehata

    2017-01-01

    Full Text Available The pair of biorthogonal matrix polynomials for commutative matrices were first introduced by Varma and Tasdelen in [22]. The main aim of this paper is to extend the properties of the pair of biorthogonal matrix polynomials of Varma and Tasdelen and certain generating matrix functions, finite series, some matrix recurrence relations, several important properties of matrix differential recurrence relations, biorthogonality relations and matrix differential equation for the pair of biorthogonal matrix polynomials J(A,B n (x, k and K(A,B n (x, k are discussed. For the matrix polynomials J(A,B n (x, k, various families of bilinear and bilateral generating matrix functions are constructed in the sequel.

  10. Food product tracing technology capabilities and interoperability.

    Science.gov (United States)

    Bhatt, Tejas; Zhang, Jianrong Janet

    2013-12-01

    Despite the best efforts of food safety and food defense professionals, contaminated food continues to enter the food supply. It is imperative that contaminated food be removed from the supply chain as quickly as possible to protect public health and stabilize markets. To solve this problem, scores of technology companies purport to have the most effective, economical product tracing system. This study sought to compare and contrast the effectiveness of these systems at analyzing product tracing information to identify the contaminated ingredient and likely source, as well as distribution of the product. It also determined if these systems can work together to better secure the food supply (their interoperability). Institute of Food Technologists (IFT) hypothesized that when technology providers are given a full set of supply-chain data, even for a multi-ingredient product, their systems will generally be able to trace a contaminated product forward and backward through the supply chain. However, when provided with only a portion of supply-chain data, even for a product with a straightforward supply chain, it was expected that interoperability of the systems will be lacking and that there will be difficulty collaborating to identify sources and/or recipients of potentially contaminated product. IFT provided supply-chain data for one complex product to 9 product tracing technology providers, and then compared and contrasted their effectiveness at analyzing product tracing information to identify the contaminated ingredient and likely source, as well as distribution of the product. A vertically integrated foodservice restaurant agreed to work with IFT to secure data from its supply chain for both a multi-ingredient and a simpler product. Potential multi-ingredient products considered included canned tuna, supreme pizza, and beef tacos. IFT ensured that all supply-chain data collected did not include any proprietary information or information that would otherwise

  11. Towards E-Society Policy Interoperability

    Science.gov (United States)

    Iannella, Renato

    The move towards the Policy-Oriented Web is destined to provide support for policy expression and management in the core web layers. One of the most promising areas that can drive this new technology adoption is e-Society communities. With so much user-generated content being shared by these social networks, there is the real danger that the implicit sharing rules that communities have developed over time will be lost in translation in the new digital communities. This will lead to a corresponding loss in confidence in e-Society sites. The Policy-Oriented Web attempts to turn the implicit into the explicit with a common framework for policy language interoperability and awareness. This paper reports on the policy driving factors from the Social Networks experiences using real-world use cases and scenarios. In particular, the key functions of policy-awareness - for privacy, rights, and identity - will be the driving force that enables the e-Society to appreciate new interoperable policy regimes.

  12. Open Health Tools: Tooling for Interoperable Healthcare

    Directory of Open Access Journals (Sweden)

    Skip McGaughey

    2008-11-01

    Full Text Available The Open Health Tools initiative is creating an ecosystem focused on the production of software tooling that promotes the exchange of medical information across political, geographic, cultural, product, and technology lines. At its core, OHT believes that the availability of high-quality tooling that interoperates will propel the industry forward, enabling organizations and vendors to build products and systems that effectively work together. This will ?raise the interoperability bar? as a result of having tools that just work. To achieve these lofty goals, careful consideration must be made to the constituencies that will be most affected by an OHT-influenced world. This document outlines a vision of OHT?s impact to these stakeholders. It does not explain the OHT process itself or how the OHT community operates. Instead, we place emphasis on the impact of that process within the health industry. The catchphrase ?code is king? underpins this document, meaning that the manifestation of any open source community lies in the products and technology it produces.

  13. Defining Inter-Cloud Architecture for Interoperability and Integration

    NARCIS (Netherlands)

    Demchenko, Y.; Ngo, C.; Makkes, M.X.; Strijkers, R.J.; Laat, C. de

    2012-01-01

    This paper presents on-going research to develop the Inter-Cloud Architecture that should address problems in multi-provider multi-domain heterogeneous Cloud based applications integration and interoperability, including integration and interoperability with legacy infrastructure services. Cloud

  14. Reference architecture for interoperability testing of Electric Vehicle charging

    NARCIS (Netherlands)

    Lehfuss, F.; Nohrer, M.; Werkmany, E.; Lopezz, J.A.; Zabalaz, E.

    2015-01-01

    This paper presents a reference architecture for interoperability testing of electric vehicles as well as their support equipment with the smart grid and the e-Mobility environment. Pan-European Electric Vehicle (EV)-charging is currently problematic as there are compliance and interoperability

  15. Defining inter-cloud architecture for interoperability and integration

    NARCIS (Netherlands)

    Demchenko, Y.; Ngo, C.; Makkes, M.X.; Strijkers, R.; de Laat, C.; Zimmermann, W.; Lee, Y.W.; Demchenko, Y.

    2012-01-01

    This paper presents an on-going research to develop the Inter-Cloud Architecture, which addresses the architectural problems in multi-provider multi-domain heterogeneous cloud based applications integration and interoperability, including integration and interoperability with legacy infrastructure

  16. Interoperability of Demand Response Resources Demonstration in NY

    Energy Technology Data Exchange (ETDEWEB)

    Wellington, Andre

    2014-03-31

    The Interoperability of Demand Response Resources Demonstration in NY (Interoperability Project) was awarded to Con Edison in 2009. The objective of the project was to develop and demonstrate methodologies to enhance the ability of customer sited Demand Response resources to integrate more effectively with electric delivery companies and regional transmission organizations.

  17. Promoting Interoperability: The Case for Discipline-Specific PSAPS

    Science.gov (United States)

    2014-12-01

    multijurisdictional, interoperability is a key factor for success. Responses to 9/11,9 the Oso mudslides in Washington, the Boston Marathon bombing...Continuum125 2. Functional Interoperability As demonstrated by the 9/11 attacks, the Oso mudslide in Washington, the Boston Marathon bombing, and other large

  18. On the applicability of schema integration techniques to database interoperation

    NARCIS (Netherlands)

    Vermeer, Mark W.W.; Apers, Peter M.G.

    1996-01-01

    We discuss the applicability of schema integration techniques developed for tightly-coupled database interoperation to interoperation of databases stemming from different modelling contexts. We illustrate that in such an environment, it is typically quite difficult to infer the real-world semantics

  19. Interactive test tool for interoperable C-ITS development

    NARCIS (Netherlands)

    Voronov, A.; Englund, C.; Bengtsson, H.H.; Chen, L.; Ploeg, J.; Jongh, J.F.C.M. de; Sluis, H.J.D. van de

    2015-01-01

    This paper presents the architecture of an Interactive Test Tool (ITT) for interoperability testing of Cooperative Intelligent Transport Systems (C-ITS). Cooperative systems are developed by different manufacturers at different locations, which makes interoperability testing a tedious task. Up until

  20. Establishing Interoperability of a Blog Archive through Linked Open Data

    DEFF Research Database (Denmark)

    Kalb, Hendrik; Lazaridou, Paraskevi; Trier, Matthias

    2013-01-01

    on archived data. However, interoperability among BlogForever archives, as well as with other digital libraries, is necessary in order to avoid silos of data. In this paper, we reveal some of our efforts to establish interoperability through the application of Linked Open data....

  1. PERSPECTIVES ON INTEROPERABILITY INTEGRATION WITHIN NATO DEFENSE PLANNING PROCESS

    Directory of Open Access Journals (Sweden)

    Florian CIOCAN

    2011-01-01

    Full Text Available Interoperability is not a new area of effort at NATO level. In fact, interoperability and more specifi cally standardization, has been a key element of the Alliance’s approach to fi elding forces for decades. But as the security and operational environment has been in a continuous change, the need to face the new threats and the current involvement in challenging operations in Afghanistan and elsewhere alongside with the necessity to interoperate at lower and lower levels of command with an increasing number of nations, including non-NATO ISAF partners, NGOs, and other organizations, have made the task even more challenging. In this respect Interoperability Integration within NATO Defense Planning Process will facilitate the timely identifi cation, development and delivery of required forces and capabilities that are interoperable and adequately prepared, equipped, trained and supported to undertake the Alliance’s full spectrum of missions.

  2. Secure Interoperable Open Smart Grid Demonstration Project

    Energy Technology Data Exchange (ETDEWEB)

    Magee, Thoman [Consolidated Edison Company Of New York, Inc., NY (United States)

    2014-12-28

    The Consolidated Edison, Inc., of New York (Con Edison) Secure Interoperable Open Smart Grid Demonstration Project (SGDP), sponsored by the United States (US) Department of Energy (DOE), demonstrated that the reliability, efficiency, and flexibility of the grid can be improved through a combination of enhanced monitoring and control capabilities using systems and resources that interoperate within a secure services framework. The project demonstrated the capability to shift, balance, and reduce load where and when needed in response to system contingencies or emergencies by leveraging controllable field assets. The range of field assets includes curtailable customer loads, distributed generation (DG), battery storage, electric vehicle (EV) charging stations, building management systems (BMS), home area networks (HANs), high-voltage monitoring, and advanced metering infrastructure (AMI). The SGDP enables the seamless integration and control of these field assets through a common, cyber-secure, interoperable control platform, which integrates a number of existing legacy control and data systems, as well as new smart grid (SG) systems and applications. By integrating advanced technologies for monitoring and control, the SGDP helps target and reduce peak load growth, improves the reliability and efficiency of Con Edison’s grid, and increases the ability to accommodate the growing use of distributed resources. Con Edison is dedicated to lowering costs, improving reliability and customer service, and reducing its impact on the environment for its customers. These objectives also align with the policy objectives of New York State as a whole. To help meet these objectives, Con Edison’s long-term vision for the distribution grid relies on the successful integration and control of a growing penetration of distributed resources, including demand response (DR) resources, battery storage units, and DG. For example, Con Edison is expecting significant long-term growth of DG

  3. OR.NET: a service-oriented architecture for safe and dynamic medical device interoperability.

    Science.gov (United States)

    Kasparick, Martin; Schmitz, Malte; Andersen, Björn; Rockstroh, Max; Franke, Stefan; Schlichting, Stefan; Golatowski, Frank; Timmermann, Dirk

    2018-02-23

    Modern surgical departments are characterized by a high degree of automation supporting complex procedures. It recently became apparent that integrated operating rooms can improve the quality of care, simplify clinical workflows, and mitigate equipment-related incidents and human errors. Particularly using computer assistance based on data from integrated surgical devices is a promising opportunity. However, the lack of manufacturer-independent interoperability often prevents the deployment of collaborative assistive systems. The German flagship project OR.NET has therefore developed, implemented, validated, and standardized concepts for open medical device interoperability. This paper describes the universal OR.NET interoperability concept enabling a safe and dynamic manufacturer-independent interconnection of point-of-care (PoC) medical devices in the operating room and the whole clinic. It is based on a protocol specifically addressing the requirements of device-to-device communication, yet also provides solutions for connecting the clinical information technology (IT) infrastructure. We present the concept of a service-oriented medical device architecture (SOMDA) as well as an introduction to the technical specification implementing the SOMDA paradigm, currently being standardized within the IEEE 11073 service-oriented device connectivity (SDC) series. In addition, the Session concept is introduced as a key enabler for safe device interconnection in highly dynamic ensembles of networked medical devices; and finally, some security aspects of a SOMDA are discussed.

  4. Managing interoperability and complexity in health systems.

    Science.gov (United States)

    Bouamrane, M-M; Tao, C; Sarkar, I N

    2015-01-01

    In recent years, we have witnessed substantial progress in the use of clinical informatics systems to support clinicians during episodes of care, manage specialised domain knowledge, perform complex clinical data analysis and improve the management of health organisations' resources. However, the vision of fully integrated health information eco-systems, which provide relevant information and useful knowledge at the point-of-care, remains elusive. This journal Focus Theme reviews some of the enduring challenges of interoperability and complexity in clinical informatics systems. Furthermore, a range of approaches are proposed in order to address, harness and resolve some of the many remaining issues towards a greater integration of health information systems and extraction of useful or new knowledge from heterogeneous electronic data repositories.

  5. Interoperability science cases with the CDPP tools

    Science.gov (United States)

    Nathanaël, J.; Cecconi, B.; André, N.; Bouchemit, M.; Gangloff, M.; Budnik, E.; Jacquey, C.; Pitout, F.; Durand, J.; Rouillard, A.; Lavraud, B.; Genot, V. N.; Popescu, D.; Beigbeder, L.; Toniutti, J. P.; Caussarieu, S.

    2017-12-01

    Data exchange protocols are never as efficient as when they are invisible for the end user who is then able to discover data, to cross compare observations and modeled data and finally to perform in depth analysis. Over the years these protocols, including SAMP from IVOA, EPN-TAP from the Europlanet 2020 RI community, backed by standard web-services, have been deployed in tools designed by the French Centre de Données de la Physique des Plasmas (CDPP) including AMDA, the Propagation Tool, 3DView, ... . This presentation will focus on science cases which show the capability of interoperability in the planetary and heliophysics contexts, involving both CDPP and companion tools. Europlanet 2020 RI has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No 654208.

  6. Interoperability of Standards for Robotics in CIME

    DEFF Research Database (Denmark)

    Kroszynski, Uri; Sørensen, Torben; Ludwig, Arnold

    1997-01-01

    Esprit Project 6457 "Interoperability of Standards for Robotics in CIME (InterRob)" belongs to the Subprogramme "Integration in Manufacturing" of Esprit, the European Specific Programme for Research and Development in Information Technology supported by the European Commision.The first main goal...... of InterRob was to close the information chain between product design, simulation, programming, and robot control by developing standardized interfaces and their software implementation for standards STEP (International Standard for the Exchange of Product model data, ISO 10303) and IRL (Industrial Robot...... Language, DIN 66312). This is a continuation of the previous Esprit projects CAD*I and NIRO, which developed substantial basics of STEP.The InterRob approach is based on standardized models for product geometry, kinematics, robotics, dynamics and control, hence on a coherent neutral information model...

  7. Interoperability between phenotype and anatomy ontologies.

    Science.gov (United States)

    Hoehndorf, Robert; Oellrich, Anika; Rebholz-Schuhmann, Dietrich

    2010-12-15

    Phenotypic information is important for the analysis of the molecular mechanisms underlying disease. A formal ontological representation of phenotypic information can help to identify, interpret and infer phenotypic traits based on experimental findings. The methods that are currently used to represent data and information about phenotypes fail to make the semantics of the phenotypic trait explicit and do not interoperate with ontologies of anatomy and other domains. Therefore, valuable resources for the analysis of phenotype studies remain unconnected and inaccessible to automated analysis and reasoning. We provide a framework to formalize phenotypic descriptions and make their semantics explicit. Based on this formalization, we provide the means to integrate phenotypic descriptions with ontologies of other domains, in particular anatomy and physiology. We demonstrate how our framework leads to the capability to represent disease phenotypes, perform powerful queries that were not possible before and infer additional knowledge. http://bioonto.de/pmwiki.php/Main/PheneOntology.

  8. Flexible solution for interoperable cloud healthcare systems.

    Science.gov (United States)

    Vida, Mihaela Marcella; Lupşe, Oana Sorina; Stoicu-Tivadar, Lăcrămioara; Bernad, Elena

    2012-01-01

    It is extremely important for the healthcare domain to have a standardized communication because will improve the quality of information and in the end the resulting benefits will improve the quality of patients' life. The standards proposed to be used are: HL7 CDA and CCD. For a better access to the medical data a solution based on cloud computing (CC) is investigated. CC is a technology that supports flexibility, seamless care, and reduced costs of the medical act. To ensure interoperability between healthcare information systems a solution creating a Web Custom Control is presented. The control shows the database tables and fields used to configure the two standards. This control will facilitate the work of the medical staff and hospital administrators, because they can configure the local system easily and prepare it for communication with other systems. The resulted information will have a higher quality and will provide knowledge that will support better patient management and diagnosis.

  9. BIM Interoperability Limitations: Australian and Malaysian Rail Projects

    Directory of Open Access Journals (Sweden)

    Kenley Russell

    2016-01-01

    Full Text Available Building information modelling (BIM is defined as a process involving the generation and management of digital representation of physical and functional characteristics of a facility. The purpose of interoperability in integrated or “open” BIM is to facilitate the information exchange between different digital systems, models and tools. There has been effort towards data interoperability with development of open source standards and object-oriented models, such as industry foundation classes (IFC for vertical infrastructure. However, the lack of open data standards for the information exchange for horizontal infrastructure limits the adoption and effectiveness of integrated BIM. The paper outlines two interoperability issues for construction of rail infrastructure. The issues are presented in two case study reports, one from Australia and one from Malaysia. The each case study includes: a description of the project, the application of BIM in the project, a discussion of the promised BIM interoperability solution plus the identification of the unresolved lack of interoperability for horizontal infrastructure project management. The Moreton Bay Rail project in Australia introduces general software interoperability issues. The Light Rail Extension project in Kuala Lumpur outlines an example of the integration problems related to two different location data structures. The paper highlights how the continuing lack of data interoperability limits utilisation of integrated BIM for horizontal infrastructure rail projects.

  10. Managing Interoperability for GEOSS - A Report from the SIF

    Science.gov (United States)

    Khalsa, S. J.; Actur, D.; Nativi, S.; Browdy, S.; Eglitis, P.

    2009-04-01

    The Global Earth Observation System of Systems (GEOSS) is a coordinating and integrating framework for Earth observing and information systems, which are contributed on a voluntary basis by Members and Participating Organizations of the intergovernmental Group on Earth Observations (GEO). GEOSS exists to support informed decision making for the benefit of society, including the implementation of international environmental treaty obligations. GEO Members and Participating organizations use the GEOSS Common Infrastructure (GCI) to register their Earth observation resources, thereby making them discoverable and consumable by both humans and client applications. Essential to meeting GEO user needs is a process for supporting interoperability of observing, processing, modeling and dissemination capabilities. The GEO Standards and Interoperability Forum (SIF) was created to develop, implement and oversee this process. The SIF supports GEO organizations contributing resources to the GEOSS by helping them understand and work with the GEOSS interoperability guidelines and encouraging them to register their "interoperability arrangements" (standards or other ad hoc arrangements for interoperability) in the GEOSS standards registry, which is part of the GCI. These registered interoperability arrangements support the actual services used to achieve interoperability of systems. By making information about these interoperability arrangements available to users of the GEOSS the SIF enhances the understanding and utility of contributed resources. We describe the procedures that the SIF has enacted to carry out its work. To operate effectively the SIF uses a workflow system and is establishing a set of regional teams and domain experts. In the near term our work has focused on population and review of the GEOSS Standards Registry, but we are also developing approaches to achieving progressive convergence on, and uptake of, an optimal set of interoperability arrangements for all of

  11. IoT interoperability : a hub-based approach

    OpenAIRE

    Blackstock, Michael; Lea, Rodger

    2014-01-01

    Interoperability in the Internet of Things is critical for emerging services and applications. In this paper we advocate the use of IoT ‘hubs’ to aggregate things using web protocols, and suggest a staged approach to interoperability. In the context of a UK government funded project involving 8 IoT projects to address cross-domain IoT interoperability, we introduce the HyperCat IoT catalogue specification. We then describe the tools and techniques we developed to adapt an existing data portal...

  12. Cloud portability and interoperability issues and current trends

    CERN Document Server

    Di Martino, Beniamino; Esposito, Antonio

    2015-01-01

    This book offers readers a quick, comprehensive and up-to-date overview of the most important methodologies, technologies, APIs and standards related to the portability and interoperability of cloud applications and services, illustrated by a number of use cases representing a variety of interoperability and portability scenarios. The lack of portability and interoperability between cloud platforms at different service levels is the main issue affecting cloud-based services today. The brokering, negotiation, management, monitoring and reconfiguration of cloud resources are challenging tasks

  13. A logical approach to semantic interoperability in healthcare.

    Science.gov (United States)

    Bird, Linda; Brooks, Colleen; Cheong, Yu Chye; Tun, Nwe Ni

    2011-01-01

    Singapore is in the process of rolling out a number of national e-health initiatives, including the National Electronic Health Record (NEHR). A critical enabler in the journey towards semantic interoperability is a Logical Information Model (LIM) that harmonises the semantics of the information structure with the terminology. The Singapore LIM uses a combination of international standards, including ISO 13606-1 (a reference model for electronic health record communication), ISO 21090 (healthcare datatypes), and SNOMED CT (healthcare terminology). The LIM is accompanied by a logical design approach, used to generate interoperability artifacts, and incorporates mechanisms for achieving unidirectional and bidirectional semantic interoperability.

  14. CCSDS SM and C Mission Operations Interoperability Prototype

    Science.gov (United States)

    Lucord, Steven A.

    2010-01-01

    This slide presentation reviews the prototype of the Spacecraft Monitor and Control (SM&C) Operations for interoperability among other space agencies. This particular prototype uses the German Space Agency (DLR) to test the ideas for interagency coordination.

  15. Interoperable Multimedia Annotation and Retrieval for the Tourism Sector

    NARCIS (Netherlands)

    Chatzitoulousis, Antonios; Efraimidis, Pavlos S.; Athanasiadis, I.N.

    2015-01-01

    The Atlas Metadata System (AMS) employs semantic web annotation techniques in order to create an interoperable information annotation and retrieval platform for the tourism sector. AMS adopts state-of-the-art metadata vocabularies, annotation techniques and semantic web technologies.

  16. Patterns in Standards and Technologies for Economic Information Systems Interoperability

    Directory of Open Access Journals (Sweden)

    Vasile Irimia

    2012-06-01

    Full Text Available This paper presets results from a review of the current standards used for collaboration between economic information systems, including web services and service oriented architecture, EDI, ebXML framework, RosettaNet framework, cXML, xCBL UBL, BPMN, BPEL, WS-CDL, ASN.1, and others. Standards have a key role in promoting economic information system interoperability, and thus enable collaboration. Analyzing the current standards, technologies and applications used for economic information systems interoperability has revealed a common pattern that runs through all of them. From this pattern we construct a basic model of interoperability around which we relate and judge all standards, technologies and applications for economic information systems interoperability.

  17. Radio Interoperability: There Is More to It Than Hardware

    National Research Council Canada - National Science Library

    Hutchins, Susan G; Timmons, Ronald P

    2007-01-01

    Radio Interoperability: The Problem *Superfluous radio transmissions contribute to auditory overload of first responders -Obscure development of an accurate operational picture for all involved -Radio spectrum is a limited commodity once...

  18. A Cultural Framework for the Interoperability of C2 Systems

    National Research Council Canada - National Science Library

    Slay, Jill

    2002-01-01

    In considering some of the difficulties experienced in coalition operations, it becomes apparent that attention is needed, is in establishing a cultural framework for the interoperability of personnel (the human agents...

  19. An Ontological Solution to Support Interoperability in the Textile Industry

    Science.gov (United States)

    Duque, Arantxa; Campos, Cristina; Jiménez-Ruiz, Ernesto; Chalmeta, Ricardo

    Significant developments in information and communication technologies and challenging market conditions have forced enterprises to adapt their way of doing business. In this context, providing mechanisms to guarantee interoperability among heterogeneous organisations has become a critical issue. Even though prolific research has already been conducted in the area of enterprise interoperability, we have found that enterprises still struggle to introduce fully interoperable solutions, especially, in terms of the development and application of ontologies. Thus, the aim of this paper is to introduce basic ontology concepts in a simple manner and to explain the advantages of the use of ontologies to improve interoperability. We will also present a case study showing the implementation of an application ontology for an enterprise in the textile/clothing sector.

  20. ISAIA: Interoperable Systems for Archival Information Access

    Science.gov (United States)

    Hanisch, Robert J.

    2002-01-01

    The ISAIA project was originally proposed in 1999 as a successor to the informal AstroBrowse project. AstroBrowse, which provided a data location service for astronomical archives and catalogs, was a first step toward data system integration and interoperability. The goals of ISAIA were ambitious: '...To develop an interdisciplinary data location and integration service for space science. Building upon existing data services and communications protocols, this service will allow users to transparently query hundreds or thousands of WWW-based resources (catalogs, data, computational resources, bibliographic references, etc.) from a single interface. The service will collect responses from various resources and integrate them in a seamless fashion for display and manipulation by the user.' Funding was approved only for a one-year pilot study, a decision that in retrospect was wise given the rapid changes in information technology in the past few years and the emergence of the Virtual Observatory initiatives in the US and worldwide. Indeed, the ISAIA pilot study was influential in shaping the science goals, system design, metadata standards, and technology choices for the virtual observatory. The ISAIA pilot project also helped to cement working relationships among the NASA data centers, US ground-based observatories, and international data centers. The ISAIA project was formed as a collaborative effort between thirteen institutions that provided data to astronomers, space physicists, and planetary scientists. Among the fruits we ultimately hoped would come from this project would be a central site on the Web that any space scientist could use to efficiently locate existing data relevant to a particular scientific question. Furthermore, we hoped that the needed technology would be general enough to allow smaller, more-focused community within space science could use the same technologies and standards to provide more specialized services. A major challenge to searching

  1. GEOSS interoperability for Weather, Ocean and Water

    Science.gov (United States)

    Richardson, David; Nyenhuis, Michael; Zsoter, Ervin; Pappenberger, Florian

    2013-04-01

    "Understanding the Earth system — its weather, climate, oceans, atmosphere, water, land, geodynamics, natural resources, ecosystems, and natural and human-induced hazards — is crucial to enhancing human health, safety and welfare, alleviating human suffering including poverty, protecting the global environment, reducing disaster losses, and achieving sustainable development. Observations of the Earth system constitute critical input for advancing this understanding." With this in mind, the Group on Earth Observations (GEO) started implementing the Global Earth Observation System of Systems (GEOSS). GEOWOW, short for "GEOSS interoperability for Weather, Ocean and Water", is supporting this objective. GEOWOW's main challenge is to improve Earth observation data discovery, accessibility and exploitability, and to evolve GEOSS in terms of interoperability, standardization and functionality. One of the main goals behind the GEOWOW project is to demonstrate the value of the TIGGE archive in interdisciplinary applications, providing a vast amount of useful and easily accessible information to the users through the GEO Common Infrastructure (GCI). GEOWOW aims at developing funcionalities that will allow easy discovery, access and use of TIGGE archive data and of in-situ observations, e.g. from the Global Runoff Data Centre (GRDC), to support applications such as river discharge forecasting.TIGGE (THORPEX Interactive Grand Global Ensemble) is a key component of THORPEX: a World Weather Research Programme to accelerate the improvements in the accuracy of 1-day to 2 week high-impact weather forecasts for the benefit of humanity. The TIGGE archive consists of ensemble weather forecast data from ten global NWP centres, starting from October 2006, which has been made available for scientific research. The TIGGE archive has been used to analyse hydro-meteorological forecasts of flooding in Europe as well as in China. In general the analysis has been favourable in terms of

  2. Visual Development Environment for Semantically Interoperable Smart Cities Applications

    OpenAIRE

    Roukounaki , Aikaterini; Soldatos , John; Petrolo , Riccardo; Loscri , Valeria; Mitton , Nathalie; Serrano , Martin

    2015-01-01

    International audience; This paper presents an IoT architecture for the semantic interoperability of diverse IoT systems and applications in smart cities. The architecture virtualizes diverse IoT systems and ensures their modelling and representation according to common standards-based IoT ontologies. Furthermore, based on this architecture, the paper introduces a first-of-a-kind visual development environment which eases the development of semantically interoperable applications in smart cit...

  3. Interoperability, Enterprise Architectures, and IT Governance in Government

    OpenAIRE

    Scholl , Hans ,; Kubicek , Herbert; Cimander , Ralf

    2011-01-01

    Part 4: Architecture, Security and Interoperability; International audience; Government represents a unique, and also uniquely complex, environment for interoperation of information systems as well as for integration of workflows and processes across governmental levels and branches. While private-sector organizations by and large have the capacity to implement “enterprise architectures” in a relatively straightforward fashion, for notable reasons governments do not enjoy such luxury. For thi...

  4. Evaluating the Organizational Interoperability Maturity Level in ICT Research Center

    Directory of Open Access Journals (Sweden)

    Manijeh Haghighinasab

    2011-03-01

    Full Text Available Interoperability refers to the ability to provide services and to accept services from other systems or devices. Collaborative enterprises face additional challenges to interoperate seamlessly within a networked organization. The major task here is to assess the maturity level of interoperating organizations. For this purpose the maturity models for enterprise were reviewed based on vendors’ reliability and advantages versus disadvantages. Interoperability maturity model was deduced from ATHENA project as European Integrated Project in 2005, this model named as EIMM was examined in Iran information and Communication Institute as a leading Telecommunication organization. 115 questionnaires were distributed between staff of 4 departments: Information Technology, Communication Technology, Security and Strategic studies regarding six areas of concern: Enterprise Modeling, Business Strategy Process, Organization and Competences, Products and Services, Systems and Technology, Legal Environment, Security and Trust at five maturity levels: Performed, Modeled , Integrated, Interoperable and Optimizing maturity. The findings showed different levels of maturity in this Institute. To achieve Interoperability level, appropriate practices are proposed for promotion to the higher levels.

  5. Regulatory barriers blocking standardization of interoperability.

    Science.gov (United States)

    Zhong, Daidi; Kirwan, Michael J; Duan, Xiaolian

    2013-07-12

    Developing and implementing a set of personal health device interoperability standards is key to cultivating a healthy global industry ecosystem. The standardization organizations, including the Institute of Electrical and Electronics Engineers 11073 Personal Health Device Workgroup (IEEE 11073-PHD WG) and Continua Health Alliance, are striving for this purpose. However, factors like the medial device regulation, health policy, and market reality have placed non-technical barriers over the adoption of technical standards throughout the industry. These barriers have significantly impaired the motivations of consumer device vendors who desire to enter the personal health market and the overall success of personal health industry ecosystem. In this paper, we present the affect that these barriers have placed on the health ecosystem. This requires immediate action from policy makers and other stakeholders. The current regulatory policy needs to be updated to reflect the reality and demand of consumer health industry. Our hope is that this paper will draw wide consensus amongst its readers, policy makers, and other stakeholders.

  6. Interoperable Data Sharing for Diverse Scientific Disciplines

    Science.gov (United States)

    Hughes, John S.; Crichton, Daniel; Martinez, Santa; Law, Emily; Hardman, Sean

    2016-04-01

    For diverse scientific disciplines to interoperate they must be able to exchange information based on a shared understanding. To capture this shared understanding, we have developed a knowledge representation framework using ontologies and ISO level archive and metadata registry reference models. This framework provides multi-level governance, evolves independent of implementation technologies, and promotes agile development, namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. The knowledge representation framework is populated through knowledge acquisition from discipline experts. It is also extended to meet specific discipline requirements. The result is a formalized and rigorous knowledge base that addresses data representation, integrity, provenance, context, quantity, and their relationships within the community. The contents of the knowledge base is translated and written to files in appropriate formats to configure system software and services, provide user documentation, validate ingested data, and support data analytics. This presentation will provide an overview of the framework, present the Planetary Data System's PDS4 as a use case that has been adopted by the international planetary science community, describe how the framework is being applied to other disciplines, and share some important lessons learned.

  7. Recent ARC developments: Through modularity to interoperability

    International Nuclear Information System (INIS)

    Smirnova, O; Cameron, D; Ellert, M; Groenager, M; Johansson, D; Kleist, J; Dobe, P; Joenemo, J; Konya, B; Fraagaat, T; Konstantinov, A; Nilsen, J K; Saada, F Ould; Qiang, W; Read, A; Kocan, M; Marton, I; Nagy, Zs; Moeller, S; Mohn, B

    2010-01-01

    The Advanced Resource Connector (ARC) middleware introduced by NorduGrid is one of the basic Grid solutions used by scientists worldwide. While being well-proven in daily use by a wide variety of scientific applications at large-scale infrastructures like the Nordic DataGrid Facility (NDGF) and smaller scale projects, production ARC of today is still largely based on conventional Grid technologies and custom interfaces introduced a decade ago. In order to guarantee sustainability, true cross-system portability and standards-compliance based interoperability, the ARC community undertakes a massive effort of implementing modular Web Service (WS) approach into the middleware. With support from the EU KnowARC project, new components were introduced and the existing key ARC services got extended with WS technology based standard-compliant interfaces following a service-oriented architecture. Such components include the hosting environment framework, the resource-coupled execution service, the re-engineered client library, the self-healing storage solution and the peer-to-peer information system, to name a few. Gradual introduction of these new services and client tools into the production middleware releases is carried out together with NDGF and thus ensures a smooth transition to the next generation Grid middleware. Standard interfaces and modularity of the new component design are essential for ARC contributions to the planned Universal Middleware Distribution of the European Grid Initiative.

  8. Recent ARC developments: Through modularity to interoperability

    Energy Technology Data Exchange (ETDEWEB)

    Smirnova, O; Cameron, D; Ellert, M; Groenager, M; Johansson, D; Kleist, J [NDGF, Kastruplundsgade 22, DK-2770 Kastrup (Denmark); Dobe, P; Joenemo, J; Konya, B [Lund University, Experimental High Energy Physics, Institute of Physics, Box 118, SE-22100 Lund (Sweden); Fraagaat, T; Konstantinov, A; Nilsen, J K; Saada, F Ould; Qiang, W; Read, A [University of Oslo, Department of Physics, P. O. Box 1048, Blindern, N-0316 Oslo (Norway); Kocan, M [Pavol Jozef Safarik University, Faculty of Science, Jesenna 5, SK-04000 Kosice (Slovakia); Marton, I; Nagy, Zs [NIIF/HUNGARNET, Victor Hugo 18-22, H-1132 Budapest (Hungary); Moeller, S [University of Luebeck, Inst. Of Neuro- and Bioinformatics, Ratzeburger Allee 160, D-23538 Luebeck (Germany); Mohn, B, E-mail: oxana.smirnova@hep.lu.s [Uppsala University, Department of Physics and Astronomy, Div. of Nuclear and Particle Physics, Box 535, SE-75121 Uppsala (Sweden)

    2010-04-01

    The Advanced Resource Connector (ARC) middleware introduced by NorduGrid is one of the basic Grid solutions used by scientists worldwide. While being well-proven in daily use by a wide variety of scientific applications at large-scale infrastructures like the Nordic DataGrid Facility (NDGF) and smaller scale projects, production ARC of today is still largely based on conventional Grid technologies and custom interfaces introduced a decade ago. In order to guarantee sustainability, true cross-system portability and standards-compliance based interoperability, the ARC community undertakes a massive effort of implementing modular Web Service (WS) approach into the middleware. With support from the EU KnowARC project, new components were introduced and the existing key ARC services got extended with WS technology based standard-compliant interfaces following a service-oriented architecture. Such components include the hosting environment framework, the resource-coupled execution service, the re-engineered client library, the self-healing storage solution and the peer-to-peer information system, to name a few. Gradual introduction of these new services and client tools into the production middleware releases is carried out together with NDGF and thus ensures a smooth transition to the next generation Grid middleware. Standard interfaces and modularity of the new component design are essential for ARC contributions to the planned Universal Middleware Distribution of the European Grid Initiative.

  9. The advanced microgrid. Integration and interoperability

    Energy Technology Data Exchange (ETDEWEB)

    Bower, Ward Isaac [Ward Bower Innovations, LLC, Albuquerque, NM (United Staes); Ton, Dan T. [U.S. Dept. of Energy, Washington, DC (United States); Guttromson, Ross [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Glover, Steven F [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Stamp, Jason Edwin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bhatnagar, Dhruv [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Reilly, Jim [Reily Associates, Pittston, PA (United States)

    2014-02-01

    This white paper focuses on "advanced microgrids," but sections do, out of necessity, reference today's commercially available systems and installations in order to clearly distinguish the differences and advances. Advanced microgrids have been identified as being a necessary part of the modern electrical grid through a two DOE microgrid workshops, the National Institute of Standards and Technology, Smart Grid Interoperability Panel and other related sources. With their grid-interconnectivity advantages, advanced microgrids will improve system energy efficiency and reliability and provide enabling technologies for grid-independence to end-user sites. One popular definition that has been evolved and is used in multiple references is that a microgrid is a group of interconnected loads and distributed-energy resources within clearly defined electrical boundaries that acts as a single controllable entity with respect to the grid. A microgrid can connect and disconnect from the grid to enable it to operate in both grid-connected or island-mode. Further, an advanced microgrid can then be loosely defined as a dynamic microgrid.

  10. AliEn - EDG Interoperability in ALICE

    CERN Document Server

    Bagnasco, S; Buncic, P; Carminati, F; Cerello, P G; Saiz, P

    2003-01-01

    AliEn (ALICE Environment) is a GRID-like system for large scale job submission and distributed data management developed and used in the context of ALICE, the CERN LHC heavy-ion experiment. With the aim of exploiting upcoming Grid resources to run AliEn-managed jobs and store the produced data, the problem of AliEn-EDG interoperability was addressed and an in-terface was designed. One or more EDG (European Data Grid) User Interface machines run the AliEn software suite (Cluster Monitor, Storage Element and Computing Element), and act as interface nodes between the systems. An EDG Resource Broker is seen by the AliEn server as a single Computing Element, while the EDG storage is seen by AliEn as a single, large Storage Element; files produced in EDG sites are registered in both the EDG Replica Catalogue and in the AliEn Data Catalogue, thus ensuring accessibility from both worlds. In fact, both registrations are required: the AliEn one is used for the data management, the EDG one to guarantee the integrity and...

  11. Mixed nickel-gallium tellurides Ni{sub 3−x}GaTe{sub 2} as a matrix for incorporating magnetic cations: A Ni{sub 3−x}Fe{sub x}GaTe{sub 2} series

    Energy Technology Data Exchange (ETDEWEB)

    Kuznetsov, Alexey N., E-mail: alexei@inorg.chem.msu.ru [Department of Chemistry, Lomonosov Moscow State University, Leninskie Gory 1-3, GSP-1, 119991 Moscow (Russian Federation); N.S. Kurnakov Institute of General and Inorganic Chemistry, RAS, Leninsky pr. 31, GSP-1, 119991 Moscow (Russian Federation); Stroganova, Ekaterina A.; Zakharova, Elena Yu; Solopchenko, Alexander V.; Sobolev, Alexey V.; Presniakov, Igor A. [Department of Chemistry, Lomonosov Moscow State University, Leninskie Gory 1-3, GSP-1, 119991 Moscow (Russian Federation); Kirdyankin, Denis I.; Novotortsev, Vladimir M. [N.S. Kurnakov Institute of General and Inorganic Chemistry, RAS, Leninsky pr. 31, GSP-1, 119991 Moscow (Russian Federation)

    2017-06-15

    Using a high-temperature ampoule technique, a series of mixed nickel-iron-gallium metal-rich tellurides with layered structures, Ni{sub 3-x}Fe{sub x}GaTe{sub 2}, were prepared and characterized based on X-ray powder diffraction, energy-dispersive spectroscopy, and {sup 57}Fe Mössbauer spectroscopy data. These compounds may be regarded as a result of partial substitution of nickel by iron in the recently reported ternary Ni{sub 3-x}GaTe{sub 2} series, which are based on NiAs/Ni{sub 2}In type of structure. The compositional boundary for the substitution was found to be at x~1. According to the Mössbauer spectroscopy data, the substitution is not statistical, and iron atoms with the increase in x tend to preferentially occupy those nickel positions that are partially vacant in the initial ternary compound. Magnetic measurements data for the Ni{sub 3-x}Fe{sub x}GaTe{sub 2} series show dramatic change in behavior from temperature-independent paramagnetic properties of the initial matrix to a low-temperature (~75 K) ferromagnetic ordering in the Ni{sub 2}FeGaTe{sub 2}. - Graphical abstract: Ordered substitution of nickel by iron in the Ni{sub 3−x}GaTe{sub 2} series leading to ferromagnetic ordering. - Highlights: • A series of Ni{sub 3−x}Fe{sub x}GaTe{sub 2} compounds were synthesized. • They adopt the NiAs/Ni{sub 2}In type of structure with ordered iron distribution. • The distribution of iron was studied using {sup 57}Fe Mössbauer spectroscopy. • An increase in iron content leads to the strong ferromagnetic coupling.

  12. Model and Interoperability using Meta Data Annotations

    Science.gov (United States)

    David, O.

    2011-12-01

    Software frameworks and architectures are in need for meta data to efficiently support model integration. Modelers have to know the context of a model, often stepping into modeling semantics and auxiliary information usually not provided in a concise structure and universal format, consumable by a range of (modeling) tools. XML often seems the obvious solution for capturing meta data, but its wide adoption to facilitate model interoperability is limited by XML schema fragmentation, complexity, and verbosity outside of a data-automation process. Ontologies seem to overcome those shortcomings, however the practical significance of their use remains to be demonstrated. OMS version 3 took a different approach for meta data representation. The fundamental building block of a modular model in OMS is a software component representing a single physical process, calibration method, or data access approach. Here, programing language features known as Annotations or Attributes were adopted. Within other (non-modeling) frameworks it has been observed that annotations lead to cleaner and leaner application code. Framework-supported model integration, traditionally accomplished using Application Programming Interfaces (API) calls is now achieved using descriptive code annotations. Fully annotated components for various hydrological and Ag-system models now provide information directly for (i) model assembly and building, (ii) data flow analysis for implicit multi-threading or visualization, (iii) automated and comprehensive model documentation of component dependencies, physical data properties, (iv) automated model and component testing, calibration, and optimization, and (v) automated audit-traceability to account for all model resources leading to a particular simulation result. Such a non-invasive methodology leads to models and modeling components with only minimal dependencies on the modeling framework but a strong reference to its originating code. Since models and

  13. An Open Source Tool to Test Interoperability

    Science.gov (United States)

    Bermudez, L. E.

    2012-12-01

    Scientists interact with information at various levels from gathering of the raw observed data to accessing portrayed processed quality control data. Geoinformatics tools help scientist on the acquisition, storage, processing, dissemination and presentation of geospatial information. Most of the interactions occur in a distributed environment between software components that take the role of either client or server. The communication between components includes protocols, encodings of messages and managing of errors. Testing of these communication components is important to guarantee proper implementation of standards. The communication between clients and servers can be adhoc or follow standards. By following standards interoperability between components increase while reducing the time of developing new software. The Open Geospatial Consortium (OGC), not only coordinates the development of standards but also, within the Compliance Testing Program (CITE), provides a testing infrastructure to test clients and servers. The OGC Web-based Test Engine Facility, based on TEAM Engine, allows developers to test Web services and clients for correct implementation of OGC standards. TEAM Engine is a JAVA open source facility, available at Sourceforge that can be run via command line, deployed in a web servlet container or integrated in developer's environment via MAVEN. The TEAM Engine uses the Compliance Test Language (CTL) and TestNG to test HTTP requests, SOAP services and XML instances against Schemas and Schematron based assertions of any type of web service, not only OGC services. For example, the OGC Web Feature Service (WFS) 1.0.0 test has more than 400 test assertions. Some of these assertions includes conformance of HTTP responses, conformance of GML-encoded data; proper values for elements and attributes in the XML; and, correct error responses. This presentation will provide an overview of TEAM Engine, introduction of how to test via the OGC Testing web site and

  14. Interoperability Guidelines for Lithuanian E-Learning Management Systems

    Directory of Open Access Journals (Sweden)

    Eugenijus Kurilovas

    2011-08-01

    Full Text Available Purpose – the paper aims to analyse e-learning content and repositories along with the problems of learning organisation interoperability. The main objective of the paper is to analyse scientific research results and the newest international experience in the area and to provide interoperability guidelines and recommendations for the implementation of appropriate Lithuanian state programmes. The learning content and repositories recommendations are designed for the implementation of the Lithuanian education portal project as well as Lithuanian Virtual University (LVU programme’s information services’ (LABT / eLABa and e-learning services’ (LieDM sub-programmes. The whole education institution recommendations are designed for the maintenance and development of LVU programme’s management services’ (LieMSIS system.Design/methodology/approach – methods used for the general analysis of proposed interoperability guidelines (reccomendations were bibliographic research and comparative analysis of Lithuanian and foreign scientific works published in periodicals and large-scale EU-funded interoperability projects deliverables. System analysis and comparative analysis methods were used in order to formulate and analyse systems’ interoperability guidelines and recommendations. The author employed the experimental research method while working in the appropriate EU-funded interoperability projects to form the guidelines (recommendations. In order to summarize the results, the evaluative research method was used..Findings – the international guidelines and recommendations presented in the paper could be suitable for implementation while developing Lithuanian state education information systems such as the Lithuanian education portal, the Lithuanian academic libraries’ (eLABa system, the Lithuanian distance learning system (LieDM, and the Lithuanian universities’ management system (LieMSIS.Research limitations/implications – the paper

  15. Interoperability Guidelines for Lithuanian E-Learning Management Systems

    Directory of Open Access Journals (Sweden)

    Eugenijus Kurilovas

    2013-08-01

    Full Text Available Purpose – the paper aims to analyse e-learning content and repositories along with the problems of learning organisation interoperability. The main objective of the paper is to analyse scientific research results and the newest international experience in the area and to provide interoperability guidelines and recommendations for the implementation of appropriate Lithuanian state programmes. The learning content and repositories recommendations are designed for the implementation of the Lithuanian education portal project as well as Lithuanian Virtual University (LVU programme’s information services’ (LABT / eLABa and e-learning services’ (LieDM sub-programmes. The whole education institution recommendations are designed for the maintenance and development of LVU programme’s management services’ (LieMSIS system.Design/methodology/approach – methods used for the general analysis of proposed interoperability guidelines (reccomendations were bibliographic research and comparative analysis of Lithuanian and foreign scientific works published in periodicals and large-scale EU-funded interoperability projects deliverables. System analysis and comparative analysis methods were used in order to formulate and analyse systems’ interoperability guidelines and recommendations. The author employed the experimental research method while working in the appropriate EU-funded interoperability projects to form the guidelines (recommendations. In order to summarize the results, the evaluative research method was used.Findings – the international guidelines and recommendations presented in the paper could be suitable for implementation while developing Lithuanian state education information systems such as the Lithuanian education portal, the Lithuanian academic libraries’ (eLABa system, the Lithuanian distance learning system (LieDM, and the Lithuanian universities’ management system (LieMSIS.Research limitations/implications – the paper

  16. Semantic Interoperability in Heterogeneous IoT Infrastructure for Healthcare

    Directory of Open Access Journals (Sweden)

    Sohail Jabbar

    2017-01-01

    Full Text Available Interoperability remains a significant burden to the developers of Internet of Things’ Systems. This is due to the fact that the IoT devices are highly heterogeneous in terms of underlying communication protocols, data formats, and technologies. Secondly due to lack of worldwide acceptable standards, interoperability tools remain limited. In this paper, we proposed an IoT based Semantic Interoperability Model (IoT-SIM to provide Semantic Interoperability among heterogeneous IoT devices in healthcare domain. Physicians communicate their patients with heterogeneous IoT devices to monitor their current health status. Information between physician and patient is semantically annotated and communicated in a meaningful way. A lightweight model for semantic annotation of data using heterogeneous devices in IoT is proposed to provide annotations for data. Resource Description Framework (RDF is a semantic web framework that is used to relate things using triples to make it semantically meaningful. RDF annotated patients’ data has made it semantically interoperable. SPARQL query is used to extract records from RDF graph. For simulation of system, we used Tableau, Gruff-6.2.0, and Mysql tools.

  17. Interoperable Cloud Networking for intelligent power supply; Interoperables Cloud Networking fuer intelligente Energieversorgung

    Energy Technology Data Exchange (ETDEWEB)

    Hardin, Dave [Invensys Operations Management, Foxboro, MA (United States)

    2010-09-15

    Intelligent power supply by a so-called Smart Grid will make it possible to control consumption by market-based pricing and signals for load reduction. This necessitates that both the energy rates and the energy information are distributed reliably and in real time to automation systems in domestic and other buildings and in industrial plants over a wide geographic range and across the most varied grid infrastructures. Effective communication at this level of complexity necessitates computer and grid resources that are normally only available in the computer centers of big industries. The cloud computing technology, which is described here in some detail, has all features to provide reliability, interoperability and efficiency for large-scale smart grid applications, at lower cost than traditional computer centers. (orig.)

  18. OGC and Grid Interoperability in enviroGRIDS Project

    Science.gov (United States)

    Gorgan, Dorian; Rodila, Denisa; Bacu, Victor; Giuliani, Gregory; Ray, Nicolas

    2010-05-01

    EnviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is a 4-years FP7 Project aiming to address the subjects of ecologically unsustainable development and inadequate resource management. The project develops a Spatial Data Infrastructure of the Black Sea Catchment region. The geospatial technologies offer very specialized functionality for Earth Science oriented applications as well as the Grid oriented technology that is able to support distributed and parallel processing. One challenge of the enviroGRIDS project is the interoperability between geospatial and Grid infrastructures by providing the basic and the extended features of the both technologies. The geospatial interoperability technology has been promoted as a way of dealing with large volumes of geospatial data in distributed environments through the development of interoperable Web service specifications proposed by the Open Geospatial Consortium (OGC), with applications spread across multiple fields but especially in Earth observation research. Due to the huge volumes of data available in the geospatial domain and the additional introduced issues (data management, secure data transfer, data distribution and data computation), the need for an infrastructure capable to manage all those problems becomes an important aspect. The Grid promotes and facilitates the secure interoperations of geospatial heterogeneous distributed data within a distributed environment, the creation and management of large distributed computational jobs and assures a security level for communication and transfer of messages based on certificates. This presentation analysis and discusses the most significant use cases for enabling the OGC Web services interoperability with the Grid environment and focuses on the description and implementation of the most promising one. In these use cases we give a special attention to issues such as: the relations between computational grid and

  19. Metadata behind the Interoperability of Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Miguel Angel Manso Callejo

    2009-05-01

    Full Text Available Wireless Sensor Networks (WSNs produce changes of status that are frequent, dynamic and unpredictable, and cannot be represented using a linear cause-effect approach. Consequently, a new approach is needed to handle these changes in order to support dynamic interoperability. Our approach is to introduce the notion of context as an explicit representation of changes of a WSN status inferred from metadata elements, which in turn, leads towards a decision-making process about how to maintain dynamic interoperability. This paper describes the developed context model to represent and reason over different WSN status based on four types of contexts, which have been identified as sensing, node, network and organisational contexts. The reasoning has been addressed by developing contextualising and bridges rules. As a result, we were able to demonstrate how contextualising rules have been used to reason on changes of WSN status as a first step towards maintaining dynamic interoperability.

  20. Plugfest 2009: Global Interoperability in Telerobotics and Telemedicine.

    Science.gov (United States)

    King, H Hawkeye; Hannaford, Blake; Kwok, Ka-Wai; Yang, Guang-Zhong; Griffiths, Paul; Okamura, Allison; Farkhatdinov, Ildar; Ryu, Jee-Hwan; Sankaranarayanan, Ganesh; Arikatla, Venkata; Tadano, Kotaro; Kawashima, Kenji; Peer, Angelika; Schauß, Thomas; Buss, Martin; Miller, Levi; Glozman, Daniel; Rosen, Jacob; Low, Thomas

    2010-05-07

    Despite the great diversity of teleoperator designs and applications, their underlying control systems have many similarities. These similarities can be exploited to enable inter-operability between heterogeneous systems. We have developed a network data specification, the Interoperable Telerobotics Protocol, that can be used for Internet based control of a wide range of teleoperators. In this work we test interoperable telerobotics on the global Internet, focusing on the telesurgery application domain. Fourteen globally dispersed telerobotic master and slave systems were connected in thirty trials in one twenty four hour period. Users performed common manipulation tasks to demonstrate effective master-slave operation. With twenty eight (93%) successful, unique connections the results show a high potential for standardizing telerobotic operation. Furthermore, new paradigms for telesurgical operation and training are presented, including a networked surgery trainer and upper-limb exoskeleton control of micro-manipulators.

  1. An Interoperability Framework and Capability Profiling for Manufacturing Software

    Science.gov (United States)

    Matsuda, M.; Arai, E.; Nakano, N.; Wakai, H.; Takeda, H.; Takata, M.; Sasaki, H.

    ISO/TC184/SC5/WG4 is working on ISO16100: Manufacturing software capability profiling for interoperability. This paper reports on a manufacturing software interoperability framework and a capability profiling methodology which were proposed and developed through this international standardization activity. Within the context of manufacturing application, a manufacturing software unit is considered to be capable of performing a specific set of function defined by a manufacturing software system architecture. A manufacturing software interoperability framework consists of a set of elements and rules for describing the capability of software units to support the requirements of a manufacturing application. The capability profiling methodology makes use of the domain-specific attributes and methods associated with each specific software unit to describe capability profiles in terms of unit name, manufacturing functions, and other needed class properties. In this methodology, manufacturing software requirements are expressed in terns of software unit capability profiles.

  2. An Architecture for Semantically Interoperable Electronic Health Records.

    Science.gov (United States)

    Toffanello, André; Gonçalves, Ricardo; Kitajima, Adriana; Puttini, Ricardo; Aguiar, Atualpa

    2017-01-01

    Despite the increasing adhesion of electronic health records, the challenge of semantic interoperability remains unsolved. The fact that different parties can exchange messages does not mean they can understand the underlying clinical meaning, therefore, it cannot be assumed or treated as a requirement. This work introduces an architecture designed to achieve semantic interoperability, in a way which organizations that follow different policies may still share medical information through a common infrastructure comparable to an ecosystem, whose organisms are exemplified within the Brazilian scenario. Nonetheless, the proposed approach describes a service-oriented design with modules adaptable to different contexts. We also discuss the establishment of an enterprise service bus to mediate a health infrastructure defined on top of international standards, such as openEHR and IHE. Moreover, we argue that, in order to achieve truly semantic interoperability in a wide sense, a proper profile must be published and maintained.

  3. Dynamic Business Networks: A Headache for Sustainable Systems Interoperability

    Science.gov (United States)

    Agostinho, Carlos; Jardim-Goncalves, Ricardo

    Collaborative networked environments emerged with the spread of the internet, contributing to overcome past communication barriers, and identifying interoperability as an essential property. When achieved seamlessly, efficiency is increased in the entire product life cycle. Nowadays, most organizations try to attain interoperability by establishing peer-to-peer mappings with the different partners, or in optimized networks, by using international standard models as the core for information exchange. In current industrial practice, mappings are only defined once, and the morphisms that represent them, are hardcoded in the enterprise systems. This solution has been effective for static environments, where enterprise and product models are valid for decades. However, with an increasingly complex and dynamic global market, models change frequently to answer new customer requirements. This paper draws concepts from the complex systems science and proposes a framework for sustainable systems interoperability in dynamic networks, enabling different organizations to evolve at their own rate.

  4. Interoperable and standard e-Health solution over Bluetooth.

    Science.gov (United States)

    Martinez, I; Del Valle, P; Munoz, P; Trigo, J D; Escayola, J; Martínez-Espronceda, M; Muñoz, A; Serrano, L; Garcia, J

    2010-01-01

    The new paradigm of e-Health demands open sensors and middleware components that permit transparent integration and end-to-end interoperability of new personal health devices. The use of standards seems to be the internationally adopted way to solve these problems. This paper presents the implementation of an end-to-end standards-based e-Health solution. This includes ISO/IEEE11073 standard for the interoperability of the medical devices in the patient environment and EN13606 standard for the interoperable exchange of the Electronic Healthcare Record. The design strictly fulfills all the technical features of the most recent versions of both standards. The implemented prototype has been tested in a laboratory environment to demonstrate its feasibility for its further transfer to the healthcare system.

  5. Interoperable eHealth Platform for Personalized Smart Services

    DEFF Research Database (Denmark)

    Mihaylov, Mihail Rumenov; Mihovska, Albena Dimitrova; Kyriazakos, Sofoklis

    2015-01-01

    personalized context-aware applications to serve the user's needs. This paper proposes the use of advised sensing, context-aware and cloud-based lifestyle reasoning to design an innovative eHealth platform that supports highly personalized smart services to primary users. The architecture of the platform has...... been designed in accordance with the interoperability requirements and standards as proposed by ITU-T and Continua Alliance. In particular, we define the interface dependencies and functional requirements needed, to allow eCare and eHealth vendors to manufacture interoperable sensors, ambient and home...

  6. Interoperable Archetypes With a Three Folded Terminology Governance.

    Science.gov (United States)

    Pederson, Rune; Ellingsen, Gunnar

    2015-01-01

    The use of openEHR archetypes increases the interoperability of clinical terminology, and in doing so improves upon the availability of clinical terminology for both primary and secondary purposes. Where clinical terminology is employed in the EPR system, research reports conflicting a results for the use of structuring and standardization as measurements of success. In order to elucidate this concept, this paper focuses on the effort to establish a national repository for openEHR based archetypes in Norway where clinical terminology could be included with benefit for interoperability three folded.

  7. Interoperation of World-Wide Production e-Science Infrastructures

    CERN Document Server

    Riedel, M; Soddemann, T; Field, L; Navarro, JP; Casey, J; Litmaath, M; Baud, J; Koblitz, B; Catlett, C; Skow, D; Wang, S; Saeki, Y; Sato, H; Matsuoka, S; Geddes, N

    Many production Grid and e-Science infrastructures have begun to offer services to end-users during the past several years with an increasing number of scientific applications that require access to a wide variety of resources and services in multiple Grids. Therefore, the Grid Interoperation Now—Community Group of the Open Grid Forum—organizes and manages interoperation efforts among those production Grid infrastructures to reach the goal of a world-wide Grid vision on a technical level in the near future. This contribution highlights fundamental approaches of the group and discusses open standards in the context of production e-Science infrastructures.

  8. Improved semantic interoperability for content reuse through knowledge organization systems

    Directory of Open Access Journals (Sweden)

    José Antonio Moreiro González

    2012-04-01

    Full Text Available The Knowledge Organization Systems (KOS are resources designed to improve the knowledge interoperability, management and retrieval. As increases the web resources, it’s evidenced the lack of KOS, with the consequent impact in the resources interoperability. The KOSS are, by definition, complicated and costly tools, so much in his creation as in his management. The reuse of similar organizational structures is a necessary element in this context. They analyses experiences of reuse of The KOS and signals like the new standards are impinged on this appearance.

  9. Requirements for and barriers towards interoperable ehealth technology in primary care

    NARCIS (Netherlands)

    Oude Nijeweme-d'Hollosy, Wendeline; van Velsen, Lex Stefan; Huygens, Martine; Hermens, Hermanus J.

    Despite eHealth technology's rapid growth, eHealth applications are rarely embedded within primary care, mostly because systems lack interoperability. This article identifies requirements for, and barriers towards, interoperable eHealth technology from healthcare professionals' perspective -- the

  10. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation (presentation)

    Science.gov (United States)

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  11. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation.

    Science.gov (United States)

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  12. ImageJ-MATLAB: a bidirectional framework for scientific image analysis interoperability.

    Science.gov (United States)

    Hiner, Mark C; Rueden, Curtis T; Eliceiri, Kevin W

    2017-02-15

    ImageJ-MATLAB is a lightweight Java library facilitating bi-directional interoperability between MATLAB and ImageJ. By defining a standard for translation between matrix and image data structures, researchers are empowered to select the best tool for their image-analysis tasks. Freely available extension to ImageJ2 ( http://imagej.net/Downloads ). Installation and use instructions available at http://imagej.net/MATLAB_Scripting. Tested with ImageJ 2.0.0-rc-54 , Java 1.8.0_66 and MATLAB R2015b. eliceiri@wisc.edu. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  13. Special topic interoperability and EHR: Combining openEHR, SNOMED, IHE, and continua as approaches to interoperability on national ehealth

    DEFF Research Database (Denmark)

    Bestek, M.; Stanimirovi, D.

    2017-01-01

    into the eHealth environment, and identification of the main success factors in the field, which are necessary for achieving required interoperability, and consequently, for the successful implementation of eHealth projects in general. Methods: The paper represents an in-depth analysis regarding...... the potential application of openEHR, SNOMED, IHE and Continua approaches in the development and implementation process of eHealth in Slovenia. The research method used is both exploratory and deductive in nature. The methodological framework is grounded on information retrieval with a special focus on research...... could alleviate the pertinent interoperability issues in the Slovenian eHealth context. 2. Analyzing the possibilities (requirements) for their inclusion in the construction process for individual eHealth solutions. 3. Identification and charting the main success factors in the interoperability field...

  14. Architectures for the Development of the National Interoperability Framework in Romania

    Directory of Open Access Journals (Sweden)

    Codrin-Florentin NISIOIU

    2015-10-01

    Full Text Available The authors of Digital Agenda consider that Europe do not take fully advantage of interoperability. They believe that we need effective interoperability between IT products and services to build a truly Digital Society. The Digital Agenda can only be effective if all the elements and applications are interoperable and based on open standards and platforms. In this context, I propose in this article a specific architecture for developing Romanian National Interoperability framework.

  15. Medical Device Plug-and-Play Interoperability Standards and Technology Leadership

    Science.gov (United States)

    2017-10-01

    Award Number: W81XWH-09-1-0705 TITLE: “Medical Device Plug-and-Play Interoperability Standards and Technology Leadership” PRINCIPAL INVESTIGATOR...Sept 2016 – 20 Sept 2017 4. TITLE AND SUBTITLE “Medical Device Plug-and-Play Interoperability 5a. CONTRACT NUMBER Standards and Technology ...efficiency through interoperable medical technologies . We played a leadership role on interoperability safety standards (AAMI, AAMI/UL Joint

  16. An Interoperability Platform Enabling Reuse of Electronic Health Records for Signal Verification Studies

    Directory of Open Access Journals (Sweden)

    Mustafa Yuksel

    2016-01-01

    Full Text Available Depending mostly on voluntarily sent spontaneous reports, pharmacovigilance studies are hampered by low quantity and quality of patient data. Our objective is to improve postmarket safety studies by enabling safety analysts to seamlessly access a wide range of EHR sources for collecting deidentified medical data sets of selected patient populations and tracing the reported incidents back to original EHRs. We have developed an ontological framework where EHR sources and target clinical research systems can continue using their own local data models, interfaces, and terminology systems, while structural interoperability and Semantic Interoperability are handled through rule-based reasoning on formal representations of different models and terminology systems maintained in the SALUS Semantic Resource Set. SALUS Common Information Model at the core of this set acts as the common mediator. We demonstrate the capabilities of our framework through one of the SALUS safety analysis tools, namely, the Case Series Characterization Tool, which have been deployed on top of regional EHR Data Warehouse of the Lombardy Region containing about 1 billion records from 16 million patients and validated by several pharmacovigilance researchers with real-life cases. The results confirm significant improvements in signal detection and evaluation compared to traditional methods with the missing background information.

  17. Pemanfaatan Google API Untuk Model Interoperability Web Berbasis PHP Dengan Google Drive

    OpenAIRE

    Sumiari, Ni Kadek

    2015-01-01

    Dalam sebuah website tercapinya interoperability suatu system sangatlah penting. Penggunaan database berbasis Mysql, Sql Server ataupun oracle memang sudah sangat lumrah dipergunakan dalam sebuah system berbasis website. Namun penggunaan database tersebut tidak bisa menjamin apakah interoperability dari system tersebut dapat tercapai. Selain dari keamanan data dari segi implementasi system juga cukup sulit. Salah satu solusi dalam mencapi interoperability suatu system berbasis website adalah...

  18. Interoperability of Services in an Open Broadband Market : Cases from the Netherlands

    NARCIS (Netherlands)

    Burgmeijer, J.

    2006-01-01

    End-to-end interoperability of broadband services and networks is a condition for an open broadband market. A business model for broadband service interoperability is given. Two cases from the Netherlands, of initiatives from the market to reach interoperability, are presented: E-norm and FIST VoIP.

  19. Datacube Interoperability, Encoding Independence, and Analytics

    Science.gov (United States)

    Baumann, Peter; Hirschorn, Eric; Maso, Joan

    2017-04-01

    Datacubes are commonly accepted as an enabling paradigm which provides a handy abstraction for accessing and analyzing the zillions of image files delivered by the manifold satellite instruments and climate simulations, among others. Additionally, datacubes are the classic model for statistical and OLAP datacubes, so a further information category can be integrated. From a standards perspective, spatio-temporal datacubes naturally are included in the concept of coverages which encompass regular and irregular grids, point clouds, and general meshes - or, more abstractly, digital representations of spatio-temporally varying phenomena. ISO 19123, which is identical to OGC Abstract Topic 6, gives a high-level abstract definition which is complemented by the OGC Coverage Implementation Schema (CIS) which is an interoperable, yet format independent concretization of the abstract model. Currently, ISO is working on adopting OGC CIS as ISO 19123-2; the existing ISO 19123 standard is under revision by one of the abstract authors and will become ISO 19123-1. The roadmap agreed by ISO further foresees adoption of the OGC Web Coverage Service (WCS) as an ISO standard so that a complete data and service model will exist. In 2016, INSPIRE has adopted WCS as Coverage Download Service, including the datacube analytics language Web Coverage Processing Service (WCPS). The rasdaman technology (www.rasdaman.org) is both OGC and INSPIRE Reference Implementation. In the global EarthServer initiative rasdaman database sizes are exceeding 250 TB today, heading for the Petabyte frontier well in 2017. Technically, CIS defines a compact, efficient model for representing multi-dimensional datacubes in several ways. The classical coverage cube defines a domain set (where are values?), a range set (what are these values?), and range type (what do the values mean?), as well as a "bag" for arbitrary metadata. With CIS 1.1, coordinate/value pair sequences have been added, as well as tiled

  20. Matrix theory

    CERN Document Server

    Franklin, Joel N

    2003-01-01

    Mathematically rigorous introduction covers vector and matrix norms, the condition-number of a matrix, positive and irreducible matrices, much more. Only elementary algebra and calculus required. Includes problem-solving exercises. 1968 edition.

  1. Waveform Diversity and Design for Interoperating Radar Systems

    Science.gov (United States)

    2013-01-01

    University Di Pisa Department Di Ingegneria Dell Informazione Elettronica, Informatica , Telecomunicazioni Via Girolamo Caruso 16 Pisa, Italy 56122...NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) University Di Pisa Department Di Ingegneria Dell Informazione Elettronica, Informatica ...DIPARTIMENTO DI INGEGNERIA DELL’INFORMAZIONE ELETTRONICA, INFORMATICA , TELECOMUNICAZIONI WAVEFORM DIVERSITY AND DESIGN FOR INTEROPERATING

  2. Managing Uncertainty: The Road Towards Better Data Interoperability

    NARCIS (Netherlands)

    Herschel, M.; van Keulen, Maurice

    Data interoperability encompasses the many data management activities needed for effective information management in anyone´s or any organization´s everyday work such as data cleaning, coupling, fusion, mapping, and information extraction. It is our conviction that a significant amount of money and

  3. Look who's talking. A guide to interoperability groups and resources.

    Science.gov (United States)

    2011-06-01

    There are huge challenges in getting medical devices to communicate with other devices and to information systems. Fortunately, a number of groups have emerged to help hospitals cope. Here's a description of the most prominent ones, including useful web links for each. We also discuss the latest and most pertinent interoperability standards.

  4. The Role of Markup for Enabling Interoperability in Health Informatics

    Directory of Open Access Journals (Sweden)

    Steve eMckeever

    2015-05-01

    Full Text Available Interoperability is the faculty of making information systems work together. In this paper we will distinguish a number of different forms that interoperability can take and show how they are realised on a variety of physiological and health care use cases. The last fifteen years has seen the rise of very cheap digital storage both on and off cite. With the advent of the 'Internet of Things' people's expectations are for greater interconnectivity and seamless interoperability. The potential impact these technologies have on healthcare are dramatic: from improved diagnoses through immediate access to a patient's electronic health record, to 'in silico' modeling of organs and early stage drug trials, to predictive medicine based on top-down modeling of disease progression and treatment. We will begin by looking at the underlying technology, classify the various kinds of interoperability that exist in the field, and discuss how they are realised. We conclude with a discussion on future possibilities that big data and further standardizations will enable.

  5. A development framework for semantically interoperable health information systems.

    Science.gov (United States)

    Lopez, Diego M; Blobel, Bernd G M E

    2009-02-01

    Semantic interoperability is a basic challenge to be met for new generations of distributed, communicating and co-operating health information systems (HIS) enabling shared care and e-Health. Analysis, design, implementation and maintenance of such systems and intrinsic architectures have to follow a unified development methodology. The Generic Component Model (GCM) is used as a framework for modeling any system to evaluate and harmonize state of the art architecture development approaches and standards for health information systems as well as to derive a coherent architecture development framework for sustainable, semantically interoperable HIS and their components. The proposed methodology is based on the Rational Unified Process (RUP), taking advantage of its flexibility to be configured for integrating other architectural approaches such as Service-Oriented Architecture (SOA), Model-Driven Architecture (MDA), ISO 10746, and HL7 Development Framework (HDF). Existing architectural approaches have been analyzed, compared and finally harmonized towards an architecture development framework for advanced health information systems. Starting with the requirements for semantic interoperability derived from paradigm changes for health information systems, and supported in formal software process engineering methods, an appropriate development framework for semantically interoperable HIS has been provided. The usability of the framework has been exemplified in a public health scenario.

  6. Interoperability, Scaling, and the Digital Libraries Research Agenda.

    Science.gov (United States)

    Lynch, Clifford; Garcia-Molina, Hector

    1996-01-01

    Summarizes reports and activities at the Information Infrastructure Technology and Applications workshop on digital libraries (Reston, Virginia, August 22, 1995). Defines digital library roles and identifies areas of needed research, including: interoperability; protocols for digital objects; collection management; interface design; human-computer…

  7. Information and documentation - Thesauri and interoperability with other vocabularies

    DEFF Research Database (Denmark)

    Lykke, Marianne; Dalbin, Sylvie; Smedt, Johan De

    ISO 25964-2:2013 is applicable to thesauri and other types of vocabulary that are commonly used for information retrieval. It describes, compares and contrasts the elements and features of these vocabularies that are implicated when interoperability is needed. It gives recommendations for the est...

  8. Design of large-scale enterprise interoperable value webs

    NARCIS (Netherlands)

    Hofman, W.J.

    2011-01-01

    Still a lot of enterprises are faced with the issue of interoperability. Whereas large enterprises are able to implement the required technology, SMEs (Small and Medium sized Enterprises) face challenges as they lack knowledge and budget. Enterprises have defined their specific semantics and

  9. Ontologies for interaction : enabling serendipitous interoperability in smart environments

    NARCIS (Netherlands)

    Niezen, G.

    2012-01-01

    The thesis describes the design and development of an ontology and software framework to support user interaction in ubiquitous computing scenarios. The key goal of ubiquitous computing is "serendipitous interoperability", where devices that were not necessarily designed to work together should be

  10. The next generation of interoperability agents in healthcare.

    Science.gov (United States)

    Cardoso, Luciana; Marins, Fernando; Portela, Filipe; Santos, Manuel; Abelha, António; Machado, José

    2014-05-16

    Interoperability in health information systems is increasingly a requirement rather than an option. Standards and technologies, such as multi-agent systems, have proven to be powerful tools in interoperability issues. In the last few years, the authors have worked on developing the Agency for Integration, Diffusion and Archive of Medical Information (AIDA), which is an intelligent, agent-based platform to ensure interoperability in healthcare units. It is increasingly important to ensure the high availability and reliability of systems. The functions provided by the systems that treat interoperability cannot fail. This paper shows the importance of monitoring and controlling intelligent agents as a tool to anticipate problems in health information systems. The interaction between humans and agents through an interface that allows the user to create new agents easily and to monitor their activities in real time is also an important feature, as health systems evolve by adopting more features and solving new problems. A module was installed in Centro Hospitalar do Porto, increasing the functionality and the overall usability of AIDA.

  11. Enterprise interoperability with SOA: a survey of service composition approaches

    NARCIS (Netherlands)

    Mantovaneli Pessoa, Rodrigo; Goncalves da Silva, Eduardo; van Sinderen, Marten J.; Quartel, Dick; Ferreira Pires, Luis

    Service-oriented architecture (SOA) claims to facilitate the construction of flexible and loosely coupled business applications, and therefore is seen as an enabling factor for enterprise interoperability. The concept of service, which is central to SOA, is very convenient to address the matching of

  12. Towards Cross-Organizational Innovative Business Process Interoperability Services

    Science.gov (United States)

    Karacan, Ömer; Del Grosso, Enrico; Carrez, Cyril; Taglino, Francesco

    This paper presents the vision and initial results of the COIN (FP7-IST-216256) European project for the development of open source Collaborative Business Process Interoperability (CBPip) in cross-organisational business collaboration environments following the Software-as-a-Service Utility (SaaS-U) paradigm.

  13. The role of markup for enabling interoperability in health informatics.

    Science.gov (United States)

    McKeever, Steve; Johnson, David

    2015-01-01

    Interoperability is the faculty of making information systems work together. In this paper we will distinguish a number of different forms that interoperability can take and show how they are realized on a variety of physiological and health care use cases. The last 15 years has seen the rise of very cheap digital storage both on and off site. With the advent of the Internet of Things people's expectations are for greater interconnectivity and seamless interoperability. The potential impact these technologies have on healthcare are dramatic: from improved diagnoses through immediate access to a patient's electronic health record, to in silico modeling of organs and early stage drug trials, to predictive medicine based on top-down modeling of disease progression and treatment. We will begin by looking at the underlying technology, classify the various kinds of interoperability that exist in the field, and discuss how they are realized. We conclude with a discussion on future possibilities that big data and further standardizations will enable.

  14. The MADE reference information model for interoperable pervasive telemedicine systems

    NARCIS (Netherlands)

    Fung, L.S.N.; Jones, Valerie M.; Hermens, Hermanus J.

    2017-01-01

    Objectives: The main objective is to develop and validate a reference information model (RIM) to support semantic interoperability of pervasive telemedicine systems. The RIM is one component within a larger, computer-interpretable "MADE language" developed by the authors in the context of the

  15. The Next Generation of Interoperability Agents in Healthcare

    Directory of Open Access Journals (Sweden)

    Luciana Cardoso

    2014-05-01

    Full Text Available Interoperability in health information systems is increasingly a requirement rather than an option. Standards and technologies, such as multi-agent systems, have proven to be powerful tools in interoperability issues. In the last few years, the authors have worked on developing the Agency for Integration, Diffusion and Archive of Medical Information (AIDA, which is an intelligent, agent-based platform to ensure interoperability in healthcare units. It is increasingly important to ensure the high availability and reliability of systems. The functions provided by the systems that treat interoperability cannot fail. This paper shows the importance of monitoring and controlling intelligent agents as a tool to anticipate problems in health information systems. The interaction between humans and agents through an interface that allows the user to create new agents easily and to monitor their activities in real time is also an important feature, as health systems evolve by adopting more features and solving new problems. A module was installed in Centro Hospitalar do Porto, increasing the functionality and the overall usability of AIDA.

  16. 47 CFR 0.192 - Emergency Response Interoperability Center.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Emergency Response Interoperability Center. 0.192 Section 0.192 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL COMMISSION ORGANIZATION..., industry representatives, and service providers. [75 FR 28207, May 20, 2010] ...

  17. ICD-11 (JLMMS) and SCT Inter-Operation.

    Science.gov (United States)

    Mamou, Marzouk; Rector, Alan; Schulz, Stefan; Campbell, James; Solbrig, Harold; Rodrigues, Jean-Marie

    2016-01-01

    The goal of this work is to contribute to a smooth and semantically sound inter-operability between the ICD-11 (International Classification of Diseases-11th revision Joint Linearization for Mortality, Morbidity and Statistics) and SNOMED CT (SCT). To guarantee such inter-operation between a classification, characterized by a single hierarchy of mutually exclusive and exhaustive classes, as is the JLMMS successor of ICD-10 on the one hand, and the multi-hierarchical, ontology-based clinical terminology SCT on the other hand, we use ontology axioms that logically express generalizable truths. This is expressed by the compositional grammar of SCT, together with queries on axiomsof SCT. We test the feasibility of the method on the circulatory chapter of ICD-11 JLMMS and present limitations and results.

  18. Interoperability And Value Added To Earth Observation Data

    Science.gov (United States)

    Gasperi, J.

    2012-04-01

    Geospatial web services technology has provided a new means for geospatial data interoperability. Open Geospatial Consortium (OGC) services such as Web Map Service (WMS) to request maps on the Internet, Web Feature Service (WFS) to exchange vectors or Catalog Service for the Web (CSW) to search for geospatialized data have been widely adopted in the Geosciences community in general and in the remote sensing community in particular. These services make Earth Observation data available to a wider range of public users than ever before. The mapshup web client offers an innovative and efficient user interface that takes advantage of the power of interoperability. This presentation will demonstrate how mapshup can be effectively used in the context of natural disasters management.

  19. Interoperable mesh and geometry tools for advanced petascale simulations

    International Nuclear Information System (INIS)

    Diachin, L; Bauer, A; Fix, B; Kraftcheck, J; Jansen, K; Luo, X; Miller, M; Ollivier-Gooch, C; Shephard, M S; Tautges, T; Trease, H

    2007-01-01

    SciDAC applications have a demonstrated need for advanced software tools to manage the complexities associated with sophisticated geometry, mesh, and field manipulation tasks, particularly as computer architectures move toward the petascale. The Center for Interoperable Technologies for Advanced Petascale Simulations (ITAPS) will deliver interoperable and interchangeable mesh, geometry, and field manipulation services that are of direct use to SciDAC applications. The premise of our technology development goal is to provide such services as libraries that can be used with minimal intrusion into application codes. To develop these technologies, we focus on defining a common data model and data-structure neutral interfaces that unify a number of different services such as mesh generation and improvement, front tracking, adaptive mesh refinement, shape optimization, and solution transfer operations. We highlight the use of several ITAPS services in SciDAC applications

  20. Modelling and approaching pragmatic interoperability of distributed geoscience data

    Science.gov (United States)

    Ma, Xiaogang

    2010-05-01

    Interoperability of geodata, which is essential for sharing information and discovering insights within a cyberinfrastructure, is receiving increasing attention. A key requirement of interoperability in the context of geodata sharing is that data provided by local sources can be accessed, decoded, understood and appropriately used by external users. Various researchers have discussed that there are four levels in data interoperability issues: system, syntax, schematics and semantics, which respectively relate to the platform, encoding, structure and meaning of geodata. Ontology-driven approaches have been significantly studied addressing schematic and semantic interoperability issues of geodata in the last decade. There are different types, e.g. top-level ontologies, domain ontologies and application ontologies and display forms, e.g. glossaries, thesauri, conceptual schemas and logical theories. Many geodata providers are maintaining their identified local application ontologies in order to drive standardization in local databases. However, semantic heterogeneities often exist between these local ontologies, even though they are derived from equivalent disciplines. In contrast, common ontologies are being studied in different geoscience disciplines (e.g., NAMD, SWEET, etc.) as a standardization procedure to coordinate diverse local ontologies. Semantic mediation, e.g. mapping between local ontologies, or mapping local ontologies to common ontologies, has been studied as an effective way of achieving semantic interoperability between local ontologies thus reconciling semantic heterogeneities in multi-source geodata. Nevertheless, confusion still exists in the research field of semantic interoperability. One problem is caused by eliminating elements of local pragmatic contexts in semantic mediation. Comparing to the context-independent feature of a common domain ontology, local application ontologies are closely related to elements (e.g., people, time, location

  1. Building Future Transatlantic Interoperability Around a Robust NATO Response Force

    Science.gov (United States)

    2012-10-01

    than already traveled . However, this accrued wealth of interoperable capa- bility may be at its apogee, soon to decline as the result of two looming...and Bydgo- szcz, Poland, as well as major national training centers such as the bilateral U.S.- Romanian Joint Task Force– East at Kogalniceanu...operations. Increase U.S. and Allied Exchange Students at National and NATO military schools. Austerity measures may eventually affect the investment

  2. Technical Data Interoperability (TDI) Pathfinder Via Emerging Standards

    Science.gov (United States)

    Conroy, Mike; Gill, Paul; Hill, Bradley; Ibach, Brandon; Jones, Corey; Ungar, David; Barch, Jeffrey; Ingalls, John; Jacoby, Joseph; Manning, Josh; hide

    2014-01-01

    The TDI project (TDI) investigates trending technical data standards for applicability to NASA vehicles, space stations, payloads, facilities, and equipment. TDI tested COTS software compatible with a certain suite of related industry standards for capabilities of individual benefits and interoperability. These standards not only esnable Information Technology (IT) efficiencies, but also address efficient structures and standard content for business processes. We used source data from generic industry samples as well as NASA and European Space Agency (ESA) data from space systems.

  3. The challenge of networked enterprises for cloud computing interoperability

    OpenAIRE

    Mezgár, István; Rauschecker, Ursula

    2014-01-01

    Manufacturing enterprises have to organize themselves into effective system architectures forming different types of Networked Enterprises (NE) to match fast changing market demands. Cloud Computing (CC) is an important up to date computing concept for NE, as it offers significant financial and technical advantages beside high-level collaboration possibilities. As cloud computing is a new concept the solutions for handling interoperability, portability, security, privacy and standardization c...

  4. Interoperability between Fingerprint Biometric Systems: An Empirical Study

    OpenAIRE

    Gashi, I.; Mason, S.; Lugini, L.; Marasco, E.; Cukic, B.

    2014-01-01

    Fingerprints are likely the most widely used biometric in commercial as well as law enforcement applications. With the expected rapid growth of fingerprint authentication in mobile devices their importance justifies increased demands for dependability. An increasing number of new sensors,applications and a diverse user population also intensify concerns about the interoperability in fingerprint authentication. In most applications, fingerprints captured for user enrollment with one device may...

  5. Emergent Semantics Interoperability in Large-Scale Decentralized Information Systems

    CERN Document Server

    Cudré-Mauroux, Philippe

    2008-01-01

    Peer-to-peer systems are evolving with new information-system architectures, leading to the idea that the principles of decentralization and self-organization will offer new approaches in informatics, especially for systems that scale with the number of users or for which central authorities do not prevail. This book describes a new way of building global agreements (semantic interoperability) based only on decentralized, self-organizing interactions.

  6. Smart hospitality—Interconnectivity and interoperability towards an ecosystem

    OpenAIRE

    Buhalis, Dimitrios; Leung, Rosanna

    2018-01-01

    The Internet and cloud computing changed the way business operate. Standardised web-based applications simplify data interchange which allow internal applications and business partners systems to become interconnected and interoperable. This study conceptualises the smart and agile hospitality enterprises of the future, and proposes a smart hospitality ecosystem that adds value to all stakeholders. Internal data from applications among all stakeholders, consolidated with external environment ...

  7. Product-driven Enterprise Interoperability for Manufacturing Systems Integration

    OpenAIRE

    Dassisti , Michele; Panetto , Hervé; Tursi , Angela

    2006-01-01

    International audience; The “Babel tower effect”, induced by the heterogeneity of applications available in the operation of enterprises brings to a consistent lack of “exchangeability” and risk of semantic loss whenever cooperation has to take place within the same enterprise. Generally speaking, this kind of problem falls within the umbrella of interoperability between local reference information models .This position paper discuss some idea on this field and traces a research roadmap to ma...

  8. Secure and interoperable communication infrastructures for PPDR organisations

    Science.gov (United States)

    Müller, Wilmuth; Marques, Hugo; Pereira, Luis; Rodriguez, Jonathan; Brouwer, Frank; Bouwers, Bert; Politis, Ilias; Lykourgiotis, Asimakis; Ladas, Alexandros; Adigun, Olayinka; Jelenc, David

    2016-05-01

    The growing number of events affecting public safety and security (PS&S) on a regional scale with potential to grow up to large scale cross border disasters puts an increased pressure on agencies and organisation responsible for PS&S. In order to respond timely and in an adequate manner to such events, Public Protection and Disaster Relief (PPDR) organisations need to cooperate, align their procedures and activities, share the needed information and be interoperable. Existing PPDR/PMR technologies such as TETRA, TETRAPOL or P25, do not currently provide broadband capability nor is expected such technologies to be upgraded in the future. This presents a major limitation in supporting new services and information flows. Furthermore, there is no known standard that addresses interoperability of these technologies. In this contribution the design of a next generation communication infrastructure for PPDR organisations which fulfills the requirements of secure and seamless end-to-end communication and interoperable information exchange within the deployed communication networks is presented. Based on Enterprise Architecture of PPDR organisations, a next generation PPDR network that is backward compatible with legacy communication technologies is designed and implemented, capable of providing security, privacy, seamless mobility, QoS and reliability support for mission-critical Private Mobile Radio (PMR) voice and broadband data services. The designed solution provides a robust, reliable, and secure mobile broadband communications system for a wide variety of PMR applications and services on PPDR broadband networks, including the ability of inter-system, interagency and cross-border operations with emphasis on interoperability between users in PMR and LTE.

  9. Enabling IoT ecosystems through platform interoperability

    OpenAIRE

    Bröring, Arne; Schmid, Stefan; Schindhelm, Corina-Kim; Khelil, Abdelmajid; Kabisch, Sebastian; Kramer, Denis; Le Phuoc, Danh; Mitic, Jelena; Anicic, Darko; Teniente López, Ernest

    2017-01-01

    Today, the Internet of Things (IoT) comprises vertically oriented platforms for things. Developers who want to use them need to negotiate access individually and adapt to the platform-specific API and information models. Having to perform these actions for each platform often outweighs the possible gains from adapting applications to multiple platforms. This fragmentation of the IoT and the missing interoperability result in high entry barriers for developers and prevent the emergence of broa...

  10. The Internet of Things: New Interoperability, Management and Security Challenges

    OpenAIRE

    Elkhodr, Mahmoud; Shahrestani, Seyed; Cheung, Hon

    2016-01-01

    The Internet of Things (IoT) brings connectivity to about every objects found in the physical space. It extends connectivity to everyday objects. From connected fridges, cars and cities, the IoT creates opportunities in numerous domains. However, this increase in connectivity creates many prominent challenges. This paper provides a survey of some of the major issues challenging the widespread adoption of the IoT. Particularly, it focuses on the interoperability, management, securi...

  11. Variation in interoperability across clinical laboratories nationwide.

    Science.gov (United States)

    Patel, Vaishali; McNamara, Lauren; Dullabh, Prashila; Sawchuk, Megan E; Swain, Matthew

    2017-12-01

    To characterize nationwide variation and factors associated with clinical laboratories': (1) capabilities to send structured test results electronically to ordering practitioners' EHR systems; and (2) their levels of exchange activity, as measured by whether they sent more than three-quarters of their test results as structured data to ordering practitioners' EHR systems. A national survey of all independent and hospital laboratories was conducted in 2013. Using an analytic weighted sample of 9382 clinical laboratories, a series of logistic regression analyses were conducted to identify organizational and area characteristics associated with clinical laboratories' exchange capability and activity. Hospital-based clinical laboratories (71%) and larger clinical laboratories (80%) had significantly higher levels of capability compared to independent (58%) and smaller laboratories (48%), respectively; though all had similar levels of exchange activity, with 30% of clinical laboratories sending 75% or more of their test results electronically. In multivariate analyses, hospital and the largest laboratories had 1.87 and 4.40 higher odds, respectively, of possessing the capability to send results electronically compared to independent laboratories (pLaboratories located in areas with a higher share of potential exchange partners had a small but significantly greater capability to send results electronically and higher levels of exchange activity(pClinical laboratories' capability to exchange varied by size and type; however, all clinical laboratories had relatively low levels of exchange activity. The role of exchange partners potentially played a small but significant role in driving exchange capability and activity. Published by Elsevier B.V.

  12. On the Feasibility of Interoperable Schemes in Hand Biometrics

    Directory of Open Access Journals (Sweden)

    Miguel A. Ferrer

    2012-02-01

    Full Text Available Personal recognition through hand-based biometrics has attracted the interest of many researchers in the last twenty years. A significant number of proposals based on different procedures and acquisition devices have been published in the literature. However, comparisons between devices and their interoperability have not been thoroughly studied. This paper tries to fill this gap by proposing procedures to improve the interoperability among different hand biometric schemes. The experiments were conducted on a database made up of 8,320 hand images acquired from six different hand biometric schemes, including a flat scanner, webcams at different wavelengths, high quality cameras, and contactless devices. Acquisitions on both sides of the hand were included. Our experiment includes four feature extraction methods which determine the best performance among the different scenarios for two of the most popular hand biometrics: hand shape and palm print. We propose smoothing techniques at the image and feature levels to reduce interdevice variability. Results suggest that comparative hand shape offers better performance in terms of interoperability than palm prints, but palm prints can be more effective when using similar sensors.

  13. On the feasibility of interoperable schemes in hand biometrics.

    Science.gov (United States)

    Morales, Aythami; González, Ester; Ferrer, Miguel A

    2012-01-01

    Personal recognition through hand-based biometrics has attracted the interest of many researchers in the last twenty years. A significant number of proposals based on different procedures and acquisition devices have been published in the literature. However, comparisons between devices and their interoperability have not been thoroughly studied. This paper tries to fill this gap by proposing procedures to improve the interoperability among different hand biometric schemes. The experiments were conducted on a database made up of 8,320 hand images acquired from six different hand biometric schemes, including a flat scanner, webcams at different wavelengths, high quality cameras, and contactless devices. Acquisitions on both sides of the hand were included. Our experiment includes four feature extraction methods which determine the best performance among the different scenarios for two of the most popular hand biometrics: hand shape and palm print. We propose smoothing techniques at the image and feature levels to reduce interdevice variability. Results suggest that comparative hand shape offers better performance in terms of interoperability than palm prints, but palm prints can be more effective when using similar sensors.

  14. INTEROPERABLE FRAMEWORK SOLUTION TO ICU HEALTH CARE MONITORING

    Directory of Open Access Journals (Sweden)

    Shola Usha Rani

    2015-03-01

    Full Text Available An interoperable telehealth system provides an independent healthcare solution for better management of health and wellness. It allows people to manage their heart disease and diabetes etc. by sending their health parameters like blood pressure, heart rate, glucose levels, temperature, weight, respiration from remote place to health professional, and get real-time feedback on their condition. Here different medical devices are connected to the patient for monitoring. Each kind of device is manufactured by different vendors. And each device information and communication requires different installation and network design. It causes design complexities and network overheads when moving patients for diagnosis examinations. This problem will be solved by interoperability among devices. The ISO/IEEE 11073 is an international standard which produces interoperable hospital information system solution to medical devices. One such type of integrated environment that requires the integration of medical devices is ICU (Intensive Care Unit. This paper presents the issues for ICU monitoring system and framework solution for it.

  15. Rollout Strategy to Implement Interoperable Traceability in the Seafood Industry.

    Science.gov (United States)

    Gooch, Martin; Dent, Benjamin; Sylvia, Gilbert; Cusack, Christopher

    2017-08-01

    Verifying the accuracy and rigor of data exchanged within and between businesses for the purposes of traceability rests on the existence of effective and efficient interoperable information systems that meet users' needs. Interoperability, particularly given the complexities intrinsic to the seafood industry, requires that the systems used by businesses operating along the supply chain share a common technology architecture that is robust, resilient, and evolves as industry needs change. Technology architectures are developed through engaging industry stakeholders in understanding why an architecture is required, the benefits provided to the industry and individual businesses and supply chains, and how the architecture will translate into practical results. This article begins by reiterating the benefits that the global seafood industry can capture by implementing interoperable chain-length traceability and the reason for basing the architecture on a peer-to-peer networked database concept versus more traditional centralized or linear approaches. A summary of capabilities that already exist within the seafood industry that the proposed architecture uses is discussed; and a strategy for implementing the architecture is presented. The 6-step strategy is presented in the form of a critical path. © 2017 Institute of Food Technologists®.

  16. Interoperability of CAD Standards and Robotics in CIME

    DEFF Research Database (Denmark)

    Sørensen, Torben

    The research presented in this dissertation concerns the identification of problems and provision of solutions for increasing the degree of interoperability between CAD, CACSD (Computer Aided Control Systems Design) and CAR (Computer Aided Robotics) in Computer Integrated Manufacturing and Engine......The research presented in this dissertation concerns the identification of problems and provision of solutions for increasing the degree of interoperability between CAD, CACSD (Computer Aided Control Systems Design) and CAR (Computer Aided Robotics) in Computer Integrated Manufacturing......· The development of a STEP based interface for general control system data and functions, especially related to robot motion control for interoperability of CAD, CACSD, and CAR systems for the extension of the inter-system communication capabilities beyond the stage achieved up to now.This interface development...... comprehends the following work:· The definition of the concepts of 'information' and 'information model', and the selection of a proper information modeling methodology within the STEP methodologies.· The elaboration of a general function model of a generic robot motion controller in IDEF0 for interface...

  17. The GEOSS solution for enabling data interoperability and integrative research.

    Science.gov (United States)

    Nativi, Stefano; Mazzetti, Paolo; Craglia, Max; Pirrone, Nicola

    2014-03-01

    Global sustainability research requires an integrative research effort underpinned by digital infrastructures (systems) able to harness data and heterogeneous information across disciplines. Digital data and information sharing across systems and applications is achieved by implementing interoperability: a property of a product or system to work with other products or systems, present or future. There are at least three main interoperability challenges a digital infrastructure must address: technological, semantic, and organizational. In recent years, important international programs and initiatives are focusing on such an ambitious objective. This manuscript presents and combines the studies and the experiences carried out by three relevant projects, focusing on the heavy metal domain: Global Mercury Observation System, Global Earth Observation System of Systems (GEOSS), and INSPIRE. This research work recognized a valuable interoperability service bus (i.e., a set of standards models, interfaces, and good practices) proposed to characterize the integrative research cyber-infrastructure of the heavy metal research community. In the paper, the GEOSS common infrastructure is discussed implementing a multidisciplinary and participatory research infrastructure, introducing a possible roadmap for the heavy metal pollution research community to join GEOSS as a new Group on Earth Observation community of practice and develop a research infrastructure for carrying out integrative research in its specific domain.

  18. On the Feasibility of Interoperable Schemes in Hand Biometrics

    Science.gov (United States)

    Morales, Aythami; González, Ester; Ferrer, Miguel A.

    2012-01-01

    Personal recognition through hand-based biometrics has attracted the interest of many researchers in the last twenty years. A significant number of proposals based on different procedures and acquisition devices have been published in the literature. However, comparisons between devices and their interoperability have not been thoroughly studied. This paper tries to fill this gap by proposing procedures to improve the interoperability among different hand biometric schemes. The experiments were conducted on a database made up of 8,320 hand images acquired from six different hand biometric schemes, including a flat scanner, webcams at different wavelengths, high quality cameras, and contactless devices. Acquisitions on both sides of the hand were included. Our experiment includes four feature extraction methods which determine the best performance among the different scenarios for two of the most popular hand biometrics: hand shape and palm print. We propose smoothing techniques at the image and feature levels to reduce interdevice variability. Results suggest that comparative hand shape offers better performance in terms of interoperability than palm prints, but palm prints can be more effective when using similar sensors. PMID:22438714

  19. Ocean Data Interoperability Platform (ODIP): developing a common framework for marine data management on a global scale

    Science.gov (United States)

    Schaap, Dick M. A.; Glaves, Helen

    2016-04-01

    Europe, the USA, and Australia are making significant progress in facilitating the discovery, access and long term stewardship of ocean and marine data through the development, implementation, population and operation of national, regional or international distributed ocean and marine observing and data management infrastructures such as SeaDataNet, EMODnet, IOOS, R2R, and IMOS. All of these developments are resulting in the development of standards and services implemented and used by their regional communities. The Ocean Data Interoperability Platform (ODIP) project is supported by the EU FP7 Research Infrastructures programme, National Science Foundation (USA) and Australian government and has been initiated 1st October 2012. Recently the project has been continued as ODIP II for another 3 years with EU HORIZON 2020 funding. ODIP includes all the major organisations engaged in ocean data management in EU, US, and Australia. ODIP is also supported by the IOC-IODE, closely linking this activity with its Ocean Data Portal (ODP) and Ocean Data Standards Best Practices (ODSBP) projects. The ODIP platform aims to ease interoperability between the regional marine data management infrastructures. Therefore it facilitates an organised dialogue between the key infrastructure representatives by means of publishing best practice, organising a series of international workshops and fostering the development of common standards and interoperability solutions. These are evaluated and tested by means of prototype projects. The presentation will give further background on the ODIP projects and the latest information on the progress of three prototype projects addressing: 1. establishing interoperability between the regional EU, USA and Australia data discovery and access services (SeaDataNet CDI, US NODC, and IMOS MCP) and contributing to the global GEOSS and IODE-ODP portals; 2. establishing interoperability between cruise summary reporting systems in Europe, the USA and

  20. IEEE 1547 and 2030 Standards for Distributed Energy Resources Interconnection and Interoperability with the Electricity Grid

    Energy Technology Data Exchange (ETDEWEB)

    Basso, T.

    2014-12-01

    Public-private partnerships have been a mainstay of the U.S. Department of Energy and the National Renewable Energy Laboratory (DOE/NREL) approach to research and development. These partnerships also include technology development that enables grid modernization and distributed energy resources (DER) advancement, especially renewable energy systems integration with the grid. Through DOE/NREL and industry support of Institute of Electrical and Electronics Engineers (IEEE) standards development, the IEEE 1547 series of standards has helped shape the way utilities and other businesses have worked together to realize increasing amounts of DER interconnected with the distribution grid. And more recently, the IEEE 2030 series of standards is helping to further realize greater implementation of communications and information technologies that provide interoperability solutions for enhanced integration of DER and loads with the grid. For these standards development partnerships, for approximately $1 of federal funding, industry partnering has contributed $5. In this report, the status update is presented for the American National Standards IEEE 1547 and IEEE 2030 series of standards. A short synopsis of the history of the 1547 standards is first presented, then the current status and future direction of the ongoing standards development activities are discussed.

  1. The Exopolysaccharide Matrix

    Science.gov (United States)

    Koo, H.; Falsetta, M.L.; Klein, M.I.

    2013-01-01

    Many infectious diseases in humans are caused or exacerbated by biofilms. Dental caries is a prime example of a biofilm-dependent disease, resulting from interactions of microorganisms, host factors, and diet (sugars), which modulate the dynamic formation of biofilms on tooth surfaces. All biofilms have a microbial-derived extracellular matrix as an essential constituent. The exopolysaccharides formed through interactions between sucrose- (and starch-) and Streptococcus mutans-derived exoenzymes present in the pellicle and on microbial surfaces (including non-mutans) provide binding sites for cariogenic and other organisms. The polymers formed in situ enmesh the microorganisms while forming a matrix facilitating the assembly of three-dimensional (3D) multicellular structures that encompass a series of microenvironments and are firmly attached to teeth. The metabolic activity of microbes embedded in this exopolysaccharide-rich and diffusion-limiting matrix leads to acidification of the milieu and, eventually, acid-dissolution of enamel. Here, we discuss recent advances concerning spatio-temporal development of the exopolysaccharide matrix and its essential role in the pathogenesis of dental caries. We focus on how the matrix serves as a 3D scaffold for biofilm assembly while creating spatial heterogeneities and low-pH microenvironments/niches. Further understanding on how the matrix modulates microbial activity and virulence expression could lead to new approaches to control cariogenic biofilms. PMID:24045647

  2. Latest developments for the IAGOS database: Interoperability and metadata

    Science.gov (United States)

    Boulanger, Damien; Gautron, Benoit; Thouret, Valérie; Schultz, Martin; van Velthoven, Peter; Broetz, Bjoern; Rauthe-Schöch, Armin; Brissebrat, Guillaume

    2014-05-01

    In-service Aircraft for a Global Observing System (IAGOS, http://www.iagos.org) aims at the provision of long-term, frequent, regular, accurate, and spatially resolved in situ observations of the atmospheric composition. IAGOS observation systems are deployed on a fleet of commercial aircraft. The IAGOS database is an essential part of the global atmospheric monitoring network. Data access is handled by open access policy based on the submission of research requests which are reviewed by the PIs. Users can access the data through the following web sites: http://www.iagos.fr or http://www.pole-ether.fr as the IAGOS database is part of the French atmospheric chemistry data centre ETHER (CNES and CNRS). The database is in continuous development and improvement. In the framework of the IGAS project (IAGOS for GMES/COPERNICUS Atmospheric Service), major achievements will be reached, such as metadata and format standardisation in order to interoperate with international portals and other databases, QA/QC procedures and traceability, CARIBIC (Civil Aircraft for the Regular Investigation of the Atmosphere Based on an Instrument Container) data integration within the central database, and the real-time data transmission. IGAS work package 2 aims at providing the IAGOS data to users in a standardized format including the necessary metadata and information on data processing, data quality and uncertainties. We are currently redefining and standardizing the IAGOS metadata for interoperable use within GMES/Copernicus. The metadata are compliant with the ISO 19115, INSPIRE and NetCDF-CF conventions. IAGOS data will be provided to users in NetCDF or NASA Ames format. We also are implementing interoperability between all the involved IAGOS data services, including the central IAGOS database, the former MOZAIC and CARIBIC databases, Aircraft Research DLR database and the Jülich WCS web application JOIN (Jülich OWS Interface) which combines model outputs with in situ data for

  3. Extending the GI Brokering Suite to Support New Interoperability Specifications

    Science.gov (United States)

    Boldrini, E.; Papeschi, F.; Santoro, M.; Nativi, S.

    2014-12-01

    The GI brokering suite provides the discovery, access, and semantic Brokers (i.e. GI-cat, GI-axe, GI-sem) that empower a Brokering framework for multi-disciplinary and multi-organizational interoperability. GI suite has been successfully deployed in the framework of several programmes and initiatives, such as European Union funded projects, NSF BCube, and the intergovernmental coordinated effort Global Earth Observation System of Systems (GEOSS). Each GI suite Broker facilitates interoperability for a particular functionality (i.e. discovery, access, semantic extension) among a set of brokered resources published by autonomous providers (e.g. data repositories, web services, semantic assets) and a set of heterogeneous consumers (e.g. client applications, portals, apps). A wide set of data models, encoding formats, and service protocols are already supported by the GI suite, such as the ones defined by international standardizing organizations like OGC and ISO (e.g. WxS, CSW, SWE, GML, netCDF) and by Community specifications (e.g. THREDDS, OpenSearch, OPeNDAP, ESRI APIs). Using GI suite, resources published by a particular Community or organization through their specific technology (e.g. OPeNDAP/netCDF) can be transparently discovered, accessed, and used by different Communities utilizing their preferred tools (e.g. a GIS visualizing WMS layers). Since Information Technology is a moving target, new standards and technologies continuously emerge and are adopted in the Earth Science context too. Therefore, GI Brokering suite was conceived to be flexible and accommodate new interoperability protocols and data models. For example, GI suite has recently added support to well-used specifications, introduced to implement Linked data, Semantic Web and precise community needs. Amongst the others, they included: DCAT: a RDF vocabulary designed to facilitate interoperability between Web data catalogs. CKAN: a data management system for data distribution, particularly used by

  4. Modeling Interoperable Information Systems with 3LGM² and IHE.

    Science.gov (United States)

    Stäubert, S; Schaaf, M; Jahn, F; Brandner, R; Winter, A

    2015-01-01

    Strategic planning of information systems (IS) in healthcare requires descriptions of the current and the future IS state. Enterprise architecture planning (EAP) tools like the 3LGM² tool help to build up and to analyze IS models. A model of the planned architecture can be derived from an analysis of current state IS models. Building an interoperable IS, i. e. an IS consisting of interoperable components, can be considered a relevant strategic information management goal for many IS in healthcare. Integrating the healthcare enterprise (IHE) is an initiative which targets interoperability by using established standards. To link IHE concepts to 3LGM² concepts within the 3LGM² tool. To describe how an information manager can be supported in handling the complex IHE world and planning interoperable IS using 3LGM² models. To describe how developers or maintainers of IHE profiles can be supported by the representation of IHE concepts in 3LGM². Conceptualization and concept mapping methods are used to assign IHE concepts such as domains, integration profiles actors and transactions to the concepts of the three-layer graph-based meta-model (3LGM²). IHE concepts were successfully linked to 3LGM² concepts. An IHE-master-model, i. e. an abstract model for IHE concepts, was modeled with the help of 3LGM² tool. Two IHE domains were modeled in detail (ITI, QRPH). We describe two use cases for the representation of IHE concepts and IHE domains as 3LGM² models. Information managers can use the IHE-master-model as reference model for modeling interoperable IS based on IHE profiles during EAP activities. IHE developers are supported in analyzing consistency of IHE concepts with the help of the IHE-master-model and functions of the 3LGM² tool The complex relations between IHE concepts can be modeled by using the EAP method 3LGM². 3LGM² tool offers visualization and analysis features which are now available for the IHE-master-model. Thus information managers and IHE

  5. Matrix calculus

    CERN Document Server

    Bodewig, E

    1959-01-01

    Matrix Calculus, Second Revised and Enlarged Edition focuses on systematic calculation with the building blocks of a matrix and rows and columns, shunning the use of individual elements. The publication first offers information on vectors, matrices, further applications, measures of the magnitude of a matrix, and forms. The text then examines eigenvalues and exact solutions, including the characteristic equation, eigenrows, extremum properties of the eigenvalues, bounds for the eigenvalues, elementary divisors, and bounds for the determinant. The text ponders on approximate solutions, as well

  6. 2016 MATRIX annals

    CERN Document Server

    Praeger, Cheryl; Tao, Terence

    2018-01-01

    MATRIX is Australia’s international, residential mathematical research institute. It facilitates new collaborations and mathematical advances through intensive residential research programs, each lasting 1-4 weeks. This book is a scientific record of the five programs held at MATRIX in its first year, 2016: Higher Structures in Geometry and Physics (Chapters 1-5 and 18-21); Winter of Disconnectedness (Chapter 6 and 22-26); Approximation and Optimisation (Chapters 7-8); Refining C*-Algebraic Invariants for Dynamics using KK-theory (Chapters 9-13); Interactions between Topological Recursion, Modularity, Quantum Invariants and Low-dimensional Topology (Chapters 14-17 and 27). The MATRIX Scientific Committee selected these programs based on their scientific excellence and the participation rate of high-profile international participants. Each program included ample unstructured time to encourage collaborative research; some of the longer programs also included an embedded conference or lecture series. The artic...

  7. System and methods of resource usage using an interoperable management framework

    Science.gov (United States)

    Heileman, Gregory L.; Jamkhedkar, Pramod A.; Lamb, Christopher C.

    2017-10-31

    Generic rights expression language allowing interoperability across different computing environments including resource usage of different applications. A formal framework for usage management provides scaffolding upon which interoperable usage management systems can be built. Certain features of the framework are standardized, such as the operational semantics, including areas free of standards that necessitate choice and innovation to achieve a balance of flexibility and usability for interoperability in usage management systems.

  8. Combinatorial matrix theory

    CERN Document Server

    Mitjana, Margarida

    2018-01-01

    This book contains the notes of the lectures delivered at an Advanced Course on Combinatorial Matrix Theory held at Centre de Recerca Matemàtica (CRM) in Barcelona. These notes correspond to five series of lectures. The first series is dedicated to the study of several matrix classes defined combinatorially, and was delivered by Richard A. Brualdi. The second one, given by Pauline van den Driessche, is concerned with the study of spectral properties of matrices with a given sign pattern. Dragan Stevanović delivered the third one, devoted to describing the spectral radius of a graph as a tool to provide bounds of parameters related with properties of a graph. The fourth lecture was delivered by Stephen Kirkland and is dedicated to the applications of the Group Inverse of the Laplacian matrix. The last one, given by Ángeles Carmona, focuses on boundary value problems on finite networks with special in-depth on the M-matrix inverse problem.

  9. Interoperability after deployment: persistent challenges and regional strategies in Denmark.

    Science.gov (United States)

    Kierkegaard, Patrick

    2015-04-01

    The European Union has identified Denmark as one of the countries who have the potential to provide leadership and inspiration for other countries in eHealth implementation and adoption. However, Denmark has historically struggled to facilitate data exchange between their public hospitals' electronic health records (EHRs). Furthermore, state-led projects failed to adequately address the challenges of interoperability after deployment. Changes in the organizational setup and division of responsibilities concerning the future of eHealth implementations in hospitals took place, which granted the Danish regions the full responsibility for all hospital systems, specifically the consolidation of EHRs to one system per region. The regions reduced the number of different EHRs to six systems by 2014. Additionally, the first version of the National Health Record was launched to provide health care practitioners with an overview of a patient's data stored in all EHRs across the regions and within the various health sectors. The governance of national eHealth implementation plays a crucial role in the development and diffusion of interoperable technologies. Changes in the organizational setup and redistribution of responsibilities between the Danish regions and the state play a pivotal role in producing viable and coherent solutions in a timely manner. Interoperability initiatives are best managed on a regional level or by the authorities responsible for the provision of local health care services. Cross-regional communication is essential during the initial phases of planning in order to set a common goal for countrywide harmonization, coherence and collaboration. © The Author 2015. Published by Oxford University Press in association with the International Society for Quality in Health Care; all rights reserved.

  10. Fourier series

    CERN Document Server

    Tolstov, Georgi P

    1962-01-01

    Richard A. Silverman's series of translations of outstanding Russian textbooks and monographs is well-known to people in the fields of mathematics, physics, and engineering. The present book is another excellent text from this series, a valuable addition to the English-language literature on Fourier series.This edition is organized into nine well-defined chapters: Trigonometric Fourier Series, Orthogonal Systems, Convergence of Trigonometric Fourier Series, Trigonometric Series with Decreasing Coefficients, Operations on Fourier Series, Summation of Trigonometric Fourier Series, Double Fourie

  11. UMTS network planning, optimization, and inter-operation with GSM

    CERN Document Server

    Rahnema, Moe

    2008-01-01

    UMTS Network Planning, Optimization, and Inter-Operation with GSM is an accessible, one-stop reference to help engineers effectively reduce the time and costs involved in UMTS deployment and optimization. Rahnema includes detailed coverage from both a theoretical and practical perspective on the planning and optimization aspects of UMTS, and a number of other new techniques to help operators get the most out of their networks. Provides an end-to-end perspective, from network design to optimizationIncorporates the hands-on experiences of numerous researchersSingle

  12. Creating XML/PHP Interface for BAN Interoperability.

    Science.gov (United States)

    Fragkos, Vasileios; Katzis, Konstantinos; Despotou, Georgios

    2017-01-01

    Recent advances in medical and electronic technologies have introduced the use of Body Area Networks as a part of e-health, for constant and accurate monitoring of patients and the transmission as well as processing of the data to develop a holistic Electronic Health Record. The rising global population, different BAN manufacturers and a variety of medical systems pose the issue of interoperability between BANs and systems as well as the proper way to propagate medical data in an organized and efficient manner. In this paper, we describe BANs and propose the use of certain web technologies to address this issue.

  13. Ocean Data Interoperability Platform: developing a common global framework for marine data management

    Science.gov (United States)

    Glaves, Helen; Schaap, Dick

    2017-04-01

    elsewhere. To add a further layer of complexity there are also global initiatives providing marine data infrastructures e.g. IOC-IODE, POGO as well as those with a wider remit which includes environmental data e.g. GEOSS, COPERNICUS etc. Ecosystem level marine research requires a common framework for marine data management that supports the sharing of data across these regional and global data systems, and provides the user with access to the data available from these services via a single point of access. This framework must be based on existing data systems and established by developing interoperability between them. The Ocean Data and Interoperability Platform (ODIP/ODIP II) project brings together those organisations responsible for maintaining selected regional data infrastructures along with other relevant experts in order to identify the common standards and best practice necessary to underpin this framework, and to evaluate the differences and commonalties between the regional data infrastructures in order to establish interoperability between them for the purposes of data sharing. This coordinated approach is being demonstrated and validated through the development of a series of prototype interoperability solutions that demonstrate the mechanisms and standards necessary to facilitate the sharing of marine data across these existing data infrastructures.

  14. Summability of alterations of convergent series

    Directory of Open Access Journals (Sweden)

    T. A. Keagy

    1981-01-01

    Full Text Available The effect of splitting, rearrangement, and grouping series alterations on the summability of a convergent series by ℓ−ℓ and cs−cs matrix methods is studied. Conditions are determined that guarantee the existence of alterations that are transformed into divergent series and into series with preassigned sums.

  15. Matrix thermalization

    International Nuclear Information System (INIS)

    Craps, Ben; Evnin, Oleg; Nguyen, Kévin

    2017-01-01

    Matrix quantum mechanics offers an attractive environment for discussing gravitational holography, in which both sides of the holographic duality are well-defined. Similarly to higher-dimensional implementations of holography, collapsing shell solutions in the gravitational bulk correspond in this setting to thermalization processes in the dual quantum mechanical theory. We construct an explicit, fully nonlinear supergravity solution describing a generic collapsing dilaton shell, specify the holographic renormalization prescriptions necessary for computing the relevant boundary observables, and apply them to evaluating thermalizing two-point correlation functions in the dual matrix theory.

  16. Matrix thermalization

    Science.gov (United States)

    Craps, Ben; Evnin, Oleg; Nguyen, Kévin

    2017-02-01

    Matrix quantum mechanics offers an attractive environment for discussing gravitational holography, in which both sides of the holographic duality are well-defined. Similarly to higher-dimensional implementations of holography, collapsing shell solutions in the gravitational bulk correspond in this setting to thermalization processes in the dual quantum mechanical theory. We construct an explicit, fully nonlinear supergravity solution describing a generic collapsing dilaton shell, specify the holographic renormalization prescriptions necessary for computing the relevant boundary observables, and apply them to evaluating thermalizing two-point correlation functions in the dual matrix theory.

  17. Matrix thermalization

    Energy Technology Data Exchange (ETDEWEB)

    Craps, Ben [Theoretische Natuurkunde, Vrije Universiteit Brussel (VUB), and International Solvay Institutes, Pleinlaan 2, B-1050 Brussels (Belgium); Evnin, Oleg [Department of Physics, Faculty of Science, Chulalongkorn University, Thanon Phayathai, Pathumwan, Bangkok 10330 (Thailand); Theoretische Natuurkunde, Vrije Universiteit Brussel (VUB), and International Solvay Institutes, Pleinlaan 2, B-1050 Brussels (Belgium); Nguyen, Kévin [Theoretische Natuurkunde, Vrije Universiteit Brussel (VUB), and International Solvay Institutes, Pleinlaan 2, B-1050 Brussels (Belgium)

    2017-02-08

    Matrix quantum mechanics offers an attractive environment for discussing gravitational holography, in which both sides of the holographic duality are well-defined. Similarly to higher-dimensional implementations of holography, collapsing shell solutions in the gravitational bulk correspond in this setting to thermalization processes in the dual quantum mechanical theory. We construct an explicit, fully nonlinear supergravity solution describing a generic collapsing dilaton shell, specify the holographic renormalization prescriptions necessary for computing the relevant boundary observables, and apply them to evaluating thermalizing two-point correlation functions in the dual matrix theory.

  18. Sociotechnical Challenges of Developing an Interoperable Personal Health Record

    Science.gov (United States)

    Gaskin, G.L.; Longhurst, C.A.; Slayton, R.; Das, A.K.

    2011-01-01

    Objectives To analyze sociotechnical issues involved in the process of developing an interoperable commercial Personal Health Record (PHR) in a hospital setting, and to create guidelines for future PHR implementations. Methods This qualitative study utilized observational research and semi-structured interviews with 8 members of the hospital team, as gathered over a 28 week period of developing and adapting a vendor-based PHR at Lucile Packard Children’s Hospital at Stanford University. A grounded theory approach was utilized to code and analyze over 100 pages of typewritten field notes and interview transcripts. This grounded analysis allowed themes to surface during the data collection process which were subsequently explored in greater detail in the observations and interviews. Results Four major themes emerged: (1) Multidisciplinary teamwork helped team members identify crucial features of the PHR; (2) Divergent goals for the PHR existed even within the hospital team; (3) Differing organizational conceptions of the end-user between the hospital and software company differentially shaped expectations for the final product; (4) Difficulties with coordination and accountability between the hospital and software company caused major delays and expenses and strained the relationship between hospital and software vendor. Conclusions Though commercial interoperable PHRs have great potential to improve healthcare, the process of designing and developing such systems is an inherently sociotechnical process with many complex issues and barriers. This paper offers recommendations based on the lessons learned to guide future development of such PHRs. PMID:22003373

  19. Interoperability prototype between hospitals and general practitioners in Switzerland.

    Science.gov (United States)

    Alves, Bruno; Müller, Henning; Schumacher, Michael; Godel, David; Abu Khaled, Omar

    2010-01-01

    Interoperability in data exchange has the potential to improve the care processes and decrease costs of the health care system. Many countries have related eHealth initiatives in preparation or already implemented. In this area, Switzerland has yet to catch up. Its health system is fragmented, because of the federated nature of cantons. It is thus more difficult to coordinate efforts between the existing healthcare actors. In the Medicoordination project a pragmatic approach was selected: integrating several partners in healthcare on a regional scale in French speaking Switzerland. In parallel with the Swiss eHealth strategy, currently being elaborated by the Swiss confederation, particularly medium-sized hospitals and general practitioners were targeted in Medicoordination to implement concrete scenarios of information exchange between hospitals and general practitioners with a high added value. In this paper we focus our attention on a prototype implementation of one chosen scenario: the discharge summary. Although simple in concept, exchanging release letters shows small, hidden difficulties due to the multi-partner nature of the project. The added value of such a prototype is potentially high and it is now important to show that interoperability can work in practice.

  20. Grid Interoperation with ARC middleware for the CMS experiment

    International Nuclear Information System (INIS)

    Edelmann, Erik; Groenager, Michael; Johansson, Daniel; Kleist, Josva; Field, Laurence; Qing, Di; Frey, Jaime; Happonen, Kalle; Klem, Jukka; Koivumaeki, Jesper; Linden, Tomas; Pirinen, Antti

    2010-01-01

    The Compact Muon Solenoid (CMS) is one of the general purpose experiments at the CERN Large Hadron Collider (LHC). CMS computing relies on different grid infrastructures to provide computational and storage resources. The major grid middleware stacks used for CMS computing are gLite, Open Science Grid (OSG) and ARC (Advanced Resource Connector). Helsinki Institute of Physics (HIP) hosts one of the Tier-2 centers for CMS computing. CMS Tier-2 centers operate software systems for data transfers (PhEDEx), Monte Carlo production (ProdAgent) and data analysis (CRAB). In order to provide the Tier-2 services for CMS, HIP uses tools and components from both ARC and gLite grid middleware stacks. Interoperation between grid systems is a challenging problem and HIP uses two different solutions to provide the needed services. The first solution is based on gLite-ARC grid level interoperability. This allows to use ARC resources in CMS without modifying the CMS application software. The second solution is based on developing specific ARC plugins in CMS software.

  1. Grid Interoperation with ARC middleware for the CMS experiment

    Energy Technology Data Exchange (ETDEWEB)

    Edelmann, Erik; Groenager, Michael; Johansson, Daniel; Kleist, Josva [Nordic DataGrid Facility, Kastruplundgade 22, 1., DK-2770 Kastrup (Denmark); Field, Laurence; Qing, Di [CERN, CH-1211 Geneve 23 (Switzerland); Frey, Jaime [University of Wisconsin-Madison, 1210 W. Dayton St., Madison, WI (United States); Happonen, Kalle; Klem, Jukka; Koivumaeki, Jesper; Linden, Tomas; Pirinen, Antti, E-mail: Jukka.Klem@cern.c [Helsinki Institute of Physics, PO Box 64, FIN-00014 University of Helsinki (Finland)

    2010-04-01

    The Compact Muon Solenoid (CMS) is one of the general purpose experiments at the CERN Large Hadron Collider (LHC). CMS computing relies on different grid infrastructures to provide computational and storage resources. The major grid middleware stacks used for CMS computing are gLite, Open Science Grid (OSG) and ARC (Advanced Resource Connector). Helsinki Institute of Physics (HIP) hosts one of the Tier-2 centers for CMS computing. CMS Tier-2 centers operate software systems for data transfers (PhEDEx), Monte Carlo production (ProdAgent) and data analysis (CRAB). In order to provide the Tier-2 services for CMS, HIP uses tools and components from both ARC and gLite grid middleware stacks. Interoperation between grid systems is a challenging problem and HIP uses two different solutions to provide the needed services. The first solution is based on gLite-ARC grid level interoperability. This allows to use ARC resources in CMS without modifying the CMS application software. The second solution is based on developing specific ARC plugins in CMS software.

  2. Grid Interoperation with ARC Middleware for the CMS Experiment

    CERN Document Server

    Edelmann, Erik; Frey, Jaime; Gronager, Michael; Happonen, Kalle; Johansson, Daniel; Kleist, Josva; Klem, Jukka; Koivumaki, Jesper; Linden, Tomas; Pirinen, Antti; Qing, Di

    2010-01-01

    The Compact Muon Solenoid (CMS) is one of the general purpose experiments at the CERN Large Hadron Collider (LHC). CMS computing relies on different grid infrastructures to provide computational and storage resources. The major grid middleware stacks used for CMS computing are gLite, Open Science Grid (OSG) and ARC (Advanced Resource Connector). Helsinki Institute of Physics (HIP) hosts one of the Tier-2 centers for CMS computing. CMS Tier-2 centers operate software systems for data transfers (PhEDEx), Monte Carlo production (ProdAgent) and data analysis (CRAB). In order to provide the Tier-2 services for CMS, HIP uses tools and components from both ARC and gLite grid middleware stacks. Interoperation between grid systems is a challenging problem and HIP uses two different solutions to provide the needed services. The first solution is based on gLite-ARC grid level interoperability. This allows to use ARC resources in CMS without modifying the CMS application software. The second solution is based on developi...

  3. Interoperability in planetary research for geospatial data analysis

    Science.gov (United States)

    Hare, Trent M.; Rossi, Angelo P.; Frigeri, Alessandro; Marmo, Chiara

    2018-01-01

    For more than a decade there has been a push in the planetary science community to support interoperable methods for accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (e.g., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized geospatial image formats, geologic mapping conventions, U.S. Federal Geographic Data Committee (FGDC) cartographic and metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Map Tile Services (cached image tiles), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they can be just as valuable for planetary domain. Another initiative, called VESPA (Virtual European Solar and Planetary Access), will marry several of the above geoscience standards and astronomy-based standards as defined by International Virtual Observatory Alliance (IVOA). This work outlines the current state of interoperability initiatives in use or in the process of being researched within the planetary geospatial community.

  4. PyMOOSE: interoperable scripting in Python for MOOSE

    Directory of Open Access Journals (Sweden)

    Subhasis Ray

    2008-12-01

    Full Text Available Python is emerging as a common scripting language for simulators. This opens up many possibilities for interoperability in the form of analysis, interfaces, and communications between simulators. We report the integration of Python scripting with the Multi-scale Object Oriented Simulation Environment (MOOSE. MOOSE is a general-purpose simulation system for compartmental neuronal models and for models of signaling pathways based on chemical kinetics. We show how the Python-scripting version of MOOSE, PyMOOSE, combines the power of a compiled simulator with the versatility and ease of use of Python. We illustrate this by using Python numerical libraries to analyze MOOSE output online, and by developing a GUI in Python/Qt for a MOOSE simulation. Finally, we build and run a composite neuronal/signaling model that uses both the NEURON and MOOSE numerical engines, and Python as a bridge between the two. Thus PyMOOSE has a high degree of interoperability with analysis routines, with graphical toolkits, and with other simulators.

  5. Processing biological literature with customizable Web services supporting interoperable formats.

    Science.gov (United States)

    Rak, Rafal; Batista-Navarro, Riza Theresa; Carter, Jacob; Rowley, Andrew; Ananiadou, Sophia

    2014-01-01

    Web services have become a popular means of interconnecting solutions for processing a body of scientific literature. This has fuelled research on high-level data exchange formats suitable for a given domain and ensuring the interoperability of Web services. In this article, we focus on the biological domain and consider four interoperability formats, BioC, BioNLP, XMI and RDF, that represent domain-specific and generic representations and include well-established as well as emerging specifications. We use the formats in the context of customizable Web services created in our Web-based, text-mining workbench Argo that features an ever-growing library of elementary analytics and capabilities to build and deploy Web services straight from a convenient graphical user interface. We demonstrate a 2-fold customization of Web services: by building task-specific processing pipelines from a repository of available analytics, and by configuring services to accept and produce a combination of input and output data interchange formats. We provide qualitative evaluation of the formats as well as quantitative evaluation of automatic analytics. The latter was carried out as part of our participation in the fourth edition of the BioCreative challenge. Our analytics built into Web services for recognizing biochemical concepts in BioC collections achieved the highest combined scores out of 10 participating teams. Database URL: http://argo.nactem.ac.uk. © The Author(s) 2014. Published by Oxford University Press.

  6. Designing for Change: Interoperability in a scaling and adapting environment

    Science.gov (United States)

    Yarmey, L.

    2015-12-01

    The Earth Science cyberinfrastructure landscape is constantly changing. Technologies advance and technical implementations are refined or replaced. Data types, volumes, packaging, and use cases evolve. Scientific requirements emerge and mature. Standards shift while systems scale and adapt. In this complex and dynamic environment, interoperability remains a critical component of successful cyberinfrastructure. Through the resource- and priority-driven iterations on systems, interfaces, and content, questions fundamental to stable and useful Earth Science cyberinfrastructure arise. For instance, how are sociotechnical changes planned, tracked, and communicated? How should operational stability balance against 'new and shiny'? How can ongoing maintenance and mitigation of technical debt be managed in an often short-term resource environment? The Arctic Data Explorer is a metadata brokering application developed to enable discovery of international, interdisciplinary Arctic data across distributed repositories. Completely dependent on interoperable third party systems, the Arctic Data Explorer publicly launched in 2013 with an original 3000+ data records from four Arctic repositories. Since then the search has scaled to 25,000+ data records from thirteen repositories at the time of writing. In the final months of original project funding, priorities shift to lean operations with a strategic eye on the future. Here we present lessons learned from four years of Arctic Data Explorer design, development, communication, and maintenance work along with remaining questions and potential directions.

  7. Language interoperability for high-performance parallel scientific components

    International Nuclear Information System (INIS)

    Elliot, N; Kohn, S; Smolinski, B

    1999-01-01

    With the increasing complexity and interdisciplinary nature of scientific applications, code reuse is becoming increasingly important in scientific computing. One method for facilitating code reuse is the use of components technologies, which have been used widely in industry. However, components have only recently worked their way into scientific computing. Language interoperability is an important underlying technology for these component architectures. In this paper, we present an approach to language interoperability for a high-performance parallel, component architecture being developed by the Common Component Architecture (CCA) group. Our approach is based on Interface Definition Language (IDL) techniques. We have developed a Scientific Interface Definition Language (SIDL), as well as bindings to C and Fortran. We have also developed a SIDL compiler and run-time library support for reference counting, reflection, object management, and exception handling (Babel). Results from using Babel to call a standard numerical solver library (written in C) from C and Fortran show that the cost of using Babel is minimal, where as the savings in development time and the benefits of object-oriented development support for C and Fortran far outweigh the costs

  8. Governance of Interoperability in Intergovernmental Services - Towards an Empirical Taxonomy

    Directory of Open Access Journals (Sweden)

    Herbert Kubicek

    2008-12-01

    Full Text Available High quality and comfortable online delivery of governmental services often requires the seamless exchange of data between two or more government agencies. Smooth data exchange, in turn, requires interoperability of the databases and workflows in the agencies involved. Interoperability (IOP is a complex issue covering purely technical aspects such as transmission protocols and data exchange formats, but also content-related semantic aspects such as identifiers and the meaning of codes as well as organizational, contractual or legal issues. Starting from IOP frameworks which provide classifications of what has to be standardized, this paper, based on an ongoing research project, adopts a political and managerial view and tries to clarify the governance of achieving IOP, i.e. where and by whom IOPstandards are developed and established and how they are put into operation. By analyzing 32 cases of successful implementation of IOP in E-Government services within the European Union empirical indicators for different aspects of governance are proposed and applied to develop an empirical taxonomy of different types of IOP governance which can be used for future comparative research regarding success factors, barriers etc.

  9. Enabling Interoperable and Selective Data Sharing among Social Networking Sites

    Science.gov (United States)

    Shin, Dongwan; Lopes, Rodrigo

    With the widespread use of social networking (SN) sites and even introduction of a social component in non-social oriented services, there is a growing concern over user privacy in general, how to handle and share user profiles across SN sites in particular. Although there have been several proprietary or open source-based approaches to unifying the creation of third party applications, the availability and retrieval of user profile information are still limited to the site where the third party application is run, mostly devoid of the support for data interoperability. In this paper we propose an approach to enabling interopearable and selective data sharing among SN sites. To support selective data sharing, we discuss an authenticated dictionary (ADT)-based credential which enables a user to share only a subset of her information certified by external SN sites with applications running on an SN site. For interoperable data sharing, we propose an extension to the OpenSocial API so that it can provide an open source-based framework for allowing the ADT-based credential to be used seamlessly among different SN sites.

  10. Interoperability Assets for Patient Summary Components: A Gap Analysis.

    Science.gov (United States)

    Heitmann, Kai U; Cangioli, Giorgio; Melgara, Marcello; Chronaki, Catherine

    2018-01-01

    The International Patient Summary (IPS) standards aim to define the specifications for a minimal and non-exhaustive Patient Summary, which is specialty-agnostic and condition-independent, but still clinically relevant. Meanwhile, health systems are developing and implementing their own variation of a patient summary while, the eHealth Digital Services Infrastructure (eHDSI) initiative is deploying patient summary services across countries in the Europe. In the spirit of co-creation, flexible governance, and continuous alignment advocated by eStandards, the Trillum-II initiative promotes adoption of the patient summary by engaging standards organizations, and interoperability practitioners in a community of practice for digital health to share best practices, tools, data, specifications, and experiences. This paper compares operational aspects of patient summaries in 14 case studies in Europe, the United States, and across the world, focusing on how patient summary components are used in practice, to promote alignment and joint understanding that will improve quality of standards and lower costs of interoperability.

  11. Using software interoperability to achieve a virtual design environment

    Science.gov (United States)

    Gregory, G. Groot; Koshel, R. John

    2005-09-01

    A variety of simulation tools, including optical design and analysis, have benefited by many years of evolution in software functionality and computing power, thus making the notion of virtual design environments a reality. To simulate the optical characteristics of a system, one needs to include optical performance, mechanical design and manufacturing aspects simultaneously. To date, no single software program offers a universal solution. One approach to achieve an integrated environment is to select tools that offer a high degree of interoperability. This allows the selection of the best tools for each aspect of the design working in concert to solve the problem. This paper discusses the issues of how to assemble a design environment and provides an example of a combination of tools for illumination design. We begin by offering a broad definition of interoperability from an optical analysis perspective. This definition includes aspects of file interchange formats, software communications protocols and customized applications. One example solution is proposed by combining SolidWorks1 for computer-aided design (CAD), TracePro2 for optical analysis and MATLAB3 as the mathematical engine for tolerance analysis. The resulting virtual tool will be applied to a lightpipe design task to illustrate how such a system can be used.

  12. BENEFITS OF LINKED DATA FOR INTEROPERABILITY DURING CRISIS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    R. Roller

    2015-08-01

    Full Text Available Floodings represent a permanent risk to the Netherlands in general and to her power supply in particular. Data sharing is essential within this crisis scenario as a power cut affects a great variety of interdependant sectors. Currently used data sharing systems have been shown to hamper interoperability between stakeholders since they lack flexibility and there is no consensus in term definitions and interpretations. The study presented in this paper addresses these challenges by proposing a new data sharing solution based on Linked Data, a method of interlinking data points in a structured way on the web. A conceptual model for two data sharing parties in a flood-caused power cut crisis management scenario was developed to which relevant data were linked. The analysis revealed that the presented data sharing solution burderns its user with extra costs in the short run, but saves resources in the long run by overcoming interoperability problems of the legacy systems. The more stakeholders adopt Linked Data the stronger its benefits for data sharing will become.

  13. Enabling interoperability in planetary sciences and heliophysics: The case for an information model

    Science.gov (United States)

    Hughes, J. Steven; Crichton, Daniel J.; Raugh, Anne C.; Cecconi, Baptiste; Guinness, Edward A.; Isbell, Christopher E.; Mafi, Joseph N.; Gordon, Mitchell K.; Hardman, Sean H.; Joyner, Ronald S.

    2018-01-01

    The Planetary Data System has developed the PDS4 Information Model to enable interoperability across diverse science disciplines. The Information Model is based on an integration of International Organization for Standardization (ISO) level standards for trusted digital archives, information model development, and metadata registries. Where controlled vocabularies provides a basic level of interoperability by providing a common set of terms for communication between both machines and humans the Information Model improves interoperability by means of an ontology that provides semantic information or additional related context for the terms. The information model was defined by team of computer scientists and science experts from each of the diverse disciplines in the Planetary Science community, including Atmospheres, Geosciences, Cartography and Imaging Sciences, Navigational and Ancillary Information, Planetary Plasma Interactions, Ring-Moon Systems, and Small Bodies. The model was designed to be extensible beyond the Planetary Science community, for example there are overlaps between certain PDS disciplines and the Heliophysics and Astrophysics disciplines. "Interoperability" can apply to many aspects of both the developer and the end-user experience, for example agency-to-agency, semantic level, and application level interoperability. We define these types of interoperability and focus on semantic level interoperability, the type of interoperability most directly enabled by an information model.

  14. Evaluation of Interoperability Protocols in Repositories of Electronic Theses and Dissertations

    Science.gov (United States)

    Hakimjavadi, Hesamedin; Masrek, Mohamad Noorman

    2013-01-01

    Purpose: The purpose of this study is to evaluate the status of eight interoperability protocols within repositories of electronic theses and dissertations (ETDs) as an introduction to further studies on feasibility of deploying these protocols in upcoming areas of interoperability. Design/methodology/approach: Three surveys of 266 ETD…

  15. Enabling interoperability-as-a-service for connected IoT infrastructures and Smart Objects

    DEFF Research Database (Denmark)

    Hovstø, Asbjørn; Guan, Yajuan; Quintero, Juan Carlos Vasquez

    2018-01-01

    Lack of interoperability is considered as the most important barrier to achieve the global integration of Internet-of-Things (IoT) ecosystems across borders of different disciplines, vendors and standards. Indeed, the current IoT landscape consists of a large set of non-interoperable infrastructu...

  16. Examining the Relationship between Electronic Health Record Interoperability and Quality Management

    Science.gov (United States)

    Purcell, Bernice M.

    2013-01-01

    A lack of interoperability impairs data quality among health care providers' electronic health record (EHR) systems. The problem is whether the International Organization for Standardization (ISO) 9000 principles relate to the problem of interoperability in implementation of EHR systems. The purpose of the nonexperimental quantitative research…

  17. 76 FR 4102 - Smart Grid Interoperability Standards; Supplemental Notice of Technical Conference

    Science.gov (United States)

    2011-01-24

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. RM11-2-000] Smart Grid Interoperability Standards; Supplemental Notice of Technical Conference January 13, 2011. On December 21, 2010, the Federal Energy Regulatory Commission announced that a Technical Conference on Smart Grid Interoperability Standards will be held on Monday...

  18. Watershed and Economic Data InterOperability (WEDO): Facilitating Discovery, Evaluation and Integration through the Sharing of Watershed Modeling Data

    Science.gov (United States)

    Watershed and Economic Data InterOperability (WEDO) is a system of information technologies designed to publish watershed modeling studies for reuse. WEDO facilitates three aspects of interoperability: discovery, evaluation and integration of data. This increased level of interop...

  19. From Networks to Time Series

    Science.gov (United States)

    Shimada, Yutaka; Ikeguchi, Tohru; Shigehara, Takaomi

    2012-10-01

    In this Letter, we propose a framework to transform a complex network to a time series. The transformation from complex networks to time series is realized by the classical multidimensional scaling. Applying the transformation method to a model proposed by Watts and Strogatz [Nature (London) 393, 440 (1998)], we show that ring lattices are transformed to periodic time series, small-world networks to noisy periodic time series, and random networks to random time series. We also show that these relationships are analytically held by using the circulant-matrix theory and the perturbation theory of linear operators. The results are generalized to several high-dimensional lattices.

  20. The Influence of Information Systems Interoperability on Economic Activity in Poland

    Directory of Open Access Journals (Sweden)

    Ganczar Małgorzata

    2017-12-01

    Full Text Available In the text, I discuss the abilities and challenges of information systems interoperability. The anticipated and expected result of interoperability is to improve the provision of public utility services to citizens and companies by means of facilitating the provision of public utility services on the basis of a “single window” principle and reducing the costs incurred by public administrations, companies, and citizens, resulting from the efficiency of the provision of public utility services. In the article, the conceptual framework of interoperability is elaborated upon. Moreover, information systems and public registers for entrepreneurs in Poland exemplify whether the interoperability may be applied and, if so, whether interoperability fulfils its targets to the extent of e-Government services for entrepreneurs.

  1. D-ATM, a working example of health care interoperability: From dirt path to gravel road.

    Science.gov (United States)

    DeClaris, John-William

    2009-01-01

    For many years, there have been calls for interoperability within health care systems. The technology currently exists and is being used in business areas like banking and commerce, to name a few. Yet the question remains, why has interoperability not been achieved in health care? This paper examines issues encountered and success achieved with interoperability during the development of the Digital Access To Medication (D-ATM) project, sponsored by the Substance Abuse and Mental Health Services Administration (SAMHSA). D-ATM is the first government funded interoperable patient management system. The goal of this paper is to provide lessons learned and propose one possible road map for health care interoperability within private industry and how government can help.

  2. Interoperability of clinical decision-support systems and electronic health records using archetypes: a case study in clinical trial eligibility.

    Science.gov (United States)

    Marcos, Mar; Maldonado, Jose A; Martínez-Salvador, Begoña; Boscá, Diego; Robles, Montserrat

    2013-08-01

    Clinical decision-support systems (CDSSs) comprise systems as diverse as sophisticated platforms to store and manage clinical data, tools to alert clinicians of problematic situations, or decision-making tools to assist clinicians. Irrespective of the kind of decision-support task CDSSs should be smoothly integrated within the clinical information system, interacting with other components, in particular with the electronic health record (EHR). However, despite decades of developments, most CDSSs lack interoperability features. We deal with the interoperability problem of CDSSs and EHRs by exploiting the dual-model methodology. This methodology distinguishes a reference model and archetypes. A reference model is represented by a stable and small object-oriented model that describes the generic properties of health record information. For their part, archetypes are reusable and domain-specific definitions of clinical concepts in the form of structured and constrained combinations of the entities of the reference model. We rely on archetypes to make the CDSS compatible with EHRs from different institutions. Concretely, we use archetypes for modelling the clinical concepts that the CDSS requires, in conjunction with a series of knowledge-intensive mappings relating the archetypes to the data sources (EHR and/or other archetypes) they depend on. We introduce a comprehensive approach, including a set of tools as well as methodological guidelines, to deal with the interoperability of CDSSs and EHRs based on archetypes. Archetypes are used to build a conceptual layer of the kind of a virtual health record (VHR) over the EHR whose contents need to be integrated and used in the CDSS, associating them with structural and terminology-based semantics. Subsequently, the archetypes are mapped to the EHR by means of an expressive mapping language and specific-purpose tools. We also describe a case study where the tools and methodology have been employed in a CDSS to support

  3. 76 FR 66040 - NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Draft...

    Science.gov (United States)

    2011-10-25

    ...-01] NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Draft... draft version of the NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0... Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Release 2.0) (Draft) for public review and...

  4. Inverse Operation of Four-dimensional Vector Matrix

    OpenAIRE

    H J Bao; A J Sang; H X Chen

    2011-01-01

    This is a new series of study to define and prove multidimensional vector matrix mathematics, which includes four-dimensional vector matrix determinant, four-dimensional vector matrix inverse and related properties. There are innovative concepts of multi-dimensional vector matrix mathematics created by authors with numerous applications in engineering, math, video conferencing, 3D TV, and other fields.

  5. Test Protocols for Advanced Inverter Interoperability Functions - Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Jay Dean [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gonzalez, Sigifredo [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ralph, Mark E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ellis, Abraham [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Broderick, Robert Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2013-11-01

    Distributed energy resources (DER) such as photovoltaic (PV) systems, when deployed in a large scale, are capable of influencing significantly the operation of power systems. Looking to the future, stakeholders are working on standards to make it possible to manage the potentially complex interactions between DER and the power system. In 2009, the Electric Power Research Institute (EPRI), Sandia National Laboratories (SNL) with the U.S. Department of Energy (DOE), and the Solar Electric Power Association (SEPA) initiated a large industry collaborative to identify and standardize definitions for a set of DER grid support functions. While the initial effort concentrated on grid-tied PV inverters and energy storage systems, the concepts have applicability to all DER. A partial product of this on-going effort is a reference definitions document (IEC TR 61850-90-7, Object models for power converters in distributed energy resources (DER) systems) that has become a basis for expansion of related International Electrotechnical Commission (IEC) standards, and is supported by US National Institute of Standards and Technology (NIST) Smart Grid Interoperability Panel (SGIP). Some industry-led organizations advancing communications protocols have also embraced this work. As standards continue to evolve, it is necessary to develop test protocols to independently verify that the inverters are properly executing the advanced functions. Interoperability is assured by establishing common definitions for the functions and a method to test compliance with operational requirements. This document describes test protocols developed by SNL to evaluate the electrical performance and operational capabilities of PV inverters and energy storage, as described in IEC TR 61850-90-7. While many of these functions are not now required by existing grid codes or may not be widely available commercially, the industry is rapidly moving in that direction. Interoperability issues are already apparent as

  6. Test Protocols for Advanced Inverter Interoperability Functions – Main Document

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Jay Dean [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gonzalez, Sigifredo [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ralph, Mark E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ellis, Abraham [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Broderick, Robert Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2013-11-01

    Distributed energy resources (DER) such as photovoltaic (PV) systems, when deployed in a large scale, are capable of influencing significantly the operation of power systems. Looking to the future, stakeholders are working on standards to make it possible to manage the potentially complex interactions between DER and the power system. In 2009, the Electric Power Research Institute (EPRI), Sandia National Laboratories (SNL) with the U.S. Department of Energy (DOE), and the Solar Electric Power Association (SEPA) initiated a large industry collaborative to identify and standardize definitions for a set of DER grid support functions. While the initial effort concentrated on grid-tied PV inverters and energy storage systems, the concepts have applicability to all DER. A partial product of this on-going effort is a reference definitions document (IEC TR 61850-90-7, Object models for power converters in distributed energy resources (DER) systems) that has become a basis for expansion of related International Electrotechnical Commission (IEC) standards, and is supported by US National Institute of Standards and Technology (NIST) Smart Grid Interoperability Panel (SGIP). Some industry-led organizations advancing communications protocols have also embraced this work. As standards continue to evolve, it is necessary to develop test protocols to independently verify that the inverters are properly executing the advanced functions. Interoperability is assured by establishing common definitions for the functions and a method to test compliance with operational requirements. This document describes test protocols developed by SNL to evaluate the electrical performance and operational capabilities of PV inverters and energy storage, as described in IEC TR 61850-90-7. While many of these functions are not currently required by existing grid codes or may not be widely available commercially, the industry is rapidly moving in that direction. Interoperability issues are already

  7. A web services choreography scenario for interoperating bioinformatics applications

    Directory of Open Access Journals (Sweden)

    Cheung David W

    2004-03-01

    Full Text Available Abstract Background Very often genome-wide data analysis requires the interoperation of multiple databases and analytic tools. A large number of genome databases and bioinformatics applications are available through the web, but it is difficult to automate interoperation because: 1 the platforms on which the applications run are heterogeneous, 2 their web interface is not machine-friendly, 3 they use a non-standard format for data input and output, 4 they do not exploit standards to define application interface and message exchange, and 5 existing protocols for remote messaging are often not firewall-friendly. To overcome these issues, web services have emerged as a standard XML-based model for message exchange between heterogeneous applications. Web services engines have been developed to manage the configuration and execution of a web services workflow. Results To demonstrate the benefit of using web services over traditional web interfaces, we compare the two implementations of HAPI, a gene expression analysis utility developed by the University of California San Diego (UCSD that allows visual characterization of groups or clusters of genes based on the biomedical literature. This utility takes a set of microarray spot IDs as input and outputs a hierarchy of MeSH Keywords that correlates to the input and is grouped by Medical Subject Heading (MeSH category. While the HTML output is easy for humans to visualize, it is difficult for computer applications to interpret semantically. To facilitate the capability of machine processing, we have created a workflow of three web services that replicates the HAPI functionality. These web services use document-style messages, which means that messages are encoded in an XML-based format. We compared three approaches to the implementation of an XML-based workflow: a hard coded Java application, Collaxa BPEL Server and Taverna Workbench. The Java program functions as a web services engine and interoperates

  8. Matrix inequalities

    CERN Document Server

    Zhan, Xingzhi

    2002-01-01

    The main purpose of this monograph is to report on recent developments in the field of matrix inequalities, with emphasis on useful techniques and ingenious ideas. Among other results this book contains the affirmative solutions of eight conjectures. Many theorems unify or sharpen previous inequalities. The author's aim is to streamline the ideas in the literature. The book can be read by research workers, graduate students and advanced undergraduates.

  9. Trust Model to Enhance Security and Interoperability of Cloud Environment

    Science.gov (United States)

    Li, Wenjuan; Ping, Lingdi

    Trust is one of the most important means to improve security and enable interoperability of current heterogeneous independent cloud platforms. This paper first analyzed several trust models used in large and distributed environment and then introduced a novel cloud trust model to solve security issues in cross-clouds environment in which cloud customer can choose different providers' services and resources in heterogeneous domains can cooperate. The model is domain-based. It divides one cloud provider's resource nodes into the same domain and sets trust agent. It distinguishes two different roles cloud customer and cloud server and designs different strategies for them. In our model, trust recommendation is treated as one type of cloud services just like computation or storage. The model achieves both identity authentication and behavior authentication. The results of emulation experiments show that the proposed model can efficiently and safely construct trust relationship in cross-clouds environment.

  10. Focus for 3D city models should be on interoperability

    DEFF Research Database (Denmark)

    Bodum, Lars; Kjems, Erik; Jaegly, Marie Michele Helena

    2006-01-01

    that would make it useful for other purposes than visualisation. Time has come to try to change this trend and to convince the municipalities that interoperability and semantics are important issues for the future. It is important for them to see that 3D modelling, mapping and geographic information...... developments in Geographical Exploration Systems. Centralized and proprietary Geographical Exploration Systems only give us their own perspective on the world. On the contrary, GRIFINOR is decentralized and available for everyone to use, empowering people to promote their own world vision....... are subjects on the same agenda towards an integrated solution for an object-oriented mapping of multidimensional geographic objects in the urban environment. Many relevant subjects could be discussed regarding these matters, but in this paper we will narrow the discussion down to the ideas behind...

  11. Internet of Things Heterogeneous Interoperable Network Architecture Design

    DEFF Research Database (Denmark)

    Bhalerao, Dipashree M.

    2014-01-01

    Internet of Thing‘s (IoT) state of the art deduce that there is no mature Internet of Things architecture available. Thesis contributes an abstract generic IoT system reference architecture development with specifications. Novelties of thesis are proposed solutions and implementations....... It is proved that reduction of data at a source will result in huge vertical scalability and indirectly horizontal also. Second non functional feature contributes in heterogeneous interoperable network architecture for constrained Things. To eliminate increasing number of gateways, Wi-Fi access point...... with Bluetooth, Zigbee (new access point is called as BZ-Fi) is proposed. Co-existence of Wi-Fi, Bluetooth, and Zigbee network technologies results in interference. To reduce the interference, orthogonal frequency division multiplexing (OFDM) is proposed tobe implemented in Bluetooth and Zigbee. The proposed...

  12. Web services for distributed and interoperable hydro-information systems

    Science.gov (United States)

    Horak, J.; Orlik, A.; Stromsky, J.

    2008-03-01

    Web services support the integration and interoperability of Web-based applications and enable machine-to-machine interaction. The concepts of web services and open distributed architecture were applied to the development of T-DSS, the prototype customised for web based hydro-information systems. T-DSS provides mapping services, database related services and access to remote components, with special emphasis placed on the output flexibility (e.g. multilingualism), where SOAP web services are mainly used for communication. The remote components are represented above all by remote data and mapping services (e.g. meteorological predictions), modelling and analytical systems (currently HEC-HMS, MODFLOW and additional utilities), which support decision making in water management.

  13. Connectivity, interoperability and manageability challenges in internet of things

    Science.gov (United States)

    Haseeb, Shariq; Hashim, Aisha Hassan A.; Khalifa, Othman O.; Ismail, Ahmad Faris

    2017-09-01

    The vision of Internet of Things (IoT) is about interconnectivity between sensors, actuators, people and processes. IoT exploits connectivity between physical objects like fridges, cars, utilities, buildings and cities for enhancing the lives of people through automation and data analytics. However, this sudden increase in connected heterogeneous IoT devices takes a huge toll on the existing Internet infrastructure and introduces new challenges for researchers to embark upon. This paper highlights the effects of heterogeneity challenges on connectivity, interoperability, management in greater details. It also surveys some of the existing solutions adopted in the core network to solve the challenges of massive IoT deployment. The paper finally concludes that IoT architecture and network infrastructure needs to be reengineered ground-up, so that IoT solutions can be safely and efficiently deployed.

  14. Operational Plan Ontology Model for Interconnection and Interoperability

    Science.gov (United States)

    Long, F.; Sun, Y. K.; Shi, H. Q.

    2017-03-01

    Aiming at the assistant decision-making system’s bottleneck of processing the operational plan data and information, this paper starts from the analysis of the problem of traditional expression and the technical advantage of ontology, and then it defines the elements of the operational plan ontology model and determines the basis of construction. Later, it builds up a semi-knowledge-level operational plan ontology model. Finally, it probes into the operational plan expression based on the operational plan ontology model and the usage of the application software. Thus, this paper has the theoretical significance and application value in the improvement of interconnection and interoperability of the operational plan among assistant decision-making systems.

  15. Special Topic Interoperability and EHR: Combining openEHR, SNOMED, IHE, and Continua as approaches to interoperability on national eHealth.

    Science.gov (United States)

    Beštek, Mate; Stanimirović, Dalibor

    2017-08-09

    The main aims of the paper comprise the characterization and examination of the potential approaches regarding interoperability. This includes openEHR, SNOMED, IHE, and Continua as combined interoperability approaches, possibilities for their incorporation into the eHealth environment, and identification of the main success factors in the field, which are necessary for achieving required interoperability, and consequently, for the successful implementation of eHealth projects in general. The paper represents an in-depth analysis regarding the potential application of openEHR, SNOMED, IHE and Continua approaches in the development and implementation process of eHealth in Slovenia. The research method used is both exploratory and deductive in nature. The methodological framework is grounded on information retrieval with a special focus on research and charting of existing experience in the field, and sources, both electronic and written, which include interoperability concepts and related implementation issues. The paper will try to answer the following inquiries that are complementing each other: 1. Scrutiny of the potential approaches, which could alleviate the pertinent interoperability issues in the Slovenian eHealth context. 2. Analyzing the possibilities (requirements) for their inclusion in the construction process for individual eHealth solutions. 3. Identification and charting the main success factors in the interoperability field that critically influence development and implementation of eHealth projects in an efficient manner. Provided insights and identified success factors could serve as a constituent of the strategic starting points for continuous integration of interoperability principles into the healthcare domain. Moreover, the general implementation of the identified success factors could facilitate better penetration of ICT into the healthcare environment and enable the eHealth-based transformation of the health system especially in the countries

  16. Middleware Interoperability for Robotics: A ROS-YARP Framework

    Directory of Open Access Journals (Sweden)

    Plinio Moreno

    2016-10-01

    Full Text Available Middlewares are fundamental tools for progress in research and applications in robotics. They enable the integration of multiple heterogeneous sensing and actuation devices, as well as providing general purpose modules for key robotics functions (kinematics, navigation, planning. However, no existing middleware yet provides a complete set of functionalities for all robotics applications, and many robots may need to rely on more than one framework. This paper focuses on the interoperability between two of the most prevalent middleware in robotics: YARP and ROS. Interoperability between middlewares should ideally allow users to execute existing software without the necessity of: (i changing the existing code, and (ii writing hand-coded ``bridges'' for each use-case. We propose a framework enabling the communication between existing YARP modules and ROS nodes for robotics applications in an automated way. Our approach generates the ``bridging gap'' code from a configuration file, connecting YARP ports and ROS topics through code-generated YARP Bottles. %%The configuration file must describe: (i the sender entities, (ii the way to group and convert the information read from the sender, (iii the structure of the output message and (iv the receiving entity. Our choice for the many inputs to one output is the most common use-case in robotics applications, where examples include filtering, decision making and visualization. %We support YARP/ROS and ROS/YARP sender/receiver configurations, which are demonstrated in a humanoid on wheels robot that uses YARP for upper body motor control and visual perception, and ROS for mobile base control and navigation algorithms.

  17. European Interoperability Assets Register and Quality Framework Implementation.

    Science.gov (United States)

    Moreno-Conde, Alberto; Thienpont, Geert; Lamote, Inge; Coorevits, Pascal; Parra, Carlos; Kalra, Dipak

    2016-01-01

    Interoperability assets is the term applied to refer to any resource that can support the design, implementation and successful adoption of eHealth services that can exchange data meaningfully. Some examples may include functional requirements, specifications, standards, clinical models and term lists, guidance on how standards may be used concurrently, implementation guides, educational resources, and other resources. Unfortunately, these are largely accessible in ad hoc ways and result in scattered fragments of a solution space that urgently need to be brought together. At present, it is well known that new initiatives and projects will reinvent assets of which they were unaware, while those assets which were potentially of great value are forgotten, not maintained and eventually fall into disuse. This research has defined a quality in use model and assessed the suitability of this quality framework based on the feedback and opinion of a representative sample of potential end users. This quality framework covers the following domains of asset development and adoption: (i) Development process, (ii) Maturity level, (iii) Trustworthiness, (iv) Support & skills, (v) Sustainability, (vi) Semantic interoperability, (vii) Cost & effort of adoption (viii) Maintenance. When participants were requested to evaluate how the overall quality in use framework, 70% would recommend using the register to their colleagues, 70% felt that it could provide relevant benefits for discovering new assets, and 50% responded that it would support their decision making about the recommended asset to adopt or implement in their organisation. Several European projects have expressed interest in using the register, which will now be sustained and promoted by the the European Institute for Innovation through Health Data.

  18. Political, policy and social barriers to health system interoperability: emerging opportunities of Web 2.0 and 3.0.

    Science.gov (United States)

    Juzwishin, Donald W M

    2009-01-01

    Achieving effective health informatics interoperability in a fragmented and uncoordinated health system is by definition not possible. Interoperability requires the simultaneous integration of health care processes and information across different types and levels of care (systems thinking). The fundamental argument of this paper is that information system interoperability will remain an unfulfilled hope until health reforms effectively address the governance (accountability), structural and process barriers to interoperability of health care delivery. The ascendency of Web 2.0 and 3.0, although still unproven, signals the opportunity to accelerate patients' access to health information and their health record. Policy suggestions for simultaneously advancing health system delivery and information system interoperability are posited.

  19. Infinite series

    CERN Document Server

    Hirschman, Isidore Isaac

    2014-01-01

    This text for advanced undergraduate and graduate students presents a rigorous approach that also emphasizes applications. Encompassing more than the usual amount of material on the problems of computation with series, the treatment offers many applications, including those related to the theory of special functions. Numerous problems appear throughout the book.The first chapter introduces the elementary theory of infinite series, followed by a relatively complete exposition of the basic properties of Taylor series and Fourier series. Additional subjects include series of functions and the app

  20. Interoperability of Information Systems Managed and Used by the Local Health Departments.

    Science.gov (United States)

    Shah, Gulzar H; Leider, Jonathon P; Luo, Huabin; Kaur, Ravneet

    2016-01-01

    In the post-Affordable Care Act era marked by interorganizational collaborations and availability of large amounts of electronic data from other community partners, it is imperative to assess the interoperability of information systems used by the local health departments (LHDs). To describe the level of interoperability of LHD information systems and identify factors associated with lack of interoperability. This mixed-methods research uses data from the 2015 Informatics Capacity and Needs Assessment Survey, with a target population of all LHDs in the United States. A representative sample of 650 LHDs was drawn using a stratified random sampling design. A total of 324 completed responses were received (50% response rate). Qualitative data were used from a key informant interview study of LHD informatics staff from across the United States. Qualitative data were independently coded by 2 researchers and analyzed thematically. Survey data were cleaned, bivariate comparisons were conducted, and a multivariable logistic regression was run to characterize factors associated with interoperability. For 30% of LHDs, no systems were interoperable, and 38% of LHD respondents indicated some of the systems were interoperable. Significant determinants of interoperability included LHDs having leadership support (adjusted odds ratio [AOR] = 3.54), control of information technology budget allocation (AOR = 2.48), control of data systems (AOR = 2.31), having a strategic plan for information systems (AOR = 1.92), and existence of business process analysis and redesign (AOR = 1.49). Interoperability of all systems may be an informatics goal, but only a small proportion of LHDs reported having interoperable systems, pointing to a substantial need among LHDs nationwide.

  1. A state-of-the-art review of interoperability amongst heterogeneous software systems

    Directory of Open Access Journals (Sweden)

    Carlos Mario Zapata Jaramillo

    2009-05-01

    Full Text Available Information systems are sets of interacting elements aimed at supporting entrepreneurial or business activities; they cannot thus coexist in an isolated way but require their data to be shared so as to increase their productivity. Such systems’ interoperability is normally accomplished through mark-up standards, query languages and web services. The literature contains work related to software system interoperability; however, it presents some difficulties, such as the need for using the same platforms and different programming languages, the use of read only languages and the deficiencies in the formalism used for achieving it. This paper presents a critical review of the advances made regarding heterogeneous software systems’ interoperability.

  2. Matrix analysis

    CERN Document Server

    Bhatia, Rajendra

    1997-01-01

    A good part of matrix theory is functional analytic in spirit. This statement can be turned around. There are many problems in operator theory, where most of the complexities and subtleties are present in the finite-dimensional case. My purpose in writing this book is to present a systematic treatment of methods that are useful in the study of such problems. This book is intended for use as a text for upper division and gradu­ ate courses. Courses based on parts of the material have been given by me at the Indian Statistical Institute and at the University of Toronto (in collaboration with Chandler Davis). The book should also be useful as a reference for research workers in linear algebra, operator theory, mathe­ matical physics and numerical analysis. A possible subtitle of this book could be Matrix Inequalities. A reader who works through the book should expect to become proficient in the art of deriving such inequalities. Other authors have compared this art to that of cutting diamonds. One first has to...

  3. Advances in HTR fuel matrix technology

    International Nuclear Information System (INIS)

    Voice, E.H.; Sturge, D.W.

    1974-02-01

    Progress in the materials and technology of matrix consolidation in recent years is summarised, noting especially the development of an improved resin and the introduction of a new graphite powder. An earlier irradiation programme, the Matrix Test Series, is recalled and the fabrication of the most recent experiment, the directly-cooled homogeneous Met. VI, is described. (author)

  4. An application of ETICS Co-Scheduling Mechanism to Interoperability and Compliance Validation of Grid Services

    CERN Document Server

    Ronchieri, Elisabetta; Diez-andino Sancho, Guillermo; DI Meglio, Alberto; Marzolla, Moreno

    2008-01-01

    Grid software projects require infrastructures in order to evaluate interoperability with other projects and compliance with predefined standards. Interoperability and compliance are quality attributes that are expected from all distributed projects. ETICS is designed to automate the investigation of this kind of problems. It integrates well-established procedures, tools and resources in a coherent framework and adaptes them to the special needs of these projects. Interoperability and compliance to standards are important quality attributes of software developed for Grid environments where many different parts of an interconnected system have to interact. Compliance to standard is one of the major factors in making sure that interoperating parts of a distributed system can actually interconnect and exchange information. Taking the case of the Grid environment (Foster and Kesselman, 2003), most of the projects that are developing software have not reached the maturity level of other communities yet and have di...

  5. Rich services in interoperable Learning Designs: can the circle be squared?

    OpenAIRE

    Griffiths, David

    2009-01-01

    Griffiths, D. (2009). Rich services in interoperable Learning Designs: Can the circle be squared?. Presented at Opening Up Learning Design, European LAMS and Learning Design Conference 2009. July, 6-9, 2009, Milton Keynes, United Kingdom.

  6. Mediation, Alignment, and Information Services for Semantic interoperability (MAISSI): A Trade Study

    National Research Council Canada - National Science Library

    Barlos, Fotis; Hunter, Dan; Krikeles, Basil; McDonough, James

    2007-01-01

    .... Semantic Interoperability (SI) encompasses a broad range of technologies such as data mediation and schema matching, ontology alignment, and context representation that attempt to enable systems to understand each others semantics...

  7. Interoperability and future internet for next generation enterprises - editorial and state of the art

    NARCIS (Netherlands)

    van Sinderen, Marten J.; Johnson, Pontus; Doumeingts, Guy

    2013-01-01

    Today’s global markets drive enterprises towards closer collaboration with customers, suppliers and partners. Interoperability problems constitute fundamental barriers to such collaboration. A characteristic of modern economic life is the requirement on continuous and rapid change and innovation.

  8. Public Key Infrastructure (PKI) Interoperability: A Security Services Approach to Support Transfer of Trust

    National Research Council Canada - National Science Library

    Hansen, Anthony

    1999-01-01

    .... This thesis defines interoperability as the capacity to support trust through retention of security services across PKI domains at a defined level of assurance and examines the elements of PKI...

  9. Analysis of Jordan's Proposed Emergency Communication Interoperability Plan (JECIP) for Disaster Response

    National Research Council Canada - National Science Library

    Alzaghal, Mohamad H

    2008-01-01

    ... country. It is essential to build a robust and interoperable Information and Communication Technology (ICT) infrastructure before the disaster, which will facilitate patch/restore/reconstruct it when and after the disaster hits...

  10. Interoperability requirements for a South African joint command and control test facility

    CSIR Research Space (South Africa)

    Le Roux, WH

    2008-06-01

    Full Text Available approach is followed to provide all the necessary services, mechanisms and functionalities. Since simulations and simulators form part of such a facility, interoperability standards are very important, as well as the underlying data model. The high...

  11. 78 FR 50075 - Statewide Communication Interoperability Plan Template and Annual Progress Report

    Science.gov (United States)

    2013-08-16

    ... Collection Request should be forwarded to DHS/NPPD/CS&C/OEC, 245 Murray Lane SW., Mail Stop 0640, Arlington... will assist states in their strategic planning for interoperable and emergency communications while...

  12. Interoperability and Security Support for Heterogeneous COTS/GOTS/Legacy Component-Based Architecture

    National Research Council Canada - National Science Library

    Tran, Tam

    2000-01-01

    There is a need for Commercial-off-the-shelf (COTS), Government-off-the-shelf (GOTS) and legacy components to interoperate in a secure distributed computing environment in order to facilitate the development of evolving applications...

  13. Matrix pentagons

    Science.gov (United States)

    Belitsky, A. V.

    2017-10-01

    The Operator Product Expansion for null polygonal Wilson loop in planar maximally supersymmetric Yang-Mills theory runs systematically in terms of multi-particle pentagon transitions which encode the physics of excitations propagating on the color flux tube ending on the sides of the four-dimensional contour. Their dynamics was unraveled in the past several years and culminated in a complete description of pentagons as an exact function of the 't Hooft coupling. In this paper we provide a solution for the last building block in this program, the SU(4) matrix structure arising from internal symmetry indices of scalars and fermions. This is achieved by a recursive solution of the Mirror and Watson equations obeyed by the so-called singlet pentagons and fixing the form of the twisted component in their tensor decomposition. The non-singlet, or charged, pentagons are deduced from these by a limiting procedure.

  14. Matrix pentagons

    Directory of Open Access Journals (Sweden)

    A.V. Belitsky

    2017-10-01

    Full Text Available The Operator Product Expansion for null polygonal Wilson loop in planar maximally supersymmetric Yang–Mills theory runs systematically in terms of multi-particle pentagon transitions which encode the physics of excitations propagating on the color flux tube ending on the sides of the four-dimensional contour. Their dynamics was unraveled in the past several years and culminated in a complete description of pentagons as an exact function of the 't Hooft coupling. In this paper we provide a solution for the last building block in this program, the SU(4 matrix structure arising from internal symmetry indices of scalars and fermions. This is achieved by a recursive solution of the Mirror and Watson equations obeyed by the so-called singlet pentagons and fixing the form of the twisted component in their tensor decomposition. The non-singlet, or charged, pentagons are deduced from these by a limiting procedure.

  15. Interoperability challenges for the Sustainable Management of seagrass meadows (Invited)

    Science.gov (United States)

    Nativi, S.; Pastres, R.; Bigagli, L.; Venier, C.; Zucchetta, M.; Santoro, M.

    2013-12-01

    Seagrass meadows (marine angiosperm plants) occupy less than 0.2% of the global ocean surface, annually store about 10-18% of the so-called 'Blue Carbon', i.e. the Carbon stored in coastal vegetated areas. Recent literature estimates that the flux to the long-term carbon sink in seagrasses represents 10-20% of seagrasses global average production. Such figures can be translated into economic benefits, taking into account that a ton of carbon dioxide in Europe is paid at around 15 € in the carbon market. This means that the organic carbon retained in seagrass sediments in the Mediterranean is worth 138 - 1128 billion €, which represents 6-23 € per square meter. This is 9-35 times more than one square meter of tropical forest soil (0.66 € per square meter), or 5-17 times when considering both the above and the belowground compartments in tropical forests. According the most conservative estimations, about 10% of the Mediterranean meadows have been lost during the last century. In the framework of the GEOSS (Global Earth Observation System of Systems) initiative, the MEDINA project (funded by the European Commission and coordinated by the University of Ca'Foscari in Venice) prepared a showcase as part of the GEOSS Architecture Interoperability Pilot -phase 6 (AIP-6). This showcase aims at providing a tool for the sustainable management of seagrass meadows along the Mediterranean coastline. The application is based on an interoperability framework providing a set of brokerage services to easily ingest and run a Habitat Suitability model (a model predicting the probability a given site to provide a suitable habitat for the development of seagrass meadow and the average coverage expected). The presentation discusses such a framework explaining how the input data is discovered, accessed and processed to ingest the model (developed in the MEDINA project). Furthermore, the brokerage framework provides the necessary services to run the model and visualize results

  16. The National Flood Interoperability Experiment: Bridging Resesarch and Operations

    Science.gov (United States)

    Salas, F. R.

    2015-12-01

    The National Weather Service's new National Water Center, located on the University of Alabama campus in Tuscaloosa, will become the nation's hub for comprehensive water resources forecasting. In conjunction with its federal partners the US Geological Survey, Army Corps of Engineers and Federal Emergency Management Agency, the National Weather Service will operationally support both short term flood prediction and long term seasonal forecasting of water resource conditions. By summer 2016, the National Water Center will begin evaluating four streamflow data products at the scale of the NHDPlus river reaches (approximately 2.67 million). In preparation for the release of these products, from September 2014 to August 2015, the National Weather Service partnered with the Consortium of Universities for the Advancement of Hydrologic Science, Inc. to support the National Flood Interoperability Experiment which included a seven week in-residence Summer Institute in Tuscaloosa for university students interested in learning about operational hydrology and flood forecasting. As part of the experiment, 15 hour forecasts from the operational High Resolution Rapid Refresh atmospheric model were used to drive a three kilometer Noah-MP land surface model loosely coupled to a RAPID river routing model operating on the NHDPlus dataset. This workflow was run every three hours during the Summer Institute and the results were made available to those engaged to pursue a range of research topics focused on flood forecasting (e.g. reservoir operations, ensemble forecasting, probabilistic flood inundation mapping, rainfall product evaluation etc.) Although the National Flood Interoperability Experiment was finite in length, it provided a platform through which the academic community could engage federal agencies and vice versa to narrow the gap between research and operations and demonstrate how state of the art research infrastructure, models, services, datasets etc. could be utilized

  17. Author identities an interoperability problem solved by a collaborative solution

    Science.gov (United States)

    Fleischer, D.; Czerniak, A.; Schirnick, C.

    2012-12-01

    The identity of authors and data providers is crucial for personalized interoperability. The marketplace of available identifiers is packed and the right choice is getting more and more complicated. Even though there are more then 15 different systems available there are still some under development and proposed to come up by the end of 2012 ('PubMed Central Author ID' and ORCID). Data Management on a scale beyond the size of a single research institute but on the scale of a scientific site including a university with student education program needs to tackle this problem and so did the Kiel Data Management an Infrastructure. The main problem with the identities of researchers is the quite high frequency changes in positions during a scientist life. The required system needed to be a system that already contained the potential of preregistered people with their scientific publications from other countries, institutions and organizations. Scanning the author ID marketplace brought up, that there us a high risk of additional workload to the researcher itself or the administration due to the fact that individuals need to register an ID for themselves or the chosen register is not yet big enough to simply find the right entry. On the other hand libraries deal with authors and their publications now for centuries and they have high quality catalogs with person identities already available. Millions of records internationally mapped are available by collaboration with libraries and can be used in exactly the same scope. The international collaboration between libraries (VIAF) provides a mapping between libraries from the US, CA, UK, FR, GER and many more. The international library author identification system made it possible to actually reach at the first matching a success of 60% of all scientists. The additional advantage is that librarians can finalize the Identity system in a kind of background process. The Kiel Data Management Infrastructure initiated a web service

  18. An E-government Interoperability Platform Supporting Personal Data Protection Regulations

    OpenAIRE

    González, Laura; Echevarría, Andrés; Morales, Dahiana; Ruggia, Raúl

    2016-01-01

    Public agencies are increasingly required to collaborate with each other in order to provide high-quality e-government services. This collaboration is usually based on the service-oriented approach and supported by interoperability platforms. Such platforms are specialized middleware-based infrastructures enabling the provision, discovery and invocation of interoperable software services. In turn, given that personal data handled by governments are often very sensitive, most governments have ...

  19. Collaborative ocean resource interoperability - multi-use of ocean data on the semantic web

    OpenAIRE

    Tao, Feng; Campbell, Jon; Pagnani, Maureen; Griffiths, Gwyn

    2009-01-01

    Earth Observations (EO) collect various characteristics of the objective environment using sensors which often have different measuring, spatial and temporal coverage. Making individual observational data interoperable becomes equally important when viewed in the context of its expensive and time-consuming EO operations. Interoperability will improve reusability of existing observations in both the broader context, and with other observations. As a demonstration of the potential offered by se...

  20. Balancing of Heterogeneity and Interoperability in E-Business Networks: The Role of Standards and Protocols

    OpenAIRE

    Frank-Dieter Dorloff; Ejub Kajan

    2012-01-01

    To reach this interoperability visibility and common understanding must be ensured on all levels of the interoperability pyramid. This includes common agreements about the visions, political and legal restrictions, clear descriptions about the collaboration scenarios, included business processes and-rules, the type and roles of the Documents, a common understandable vocabulary, etc. To do this in an effective and automatable manner, ICT based concepts, frameworks and models have to be defined...

  1. [Lessons learned in the implementation of interoperable National Health Information Systems: a systematic review].

    Science.gov (United States)

    Ovies-Bernal, Diana Paola; Agudelo-Londoño, Sandra M

    2014-01-01

    Identify shared criteria used throughout the world in the implementation of interoperable National Health Information Systems (NHIS) and provide validated scientific information on the dimensions affecting interoperability. This systematic review sought to identify primary articles on the implementation of interoperable NHIS published in scientific journals in English, Portuguese, or Spanish between 1990 and 2011 through a search of eight databases of electronic journals in the health sciences and informatics: MEDLINE (PubMed), Proquest, Ovid, EBSCO, MD Consult, Virtual Health Library, Metapress, and SciELO. The full texts of the articles were reviewed, and those that focused on technical computer aspects or on normative issues were excluded, as well as those that did not meet the quality criteria for systematic reviews of interventions. Of 291 studies found and reviewed, only five met the inclusion criteria. These articles reported on the process of implementing an interoperable NHIS in Brazil, China, the United States, Turkey, and the Semiautonomous Region of Zanzíbar, respectively. Five common basic criteria affecting implementation of the NHIS were identified: standards in place to govern the process, availability of trained human talent, financial and structural constraints, definition of standards, and assurance that the information is secure. Four dimensions affecting interoperability were defined: technical, semantic, legal, and organizational. The criteria identified have to be adapted to the actual situation in each country and a proactive approach should be used to ensure that implementation of the interoperable NHIS is strategic, simple, and reliable.

  2. Reflections on the role of open source in health information system interoperability.

    Science.gov (United States)

    Sfakianakis, S; Chronaki, C E; Chiarugi, F; Conforti, F; Katehakis, D G

    2007-01-01

    This paper reflects on the role of open source in health information system interoperability. Open source is a driving force in computer science research and the development of information systems. It facilitates the sharing of information and ideas, enables evolutionary development and open collaborative testing of code, and broadens the adoption of interoperability standards. In health care, information systems have been developed largely ad hoc following proprietary specifications and customized design. However, the wide deployment of integrated services such as Electronic Health Records (EHRs) over regional health information networks (RHINs) relies on interoperability of the underlying information systems and medical devices. This reflection is built on the experiences of the PICNIC project that developed shared software infrastructure components in open source for RHINs and the OpenECG network that offers open source components to lower the implementation cost of interoperability standards such as SCP-ECG, in electrocardiography. Open source components implementing standards and a community providing feedback from real-world use are key enablers of health care information system interoperability. Investing in open source is investing in interoperability and a vital aspect of a long term strategy towards comprehensive health services and clinical research.

  3. On the formal definition of the systems' interoperability capability: an anthropomorphic approach

    Science.gov (United States)

    Zdravković, Milan; Luis-Ferreira, Fernando; Jardim-Goncalves, Ricardo; Trajanović, Miroslav

    2017-03-01

    The extended view of enterprise information systems in the Internet of Things (IoT) introduces additional complexity to the interoperability problems. In response to this, the problem of systems' interoperability is revisited by taking into the account the different aspects of philosophy, psychology, linguistics and artificial intelligence, namely by analysing the potential analogies between the processes of human and system communication. Then, the capability to interoperate as a property of the system, is defined as a complex ability to seamlessly sense and perceive a stimulus from its environment (assumingly, a message from any other system), make an informed decision about this perception and consequently, articulate a meaningful and useful action or response, based on this decision. Although this capability is defined on the basis of the existing interoperability theories, the proposed approach to its definition excludes the assumption on the awareness of co-existence of two interoperating systems. Thus, it establishes the links between the research of interoperability of systems and intelligent software agents, as one of the systems' digital identities.

  4. A federated semantic metadata registry framework for enabling interoperability across clinical research and care domains.

    Science.gov (United States)

    Sinaci, A Anil; Laleci Erturkmen, Gokce B

    2013-10-01

    In order to enable secondary use of Electronic Health Records (EHRs) by bridging the interoperability gap between clinical care and research domains, in this paper, a unified methodology and the supporting framework is introduced which brings together the power of metadata registries (MDR) and semantic web technologies. We introduce a federated semantic metadata registry framework by extending the ISO/IEC 11179 standard, and enable integration of data element registries through Linked Open Data (LOD) principles where each Common Data Element (CDE) can be uniquely referenced, queried and processed to enable the syntactic and semantic interoperability. Each CDE and their components are maintained as LOD resources enabling semantic links with other CDEs, terminology systems and with implementation dependent content models; hence facilitating semantic search, much effective reuse and semantic interoperability across different application domains. There are several important efforts addressing the semantic interoperability in healthcare domain such as IHE DEX profile proposal, CDISC SHARE and CDISC2RDF. Our architecture complements these by providing a framework to interlink existing data element registries and repositories for multiplying their potential for semantic interoperability to a greater extent. Open source implementation of the federated semantic MDR framework presented in this paper is the core of the semantic interoperability layer of the SALUS project which enables the execution of the post marketing safety analysis studies on top of existing EHR systems. Copyright © 2013 Elsevier Inc. All rights reserved.

  5. An approach to define semantics for BPM systems interoperability

    Science.gov (United States)

    Rico, Mariela; Caliusco, María Laura; Chiotti, Omar; Rosa Galli, María

    2015-04-01

    This article proposes defining semantics for Business Process Management systems interoperability through the ontology of Electronic Business Documents (EBD) used to interchange the information required to perform cross-organizational processes. The semantic model generated allows aligning enterprise's business processes to support cross-organizational processes by matching the business ontology of each business partner with the EBD ontology. The result is a flexible software architecture that allows dynamically defining cross-organizational business processes by reusing the EBD ontology. For developing the semantic model, a method is presented, which is based on a strategy for discovering entity features whose interpretation depends on the context, and representing them for enriching the ontology. The proposed method complements ontology learning techniques that can not infer semantic features not represented in data sources. In order to improve the representation of these entity features, the method proposes using widely accepted ontologies, for representing time entities and relations, physical quantities, measurement units, official country names, and currencies and funds, among others. When the ontologies reuse is not possible, the method proposes identifying whether that feature is simple or complex, and defines a strategy to be followed. An empirical validation of the approach has been performed through a case study.

  6. Adaptation of interoperability standards for cross domain usage

    Science.gov (United States)

    Essendorfer, B.; Kerth, Christian; Zaschke, Christian

    2017-05-01

    As globalization affects most aspects of modern life, challenges of quick and flexible data sharing apply to many different domains. To protect a nation's security for example, one has to look well beyond borders and understand economical, ecological, cultural as well as historical influences. Most of the time information is produced and stored digitally and one of the biggest challenges is to receive relevant readable information applicable to a specific problem out of a large data stock at the right time. These challenges to enable data sharing across national, organizational and systems borders are known to other domains (e.g., ecology or medicine) as well. Solutions like specific standards have been worked on for the specific problems. The question is: what can the different domains learn from each other and do we have solutions when we need to interlink the information produced in these domains? A known problem is to make civil security data available to the military domain and vice versa in collaborative operations. But what happens if an environmental crisis leads to the need to quickly cooperate with civil or military security in order to save lives? How can we achieve interoperability in such complex scenarios? The paper introduces an approach to adapt standards from one domain to another and lines out problems that have to be overcome and limitations that may apply.

  7. An open, interoperable, and scalable prehospital information technology network architecture.

    Science.gov (United States)

    Landman, Adam B; Rokos, Ivan C; Burns, Kevin; Van Gelder, Carin M; Fisher, Roger M; Dunford, James V; Cone, David C; Bogucki, Sandy

    2011-01-01

    Some of the most intractable challenges in prehospital medicine include response time optimization, inefficiencies at the emergency medical services (EMS)-emergency department (ED) interface, and the ability to correlate field interventions with patient outcomes. Information technology (IT) can address these and other concerns by ensuring that system and patient information is received when and where it is needed, is fully integrated with prior and subsequent patient information, and is securely archived. Some EMS agencies have begun adopting information technologies, such as wireless transmission of 12-lead electrocardiograms, but few agencies have developed a comprehensive plan for management of their prehospital information and integration with other electronic medical records. This perspective article highlights the challenges and limitations of integrating IT elements without a strategic plan, and proposes an open, interoperable, and scalable prehospital information technology (PHIT) architecture. The two core components of this PHIT architecture are 1) routers with broadband network connectivity to share data between ambulance devices and EMS system information services and 2) an electronic patient care report to organize and archive all electronic prehospital data. To successfully implement this comprehensive PHIT architecture, data and technology requirements must be based on best available evidence, and the system must adhere to health data standards as well as privacy and security regulations. Recent federal legislation prioritizing health information technology may position federal agencies to help design and fund PHIT architectures.

  8. Health level seven interoperability strategy: big data, incrementally structured.

    Science.gov (United States)

    Dolin, R H; Rogers, B; Jaffe, C

    2015-01-01

    Describe how the HL7 Clinical Document Architecture (CDA), a foundational standard in US Meaningful Use, contributes to a "big data, incrementally structured" interoperability strategy, whereby data structured incrementally gets large amounts of data flowing faster. We present cases showing how this approach is leveraged for big data analysis. To support the assertion that semi-structured narrative in CDA format can be a useful adjunct in an overall big data analytic approach, we present two case studies. The first assesses an organization's ability to generate clinical quality reports using coded data alone vs. coded data supplemented by CDA narrative. The second leverages CDA to construct a network model for referral management, from which additional observations can be gleaned. The first case shows that coded data supplemented by CDA narrative resulted in significant variances in calculated performance scores. In the second case, we found that the constructed network model enables the identification of differences in patient characteristics among different referral work flows. The CDA approach goes after data indirectly, by focusing first on the flow of narrative, which is then incrementally structured. A quantitative assessment of whether this approach will lead to a greater flow of data and ultimately a greater flow of structured data vs. other approaches is planned as a future exercise. Along with growing adoption of CDA, we are now seeing the big data community explore the standard, particularly given its potential to supply analytic en- gines with volumes of data previously not possible.

  9. An Interoperable System toward Cardiac Risk Stratification from ECG Monitoring

    Directory of Open Access Journals (Sweden)

    Cristina Soguero-Ruiz

    2018-03-01

    Full Text Available Many indices have been proposed for cardiovascular risk stratification from electrocardiogram signal processing, still with limited use in clinical practice. We created a system integrating the clinical definition of cardiac risk subdomains from ECGs and the use of diverse signal processing techniques. Three subdomains were defined from the joint analysis of the technical and clinical viewpoints. One subdomain was devoted to demographic and clinical data. The other two subdomains were intended to obtain widely defined risk indices from ECG monitoring: a simple-domain (heart rate turbulence (HRT, and a complex-domain (heart rate variability (HRV. Data provided by the three subdomains allowed for the generation of alerts with different intensity and nature, as well as for the grouping and scrutinization of patients according to the established processing and risk-thresholding criteria. The implemented system was tested by connecting data from real-world in-hospital electronic health records and ECG monitoring by considering standards for syntactic (HL7 messages and semantic interoperability (archetypes based on CEN/ISO EN13606 and SNOMED-CT. The system was able to provide risk indices and to generate alerts in the health records to support decision-making. Overall, the system allows for the agile interaction of research and clinical practice in the Holter-ECG-based cardiac risk domain.

  10. An Interoperable System toward Cardiac Risk Stratification from ECG Monitoring

    Science.gov (United States)

    Mora-Jiménez, Inmaculada; Ramos-López, Javier; Quintanilla Fernández, Teresa; García-García, Antonio; Díez-Mazuela, Daniel; García-Alberola, Arcadi

    2018-01-01

    Many indices have been proposed for cardiovascular risk stratification from electrocardiogram signal processing, still with limited use in clinical practice. We created a system integrating the clinical definition of cardiac risk subdomains from ECGs and the use of diverse signal processing techniques. Three subdomains were defined from the joint analysis of the technical and clinical viewpoints. One subdomain was devoted to demographic and clinical data. The other two subdomains were intended to obtain widely defined risk indices from ECG monitoring: a simple-domain (heart rate turbulence (HRT)), and a complex-domain (heart rate variability (HRV)). Data provided by the three subdomains allowed for the generation of alerts with different intensity and nature, as well as for the grouping and scrutinization of patients according to the established processing and risk-thresholding criteria. The implemented system was tested by connecting data from real-world in-hospital electronic health records and ECG monitoring by considering standards for syntactic (HL7 messages) and semantic interoperability (archetypes based on CEN/ISO EN13606 and SNOMED-CT). The system was able to provide risk indices and to generate alerts in the health records to support decision-making. Overall, the system allows for the agile interaction of research and clinical practice in the Holter-ECG-based cardiac risk domain. PMID:29494497

  11. Interoperability In The New Planetary Science Archive (PSA)

    Science.gov (United States)

    Rios, C.; Barbarisi, I.; Docasal, R.; Macfarlane, A. J.; Gonzalez, J.; Arviset, C.; Grotheer, E.; Besse, S.; Martinez, S.; Heather, D.; De Marchi, G.; Lim, T.; Fraga, D.; Barthelemy, M.

    2015-12-01

    As the world becomes increasingly interconnected, there is a greater need to provide interoperability with software and applications that are commonly being used globally. For this purpose, the development of the new Planetary Science Archive (PSA), by the European Space Astronomy Centre (ESAC) Science Data Centre (ESDC), is focused on building a modern science archive that takes into account internationally recognised standards in order to provide access to the archive through tools from third parties, for example by the NASA Planetary Data System (PDS), the VESPA project from the Virtual Observatory of Paris as well as other international institutions. The protocols and standards currently being supported by the new Planetary Science Archive at this time are the Planetary Data Access Protocol (PDAP), the EuroPlanet-Table Access Protocol (EPN-TAP) and Open Geospatial Consortium (OGC) standards. The architecture of the PSA consists of a Geoserver (an open-source map server), the goal of which is to support use cases such as the distribution of search results, sharing and processing data through a OGC Web Feature Service (WFS) and a Web Map Service (WMS). This server also allows the retrieval of requested information in several standard output formats like Keyhole Markup Language (KML), Geography Markup Language (GML), shapefile, JavaScript Object Notation (JSON) and Comma Separated Values (CSV), among others. The provision of these various output formats enables end-users to be able to transfer retrieved data into popular applications such as Google Mars and NASA World Wind.

  12. A Working Framework for Enabling International Science Data System Interoperability

    Science.gov (United States)

    Hughes, J. Steven; Hardman, Sean; Crichton, Daniel J.; Martinez, Santa; Law, Emily; Gordon, Mitchell K.

    2016-07-01

    For diverse scientific disciplines to interoperate they must be able to exchange information based on a shared understanding. To capture this shared understanding, we have developed a knowledge representation framework that leverages ISO level reference models for metadata registries and digital archives. This framework provides multi-level governance, evolves independent of the implementation technologies, and promotes agile development, namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. The knowledge representation is captured in an ontology through a process of knowledge acquisition. Discipline experts in the role of stewards at the common, discipline, and project levels work to design and populate the ontology model. The result is a formal and consistent knowledge base that provides requirements for data representation, integrity, provenance, context, identification, and relationship. The contents of the knowledge base are translated and written to files in suitable formats to configure system software and services, provide user documentation, validate input, and support data analytics. This presentation will provide an overview of the framework, present a use case that has been adopted by an entire science discipline at the international level, and share some important lessons learned.

  13. Digital Motion Imagery, Interoperability Challenges for Space Operations

    Science.gov (United States)

    Grubbs, Rodney

    2012-01-01

    With advances in available bandwidth from spacecraft and between terrestrial control centers, digital motion imagery and video is becoming more practical as a data gathering tool for science and engineering, as well as for sharing missions with the public. The digital motion imagery and video industry has done a good job of creating standards for compression, distribution, and physical interfaces. Compressed data streams can easily be transmitted or distributed over radio frequency, internet protocol, and other data networks. All of these standards, however, can make sharing video between spacecraft and terrestrial control centers a frustrating and complicated task when different standards and protocols are used by different agencies. This paper will explore the challenges presented by the abundance of motion imagery and video standards, interfaces and protocols with suggestions for common formats that could simplify interoperability between spacecraft and ground support systems. Real-world examples from the International Space Station will be examined. The paper will also discuss recent trends in the development of new video compression algorithms, as well likely expanded use of Delay (or Disruption) Tolerant Networking nodes.

  14. Advances in a Distributed Approach for Ocean Model Data Interoperability

    Directory of Open Access Journals (Sweden)

    Richard P. Signell

    2014-03-01

    Full Text Available An infrastructure for earth science data is emerging across the globe based on common data models and web services. As we evolve from custom file formats and web sites to standards-based web services and tools, data is becoming easier to distribute, find and retrieve, leaving more time for science. We describe recent advances that make it easier for ocean model providers to share their data, and for users to search, access, analyze and visualize ocean data using MATLAB® and Python®. These include a technique for modelers to create aggregated, Climate and Forecast (CF metadata convention datasets from collections of non-standard Network Common Data Form (NetCDF output files, the capability to remotely access data from CF-1.6-compliant NetCDF files using the Open Geospatial Consortium (OGC Sensor Observation Service (SOS, a metadata standard for unstructured grid model output (UGRID, and tools that utilize both CF and UGRID standards to allow interoperable data search, browse and access. We use examples from the U.S. Integrated Ocean Observing System (IOOS® Coastal and Ocean Modeling Testbed, a project in which modelers using both structured and unstructured grid model output needed to share their results, to compare their results with other models, and to compare models with observed data. The same techniques used here for ocean modeling output can be applied to atmospheric and climate model output, remote sensing data, digital terrain and bathymetric data.

  15. Advances in a distributed approach for ocean model data interoperability

    Science.gov (United States)

    Signell, Richard P.; Snowden, Derrick P.

    2014-01-01

    An infrastructure for earth science data is emerging across the globe based on common data models and web services. As we evolve from custom file formats and web sites to standards-based web services and tools, data is becoming easier to distribute, find and retrieve, leaving more time for science. We describe recent advances that make it easier for ocean model providers to share their data, and for users to search, access, analyze and visualize ocean data using MATLAB® and Python®. These include a technique for modelers to create aggregated, Climate and Forecast (CF) metadata convention datasets from collections of non-standard Network Common Data Form (NetCDF) output files, the capability to remotely access data from CF-1.6-compliant NetCDF files using the Open Geospatial Consortium (OGC) Sensor Observation Service (SOS), a metadata standard for unstructured grid model output (UGRID), and tools that utilize both CF and UGRID standards to allow interoperable data search, browse and access. We use examples from the U.S. Integrated Ocean Observing System (IOOS®) Coastal and Ocean Modeling Testbed, a project in which modelers using both structured and unstructured grid model output needed to share their results, to compare their results with other models, and to compare models with observed data. The same techniques used here for ocean modeling output can be applied to atmospheric and climate model output, remote sensing data, digital terrain and bathymetric data.

  16. Evaluating Sustainability Models for Interoperability through Brokering Software

    Science.gov (United States)

    Pearlman, Jay; Benedict, Karl; Best, Mairi; Fyfe, Sue; Jacobs, Cliff; Michener, William; Nativi, Stefano; Powers, Lindsay; Turner, Andrew

    2016-04-01

    Sustainability of software and research support systems is an element of innovation that is not often discussed. Yet, sustainment is essential if we expect research communities to make the time investment to learn and adopt new technologies. As the Research Data Alliance (RDA) is developing new approaches to interoperability, the question of uptake and sustainability is important. Brokering software sustainability is one of the areas that is being addressed in RDA. The Business Models Team of the Research Data Alliance Brokering Governance Working Group examined several support models proposed to promote the long-term sustainability of brokering middleware. The business model analysis includes examination of funding source, implementation frameworks and challenges, and policy and legal considerations. Results of this comprehensive analysis highlight advantages and disadvantages of the various models with respect to the specific requirements for brokering services. We offer recommendations based on the outcomes of this analysis that suggest that hybrid funding models present the most likely avenue to long term sustainability.

  17. Standardized headings as a foundation for semantic interoperability in EHR

    Directory of Open Access Journals (Sweden)

    Halilovic Amra

    2016-01-01

    Full Text Available The new Swedish Patient Act, which allows patients to choose health care in county councils other than their own, creates the need to be able to share health-related information contained in electronic health records [EHRs across county councils. This demands interoperability in terms of structured and standardized data. Headings in EHR could also be a part of structured and standardized data. The aim was to study to what extent terminology is shared and standardized across county councils in Sweden. Headings from three county councils were analyzed to see to what extent they were shared and to what extent they corresponded to concepts in SNOMED CT and the National Board of Health and Welfare’s term dictionary [NBHW’s TD. In total 41% of the headings were shared across two or three county councils. A third of the shared headings corresponded to concepts in SNOMED CT. Further, an eighth of the shared headings corresponded to concepts in NBHW’s TD. The results showed that the extent of shared and standardized terminology in terms of headings across the studied three county councils were negligible.

  18. Making Interoperability Easier with the NASA Metadata Management Tool

    Science.gov (United States)

    Shum, D.; Reese, M.; Pilone, D.; Mitchell, A. E.

    2016-12-01

    ISO 19115 has enabled interoperability amongst tools, yet many users find it hard to build ISO metadata for their collections because it can be large and overly flexible for their needs. The Metadata Management Tool (MMT), part of NASA's Earth Observing System Data and Information System (EOSDIS), offers users a modern, easy to use browser based tool to develop ISO compliant metadata. Through a simplified UI experience, metadata curators can create and edit collections without any understanding of the complex ISO-19115 format, while still generating compliant metadata. The MMT is also able to assess the completeness of collection level metadata by evaluating it against a variety of metadata standards. The tool provides users with clear guidance as to how to change their metadata in order to improve their quality and compliance. It is based on NASA's Unified Metadata Model for Collections (UMM-C) which is a simpler metadata model which can be cleanly mapped to ISO 19115. This allows metadata authors and curators to meet ISO compliance requirements faster and more accurately. The MMT and UMM-C have been developed in an agile fashion, with recurring end user tests and reviews to continually refine the tool, the model and the ISO mappings. This process is allowing for continual improvement and evolution to meet the community's needs.

  19. Providing trust and interoperability to federate distributed biobanks.

    Science.gov (United States)

    Lablans, Martin; Bartholomäus, Sebastian; Uckert, Frank

    2011-01-01

    Biomedical research requires large numbers of well annotated, quality-assessed samples which often cannot be provided by a single biobank. Connecting biobanks, researchers and service providers raises numerous challenges including trust among partners and towards the infrastructure as well as interoperability problems. Therefore we develop a holistic, open-source and easy-to-use IT infrastructure. Our federated approach allows partners to reflect their organizational structures and protect their data sovereignty. The search service and the contact arrangement processes increase data sovereignty without stigmatizing for rejecting a specific cooperation. The infrastructure supports daily processes with an integrated basic sample manager and user-definable electronic case report forms. Interfaces for existing IT systems avoid re-entering of data. Moreover, resource virtualization is supported to make underutilized resources of some partners accessible to those with insufficient equipment for mutual benefit. The functionality of the resulting infrastructure is outlined in a use-case to demonstrate collaboration within a translational research network. Compared to other existing or upcoming infrastructures, our approach has ultimately the same goals, but relies on gentle incentives rather than top-down imposed progress.

  20. Measuring interoperable EHR adoption and maturity: a Canadian example.

    Science.gov (United States)

    Gheorghiu, Bobby; Hagens, Simon

    2016-01-25

    An interoperable electronic health record is a secure consolidated record of an individual's health history and care, designed to facilitate authorized information sharing across the care continuum.  Each Canadian province and territory has implemented such a system and for all, measuring adoption is essential to understanding progress and optimizing use in order to realize intended benefits. About 250,000 health professionals-approximately half of Canada's anticipated potential physician, nurse, pharmacist, and administrative users-indicated that they electronically access data, such as those found in provincial/territorial lab or drug information systems, in 2015.  Trends suggest further growth as maturity of use increases. There is strong interest in health information exchange through the iEHR in Canada, and continued growth in adoption is expected. Central to managing the evolution of digital health is access to robust data about who is using solutions, how they are used, where and when.  Stakeholders such as government, program leads, and health system administrators must critically assess progress and achievement of benefits, to inform future strategic and operational decisions.

  1. MPEG-4 IPMP Extension for Interoperable Protection of Multimedia Content

    Directory of Open Access Journals (Sweden)

    Zeng Wenjun

    2004-01-01

    Full Text Available To ensure secure content delivery, the Motion Picture Experts Group (MPEG has dedicated significant effort to the digital rights management (DRM issues. MPEG is now moving from defining only hooks to proprietary systems (e.g., in MPEG-2, MPEG-4 Version 1 to specifying a more encompassing standard in intellectual property management and protection (IPMP. MPEG feels that this is necessary in order to achieve MPEG's most important goal: interoperability. The design of the IPMP Extension framework also considers the complexity of the MPEG-4 standard and the diversity of its applications. This architecture leaves the details of the design of IPMP tools in the hands of applications developers, while ensuring the maximum flexibility and security. This paper first briefly describes the background of the development of the MPEG-4 IPMP Extension. It then presents an overview of the MPEG-4 IPMP Extension, including its architecture, the flexible protection signaling, and the secure messaging framework for the communication between the terminal and the tools. Two sample usage scenarios are also provided to illustrate how an MPEG-4 IPMP Extension compliant system works.

  2. A Proposed Information Architecture for Telehealth System Interoperability

    Energy Technology Data Exchange (ETDEWEB)

    Craft, R.L.; Funkhouser, D.R.; Gallagher, L.K.; Garica, R.J.; Parks, R.C.; Warren, S.

    1999-04-20

    We propose an object-oriented information architecture for telemedicine systems that promotes secure `plug-and-play' interaction between system components through standardized interfaces, communication protocols, messaging formats, and data definitions. In this architecture, each component functions as a black box, and components plug together in a ''lego-like'' fashion to achieve the desired device or system functionality. Introduction Telemedicine systems today rely increasingly on distributed, collaborative information technology during the care delivery process. While these leading-edge systems are bellwethers for highly advanced telemedicine, most are custom-designed and do not interoperate with other commercial offerings. Users are limited to a set of functionality that a single vendor provides and must often pay high prices to obtain this functionality, since vendors in this marketplace must deliver en- tire systems in order to compete. Besides increasing corporate research and development costs, this inhibits the ability of the user to make intelligent purchasing decisions regarding best-of-breed technologies. This paper proposes a reference architecture for plug-and-play telemedicine systems that addresses these issues.

  3. A Proposed Information Architecture for Telehealth System Interoperability

    Energy Technology Data Exchange (ETDEWEB)

    Warren, S.; Craft, R.L.; Parks, R.C.; Gallagher, L.K.; Garcia, R.J.; Funkhouser, D.R.

    1999-04-07

    Telemedicine technology is rapidly evolving. Whereas early telemedicine consultations relied primarily on video conferencing, consultations today may utilize video conferencing, medical peripherals, store-and-forward capabilities, electronic patient record management software, and/or a host of other emerging technologies. These remote care systems rely increasingly on distributed, collaborative information technology during the care delivery process, in its many forms. While these leading-edge systems are bellwethers for highly advanced telemedicine, the remote care market today is still immature. Most telemedicine systems are custom-designed and do not interoperate with other commercial offerings. Users are limited to a set of functionality that a single vendor provides and must often pay high prices to obtain this functionality, since vendors in this marketplace must deliver entire systems in order to compete. Besides increasing corporate research and development costs, this inhibits the ability of the user to make intelligent purchasing decisions regarding best-of-breed technologies. We propose a secure, object-oriented information architecture for telemedicine systems that promotes plug-and-play interaction between system components through standardized interfaces, communication protocols, messaging formats, and data definitions. In this architecture, each component functions as a black box, and components plug together in a lego-like fashion to achieve the desired device or system functionality. The architecture will support various ongoing standards work in the medical device arena.

  4. An ontological system for interoperable spatial generalisation in biodiversity monitoring

    Science.gov (United States)

    Nieland, Simon; Moran, Niklas; Kleinschmit, Birgit; Förster, Michael

    2015-11-01

    Semantic heterogeneity remains a barrier to data comparability and standardisation of results in different fields of spatial research. Because of its thematic complexity, differing acquisition methods and national nomenclatures, interoperability of biodiversity monitoring information is especially difficult. Since data collection methods and interpretation manuals broadly vary there is a need for automatised, objective methodologies for the generation of comparable data-sets. Ontology-based applications offer vast opportunities in data management and standardisation. This study examines two data-sets of protected heathlands in Germany and Belgium which are based on remote sensing image classification and semantically formalised in an OWL2 ontology. The proposed methodology uses semantic relations of the two data-sets, which are (semi-)automatically derived from remote sensing imagery, to generate objective and comparable information about the status of protected areas by utilising kernel-based spatial reclassification. This automatised method suggests a generalisation approach, which is able to generate delineation of Special Areas of Conservation (SAC) of the European biodiversity Natura 2000 network. Furthermore, it is able to transfer generalisation rules between areas surveyed with varying acquisition methods in different countries by taking into account automated inference of the underlying semantics. The generalisation results were compared with the manual delineation of terrestrial monitoring. For the different habitats in the two sites an accuracy of above 70% was detected. However, it has to be highlighted that the delineation of the ground-truth data inherits a high degree of uncertainty, which is discussed in this study.

  5. Technical Interoperability for Machine Connectivity on the Shop Floor

    Directory of Open Access Journals (Sweden)

    Magnus Åkerman

    2018-06-01

    Full Text Available This paper presents a generic technical solution that can increase Industry 4.0 maturity by collecting data from sensors and control systems on the shop floor. Within the research project “5G-Enabled Manufacturing”, an LTE (Long-Term Evolution network with 5G technologies was deployed on the shop floor to enable fast and scalable connectivity. This network was used to connect a grinding machine to a remote private cloud where data was stored and streamed to a data analytics center. This enabled visibility and transparency of the production data, which is the basis for Industry 4.0 and smart manufacturing. The solution is described with a focus on high-level communication technologies above wireless communication standards. These technologies are discussed regarding technical interoperability, focusing on the system layout, communication standards, and open systems. From the discussion, it can be derived that generic solutions such as this are possible, but manufacturing end-users must expand and further internalize knowledge of future information and communication technologies to reduce their dependency on equipment and technology providers.

  6. Chart Series

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Centers for Medicare and Medicaid Services (CMS) offers several different Chart Series with data on beneficiary health status, spending, operations, and quality...

  7. Eight reasons payer interoperability and data sharing are essential in ACOs. Interoperability standards could be a prerequisite to measuring care.

    Science.gov (United States)

    Mookencherry, Shefali

    2012-01-01

    It makes strategic and business sense for payers and providers to collaborate on how to take substantial cost out of the healthcare delivery system. Acting independently, neither medical groups, hospitals nor health plans have the optimal mix of resources and incentives to significantly reduce costs. Payers have core assets such as marketing, claims data, claims processing, reimbursement systems and capital. It would be cost prohibitive for all but the largest providers to develop these capabilities in order to compete directly with insurers. Likewise, medical groups and hospitals are positioned to foster financial interdependence among providers and coordinate the continuum of patient illnesses and care settings. Payers and providers should commit to reasonable clinical and cost goals, and share resources to minimize expenses and financial risks. It is in the interest of payers to work closely with providers on risk-management strategies because insurers need synergy with ACOs to remain cost competitive. It is in the interest of ACOs to work collaboratively with payers early on to develop reasonable and effective performance benchmarks. Hence, it is essential to have payer interoperability and data sharing integrated in an ACO model.

  8. Interoperable Access to NCAR Research Data Archive Collections

    Science.gov (United States)

    Schuster, D.; Ji, Z.; Worley, S. J.; Manross, K.

    2014-12-01

    The National Center for Atmospheric Research (NCAR) Research Data Archive (RDA) provides free access to 600+ observational and gridded dataset collections. The RDA is designed to support atmospheric and related sciences research, updated frequently where datasets have ongoing production, and serves data to 10,000 unique users annually. The traditional data access options include web-based direct archive file downloads, user selected data subsets and format conversions produced by server-side computations, and client and cURL-based APIs for routine scripted data retrieval. To enhance user experience and utility, the RDA now also offers THREDDS Data Server (TDS) access for many highly valued dataset collections. TDS offered datasets are presented as aggregations, enabling users to access an entire dataset collection, that can be comprised of 1000's of files, through a single virtual file. The OPeNDAP protocol, supported by the TDS, allows compatible tools to open and access these virtual files remotely, and make the native data file format transparent to the end user. The combined functionality (TDS/OPeNDAP) gives users the ability to browse, select, visualize, and download data from a complete dataset collection without having to transfer archive files to a local host. This presentation will review the TDS basics and describe the specific TDS implementation on the RDA's diverse archive of GRIB-1, GRIB-2, and gridded NetCDF formatted dataset collections. Potential future TDS implementation on in-situ observational dataset collections will be discussed. Illustrative sample cases will be used to highlight the end users benefits from this interoperable data access to the RDA.

  9. A Prototype Ontology Tool and Interface for Coastal Atlas Interoperability

    Science.gov (United States)

    Wright, D. J.; Bermudez, L.; O'Dea, L.; Haddad, T.; Cummins, V.

    2007-12-01

    While significant capacity has been built in the field of web-based coastal mapping and informatics in the last decade, little has been done to take stock of the implications of these efforts or to identify best practice in terms of taking lessons learned into consideration. This study reports on the second of two transatlantic workshops that bring together key experts from Europe, the United States and Canada to examine state-of-the-art developments in coastal web atlases (CWA), based on web enabled geographic information systems (GIS), along with future needs in mapping and informatics for the coastal practitioner community. While multiple benefits are derived from these tailor-made atlases (e.g. speedy access to multiple sources of coastal data and information; economic use of time by avoiding individual contact with different data holders), the potential exists to derive added value from the integration of disparate CWAs, to optimize decision-making at a variety of levels and across themes. The second workshop focused on the development of a strategy to make coastal web atlases interoperable by way of controlled vocabularies and ontologies. The strategy is based on web service oriented architecture and an implementation of Open Geospatial Consortium (OGC) web services, such as Web Feature Services (WFS) and Web Map Service (WMS). Atlases publishes Catalog Web Services (CSW) using ISO 19115 metadata and controlled vocabularies encoded as Uniform Resource Identifiers (URIs). URIs allows the terminology of each atlas to be uniquely identified and facilitates mapping of terminologies using semantic web technologies. A domain ontology was also created to formally represent coastal erosion terminology as a use case, and with a test linkage of those terms between the Marine Irish Digital Atlas and the Oregon Coastal Atlas. A web interface is being developed to discover coastal hazard themes in distributed coastal atlases as part of a broader International Coastal

  10. Improving the interoperability of biomedical ontologies with compound alignments.

    Science.gov (United States)

    Oliveira, Daniela; Pesquita, Catia

    2018-01-09

    Ontologies are commonly used to annotate and help process life sciences data. Although their original goal is to facilitate integration and interoperability among heterogeneous data sources, when these sources are annotated with distinct ontologies, bridging this gap can be challenging. In the last decade, ontology matching systems have been evolving and are now capable of producing high-quality mappings for life sciences ontologies, usually limited to the equivalence between two ontologies. However, life sciences research is becoming increasingly transdisciplinary and integrative, fostering the need to develop matching strategies that are able to handle multiple ontologies and more complex relations between their concepts. We have developed ontology matching algorithms that are able to find compound mappings between multiple biomedical ontologies, in the form of ternary mappings, finding for instance that "aortic valve stenosis"(HP:0001650) is equivalent to the intersection between "aortic valve"(FMA:7236) and "constricted" (PATO:0001847). The algorithms take advantage of search space filtering based on partial mappings between ontology pairs, to be able to handle the increased computational demands. The evaluation of the algorithms has shown that they are able to produce meaningful results, with precision in the range of 60-92% for new mappings. The algorithms were also applied to the potential extension of logical definitions of the OBO and the matching of several plant-related ontologies. This work is a first step towards finding more complex relations between multiple ontologies. The evaluation shows that the results produced are significant and that the algorithms could satisfy specific integration needs.

  11. Characterization of supercapacitors matrix

    Energy Technology Data Exchange (ETDEWEB)

    Sakka, Monzer Al, E-mail: Monzer.Al.Sakka@vub.ac.b [Vrije Universiteit Brussel, pleinlaan 2, B-1050 Brussels (Belgium); FEMTO-ST Institute, ENISYS Department, FCLAB, UFC-UTBM, bat.F, 90010 Belfort (France); Gualous, Hamid, E-mail: Hamid.Gualous@unicaen.f [Laboratoire LUSAC, Universite de Caen Basse Normandie, Rue Louis Aragon - BP 78, 50130 Cherbourg-Octeville (France); Van Mierlo, Joeri [Vrije Universiteit Brussel, pleinlaan 2, B-1050 Brussels (Belgium)

    2010-10-30

    This paper treats supercapacitors matrix characterization. In order to cut off transient power peaks and to compensate for the intrinsic limitations in embedded sources, the use of supercapacitors as a storage system is quite suitable, because of their appropriate electrical characteristics (huge capacitance, small series resistance, high specific energy, high specific power), direct storage (energy ready for use), and easy control by power electronic conversion. This use requires supercapacitors modules where several cells connected in serial and/or in parallel, thus a bypass system to balance the charging or the discharging of supercapacitors is required. In the matrix of supercapacitors, six elements of three parallel BCAP0350 supercapacitors in serial connections have been considered. This topology permits to reduce the number of the bypass circuits and it can work in degraded mode. Actually, it allows the system to have more reliability by providing power continually to the load even when there are one or more cells failed. Simulation and experimental results are presented and discussed.

  12. Characterization of supercapacitors matrix

    International Nuclear Information System (INIS)

    Sakka, Monzer Al; Gualous, Hamid; Van Mierlo, Joeri

    2010-01-01

    This paper treats supercapacitors matrix characterization. In order to cut off transient power peaks and to compensate for the intrinsic limitations in embedded sources, the use of supercapacitors as a storage system is quite suitable, because of their appropriate electrical characteristics (huge capacitance, small series resistance, high specific energy, high specific power), direct storage (energy ready for use), and easy control by power electronic conversion. This use requires supercapacitors modules where several cells connected in serial and/or in parallel, thus a bypass system to balance the charging or the discharging of supercapacitors is required. In the matrix of supercapacitors, six elements of three parallel BCAP0350 supercapacitors in serial connections have been considered. This topology permits to reduce the number of the bypass circuits and it can work in degraded mode. Actually, it allows the system to have more reliability by providing power continually to the load even when there are one or more cells failed. Simulation and experimental results are presented and discussed.

  13. Impact of Business Interoperability on the Performance of Complex Cooperative Supply Chain Networks: A Case Study

    Directory of Open Access Journals (Sweden)

    Izunildo Cabral

    2018-01-01

    Full Text Available This paper proposes an agent-based model for evaluating the effect of business interoperability on the performance of cooperative supply chain networks. The model is based on insights from the Industrial Marketing and Purchasing network approach and the complex systems theory perspective. To demonstrate its applicability, an explanatory case study regarding a Portuguese reverse logistics cooperative supply chain network is presented. Face-to-face interviews and forms were used to collect data. The findings show that the establishment of appropriate levels of business interoperability has helped to reduce several non-value-added interaction processes and consequently improve the operational performance of the Valorpneu network. Regarding the research implications, this paper extends the current knowledge on business interoperability and an important problem in business: how business interoperability gaps in dyadic organizational relationships affect the network of companies that the two companies belong to—network effect. In terms of practical implications, managers can use the proposed model as a starting point to simulate complex interactions between supply chain network partners and understand better how the performance of their networks emerges from these interactions and from the adoption of different levels of business interoperability.

  14. Interoperability as a quality label for portable & wearable health monitoring systems.

    Science.gov (United States)

    Chronaki, Catherine E; Chiarugi, Franco

    2005-01-01

    Advances in ICT promising universal access to high quality care, reduction of medical errors, and containment of health care costs, have renewed interest in electronic health records (EHR) standards and resulted in comprehensive EHR adoption programs in many European states. Health cards, and in particular the European health insurance card, present an opportunity for instant cross-border access to emergency health data including allergies, medication, even a reference ECG. At the same time, research and development in miniaturized medical devices and wearable medical sensors promise continuous health monitoring in a comfortable, flexible, and fashionable way. These trends call for the seamless integration of medical devices and intelligent wearables into an active EHR exploiting the vast information available to increase medical knowledge and establish personal wellness profiles. In a mobile connected world with empowered health consumers and fading barriers between health and healthcare, interoperability has a strong impact on consumer trust. As a result, current interoperability initiatives are extending the traditional standardization process to embrace implementation, validation, and conformance testing. In this paper, starting from the OpenECG initiative, which promotes the consistent implementation of interoperability standards in electrocardiography and supports a worldwide community with data sets, open source tools, specifications, and online conformance testing, we discuss EHR interoperability as a quality label for personalized health monitoring systems. Such a quality label would support big players and small enterprises in creating interoperable eHealth products, while opening the way for pervasive healthcare and the take-up of the eHealth market.

  15. The eXtensible ontology development (XOD) principles and tool implementation to support ontology interoperability.

    Science.gov (United States)

    He, Yongqun; Xiang, Zuoshuang; Zheng, Jie; Lin, Yu; Overton, James A; Ong, Edison

    2018-01-12

    Ontologies are critical to data/metadata and knowledge standardization, sharing, and analysis. With hundreds of biological and biomedical ontologies developed, it has become critical to ensure ontology interoperability and the usage of interoperable ontologies for standardized data representation and integration. The suite of web-based Ontoanimal tools (e.g., Ontofox, Ontorat, and Ontobee) support different aspects of extensible ontology development. By summarizing the common features of Ontoanimal and other similar tools, we identified and proposed an "eXtensible Ontology Development" (XOD) strategy and its associated four principles. These XOD principles reuse existing terms and semantic relations from reliable ontologies, develop and apply well-established ontology design patterns (ODPs), and involve community efforts to support new ontology development, promoting standardized and interoperable data and knowledge representation and integration. The adoption of the XOD strategy, together with robust XOD tool development, will greatly support ontology interoperability and robust ontology applications to support data to be Findable, Accessible, Interoperable and Reusable (i.e., FAIR).

  16. Interoperability in practice: case study of the Slovenian independence war of 1991

    Directory of Open Access Journals (Sweden)

    Vladimir Prebilič

    2015-08-01

    Full Text Available The paper will examine the theory of the interoperability of armed forces through the case of he Slovenian Independence War of 1991. Although defense system interoperability is a well-established concept, there are many obstacles to its implementation. Some defense systems do not deliberately support the idea of interoperability. One such example is the total defense system in SFR Yugoslavia, which is comprised of two defense components: the Yugoslav People’s Army (YPA and territorial defense structures organized by the federal republic. The question of interoperability is highly relevant since the war was fought between the YPA and the defense forces of the newly proclaimed independent state, Slovenia, who were partners in the total defense concept. Due to the clear asymmetry, interoperability offered a great advantage in the independence war. The Slovenian defense forces were combined into three structures: the former militia as an internal security element, the territorial defense as a military component, and the national protection forces as a “civil” defense element. Although each structure had its own command and organizational structure, during the Slovenian War they were combined into a well-structured and organized defense element that achieved victory against a much stronger, better equipped, and better supported army.

  17. Semantic and syntactic interoperability in online processing of big Earth observation data.

    Science.gov (United States)

    Sudmanns, Martin; Tiede, Dirk; Lang, Stefan; Baraldi, Andrea

    2018-01-01

    The challenge of enabling syntactic and semantic interoperability for comprehensive and reproducible online processing of big Earth observation (EO) data is still unsolved. Supporting both types of interoperability is one of the requirements to efficiently extract valuable information from the large amount of available multi-temporal gridded data sets. The proposed system wraps world models, (semantic interoperability) into OGC Web Processing Services (syntactic interoperability) for semantic online analyses. World models describe spatio-temporal entities and their relationships in a formal way. The proposed system serves as enabler for (1) technical interoperability using a standardised interface to be used by all types of clients and (2) allowing experts from different domains to develop complex analyses together as collaborative effort. Users are connecting the world models online to the data, which are maintained in a centralised storage as 3D spatio-temporal data cubes. It allows also non-experts to extract valuable information from EO data because data management, low-level interactions or specific software issues can be ignored. We discuss the concept of the proposed system, provide a technical implementation example and describe three use cases for extracting changes from EO images and demonstrate the usability also for non-EO, gridded, multi-temporal data sets (CORINE land cover).

  18. Interoperability between OPC UA and AutomationML

    OpenAIRE

    Henßen, Robert; Schleipen, Miriam

    2014-01-01

    OPC UA (OPC Unified Architecture) is a platform-independent standard series (IEC 62541) [1], [2] for communication of industrial automation devices and systems. The OPC Unified Architecture is an advanced communication technology for process control. Certainly the launching costs for the initial information model are quite high. AutomationML (Automation Markup Language) is an upcoming open standard series (IEC 62714) [3], [4] for describing production plants or plant components. The goal of t...

  19. Case Series

    African Journals Online (AJOL)

    calciphylaxis is prevention through rigorous control of phosphate and calcium balance. We here present two ... The authors declared no conflict of interest. Introduction. Calciphylaxis is a rare but serious disorder .... were reported to resolve the calciphylaxis lesions in a chronic renal failure patient [20]. In a series of five.

  20. Fourier Series

    Indian Academy of Sciences (India)

    polynomials are dense in the class of continuous functions! The body of literature dealing with Fourier series has reached epic proportions over the last two centuries. We have only given the readers an outline of the topic in this article. For the full length episode we refer the reader to the monumental treatise of. A Zygmund.

  1. Case series

    African Journals Online (AJOL)

    abp

    13 oct. 2017 ... This is an Open Access article distributed under the terms of the Creative Commons Attribution ... Bifocal leg fractures pose many challenges for the surgeon due to .... Dans notre serie, le taux d'infection est reste dans un.

  2. Fourier Series

    Indian Academy of Sciences (India)

    The theory of Fourier series deals with periodic functions. By a periodic ..... including Dirichlet, Riemann and Cantor occupied themselves with the problem of ... to converge only on a set which is negligible in a certain sense (Le. of measure ...

  3. case series

    African Journals Online (AJOL)

    Administrator

    Key words: Case report, case series, concept analysis, research design. African Health Sciences 2012; (4): 557 - 562 http://dx.doi.org/10.4314/ahs.v12i4.25. PO Box 17666 .... According to the latest version of the Dictionary of. Epidemiology ...

  4. Development and characterization of Powder Metallurgy (PM) 2XXX series Al alloy products and Metal Matrix Composite (MMC) 2XXX Al/SiC materials for high temperature aircraft structural applications

    Science.gov (United States)

    Chellman, D. J.; Gurganus, T. B.; Walker, J. A.

    1992-01-01

    The results of a series of material studies performed by the Lockheed Aeronautical Systems Company over the time period from 1980 to 1991 are discussed. The technical objective of these evaluations was to develop and characterize advanced aluminum alloy materials with temperature capabilities extending to 350 F. An overview is given of the first five alloy development efforts under this contract. Prior work conducted during the first five modifications of the alloy development program are listed. Recent developments based on the addition of high Zr levels to an optimum Al-Cu-Mg alloy composition by powder metallurgy processing are discussed. Both reinforced and SiC or B4C ceramic reinforced alloys were explored to achieve specific target goals for high temperature aluminum alloy applications.

  5. Enabling interoperability in Geoscience with GI-suite

    Science.gov (United States)

    Boldrini, Enrico; Papeschi, Fabrizio; Santoro, Mattia; Nativi, Stefano

    2015-04-01

    GI-suite is a brokering framework targeting interoperability of heterogeneous systems in the Geoscience domain. The framework is composed by different brokers each one focusing on a specific functionality: discovery, access and semantics (i.e. GI-cat, GI-axe, GI-sem). The brokering takes place between a set of heterogeneous publishing services and a set of heterogeneous consumer applications: the brokering target is represented by resources (e.g. coverages, features, or metadata information) required to seamlessly flow from the providers to the consumers. Different international and community standards are now supported by GI-suite, making possible the successful deployment of GI-suite in many international projects and initiatives (such as GEOSS, NSF BCube and several EU funded projects). As for the publisher side more than 40 standards and implementations are supported (e.g. Dublin Core, OAI-PMH, OGC W*S, Geonetwork, THREDDS Data Server, Hyrax Server, etc.). The support for each individual standard is provided by means of specific GI-suite components, called accessors. As for the consumer applications side more than 15 standards and implementations are supported (e.g. ESRI ArcGIS, Openlayers, OGC W*S, OAI-PMH clients, etc.). The support for each individual standard is provided by means of specific profiler components. The GI-suite can be used in different scenarios by different actors: - A data provider having a pre-existent data repository can deploy and configure GI-suite to broker it and making thus available its data resources through different protocols to many different users (e.g. for data discovery and/or data access) - A data consumer can use GI-suite to discover and/or access resources from a variety of publishing services that are already publishing data according to well-known standards. - A community can deploy and configure GI-suite to build a community (or project-specific) broker: GI-suite can broker a set of community related repositories and

  6. CityGML - Interoperable semantic 3D city models

    Science.gov (United States)

    Gröger, Gerhard; Plümer, Lutz

    2012-07-01

    CityGML is the international standard of the Open Geospatial Consortium (OGC) for the representation and exchange of 3D city models. It defines the three-dimensional geometry, topology, semantics and appearance of the most relevant topographic objects in urban or regional contexts. These definitions are provided in different, well-defined Levels-of-Detail (multiresolution model). The focus of CityGML is on the semantical aspects of 3D city models, its structures, taxonomies and aggregations, allowing users to employ virtual 3D city models for advanced analysis and visualization tasks in a variety of application domains such as urban planning, indoor/outdoor pedestrian navigation, environmental simulations, cultural heritage, or facility management. This is in contrast to purely geometrical/graphical models such as KML, VRML, or X3D, which do not provide sufficient semantics. CityGML is based on the Geography Markup Language (GML), which provides a standardized geometry model. Due to this model and its well-defined semantics and structures, CityGML facilitates interoperable data exchange in the context of geo web services and spatial data infrastructures. Since its standardization in 2008, CityGML has become used on a worldwide scale: tools from notable companies in the geospatial field provide CityGML interfaces. Many applications and projects use this standard. CityGML is also having a strong impact on science: numerous approaches use CityGML, particularly its semantics, for disaster management, emergency responses, or energy-related applications as well as for visualizations, or they contribute to CityGML, improving its consistency and validity, or use CityGML, particularly its different Levels-of-Detail, as a source or target for generalizations. This paper gives an overview of CityGML, its underlying concepts, its Levels-of-Detail, how to extend it, its applications, its likely future development, and the role it plays in scientific research. Furthermore, its

  7. Towards interoperable and reproducible QSAR analyses: Exchange of datasets.

    Science.gov (United States)

    Spjuth, Ola; Willighagen, Egon L; Guha, Rajarshi; Eklund, Martin; Wikberg, Jarl Es

    2010-06-30

    QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML) which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join, extend, combine datasets and hence work collectively, but

  8. Towards interoperable and reproducible QSAR analyses: Exchange of datasets

    Directory of Open Access Journals (Sweden)

    Spjuth Ola

    2010-06-01

    Full Text Available Abstract Background QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. Results We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Conclusions Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join

  9. Interoperability format translation and transformation between IFC architectural design file and simulation file formats

    Science.gov (United States)

    Chao, Tian-Jy; Kim, Younghun

    2015-01-06

    Automatically translating a building architecture file format (Industry Foundation Class) to a simulation file, in one aspect, may extract data and metadata used by a target simulation tool from a building architecture file. Interoperability data objects may be created and the extracted data is stored in the interoperability data objects. A model translation procedure may be prepared to identify a mapping from a Model View Definition to a translation and transformation function. The extracted data may be transformed using the data stored in the interoperability data objects, an input Model View Definition template, and the translation and transformation function to convert the extracted data to correct geometric values needed for a target simulation file format used by the target simulation tool. The simulation file in the target simulation file format may be generated.

  10. Inter-operator Variability in Defining Uterine Position Using Three-dimensional Ultrasound Imaging

    DEFF Research Database (Denmark)

    Baker, Mariwan; Jensen, Jørgen Arendt; Behrens, Claus F.

    2013-01-01

    significantly larger inter-fractional uterine positional displacement, in some cases up to 20 mm, which outweighs the magnitude of current inter-operator variations. Thus, the current US-phantom-study suggests that the inter-operator variability in addressing uterine position is clinically irrelevant.......In radiotherapy the treatment outcome of gynecological (GYN) cancer patients is crucially related to reproducibility of the actual uterine position. The purpose of this study is to evaluate the inter-operator variability in addressing uterine position using a novel 3-D ultrasound (US) system....... The study is initiated by US-scanning of a uterine phantom (CIRS 404, Universal Medical, Norwood, USA) by seven experienced US operators. The phantom represents a female pelvic region, containing a uterus, bladder and rectal landmarks readily definable in the acquired US-scans. The organs are subjected...

  11. Assessment of Collaboration and Interoperability in an Information Management System to Support Bioscience Research

    Science.gov (United States)

    Myneni, Sahiti; Patel, Vimla L.

    2009-01-01

    Biomedical researchers often have to work on massive, detailed, and heterogeneous datasets that raise new challenges of information management. This study reports an investigation into the nature of the problems faced by the researchers in two bioscience test laboratories when dealing with their data management applications. Data were collected using ethnographic observations, questionnaires, and semi-structured interviews. The major problems identified in working with these systems were related to data organization, publications, and collaboration. The interoperability standards were analyzed using a C4I framework at the level of connection, communication, consolidation, and collaboration. Such an analysis was found to be useful in judging the capabilities of data management systems at different levels of technological competency. While collaboration and system interoperability are the “must have” attributes of these biomedical scientific laboratory information management applications, usability and human interoperability are the other design concerns that must also be addressed for easy use and implementation. PMID:20351900

  12. Cross border semantic interoperability for clinical research: the EHR4CR semantic resources and services

    Science.gov (United States)

    Daniel, Christel; Ouagne, David; Sadou, Eric; Forsberg, Kerstin; Gilchrist, Mark Mc; Zapletal, Eric; Paris, Nicolas; Hussain, Sajjad; Jaulent, Marie-Christine; MD, Dipka Kalra

    2016-01-01

    With the development of platforms enabling the use of routinely collected clinical data in the context of international clinical research, scalable solutions for cross border semantic interoperability need to be developed. Within the context of the IMI EHR4CR project, we first defined the requirements and evaluation criteria of the EHR4CR semantic interoperability platform and then developed the semantic resources and supportive services and tooling to assist hospital sites in standardizing their data for allowing the execution of the project use cases. The experience gained from the evaluation of the EHR4CR platform accessing to semantically equivalent data elements across 11 European participating EHR systems from 5 countries demonstrated how far the mediation model and mapping efforts met the expected requirements of the project. Developers of semantic interoperability platforms are beginning to address a core set of requirements in order to reach the goal of developing cross border semantic integration of data. PMID:27570649

  13. Ocean Data Interoperability Platform (ODIP): developing a common framework for global marine data management

    Science.gov (United States)

    Glaves, H. M.

    2015-12-01

    In recent years marine research has become increasingly multidisciplinary in its approach with a corresponding rise in the demand for large quantities of high quality interoperable data as a result. This requirement for easily discoverable and readily available marine data is currently being addressed by a number of regional initiatives with projects such as SeaDataNet in Europe, Rolling Deck to Repository (R2R) in the USA and the Integrated Marine Observing System (IMOS) in Australia, having implemented local infrastructures to facilitate the exchange of standardised marine datasets. However, each of these systems has been developed to address local requirements and created in isolation from those in other regions.Multidisciplinary marine research on a global scale necessitates a common framework for marine data management which is based on existing data systems. The Ocean Data Interoperability Platform project is seeking to address this requirement by bringing together selected regional marine e-infrastructures for the purposes of developing interoperability across them. By identifying the areas of commonality and incompatibility between these data infrastructures, and leveraging the development activities and expertise of these individual systems, three prototype interoperability solutions are being created which demonstrate the effective sharing of marine data and associated metadata across the participating regional data infrastructures as well as with other target international systems such as GEO, COPERNICUS etc.These interoperability solutions combined with agreed best practice and approved standards, form the basis of a common global approach to marine data management which can be adopted by the wider marine research community. To encourage implementation of these interoperability solutions by other regional marine data infrastructures an impact assessment is being conducted to determine both the technical and financial implications of deploying them

  14. Conceptual Model Formalization in a Semantic Interoperability Service Framework: Transforming Relational Database Schemas to OWL.

    Science.gov (United States)

    Bravo, Carlos; Suarez, Carlos; González, Carolina; López, Diego; Blobel, Bernd

    2014-01-01

    Healthcare information is distributed through multiple heterogeneous and autonomous systems. Access to, and sharing of, distributed information sources are a challenging task. To contribute to meeting this challenge, this paper presents a formal, complete and semi-automatic transformation service from Relational Databases to Web Ontology Language. The proposed service makes use of an algorithm that allows to transform several data models of different domains by deploying mainly inheritance rules. The paper emphasizes the relevance of integrating the proposed approach into an ontology-based interoperability service to achieve semantic interoperability.

  15. Current Barriers to Large-scale Interoperability of Traceability Technology in the Seafood Sector.

    Science.gov (United States)

    Hardt, Marah J; Flett, Keith; Howell, Colleen J

    2017-08-01

    Interoperability is a critical component of full-chain digital traceability, but is almost nonexistent in the seafood industry. Using both quantitative and qualitative methodology, this study explores the barriers impeding progress toward large-scale interoperability among digital traceability systems in the seafood sector from the perspectives of seafood companies, technology vendors, and supply chains as a whole. We highlight lessons from recent research and field work focused on implementing traceability across full supply chains and make some recommendations for next steps in terms of overcoming challenges and scaling current efforts. © 2017 Institute of Food Technologists®.

  16. Towards multi-layer interoperability of heterogeneous IoT platforms : the INTER-IoT approach

    NARCIS (Netherlands)

    Fortino, Giancarlo; Savaglio, Claudio; Palau, Carlos E.; de Puga, Jara Suarez; Ghanza, Maria; Paprzycki, Marcin; Montesinos, Miguel; Liotta, Antonio; Llop, Miguel; Gravina, R.; Palau, C.E.; Manso, M.; Liotta, A.; Fortino, G.

    2018-01-01

    Open interoperability delivers on the promise of enabling vendors and developers to interact and interoperate, without interfering with anyone’s ability to compete by delivering a superior product and experience. In the absence of global IoT standards, the INTER-IoT voluntary approach will support

  17. Support interoperability and reusability of emerging forms of assessment: Some issues on integrating IMS LD with IMS QTI

    NARCIS (Netherlands)

    Miao, Yongwu; Boon, Jo; Van der Klink, Marcel; Sloep, Peter; Koper, Rob

    2009-01-01

    Miao, Y., Boon, J., Van der Klink, M., Sloep, P. B., & Koper, R. (2011). Support interoperability and reusability of emerging forms of assessment: Some issues on integrating IMS LD with IMS QTI. In F. Lazarinis, S. Green, & E. Pearson (Eds.), E-Learning Standards and Interoperability: Frameworks

  18. Improved 206Pb/238U microprobe geochronology by the monitoring of a trace-element-related matrix effect; SHRIMP, ID-TIMS, ELA-ICP-MS and oxygen isotope documentation for a series of zircon standards

    Science.gov (United States)

    Black, L.P.; Kamo, S.L.; Allen, C.M.; Davis, D.W.; Aleinikoff, J.N.; Valley, J.W.; Mundil, R.; Campbell, I.H.; Korsch, R.J.; Williams, I.S.; Foudoulis, C.

    2004-01-01

    Precise isotope dilution-thermal ionisation mass spectrometry (ID-TIMS) documentation is given for two new Palaeozoic zircon standards (TEMORA 2 and R33). These data, in combination with results for previously documented standards (AS3, SL13, QGNG and TEMORA 1), provide the basis for a detailed investigation of inconsistencies in 206Pb/238U ages measured by microprobe. Although these ages are normally consistent between any two standards, their relative age offsets are often different from those established by ID-TIMS. This is true for both sensitive high-resolution ion-microprobe (SHRIMP) and excimer laser ablation-inductively coupled plasma-mass spectrometry (ELA-ICP-MS) dating, although the age offsets are in the opposite sense for the two techniques. Various factors have been investigated for possible correlations with age bias, in an attempt to resolve why the accuracy of the method is worse than the indicated precision. Crystallographic orientation, position on the grain-mount and oxygen isotopic composition are unrelated to the bias. There are, however, striking correlations between the 206Pb/238U age offsets and P, Sm and, most particularly, Nd abundances in the zircons. Although these are not believed to be the primary cause of this apparent matrix effect, they indicate that ionisation of 206Pb/238U is influenced, at least in part, by a combination of trace elements. Nd is sufficiently representative of the controlling trace elements that it provides a quantitative means of correcting for the microprobe age bias. This approach has the potential to reduce age biases associated with different techniques, different instrumentation and different standards within and between laboratories. Crown Copyright ?? 2004 Published by Elsevier B.V. All rights reserved.

  19. Exposures series

    OpenAIRE

    Stimson, Blake

    2011-01-01

    Reaktion Books’ Exposures series, edited by Peter Hamilton and Mark Haworth-Booth, is comprised of 13 volumes and counting, each less than 200 pages with 80 high-quality illustrations in color and black and white. Currently available titles include Photography and Australia, Photography and Spirit, Photography and Cinema, Photography and Literature, Photography and Flight, Photography and Egypt, Photography and Science, Photography and Africa, Photography and Italy, Photography and the USA, P...

  20. A versatile and interoperable network sensors for water resources monitoring

    Science.gov (United States)

    Ortolani, Alberto; Brandini, Carlo; Costantini, Roberto; Costanza, Letizia; Innocenti, Lucia; Sabatini, Francesco; Gozzini, Bernardo

    2010-05-01

    Monitoring systems to assess water resources quantity and quality require extensive use of in-situ measurements, that have great limitations like difficulties to access and share data, and to customise and easy reconfigure sensors network to fulfil end-users needs during monitoring or crisis phases. In order to address such limitations Sensor Web Enablement technologies for sensors management have been developed and applied to different environmental context under the EU-funded OSIRIS project (Open architecture for Smart and Interoperable networks in Risk management based on In-situ Sensors, www.osiris-fp6.eu). The main objective of OSIRIS was to create a monitoring system to manage different environmental crisis situations, through an efficient data processing chain where in-situ sensors are connected via an intelligent and versatile network infrastructure (based on web technologies) that enables end-users to remotely access multi-domain sensors information. Among the project application, one was focused on underground fresh-water monitoring and management. With this aim a monitoring system to continuously and automatically check water quality and quantity has been designed and built in a pilot test, identified as a portion of the Amiata aquifer feeding the Santa Fiora springs (Grosseto, Italy). This aquifer present some characteristics that make it greatly vulnerable under some conditions. It is a volcanic aquifer with a fractured structure. The volcanic nature in Santa Fiora causes levels of arsenic concentrations that normally are very close to the threshold stated by law, but that sometimes overpass such threshold for reasons still not fully understood. The presence of fractures makes the infiltration rate very inhomogeneous from place to place and very high in correspondence of big fractures. In case of liquid-pollutant spills (typically hydrocarbons spills from tanker accidents or leakage from house tanks containing fuel for heating), these fractures can act

  1. Towards Data Repository Interoperability: The Data Conservancy Data Packaging Specification

    Science.gov (United States)

    DiLauro, T.; Duerr, R.; Thessen, A. E.; Rippin, M.; Pralle, B.; Choudhury, G. S.

    2013-12-01

    description, the DCS instance will be able to provide default mappings for the directories and files within the package payload and enable support for deposited content at a lower level of service. Internally, the DCS will map these hybrid package serializations to its own internal business objects and their properties. Thus, this approach is highly extensible, as other packaging formats could be mapped in a similar manner. In addition, this scheme supports establishing the fixity of the payload while still supporting update of the semantic overlay data. This allows a data producer with scarce resources or an archivist who acquires a researcher's data to package the data for deposit with the intention of augmenting the resource description in the future. The Data Conservancy is partnering with the Sustainable Environment Actionable Data[4] project to test the interoperability of this new packaging mechanism. [1] Data Conservancy: http://dataconservancy.org/ [2] BagIt: https://datatracker.ietf.org/doc/draft-kunze-bagit/ [3] OAI-ORE: http://www.openarchives.org/ore/1.0/ [4] SEAD: http://sead-data.net/

  2. Use of Annotations for Component and Framework Interoperability

    Science.gov (United States)

    David, O.; Lloyd, W.; Carlson, J.; Leavesley, G. H.; Geter, F.

    2009-12-01

    western United States at the USDA NRCS National Water and Climate Center. PRMS is a component based modular precipitation-runoff model developed to evaluate the impacts of various combinations of precipitation, climate, and land use on streamflow and general basin hydrology. The new OMS 3.0 PRMS model source code is more concise and flexible as a result of using the new framework’s annotation based approach. The fully annotated components are now providing information directly for (i) model assembly and building, (ii) dataflow analysis for implicit multithreading, (iii) automated and comprehensive model documentation of component dependencies, physical data properties, (iv) automated model and component testing, and (v) automated audit-traceability to account for all model resources leading to a particular simulation result. Experience to date has demonstrated the multi-purpose value of using annotations. Annotations are also a feasible and practical method to enable interoperability among models and modeling frameworks. As a prototype example, model code annotations were used to generate binding and mediation code to allow the use of OMS 3.0 model components within the OpenMI context.

  3. Achieving control and interoperability through unified model-based systems and software engineering

    Science.gov (United States)

    Rasmussen, Robert; Ingham, Michel; Dvorak, Daniel

    2005-01-01

    Control and interoperation of complex systems is one of the most difficult challenges facing NASA's Exploration Systems Mission Directorate. An integrated but diverse array of vehicles, habitats, and supporting facilities, evolving over the long course of the enterprise, must perform ever more complex tasks while moving steadily away from the sphere of ground support and intervention.

  4. An ontology for regulating eHealth interoperability in developing African countries

    CSIR Research Space (South Africa)

    Moodley, D

    2013-08-01

    Full Text Available eHealth governance and regulation are necessary in low resource African countries to ensure effective and equitable use of health information technology and to realize national eHealth goals such as interoperability, adoption of standards and data...

  5. A Proposed Engineering Process and Prototype Toolset for Developing C2-to-Simulation Interoperability Solutions

    NARCIS (Netherlands)

    Gautreau, B.; Khimeche, L.; Reus, N.M. de; Heffner, K.; Mevassvik, O.M.

    2014-01-01

    The Coalition Battle Management Language (C-BML) is an open standard being developed for the exchange of digitized military information among command and control (C2), simulation and autonomous systems by the Simulation Interoperability Standards Organization (SISO). As the first phase of the C-BML

  6. RuleML-Based Learning Object Interoperability on the Semantic Web

    Science.gov (United States)

    Biletskiy, Yevgen; Boley, Harold; Ranganathan, Girish R.

    2008-01-01

    Purpose: The present paper aims to describe an approach for building the Semantic Web rules for interoperation between heterogeneous learning objects, namely course outlines from different universities, and one of the rule uses: identifying (in)compatibilities between course descriptions. Design/methodology/approach: As proof of concept, a rule…

  7. Contribution of Clinical Archetypes, and the Challenges, towards Achieving Semantic Interoperability for EHRs.

    Science.gov (United States)

    Tapuria, Archana; Kalra, Dipak; Kobayashi, Shinji

    2013-12-01

    The objective is to introduce 'clinical archetype' which is a formal and agreed way of representing clinical information to ensure interoperability across and within Electronic Health Records (EHRs). The paper also aims at presenting the challenges building quality labeled clinical archetypes and the challenges towards achieving semantic interoperability between EHRs. Twenty years of international research, various European healthcare informatics projects and the pioneering work of the openEHR Foundation have led to the following results. The requirements for EHR information architectures have been consolidated within ISO 18308 and adopted within the ISO 13606 EHR interoperability standard. However, a generic EHR architecture cannot ensure that the clinical meaning of information from heterogeneous sources can be reliably interpreted by receiving systems and services. Therefore, clinical models called 'clinical archetypes' are required to formalize the representation of clinical information within the EHR. Part 2 of ISO 13606 defines how archetypes should be formally represented. The current challenge is to grow clinical communities to build a library of clinical archetypes and to identify how evidence of best practice and multi-professional clinical consensus should best be combined to define archetypes at the optimal level of granularity and specificity and quality label them for wide adoption. Standardizing clinical terms within EHRs using clinical terminology like Systematized Nomenclature of Medicine Clinical Terms is also a challenge. Clinical archetypes would play an important role in achieving semantic interoperability within EHRs. Attempts are being made in exploring the design and adoption challenges for clinical archetypes.

  8. Analysis of interoperability requirements and of relevant activities in standards bodies and fora

    DEFF Research Database (Denmark)

    Guarneri, R.; Skouby, Knud Erik; Falch, Morten

    1999-01-01

    The aim of this deliverable is to provide the summary of the standardisation activities considered of most relevance for the work of the PRIME project with respect to interoperability; this information is of prime importance for the planning of further PRIME technical work in this area....

  9. OpenICE medical device interoperability platform overview and requirement analysis.

    Science.gov (United States)

    Arney, David; Plourde, Jeffrey; Goldman, Julian M

    2018-02-23

    We give an overview of OpenICE, an open source implementation of the ASTM standard F2761 for the Integrated Clinical Environment (ICE) that leverages medical device interoperability, together with an analysis of the clinical and non-functional requirements and community process that inspired its design.

  10. Model-based prototyping of an interoperability protocol for mobile ad-hoc networks

    NARCIS (Netherlands)

    Kristensen, L.M.; Westergaard, M.; Norgaard, P.C.; Romijn, J.; Smith, G.; Pol, van de J.

    2005-01-01

    We present an industrial project conducted at Ericsson Danmark A/S, Telebit where formal methods in the form of Coloured Petri Nets (CP-nets or CPNs) have been used for the specification of an interoperability protocol for routing packets between fixed core networks and mobile ad-hoc networks. The

  11. Interoperability Issues for Formal Authoring Processes, Community Efforts, and the Creation of Mashup PLE

    NARCIS (Netherlands)

    Klemke, Roland; Schmitz, Birgit

    2009-01-01

    Klemke, R., & Schmitz, B. (2009). Interoperability Issues for Formal Authoring Processes, Community Efforts, and the Creation of Mashup PLE. In F. Wild, M. Kalz, M. Palmér & D. Müller (Eds.), Proceedings of 2nd Workshop Mash-Up Personal Learning Environments (MUPPLE'09). Workshop in conjunction with

  12. Chief Information Officer's Role in Adopting an Interoperable Electronic Health Record System for Medical Data Exchange

    Science.gov (United States)

    Akpabio, Akpabio Enebong Ema

    2013-01-01

    Despite huge growth in hospital technology systems, there remains a dearth of literature examining health care administrator's perceptions of the efficacy of interoperable EHR systems. A qualitative research methodology was used in this multiple-case study to investigate the application of diffusion of innovations theory and the technology…

  13. CopperCore Service Integration, Integrating IMS Learning Design and IMS Question and Test Interoperability

    NARCIS (Netherlands)

    Vogten, Hubert; Martens, Harrie; Nadolski, Rob; Tattersall, Colin; Van Rosmalen, Peter; Koper, Rob

    2006-01-01

    Vogten, H., Martens, H., Nadolski, R., Tattersall, C., Rosmalen, van, P., Koper, R., (2006). CopperCore Service Integration, Integrating IMS Learning Design and IMS Question and Test Interoperability. Proceedings of the 6th IEEE International Conference on Advanced Learning Technologies (pp.

  14. Integrating IMS Learning Design and IMS Question and Test Interoperability using CopperCore Service Integration

    NARCIS (Netherlands)

    Vogten, Hubert; Martens, Harrie; Nadolski, Rob; Tattersall, Colin; Van Rosmalen, Peter; Koper, Rob

    2006-01-01

    Please, cite this publication as: Vogten, H., Martens, H., Nadolski, R., Tattersall, C., van Rosmalen, P., & Koper, R. (2006). Integrating IMS Learning Design and IMS Question and Test Interoperability using CopperCore Service Integration. Proceedings of International Workshop in Learning Networks

  15. Special issue on enabling open and interoperable access to Planetary Science and Heliophysics databases and tools

    Science.gov (United States)

    2018-01-01

    The large amount of data generated by modern space missions calls for a change of organization of data distribution and access procedures. Although long term archives exist for telescopic and space-borne observations, high-level functions need to be developed on top of these repositories to make Planetary Science and Heliophysics data more accessible and to favor interoperability. Results of simulations and reference laboratory data also need to be integrated to support and interpret the observations. Interoperable software and interfaces have recently been developed in many scientific domains. The Virtual Observatory (VO) interoperable standards developed for Astronomy by the International Virtual Observatory Alliance (IVOA) can be adapted to Planetary Sciences, as demonstrated by the VESPA (Virtual European Solar and Planetary Access) team within the Europlanet-H2020-RI project. Other communities have developed their own standards: GIS (Geographic Information System) for Earth and planetary surfaces tools, SPASE (Space Physics Archive Search and Extract) for space plasma, PDS4 (NASA Planetary Data System, version 4) and IPDA (International Planetary Data Alliance) for planetary mission archives, etc, and an effort to make them interoperable altogether is starting, including automated workflows to process related data from different sources.

  16. A Review of Interoperability Standards in E-health and Imperatives for their Adoption in Africa

    Directory of Open Access Journals (Sweden)

    Funmi Adebesin

    2013-07-01

    Full Text Available The ability of healthcare information systems to share and exchange information (interoperate is essential to facilitate the quality and effectiveness of healthcare services. Although standardization is considered key to addressing the fragmentation currently challenging the healthcare environment, e-health standardization can be difficult for many reasons, one of which is making sense of the e-health interoperability standards landscape. Specifically aimed at the African health informatics community, this paper aims to provide an overview of e-health interoperability and the significance of standardization in its achievement. We conducted a literature study of e-health standards, their development, and the degree of participation by African countries in the process. We also provide a review of a selection of prominent e-health interoperability standards that have been widely adopted especially by developed countries, look at some of the factors that affect their adoption in Africa, and provide an overview of ongoing global initiatives to address the identified barriers. Although the paper is specifically aimed at the African community, its findings would be equally applicable to many other developing countries.

  17. IHE cross-enterprise document sharing for imaging: interoperability testing software

    Directory of Open Access Journals (Sweden)

    Renaud Bérubé

    2010-09-01

    Full Text Available Abstract Background With the deployments of Electronic Health Records (EHR, interoperability testing in healthcare is becoming crucial. EHR enables access to prior diagnostic information in order to assist in health decisions. It is a virtual system that results from the cooperation of several heterogeneous distributed systems. Interoperability between peers is therefore essential. Achieving interoperability requires various types of testing. Implementations need to be tested using software that simulates communication partners, and that provides test data and test plans. Results In this paper we describe a software that is used to test systems that are involved in sharing medical images within the EHR. Our software is used as part of the Integrating the Healthcare Enterprise (IHE testing process to test the Cross Enterprise Document Sharing for imaging (XDS-I integration profile. We describe its architecture and functionalities; we also expose the challenges encountered and discuss the elected design solutions. Conclusions EHR is being deployed in several countries. The EHR infrastructure will be continuously evolving to embrace advances in the information technology domain. Our software is built on a web framework to allow for an easy evolution with web technology. The testing software is publicly available; it can be used by system implementers to test their implementations. It can also be used by site integrators to verify and test the interoperability of systems, or by developers to understand specifications ambiguities, or to resolve implementations difficulties.

  18. 75 FR 81605 - Smart Grid Interoperability Standards; Notice of Technical Conference

    Science.gov (United States)

    2010-12-28

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. RM11-2-000] Smart Grid Interoperability Standards; Notice of Technical Conference December 21, 2010. Take notice that the Federal Energy Regulatory Commission will hold a Technical Conference on Monday, January 31, 2011 at the Commission's headquarters at 888 First Street,...

  19. 75 FR 66752 - Smart Grid Interoperability Standards; Notice of Technical Conference

    Science.gov (United States)

    2010-10-29

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. RM11-2-000] Smart Grid Interoperability Standards; Notice of Technical Conference October 22, 2010. Take notice that the Federal Energy Regulatory Commission (Commission) will convene a conference on November 14, 2010, from 10:30 a.m. to 11:30 a.m. Eastern time in conjunction...

  20. Proceedings of the 1st Interoperability of Enterprise Software and Applications conference

    NARCIS (Netherlands)

    Konstantas, D.; Bourrieres, J-P.; Leonard, M.; Boudjlida, N.; Unknown, [Unknown

    2005-01-01

    Interoperability: the ability of a system or a product to work with other systems or products without special effort from the user is a key issue in manufacturing and industrial enterprise generally. It is fundamental to the production of goods and services quickly and at low cost at the same time

  1. Efficiency criterion for teleportation via channel matrix, measurement matrix and collapsed matrix

    Directory of Open Access Journals (Sweden)

    Xin-Wei Zha

    Full Text Available In this paper, three kinds of coefficient matrixes (channel matrix, measurement matrix, collapsed matrix associated with the pure state for teleportation are presented, the general relation among channel matrix, measurement matrix and collapsed matrix is obtained. In addition, a criterion for judging whether a state can be teleported successfully is given, depending on the relation between the number of parameter of an unknown state and the rank of the collapsed matrix. Keywords: Channel matrix, Measurement matrix, Collapsed matrix, Teleportation

  2. Matrix completion by deep matrix factorization.

    Science.gov (United States)

    Fan, Jicong; Cheng, Jieyu

    2018-02-01

    Conventional methods of matrix completion are linear methods that are not effective in handling data of nonlinear structures. Recently a few researchers attempted to incorporate nonlinear techniques into matrix completion but there still exists considerable limitations. In this paper, a novel method called deep matrix factorization (DMF) is proposed for nonlinear matrix completion. Different from conventional matrix completion methods that are based on linear latent variable models, DMF is on the basis of a nonlinear latent variable model. DMF is formulated as a deep-structure neural network, in which the inputs are the low-dimensional unknown latent variables and the outputs are the partially observed variables. In DMF, the inputs and the parameters of the multilayer neural network are simultaneously optimized to minimize the reconstruction errors for the observed entries. Then the missing entries can be readily recovered by propagating the latent variables to the output layer. DMF is compared with state-of-the-art methods of linear and nonlinear matrix completion in the tasks of toy matrix completion, image inpainting and collaborative filtering. The experimental results verify that DMF is able to provide higher matrix completion accuracy than existing methods do and DMF is applicable to large matrices. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Molecular dynamics simulations of matrix assisted laser desorption ionization: Matrix-analyte interactions

    International Nuclear Information System (INIS)

    Nangia, Shivangi; Garrison, Barbara J.

    2011-01-01

    There is synergy between matrix assisted laser desorption ionization (MALDI) experiments and molecular dynamics (MD) simulations. To understand analyte ejection from the matrix, MD simulations have been employed. Prior calculations show that the ejected analyte molecules remain solvated by the matrix molecules in the ablated plume. In contrast, the experimental data show free analyte ions. The main idea of this work is that analyte molecule ejection may depend on the microscopic details of analyte interaction with the matrix. Intermolecular matrix-analyte interactions have been studied by focusing on 2,5-dihydroxybenzoic acid (DHB; matrix) and amino acids (AA; analyte) using Chemistry at HARvard Molecular Mechanics (CHARMM) force field. A series of AA molecules have been studied to analyze the DHB-AA interaction. A relative scale of AA molecule affinity towards DHB has been developed.

  4. Interoperability of remote handling control system software modules at Divertor Test Platform 2 using middleware

    International Nuclear Information System (INIS)

    Tuominen, Janne; Rasi, Teemu; Mattila, Jouni; Siuko, Mikko; Esque, Salvador; Hamilton, David

    2013-01-01

    Highlights: ► The prototype DTP2 remote handling control system is a heterogeneous collection of subsystems, each realizing a functional area of responsibility. ► Middleware provides well-known, reusable solutions to problems, such as heterogeneity, interoperability, security and dependability. ► A middleware solution was selected and integrated with the DTP2 RH control system. The middleware was successfully used to integrate all relevant subsystems and functionality was demonstrated. -- Abstract: This paper focuses on the inter-subsystem communication channels in a prototype distributed remote handling control system at Divertor Test Platform 2 (DTP2). The subsystems are responsible for specific tasks and, over the years, their development has been carried out using various platforms and programming languages. The communication channels between subsystems have different priorities, e.g. very high messaging rate and deterministic timing or high reliability in terms of individual messages. Generally, a control system's communication infrastructure should provide interoperability, scalability, performance and maintainability. An attractive approach to accomplish this is to use a standardized and proven middleware implementation. The selection of a middleware can have a major cost impact in future integration efforts. In this paper we present development done at DTP2 using the Object Management Group's (OMG) standard specification for Data Distribution Service (DDS) for ensuring communications interoperability. DDS has gained a stable foothold especially in the military field. It lacks a centralized broker, thereby avoiding a single-point-of-failure. It also includes an extensive set of Quality of Service (QoS) policies. The standard defines a platform- and programming language independent model and an interoperability wire protocol that enables DDS vendor interoperability, allowing software developers to avoid vendor lock-in situations

  5. Interoperability of remote handling control system software modules at Divertor Test Platform 2 using middleware

    Energy Technology Data Exchange (ETDEWEB)

    Tuominen, Janne, E-mail: janne.m.tuominen@tut.fi [Tampere University of Technology, Department of Intelligent Hydraulics and Automation, Tampere (Finland); Rasi, Teemu; Mattila, Jouni [Tampere University of Technology, Department of Intelligent Hydraulics and Automation, Tampere (Finland); Siuko, Mikko [VTT, Technical Research Centre of Finland, Tampere (Finland); Esque, Salvador [F4E, Fusion for Energy, Torres Diagonal Litoral B3, Josep Pla2, 08019, Barcelona (Spain); Hamilton, David [ITER Organization, Route de Vinon sur Verdon, 13115 Saint Paul Lez Durance (France)

    2013-10-15

    Highlights: ► The prototype DTP2 remote handling control system is a heterogeneous collection of subsystems, each realizing a functional area of responsibility. ► Middleware provides well-known, reusable solutions to problems, such as heterogeneity, interoperability, security and dependability. ► A middleware solution was selected and integrated with the DTP2 RH control system. The middleware was successfully used to integrate all relevant subsystems and functionality was demonstrated. -- Abstract: This paper focuses on the inter-subsystem communication channels in a prototype distributed remote handling control system at Divertor Test Platform 2 (DTP2). The subsystems are responsible for specific tasks and, over the years, their development has been carried out using various platforms and programming languages. The communication channels between subsystems have different priorities, e.g. very high messaging rate and deterministic timing or high reliability in terms of individual messages. Generally, a control system's communication infrastructure should provide interoperability, scalability, performance and maintainability. An attractive approach to accomplish this is to use a standardized and proven middleware implementation. The selection of a middleware can have a major cost impact in future integration efforts. In this paper we present development done at DTP2 using the Object Management Group's (OMG) standard specification for Data Distribution Service (DDS) for ensuring communications interoperability. DDS has gained a stable foothold especially in the military field. It lacks a centralized broker, thereby avoiding a single-point-of-failure. It also includes an extensive set of Quality of Service (QoS) policies. The standard defines a platform- and programming language independent model and an interoperability wire protocol that enables DDS vendor interoperability, allowing software developers to avoid vendor lock-in situations.

  6. Towards an Interoperable Field Spectroscopy Metadata Standard with Extended Support for Marine Specific Applications

    Directory of Open Access Journals (Sweden)

    Barbara A. Rasaiah

    2015-11-01

    Full Text Available This paper presents an approach to developing robust metadata standards for specific applications that serves to ensure a high level of reliability and interoperability for a spectroscopy dataset. The challenges of designing a metadata standard that meets the unique requirements of specific user communities are examined, including in situ measurement of reflectance underwater, using coral as a case in point. Metadata schema mappings from seven existing metadata standards demonstrate that they consistently fail to meet the needs of field spectroscopy scientists for general and specific applications (μ = 22%, σ = 32% conformance with the core metadata requirements and μ = 19%, σ = 18% for the special case of a benthic (e.g., coral reflectance metadataset. Issues such as field measurement methods, instrument calibration, and data representativeness for marine field spectroscopy campaigns are investigated within the context of submerged benthic measurements. The implication of semantics and syntax for a robust and flexible metadata standard are also considered. A hybrid standard that serves as a “best of breed” incorporating useful modules and parameters within the standards is proposed. This paper is Part 3 in a series of papers in this journal, examining the issues central to a metadata standard for field spectroscopy datasets. The results presented in this paper are an important step towards field spectroscopy metadata standards that address the specific needs of field spectroscopy data stakeholders while facilitating dataset documentation, quality assurance, discoverability and data exchange within large-scale information sharing platforms.

  7. Generating series for GUE correlators

    Science.gov (United States)

    Dubrovin, Boris; Yang, Di

    2017-11-01

    We extend to the Toda lattice hierarchy the approach of Bertola et al. (Phys D Nonlinear Phenom 327:30-57, 2016; IMRN, 2016) to computation of logarithmic derivatives of tau-functions in terms of the so-called matrix resolvents of the corresponding difference Lax operator. As a particular application we obtain explicit generating series for connected GUE correlators. On this basis an efficient recursive procedure for computing the correlators in full genera is developed.

  8. Case Series.

    Science.gov (United States)

    Vetrayan, Jayachandran; Othman, Suhana; Victor Paulraj, Smily Jesu Priya

    2017-01-01

    To assess the effectiveness and feasibility of behavioral sleep intervention for medicated children with ADHD. Six medicated children (five boys, one girl; aged 6-12 years) with ADHD participated in a 4-week sleep intervention program. The main behavioral strategies used were Faded Bedtime With Response Cost (FBRC) and positive reinforcement. Within a case-series design, objective measure (Sleep Disturbance Scale for Children [SDSC]) and subjective measure (sleep diaries) were used to record changes in children's sleep. For all six children, significant decrease was found in the severity of children's sleep problems (based on SDSC data). Bedtime resistance and mean sleep onset latency were reduced following the 4-week intervention program according to sleep diaries data. Gains were generally maintained at the follow-up. Parents perceived the intervention as being helpful. Based on the initial data, this intervention shows promise as an effective and feasible treatment.

  9. The Matrix Cookbook

    DEFF Research Database (Denmark)

    Petersen, Kaare Brandt; Pedersen, Michael Syskind

    Matrix identities, relations and approximations. A desktop reference for quick overview of mathematics of matrices.......Matrix identities, relations and approximations. A desktop reference for quick overview of mathematics of matrices....

  10. Improving Patient Safety with X-Ray and Anesthesia Machine Ventilator Synchronization: A Medical Device Interoperability Case Study

    Science.gov (United States)

    Arney, David; Goldman, Julian M.; Whitehead, Susan F.; Lee, Insup

    When a x-ray image is needed during surgery, clinicians may stop the anesthesia machine ventilator while the exposure is made. If the ventilator is not restarted promptly, the patient may experience severe complications. This paper explores the interconnection of a ventilator and simulated x-ray into a prototype plug-and-play medical device system. This work assists ongoing interoperability framework development standards efforts to develop functional and non-functional requirements and illustrates the potential patient safety benefits of interoperable medical device systems by implementing a solution to a clinical use case requiring interoperability.

  11. Carbonate fuel cell matrix

    Science.gov (United States)

    Farooque, Mohammad; Yuh, Chao-Yi

    1996-01-01

    A carbonate fuel cell matrix comprising support particles and crack attenuator particles which are made platelet in shape to increase the resistance of the matrix to through cracking. Also disclosed is a matrix having porous crack attenuator particles and a matrix whose crack attenuator particles have a thermal coefficient of expansion which is significantly different from that of the support particles, and a method of making platelet-shaped crack attenuator particles.

  12. Matrix with Prescribed Eigenvectors

    Science.gov (United States)

    Ahmad, Faiz

    2011-01-01

    It is a routine matter for undergraduates to find eigenvalues and eigenvectors of a given matrix. But the converse problem of finding a matrix with prescribed eigenvalues and eigenvectors is rarely discussed in elementary texts on linear algebra. This problem is related to the "spectral" decomposition of a matrix and has important technical…

  13. Triangularization of a Matrix

    Indian Academy of Sciences (India)

    Much of linear algebra is devoted to reducing a matrix (via similarity or unitary similarity) to another that has lots of zeros. The simplest such theorem is the Schur triangularization theorem. This says that every matrix is unitarily similar to an upper triangular matrix. Our aim here is to show that though it is very easy to prove it ...

  14. DCP Series

    Directory of Open Access Journals (Sweden)

    Philip Stearns

    2011-06-01

    Full Text Available Photo essay. A collection of Images produced by intentionally corrupting the circuitry of a Kodak DC280 2 MP digitalcamera. By rewiring the electronics of a digital camera, glitched images are produced in a manner that parallels chemically processing unexposed film or photographic paper to produce photographic images without exposure to light. The DCP Series of Digital Images are direct visualizations of data generated by a digital camera as it takes a picture. Electronic processes associated with the normal operations of the camera, which are usually taken for granted, are revealed through an act of intervention. The camera is turned inside­out through complexes of short­circuits, selected by the artist, transforming the camera from a picture taking device to a data capturing device that renders raw data (electronic signals as images. In essence, these images are snap­shots of electronic signals dancing through the camera's circuits, manually rerouted, written directly to the on­board memory device. Rather than seeing images of the world through a lens, we catch a glimpse of what the camera sees when it is forced to peer inside its own mind.

  15. A New Cyber-enabled Platform for Scale-independent Interoperability of Earth Observations with Hydrologic Models

    Science.gov (United States)

    Rajib, A.; Zhao, L.; Merwade, V.; Shin, J.; Smith, J.; Song, C. X.

    2017-12-01

    Despite the significant potential of remotely sensed earth observations, their application is still not full-fledged in water resources research, management and education. Inconsistent storage structures, data formats and spatial resolution among different platforms/sources of earth observations hinder the use of these data. Available web-services can help bulk data downloading and visualization, but they are not sufficiently tailored to meet the degree of interoperability required for direct application of earth observations in hydrologic modeling at user-defined spatio-temporal scales. Similarly, the least ambiguous way for educators and watershed managers is to instantaneously obtain a time-series at any watershed of interest without spending time and computational resources on data download and post-processing activities. To address this issue, an open access, online platform, named HydroGlobe, is developed that minimizes all these processing tasks and delivers ready-to-use data from different earth observation sources. HydroGlobe can provide spatially-averaged time series of earth observations by using the following inputs: (i) data source, (ii) temporal extent in the form of start/end date, and (iii) geographic units (e.g., grid cell or sub-basin boundary) and extent in the form of GIS shapefile. In its preliminary version, HydroGlobe simultaneously handles five data sources including the surface and root zone soil moisture from SMAP (Soil Moisture Active Passive Mission), actual and potential evapotranspiration from MODIS (Moderate Resolution Imaging Spectroradiometer), and precipitation from GPM (Global Precipitation Measurements). This presentation will demonstrate the HydroGlobe interface and its applicability using few test cases on watersheds from different parts of the globe.

  16. Clinical data integration model. Core interoperability ontology for research using primary care data.

    Science.gov (United States)

    Ethier, J-F; Curcin, V; Barton, A; McGilchrist, M M; Bastiaens, H; Andreasson, A; Rossiter, J; Zhao, L; Arvanitis, T N; Taweel, A; Delaney, B C; Burgun, A

    2015-01-01

    This article is part of the Focus Theme of METHODS of Information in Medicine on "Managing Interoperability and Complexity in Health Systems". Primary care data is the single richest source of routine health care data. However its use, both in research and clinical work, often requires data from multiple clinical sites, clinical trials databases and registries. Data integration and interoperability are therefore of utmost importance. TRANSFoRm's general approach relies on a unified interoperability framework, described in a previous paper. We developed a core ontology for an interoperability framework based on data mediation. This article presents how such an ontology, the Clinical Data Integration Model (CDIM), can be designed to support, in conjunction with appropriate terminologies, biomedical data federation within TRANSFoRm, an EU FP7 project that aims to develop the digital infrastructure for a learning healthcare system in European Primary Care. TRANSFoRm utilizes a unified structural / terminological interoperability framework, based on the local-as-view mediation paradigm. Such an approach mandates the global information model to describe the domain of interest independently of the data sources to be explored. Following a requirement analysis process, no ontology focusing on primary care research was identified and, thus we designed a realist ontology based on Basic Formal Ontology to support our framework in collaboration with various terminologies used in primary care. The resulting ontology has 549 classes and 82 object properties and is used to support data integration for TRANSFoRm's use cases. Concepts identified by researchers were successfully expressed in queries using CDIM and pertinent terminologies. As an example, we illustrate how, in TRANSFoRm, the Query Formulation Workbench can capture eligibility criteria in a computable representation, which is based on CDIM. A unified mediation approach to semantic interoperability provides a

  17. XML interoperability standards for seamless communication: An analysis of industry-neutral and domain-specific initiatives

    NARCIS (Netherlands)

    Chituc, C.M.

    2017-01-01

    Attaining seamless interoperability among heterogeneous communication systems and technologies remains a great challenge in todays’ networked world. Real time information exchange among heterogeneous and geographically distributed systems is required to support the execution of complex e-business

  18. Definition and implementation of a SAML-XACML profile for authorization interoperability across grid middleware in OSG and EGEE

    Energy Technology Data Exchange (ETDEWEB)

    Garzoglio, Gabriele; Alderman, Ian; Altunay, Mine; Anathakrishnan, Rachana; Bester, Joe; Chadwick, Keith; Ciaschini, Vincenzo; Demchenko, Yuri; Ferraro, Andrea; Forti, Alberto; Groep, David; /Fermilab /NIKHEF, Amsterdam /Brookhaven /Amsterdam U. /SWITCH, Zurich /Bergen U. /INFN, CNAF /Argonne /Wisconsin U., Madison

    2009-04-01

    In order to ensure interoperability between middleware and authorization infrastructures used in the Open Science Grid (OSG) and the Enabling Grids for E-sciencE (EGEE) projects, an Authorization Interoperability activity was initiated in 2006. The interoperability goal was met in two phases: first, agreeing on a common authorization query interface and protocol with an associated profile that ensures standardized use of attributes and obligations; and second, implementing, testing, and deploying, on OSG and EGEE, middleware that supports the interoperability protocol and profile. The activity has involved people from OSG, EGEE, the Globus Toolkit project, and the Condor project. This paper presents a summary of the agreed-upon protocol, profile and the software components involved.

  19. Introduction to Matrix Algebra, Student's Text, Unit 23.

    Science.gov (United States)

    Allen, Frank B.; And Others

    Unit 23 in the SMSG secondary school mathematics series is a student text covering the following topics in matrix algebra: matrix operations, the algebra of 2 X 2 matrices, matrices and linear systems, representation of column matrices as geometric vectors, and transformations of the plane. Listed in the appendix are four research exercises in…

  20. Common business objects: Demonstrating interoperability in the oil and gas industry

    International Nuclear Information System (INIS)

    McLellan, S.G.; Abusalbi, N.; Brown, J.; Quinlivan, W.F.

    1997-01-01

    The PetroTechnical Open Software Corp. (POSC) was organized in 1990 to define technical methods to make it easier to design interoperable data solutions for oil and gas companies. When POSC rolls out seed implementations, oilfield service members must validate them, correct any errors or ambiguities, and champion these corrections into the original specifications before full integration into POSC-compliant, commercial products. Organizations like POSC are assuming a new role of promoting formation of projects where E and P companies and vendors jointly test their pieces of the migration puzzle on small subsets of the whole problem. The authors describe three such joint projects. While confirming the value of such open cross-company cooperation, these cases also help to redefine interoperability in terms of business objects that will be common across oilfield companies, their applications, access software, data, or data stores

  1. Combining Archetypes with Fast Health Interoperability Resources in Future-proof Health Information Systems.

    Science.gov (United States)

    Bosca, Diego; Moner, David; Maldonado, Jose Alberto; Robles, Montserrat

    2015-01-01

    Messaging standards, and specifically HL7 v2, are heavily used for the communication and interoperability of Health Information Systems. HL7 FHIR was created as an evolution of the messaging standards to achieve semantic interoperability. FHIR is somehow similar to other approaches like the dual model methodology as both are based on the precise modeling of clinical information. In this paper, we demonstrate how we can apply the dual model methodology to standards like FHIR. We show the usefulness of this approach for data transformation between FHIR and other specifications such as HL7 CDA, EN ISO 13606, and openEHR. We also discuss the advantages and disadvantages of defining archetypes over FHIR, and the consequences and outcomes of this approach. Finally, we exemplify this approach by creating a testing data server that supports both FHIR resources and archetypes.

  2. Challenges and Approaches to Make Multidisciplinary Team Meetings Interoperable - The KIMBo Project.

    Science.gov (United States)

    Krauss, Oliver; Holzer, Karl; Schuler, Andreas; Egelkraut, Reinhard; Franz, Barbara

    2017-01-01

    Multidisciplinary team meetings (MDTMs) are already in use for certain areas in healthcare (e.g. treatment of cancer). Due to the lack of common standards and accessibility for the applied IT systems, their potential is not yet completely exploited. Common requirements for MDTMs shall be identified and aggregated into a process definition to be automated by an application architecture utilizing modern standards in electronic healthcare, e.g. HL7 FHIR. To identify requirements, an extensive literature review as well as semi-structured expert interviews were conducted. Results showed, that interoperability and flexibility in terms of the process are key requirements to be addressed. An architecture blueprint as well as an aggregated process definition were derived from the insights gained. To evaluate the feasibility of identified requirements, methods of explorative prototyping in software engineering were used. MDTMs will become an important part of modern and future healthcare but the need for standardization in terms of interoperability is imminent.

  3. End-to-end interoperability and workflows from building architecture design to one or more simulations

    Science.gov (United States)

    Chao, Tian-Jy; Kim, Younghun

    2015-02-10

    An end-to-end interoperability and workflows from building architecture design to one or more simulations, in one aspect, may comprise establishing a BIM enablement platform architecture. A data model defines data entities and entity relationships for enabling the interoperability and workflows. A data definition language may be implemented that defines and creates a table schema of a database associated with the data model. Data management services and/or application programming interfaces may be implemented for interacting with the data model. Web services may also be provided for interacting with the data model via the Web. A user interface may be implemented that communicates with users and uses the BIM enablement platform architecture, the data model, the data definition language, data management services and application programming interfaces to provide functions to the users to perform work related to building information management.

  4. Towards Interoperable IoT Deployments inSmart Cities - How project VITAL enables smart, secure and cost- effective cities

    OpenAIRE

    Schiele , Gregor; Soldatos , John; Mitton , Nathalie

    2014-01-01

    International audience; IoT-based deployments in smart cities raise several challenges, especially in terms of interoperability. In this paper, we illustrate semantic interoperability solutions for IoT systems. Based on these solutions, we describe how the FP7 VITAL project aims to bridge numerous silo IoT deployments in smart cities through repurposing and reusing sensors and data streams across multiple applications without carelessly compromising citizens’ security and privacy. This approa...

  5. caCORE version 3: Implementation of a model driven, service-oriented architecture for semantic interoperability

    OpenAIRE

    Komatsoulis, George A.; Warzel, Denise B.; Hartel, Frank W.; Shanbhag, Krishnakant; Chilukuri, Ram; Fragoso, Gilberto; de Coronado, Sherri; Reeves, Dianne M.; Hadfield, Jillaine B.; Ludet, Christophe; Covitz, Peter A.

    2007-01-01

    One of the requirements for a federated information system is interoperability, the ability of one computer system to access and use the resources of another system. This feature is particularly important in biomedical research systems, which need to coordinate a variety of disparate types of data. In order to meet this need, the National Cancer Institute Center for Bioinformatics (NCICB) has created the cancer Common Ontologic Representation Environment (caCORE), an interoperability infrastr...

  6. Implementation of a metadata architecture and knowledge collection to support semantic interoperability in an enterprise data warehouse.

    Science.gov (United States)

    Dhaval, Rakesh; Borlawsky, Tara; Ostrander, Michael; Santangelo, Jennifer; Kamal, Jyoti; Payne, Philip R O

    2008-11-06

    In order to enhance interoperability between enterprise systems, and improve data validity and reliability throughout The Ohio State University Medical Center (OSUMC), we have initiated the development of an ontology-anchored metadata architecture and knowledge collection for our enterprise data warehouse. The metadata and corresponding semantic relationships stored in the OSUMC knowledge collection are intended to promote consistency and interoperability across the heterogeneous clinical, research, business and education information managed within the data warehouse.

  7. Electronic Health Records: VAs Efforts Raise Concerns about Interoperability Goals and Measures, Duplication with DOD, and Future Plans

    Science.gov (United States)

    2016-07-13

    ELECTRONIC HEALTH RECORDS VA’s Efforts Raise Concerns about Interoperability Goals and Measures, Duplication with DOD...Agencies, Committee on Appropriations, U.S. Senate July 13, 2016 ELECTRONIC HEALTH RECORDS VA’s Efforts Raise Concerns about Interoperability Goals...initiatives with the Department of Defense (DOD) that were intended to advance the ability of the two departments to share electronic health records ,

  8. Fundamental Data Standards for Science Data System Interoperability and Data Correlation

    Science.gov (United States)

    Hughes, J. Steven; Gopala Krishna, Barla; Rye, Elizabeth; Crichton, Daniel

    The advent of the Web and languages such as XML have brought an explosion of online science data repositories and the promises of correlated data and interoperable systems. However there have been relatively few successes in meeting the expectations of science users in the internet age. For example a Google-like search for images of Mars will return many highly-derived and appropriately tagged images but largely ignore the majority of images in most online image repositories. Once retrieved, users are further frustrated by poor data descriptions, arcane formats, and badly organized ancillary information. A wealth of research indicates that shared information models are needed to enable system interoperability and data correlation. However, at a more fundamental level, data correlation and system interoperability are dependant on a relatively few shared data standards. A com-mon data dictionary standard, for example, allows the controlled vocabulary used in a science repository to be shared with potential collaborators. Common data registry and product iden-tification standards enable systems to efficiently find, locate, and retrieve data products and their metadata from remote repositories. Information content standards define categories of descriptive data that help make the data products scientifically useful to users who were not part of the original team that produced the data. The Planetary Data System (PDS) has a plan to move the PDS to a fully online, federated system. This plan addresses new demands on the system including increasing data volume, numbers of missions, and complexity of missions. A key component of this plan is the upgrade of the PDS Data Standards. The adoption of the core PDS data standards by the International Planetary Data Alliance (IPDA) adds the element of international cooperation to the plan. This presentation will provide an overview of the fundamental data standards being adopted by the PDS that transcend science domains and that

  9. Medical Device Plug-and-Play Interoperability Standards and Technology Leadership

    Science.gov (United States)

    2016-10-01

    death,” The Boston Globe, April 3 2010. 27. Arney D, Pajic M, Goldman JM, Lee I, Mangharam R, Sokolsky O, “Toward Patient Safety in Closed - Loop Medical ...becoming increasingly clear. We have been providing medical device interoperability domain expertise to assist the Veterans Administration in a...15. Wallroth C, Goldman J, Manigel J, Osborn D, Roellike T, Weininger S, Westenskow D, “Development of a Standard for Physiologic Closed Loop

  10. Profiling Fast Healthcare Interoperability Resources (FHIR) of Family Health History based on the Clinical Element Models

    OpenAIRE

    Lee, Jaehoon; Hulse, Nathan C.; Wood, Grant M.; Oniki, Thomas A.; Huff, Stanley M.

    2017-01-01

    In this study we developed a Fast Healthcare Interoperability Resources (FHIR) profile to support exchanging a full pedigree based family health history (FHH) information across multiple systems and applications used by clinicians, patients, and researchers. We used previously developed clinical element models (CEMs) that are capable of representing the FHH information, and derived essential data elements including attributes, constraints, and value sets. We analyzed gaps between the FHH CEM ...

  11. In vivo evaluation of inter-operator reproducibility of digital dental and conventional impression techniques.

    Directory of Open Access Journals (Sweden)

    Emi Kamimura

    Full Text Available The aim of this study was to evaluate and compare the inter-operator reproducibility of three-dimensional (3D images of teeth captured by a digital impression technique to a conventional impression technique in vivo.Twelve participants with complete natural dentition were included in this study. A digital impression of the mandibular molars of these participants was made by two operators with different levels of clinical experience, 3 or 16 years, using an intra-oral scanner (Lava COS, 3M ESPE. A silicone impression also was made by the same operators using the double mix impression technique (Imprint3, 3M ESPE. Stereolithography (STL data were directly exported from the Lava COS system, while STL data of a plaster model made from silicone impression were captured by a three-dimensional (3D laboratory scanner (D810, 3shape. The STL datasets recorded by two different operators were compared using 3D evaluation software and superimposed using the best-fit-algorithm method (least-squares method, PolyWorks, InnovMetric Software for each impression technique. Inter-operator reproducibility as evaluated by average discrepancies of corresponding 3D data was compared between the two techniques (Wilcoxon signed-rank test.The visual inspection of superimposed datasets revealed that discrepancies between repeated digital impression were smaller than observed with silicone impression. Confirmation was forthcoming from statistical analysis revealing significantly smaller average inter-operator reproducibility using a digital impression technique (0.014± 0.02 mm than when using a conventional impression technique (0.023 ± 0.01 mm.The results of this in vivo study suggest that inter-operator reproducibility with a digital impression technique may be better than that of a conventional impression technique and is independent of the clinical experience of the operator.

  12. Medical Device Plug-and-Play Interoperability Standards and Technology Leadership

    Science.gov (United States)

    2014-10-01

    Kaiser Permanente, Johns Hopkins Medicine, the VA, FDA, NIST, TATRC, computer and information science groups at University of Pennsylvania...for sharing the findings from our TATRC work and from our NIH Quantum work relative to the gaps in existing standards and recommendations on how they...challenges facing the industry in areas such as interoperability, cybersecurity , data stewardship, and system reliability, and policies needed to accelerate

  13. Bringing Health and Fitness Data Together for Connected Health Care: Mobile Apps as Enablers of Interoperability

    OpenAIRE

    Gay, Valerie; Leijdekkers, Peter

    2015-01-01

    Background A transformation is underway regarding how we deal with our health. Mobile devices make it possible to have continuous access to personal health information. Wearable devices, such as Fitbit and Apple?s smartwatch, can collect data continuously and provide insights into our health and fitness. However, lack of interoperability and the presence of data silos prevent users and health professionals from getting an integrated view of health and fitness data. To provide better health ou...

  14. Autonomous Underwater Vehicle Data Management and Metadata Interoperability for Coastal Ocean Studies

    Science.gov (United States)

    McCann, M. P.; Ryan, J. P.; Chavez, F. P.; Rienecker, E.

    2004-12-01

    Data from over 1000 km of Autonomous Underwater Vehicle (AUV) surveys of Monterey Bay have been collected and cataloged in an ocean observatory data management system. The Monterey Bay Aquarium Institute's AUV is equipped with a suite of instruments that include a conductivity, temperature, depth (CTD) instrument, transmissometers, a fluorometer, a nitrate sensor, and an inertial navigation system. Data are logged on the vehicle and upon completion of a survey XML descriptions of the data are submitted to the Shore Side Data System (SSDS). Instrument data are then processed on shore to apply calibrations and produce scientifically useful data products. The SSDS employs a data model that tracks data from the instrument that created it through all the consuming processes that generate derived products. SSDS employs OPeNDAP and netCDF to provide data set interoperability at the data level. The core of SSDS is the metadata that is the catalog of these data sets and their relation to all other relevant data. The metadata is managed in a relational database and governed by a Enterprise Java Bean (EJB) server application. Cross-platform Java applications have been written to manage and visualize these data. A Java Swing application - the Hierarchical Ocean Observatory Visualization and Editing System (HOOVES) - has been developed to provide visualization of data set pedigree and data set variables. Because the SSDS data model is generalized according to "Data Producers" and "Data Containers" many different types of data can be represented in SSDS allowing for interoperability at a metadata level. Comparisons of appropriate data sets, whether they are from an autonomous underwater vehicle or from a fixed mooring are easily made using SSDS. The authors will present the SSDS data model and show examples of how the model helps organize data set metadata allowing for data discovery and interoperability. With improved discovery and interoperability the system is helping us

  15. Building a Global Earth Observation System of Systems (GEOSS) and Its Interoperability Challenges

    Science.gov (United States)

    Ryan, B. J.

    2015-12-01

    Launched in 2005 by industrialized nations, the Group on Earth Observations (GEO) began building the Global Earth Observation System of Systems (GEOSS). Consisting of both a policy framework, and an information infrastructure, GEOSS, was intended to link and/or integrate the multitude of Earth observation systems, primarily operated by its Member Countries and Participating Organizations, so that users could more readily benefit from global information assets for a number of society's key environmental issues. It was recognized that having ready access to observations from multiple systems was a prerequisite for both environmental decision-making, as well as economic development. From the very start, it was also recognized that the shear complexity of the Earth's system cannot be captured by any single observation system, and that a federated, interoperable approach was necessary. While this international effort has met with much success, primarily in advancing broad, open data policies and practices, challenges remain. In 2014 (Geneva, Switzerland) and 2015 (Mexico City, Mexico), Ministers from GEO's Member Countries, including the European Commission, came together to assess progress made during the first decade (2005 to 2015), and approve implementation strategies and mechanisms for the second decade (2016 to 2025), respectively. The approved implementation strategies and mechanisms are intended to advance GEOSS development thereby facilitating the increased uptake of Earth observations for informed decision-making. Clearly there are interoperability challenges that are technological in nature, and several will be discussed in this presentation. There are, however, interoperability challenges that can be better characterized as economic, governmental and/or political in nature, and these will be discussed as well. With the emergence of the Sustainable Development Goals (SDGs), the World Conference on Disaster Risk Reduction (WCDRR), and the United Nations

  16. Achieving Interoperability Through Base Registries for Governmental Services and Document Management

    Science.gov (United States)

    Charalabidis, Yannis; Lampathaki, Fenareti; Askounis, Dimitris

    As digital infrastructures increase their presence worldwide, following the efforts of governments to provide citizens and businesses with high-quality one-stop services, there is a growing need for the systematic management of those newly defined and constantly transforming processes and electronic documents. E-government Interoperability Frameworks usually cater to the technical standards of e-government systems interconnection, but do not address service composition and use by citizens, businesses, or other administrations.

  17. Data and Mined-Knowledge Interoperability in eHealth Systems

    OpenAIRE

    Sartipi, Kamran; Najafi, Mehran; Kazemzadeh, Reza S.

    2008-01-01

    Current healthcare infrastructures in the advanced societies can not fulfil the demands for quality public health services which are characterized by patient-centric, seamless interoperation of heterogeneous healthcare systems, and nation-wide electronic health record services. Consequently, the governments and healthcare institutions are embracing new information and communication technologies to provide the necessary infrastructures for healthcare and medical services. In this chapter, we a...

  18. Beyond Sister City Agreements: Exploring the Challenges of Full International Interoperability

    Science.gov (United States)

    2016-03-01

    Mexican-American Attitudes toward Mexico,” International Migration Review 32, no. 2 ( Summer 1998). 36 de la Garza and DeSipio, “Interests Not Passions...Daily Times, October 7, 2014, 2. 44 Diana L. Haytko, John L. Kent, and Angela Hausman , “Mexican Maquiladoras: Helping or Hurting the US/Mexico...standing of a local player in a regional game. These interoperable agreements make sense from a political 175 Lawrence Freedman, Strategy-A History (New

  19. Personalized-Detailed Clinical Model for Data Interoperability Among Clinical Standards

    OpenAIRE

    Khan, Wajahat Ali; Hussain, Maqbool; Afzal, Muhammad; Amin, Muhammad Bilal; Saleem, Muhammad Aamir; Lee, Sungyoung

    2013-01-01

    Objective: Data interoperability among health information exchange (HIE) systems is a major concern for healthcare practitioners to enable provisioning of telemedicine-related services. Heterogeneity exists in these systems not only at the data level but also among different heterogeneous healthcare standards with which these are compliant. The relationship between healthcare organization data and different heterogeneous standards is necessary to achieve the goal of data level interoperabi...

  20. Semantic Interoperability in Czech Healthcare Environment Supported by HL7 Version 3

    Czech Academy of Sciences Publication Activity Database

    Nagy, Miroslav; Hanzlíček, Petr; Přečková, Petra; Říha, Antonín; Dioszegi, Matěj; Seidl, Libor; Zvárová, Jana

    2010-01-01

    Roč. 49, č. 2 (2010), s. 186-195 ISSN 0026-1270 R&D Projects: GA MŠk(CZ) 1M06014; GA AV ČR 1ET200300413 Institutional research plan: CEZ:AV0Z10300504 Keywords : information storage and retrieval * electronic health record * HL7 * semantic interoperability * communication standards Subject RIV: IN - Informatics, Computer Science Impact factor: 1.472, year: 2010

  1. Model for peace support operations: an overview of the ICT and interoperability requirements

    CSIR Research Space (South Africa)

    Leenen, L

    2009-03-01

    Full Text Available requires a reciprocal interdependence among these various elements, and this necessitates complex coordination and a great demand for ongoing and accurate communication (Chisholm 1986). Higher technological complexity requires higher levels... interoperability requirements thereof. Such methods, when fully developed, give the military planner the ability to rapidly assess the requirements as circumstances change. From interviews with SANDF staff (Ross 2007), we gathered that the SANDF planning...

  2. The development of a prototype level-three interoperable catalog system

    Science.gov (United States)

    Hood, Carroll A.; Howie, Randy; Verhanovitz, Rich

    1993-08-01

    The development of a level-three interoperable catalog system is defined by a new paradigm for metadata access. The old paradigm is characterized by a hierarchy of metadata layers, the transfer of control to target systems, and the requirement for the user to be familiar with the syntax and data dictionaries of several catalog system elements. Attributes of the new paradigm are exactly orthogonal: the directory and inventories are peer entities, there is a single user interface, and the system manages the complexity of interacting transparently with remote elements. We have designed and implemented a prototype level-three interoperable catalog system based on the new paradigm. Through a single intelligent interface, users can interoperably access a master directory, inventories for selected satellite datasets, and an in situ meteorological dataset inventory. This paper describes the development of the prototype system and three of the formidable challenges that were addressed in the process. The first involved the interoperable integration of satellite and in situ inventories, which to our knowledge, has never been operationally demonstrated. The second was the development of a search strategy for orbital and suborbital granules which preserves the capability to identify temporally or spatially coincident subsets between them. The third involved establishing a method of incorporating inventory-specific search criteria into user queries. We are working closely with selected science data users to obtain feedback on the system's design and performance. The lessons learned from this prototype will help direct future development efforts. Distributed data systems of the 1990s such as EOSDIS and the Global Change Data and Information System (GCDIS) will be able to build on this prototype.

  3. In vivo evaluation of inter-operator reproducibility of digital dental and conventional impression techniques

    Science.gov (United States)

    Kamimura, Emi; Tanaka, Shinpei; Takaba, Masayuki; Tachi, Keita; Baba, Kazuyoshi

    2017-01-01

    Purpose The aim of this study was to evaluate and compare the inter-operator reproducibility of three-dimensional (3D) images of teeth captured by a digital impression technique to a conventional impression technique in vivo. Materials and methods Twelve participants with complete natural dentition were included in this study. A digital impression of the mandibular molars of these participants was made by two operators with different levels of clinical experience, 3 or 16 years, using an intra-oral scanner (Lava COS, 3M ESPE). A silicone impression also was made by the same operators using the double mix impression technique (Imprint3, 3M ESPE). Stereolithography (STL) data were directly exported from the Lava COS system, while STL data of a plaster model made from silicone impression were captured by a three-dimensional (3D) laboratory scanner (D810, 3shape). The STL datasets recorded by two different operators were compared using 3D evaluation software and superimposed using the best-fit-algorithm method (least-squares method, PolyWorks, InnovMetric Software) for each impression technique. Inter-operator reproducibility as evaluated by average discrepancies of corresponding 3D data was compared between the two techniques (Wilcoxon signed-rank test). Results The visual inspection of superimposed datasets revealed that discrepancies between repeated digital impression were smaller than observed with silicone impression. Confirmation was forthcoming from statistical analysis revealing significantly smaller average inter-operator reproducibility using a digital impression technique (0.014± 0.02 mm) than when using a conventional impression technique (0.023 ± 0.01 mm). Conclusion The results of this in vivo study suggest that inter-operator reproducibility with a digital impression technique may be better than that of a conventional impression technique and is independent of the clinical experience of the operator. PMID:28636642

  4. caCORE version 3: Implementation of a model driven, service-oriented architecture for semantic interoperability.

    Science.gov (United States)

    Komatsoulis, George A; Warzel, Denise B; Hartel, Francis W; Shanbhag, Krishnakant; Chilukuri, Ram; Fragoso, Gilberto; Coronado, Sherri de; Reeves, Dianne M; Hadfield, Jillaine B; Ludet, Christophe; Covitz, Peter A

    2008-02-01

    One of the requirements for a federated information system is interoperability, the ability of one computer system to access and use the resources of another system. This feature is particularly important in biomedical research systems, which need to coordinate a variety of disparate types of data. In order to meet this need, the National Cancer Institute Center for Bioinformatics (NCICB) has created the cancer Common Ontologic Representation Environment (caCORE), an interoperability infrastructure based on Model Driven Architecture. The caCORE infrastructure provides a mechanism to create interoperable biomedical information systems. Systems built using the caCORE paradigm address both aspects of interoperability: the ability to access data (syntactic interoperability) and understand the data once retrieved (semantic interoperability). This infrastructure consists of an integrated set of three major components: a controlled terminology service (Enterprise Vocabulary Services), a standards-based metadata repository (the cancer Data Standards Repository) and an information system with an Application Programming Interface (API) based on Domain Model Driven Architecture. This infrastructure is being leveraged to create a Semantic Service-Oriented Architecture (SSOA) for cancer research by the National Cancer Institute's cancer Biomedical Informatics Grid (caBIG).

  5. CCR+: Metadata Based Extended Personal Health Record Data Model Interoperable with the ASTM CCR Standard.

    Science.gov (United States)

    Park, Yu Rang; Yoon, Young Jo; Jang, Tae Hun; Seo, Hwa Jeong; Kim, Ju Han

    2014-01-01

    Extension of the standard model while retaining compliance with it is a challenging issue because there is currently no method for semantically or syntactically verifying an extended data model. A metadata-based extended model, named CCR+, was designed and implemented to achieve interoperability between standard and extended models. Furthermore, a multilayered validation method was devised to validate the standard and extended models. The American Society for Testing and Materials (ASTM) Community Care Record (CCR) standard was selected to evaluate the CCR+ model; two CCR and one CCR+ XML files were evaluated. In total, 188 metadata were extracted from the ASTM CCR standard; these metadata are semantically interconnected and registered in the metadata registry. An extended-data-model-specific validation file was generated from these metadata. This file can be used in a smartphone application (Health Avatar CCR+) as a part of a multilayered validation. The new CCR+ model was successfully evaluated via a patient-centric exchange scenario involving multiple hospitals, with the results supporting both syntactic and semantic interoperability between the standard CCR and extended, CCR+, model. A feasible method for delivering an extended model that complies with the standard model is presented herein. There is a great need to extend static standard models such as the ASTM CCR in various domains: the methods presented here represent an important reference for achieving interoperability between standard and extended models.

  6. Breaking barriers to interoperability: assigning spatially and temporally unique identifiers to spaces and buildings.

    Science.gov (United States)

    Pyke, Christopher R; Madan, Isaac

    2013-08-01

    The real estate industry routinely uses specialized information systems for functions, including design, construction, facilities management, brokerage, tax assessment, and utilities. These systems are mature and effective within vertically integrated market segments. However, new questions are reaching across these traditional information silos. For example, buyers may be interested in evaluating the design, energy efficiency characteristics, and operational performance of a commercial building. This requires the integration of information across multiple databases held by different institutions. Today, this type of data integration is difficult to automate and propone to errors due, in part, to the lack of generally accepted building and spaces identifiers. Moving forward, the real estate industry needs a new mechanism to assign identifiers for whole buildings and interior spaces for the purpose of interoperability, data exchange, and integration. This paper describes a systematic process to identify activities occurring at building or within interior spaces to provide a foundation for exchange and interoperability. We demonstrate the application of the approach with a prototype Web application. This concept and demonstration illustrate the elements of a practical interoperability framework that can increase productivity, create new business opportunities, and reduce errors, waste, and redundancy. © 2013 New York Academy of Sciences.

  7. Interoperability challenges in river discharge modelling: A cross domain application scenario

    Science.gov (United States)

    Santoro, Mattia; Andres, Volker; Jirka, Simon; Koike, Toshio; Looser, Ulrich; Nativi, Stefano; Pappenberger, Florian; Schlummer, Manuela; Strauch, Adrian; Utech, Michael; Zsoter, Ervin

    2018-06-01

    River discharge is a critical water cycle variable, as it integrates all the processes (e.g. runoff and evapotranspiration) occurring within a river basin and provides a hydrological output variable that can be readily measured. Its prediction is of invaluable help for many water-related tasks including water resources assessment and management, flood protection, and disaster mitigation. Observations of river discharge are important to calibrate and validate hydrological or coupled land, atmosphere and ocean models. This requires using datasets from different scientific domains (Water, Weather, etc.). Typically, such datasets are provided using different technological solutions. This complicates the integration of new hydrological data sources into application systems. Therefore, a considerable effort is often spent on data access issues instead of the actual scientific question. This paper describes the work performed to address multidisciplinary interoperability challenges related to river discharge modeling and validation. This includes definition and standardization of domain specific interoperability standards for hydrological data sharing and their support in global frameworks such as the Global Earth Observation System of Systems (GEOSS). The research was developed in the context of the EU FP7-funded project GEOWOW (GEOSS Interoperability for Weather, Ocean and Water), which implemented a "River Discharge" application scenario. This scenario demonstrates the combination of river discharge observations data from the Global Runoff Data Centre (GRDC) database and model outputs produced by the European Centre for Medium-Range Weather Forecasts (ECMWF) predicting river discharge based on weather forecast information in the context of the GEOSS.

  8. FHIR Healthcare Directories: Adopting Shared Interfaces to Achieve Interoperable Medical Device Data Integration.

    Science.gov (United States)

    Tyndall, Timothy; Tyndall, Ayami

    2018-01-01

    Healthcare directories are vital for interoperability among healthcare providers, researchers and patients. Past efforts at directory services have not provided the tools to allow integration of the diverse data sources. Many are overly strict, incompatible with legacy databases, and do not provide Data Provenance. A more architecture-independent system is needed to enable secure, GDPR-compatible (8) service discovery across organizational boundaries. We review our development of a portable Data Provenance Toolkit supporting provenance within Health Information Exchange (HIE) systems. The Toolkit has been integrated with client software and successfully leveraged in clinical data integration. The Toolkit validates provenance stored in a Blockchain or Directory record and creates provenance signatures, providing standardized provenance that moves with the data. This healthcare directory suite implements discovery of healthcare data by HIE and EHR systems via FHIR. Shortcomings of past directory efforts include the ability to map complex datasets and enabling interoperability via exchange endpoint discovery. By delivering data without dictating how it is stored we improve exchange and facilitate discovery on a multi-national level through open source, fully interoperable tools. With the development of Data Provenance resources we enhance exchange and improve security and usability throughout the health data continuum.

  9. Interoperability of Geographic Information: A Communication Process –Based Prototype

    Directory of Open Access Journals (Sweden)

    Jean Brodeur

    2005-04-01

    Full Text Available Since 1990, municipal, state/provincial, and federal governments have developed numerous geographic databases over the years to fulfill organizations' specific needs. As such, same real world topographic phenomena have been abstracted differently, for instance vegetation (surface, trees (surface, wooded area (line, wooded area (point and line, milieu boisé (surface, zone boisée (unknown geometry. Today, information about these geographic phenomena is accessible on the Internet from Web infrastructures specially developed to simplify their access. Early in the nineties, the development of interoperability of geographic information has been undertaken to solve syntactic, structural, and semantic heterogeneities as well as spatial and temporal heterogeneities to facilitate sharing and integration of such data. Recently, we have proposed a new conceptual framework for interoperability of geographic information based on the human communication process, cognitive science, and ontology, and introduced geosemantic proximity, a reasoning methodology to qualify dynamically the semantic similarity between geographic abstractions. This framework could be of interest to other disciplines. This paper presents the details of our framework for interoperability of geographic information as well as a prototype.

  10. A Service Oriented Architecture Approach to Achieve Interoperability between Immunization Information Systems in Iran.

    Science.gov (United States)

    Hosseini, Masoud; Ahmadi, Maryam; Dixon, Brian E

    2014-01-01

    Clinical decision support (CDS) systems can support vaccine forecasting and immunization reminders; however, immunization decision-making requires data from fragmented, independent systems. Interoperability and accurate data exchange between immunization information systems (IIS) is an essential factor to utilize Immunization CDS systems. Service oriented architecture (SOA) and Health Level 7 (HL7) are dominant standards for web-based exchange of clinical information. We implemented a system based on SOA and HL7 v3 to support immunization CDS in Iran. We evaluated system performance by exchanging 1500 immunization records for roughly 400 infants between two IISs. System turnaround time is less than a minute for synchronous operation calls and the retrieved immunization history of infants were always identical in different systems. CDS generated reports were accordant to immunization guidelines and the calculations for next visit times were accurate. Interoperability is rare or nonexistent between IIS. Since inter-state data exchange is rare in United States, this approach could be a good prototype to achieve interoperability of immunization information.

  11. An E-government Interoperability Platform Supporting Personal Data Protection Regulations

    Directory of Open Access Journals (Sweden)

    Laura González

    2016-08-01

    Full Text Available Public agencies are increasingly required to collaborate with each other in order to provide high-quality e-government services. This collaboration is usually based on the service-oriented approach and supported by interoperability platforms. Such platforms are specialized middleware-based infrastructures enabling the provision, discovery and invocation of interoperable software services. In turn, given that personal data handled by governments are often very sensitive, most governments have developed some sort of legislation focusing on data protection. This paper proposes solutions for monitoring and enforcing data protection laws within an E-government Interoperability Platform. In particular, the proposal addresses requirements posed by the Uruguayan Data Protection Law and the Uruguayan E-government Platform, although it can also be applied in similar scenarios. The solutions are based on well-known integration mechanisms (e.g. Enterprise Service Bus as well as recognized security standards (e.g. eXtensible Access Control Markup Language and were completely prototyped leveraging the SwitchYard ESB product.

  12. The e-MapScholar project—an example of interoperability in GIScience education

    Science.gov (United States)

    Purves, R. S.; Medyckyj-Scott, D. J.; Mackaness, W. A.

    2005-03-01

    The proliferation of the use of digital spatial data in learning and teaching provides a set of opportunities and challenges for the development of e-learning materials suitable for use by a broad spectrum of disciplines in Higher Education. Effective e-learning materials must both provide engaging materials with which the learner can interact and be relevant to the learners' disciplinary and background knowledge. Interoperability aims to allow sharing of data and materials through the use of common agreements and specifications. Shared learning materials can take advantage of interoperable components to provide customisable components, and must consider issues in sharing data across institutional borders. The e-MapScholar project delivers teaching materials related to spatial data, which are customisable with respect to both context and location. Issues in the provision of such interoperable materials are discussed, including suitable levels of granularity of materials, the provision of tools to facilitate customisation and mechanisms to deliver multiple data sets and the metadata issues related to such materials. The examples shown make extensive use of the OpenGIS consortium specifications in the delivery of spatial data.

  13. Parallelism in matrix computations

    CERN Document Server

    Gallopoulos, Efstratios; Sameh, Ahmed H

    2016-01-01

    This book is primarily intended as a research monograph that could also be used in graduate courses for the design of parallel algorithms in matrix computations. It assumes general but not extensive knowledge of numerical linear algebra, parallel architectures, and parallel programming paradigms. The book consists of four parts: (I) Basics; (II) Dense and Special Matrix Computations; (III) Sparse Matrix Computations; and (IV) Matrix functions and characteristics. Part I deals with parallel programming paradigms and fundamental kernels, including reordering schemes for sparse matrices. Part II is devoted to dense matrix computations such as parallel algorithms for solving linear systems, linear least squares, the symmetric algebraic eigenvalue problem, and the singular-value decomposition. It also deals with the development of parallel algorithms for special linear systems such as banded ,Vandermonde ,Toeplitz ,and block Toeplitz systems. Part III addresses sparse matrix computations: (a) the development of pa...

  14. Neutrino mass matrix

    International Nuclear Information System (INIS)

    Strobel, E.L.

    1985-01-01

    Given the many conflicting experimental results, examination is made of the neutrino mass matrix in order to determine possible masses and mixings. It is assumed that the Dirac mass matrix for the electron, muon, and tau neutrinos is similar in form to those of the quarks and charged leptons, and that the smallness of the observed neutrino masses results from the Gell-Mann-Ramond-Slansky mechanism. Analysis of masses and mixings for the neutrinos is performed using general structures for the Majorana mass matrix. It is shown that if certain tentative experimental results concerning the neutrino masses and mixing angles are confirmed, significant limitations may be placed on the Majorana mass matrix. The most satisfactory simple assumption concerning the Majorana mass matrix is that it is approximately proportional to the Dirac mass matrix. A very recent experimental neutrino mass result and its implications are discussed. Some general properties of matrices with structure similar to the Dirac mass matrices are discussed

  15. Analysis of Heavy-Tailed Time Series

    DEFF Research Database (Denmark)

    Xie, Xiaolei

    This thesis is about analysis of heavy-tailed time series. We discuss tail properties of real-world equity return series and investigate the possibility that a single tail index is shared by all return series of actively traded equities in a market. Conditions for this hypothesis to be true...... are identified. We study the eigenvalues and eigenvectors of sample covariance and sample auto-covariance matrices of multivariate heavy-tailed time series, and particularly for time series with very high dimensions. Asymptotic approximations of the eigenvalues and eigenvectors of such matrices are found...... and expressed in terms of the parameters of the dependence structure, among others. Furthermore, we study an importance sampling method for estimating rare-event probabilities of multivariate heavy-tailed time series generated by matrix recursion. We show that the proposed algorithm is efficient in the sense...

  16. Standards-based Content Resources: A Prerequisite for Content Integration and Content Interoperability

    Directory of Open Access Journals (Sweden)

    Christian Galinski

    2010-05-01

    Full Text Available Objective: to show how standards-based approaches for content standardization, content management, content related services and tools as well as the respective certification systems not only guarantee reliable content integration and content interoperability, but also are of particular benefit to people with special needs in eAccessibility/eInclusion. Method: document MoU/MG/05 N0221 ''Semantic Interoperability and the need for a coherent policy for a framework of distributed, possibly federated repositories for all kinds of content items on a world-wide scale''2, which was adopted in 2005, was a first step towards the formulation of global interoperability requirements for structured content. These requirements -based on advanced terminological principles- were taken up in EU-projects such as IN-SAFETY (INfrastructure and SAFETY and OASIS (Open architecture for Accessible Services Integration and Standardization. Results: Content integration and content interoperability are key concepts in connection with the emergence of state-of-the-art distributed and federated databases/repositories of structured content. Given the fact that linguistic content items are increasingly combined with or embedded in non-linguistic content items (and vice versa, a systemic and generic approach to data modelling and content management has become the order of the day. Fulfilling the requirements of capability for multilinguality and multimodality, based on open standards makes software and database design fit for eAccessibility/eInclusion from the outset. It also makes structured content capable for global content integration and content interoperability, because it enhances its potential for being re-used and re-purposed in totally different eApplications. Such content as well as the methods, tools and services applied can be subject to new kinds of certification schemes which also should be based on standards. Conclusions: Content must be totally reliable in some

  17. Sustainability of Open-Source Software Organizations as Underpinning for Sustainable Interoperability on Large Scales

    Science.gov (United States)

    Fulker, D. W.; Gallagher, J. H. R.

    2015-12-01

    OPeNDAP's Hyrax data server is an open-source framework fostering interoperability via easily-deployed Web services. Compatible with solutions listed in the (PA001) session description—federation, rigid standards and brokering/mediation—the framework can support tight or loose coupling, even with dependence on community-contributed software. Hyrax is a Web-services framework with a middleware-like design and a handler-style architecture that together reduce the interoperability challenge (for N datatypes and M user contexts) to an O(N+M) problem, similar to brokering. Combined with an open-source ethos, this reduction makes Hyrax a community tool for gaining interoperability. E.g., in its response to the Big Earth Data Initiative (BEDI), NASA references OPeNDAP-based interoperability. Assuming its suitability, the question becomes: how sustainable is OPeNDAP, a small not-for-profit that produces open-source software, i.e., has no software-sales? In other words, if geoscience interoperability depends on OPeNDAP and similar organizations, are those entities in turn sustainable? Jim Collins (in Good to Great) highlights three questions that successful companies can answer (paraphrased here): What is your passion? Where is your world-class excellence? What drives your economic engine? We attempt to shed light on OPeNDAP sustainability by examining these. Passion: OPeNDAP has a focused passion for improving the effectiveness of scientific data sharing and use, as deeply-cooperative community endeavors. Excellence: OPeNDAP has few peers in remote, scientific data access. Skills include computer science with experience in data science, (operational, secure) Web services, and software design (for servers and clients, where the latter vary from Web pages to standalone apps and end-user programs). Economic Engine: OPeNDAP is an engineering services organization more than a product company, despite software being key to OPeNDAP's reputation. In essence, provision of

  18. Key pillars of data interoperability in Earth Sciences - INSPIRE and beyond

    Science.gov (United States)

    Tomas, Robert; Lutz, Michael

    2013-04-01

    The well-known heterogeneity and fragmentation of data models, formats and controlled vocabularies of environmental data limit potential data users from utilising the wealth of environmental information available today across Europe. The main aim of INSPIRE1 is to improve this situation and give users possibility to access, use and correctly interpret environmental data. Over the past years number of INSPIRE technical guidelines (TG) and implementing rules (IR) for interoperability have been developed, involving hundreds of domain experts from across Europe. The data interoperability specifications, which have been developed for all 34 INSPIRE spatial data themes2, are the central component of the TG and IR. Several of these themes are related to the earth sciences, e.g. geology (including hydrogeology, geophysics and geomorphology), mineral and energy resources, soil science, natural hazards, meteorology, oceanography, hydrology and land cover. The following main pillars for data interoperability and harmonisation have been identified during the development of the specifications: Conceptual data models describe the spatial objects and their properties and relationships for the different spatial data themes. To achieve cross-domain harmonization, the data models for all themes are based on a common modelling framework (the INSPIRE Generic Conceptual Model3) and managed in a common UML repository. Harmonised vocabularies (or code lists) are to be used in data exchange in order to overcome interoperability issues caused by heterogeneous free-text and/or multi-lingual content. Since a mapping to a harmonized vocabulary could be difficult, the INSPIRE data models typically allow the provision of more specific terms from local vocabularies in addition to the harmonized terms - utilizing either the extensibility options or additional terminological attributes. Encoding. Currently, specific XML profiles of the Geography Markup Language (GML) are promoted as the standard

  19. SC-228 Inclusion of DAA Warning Alert for TCAS Interoperability

    Science.gov (United States)

    Fern, Lisa

    2016-01-01

    alerting system is to provide critical timing information to the pilot about the potential for a loss of well clear with another aircraft. This is done by employing both temporal and spatial thresholds that indicate to the pilot the likelihood and imminence of a loss of well clear. The design of the DAA alerting thresholds is a balancing act between eliciting the desired pilot response in real loss of well clear threat events and reducing excessive, unnecessary, and/or uncoordinated UAS maneuvering within the air traffic environment; larger thresholds, both spatially and temporally, may increase the likelihood of a pilot avoiding a loss of well clear, but it can also increase the frequency of maneuvering - especially in cases where a maneuver is not actually needed to maintain well clear. A series of human in the loop (HITL) simulations have been conducted as part of NASA's Unmanned Aircraft Systems (UAS) Integration in the National Airspace System (NAS) project. The purpose of these HITLs has been to provide empirical results in order to inform development of the minimum human-machine interface requirements for the DAA system. This white paper will present those results which provide evidence of a human performance benefit (in terms of response times and ability to remain well clear of other aircraft) of the DAA warning alert both with and without a collision avoidance system on board the aircraft.

  20. vector bilinear autoregressive time series model and its superiority

    African Journals Online (AJOL)

    KEYWORDS: Linear time series, Autoregressive process, Autocorrelation function, Partial autocorrelation function,. Vector time .... important result on matrix algebra with respect to the spectral ..... application to covariance analysis of super-.

  1. Patience of matrix games

    DEFF Research Database (Denmark)

    Hansen, Kristoffer Arnsfelt; Ibsen-Jensen, Rasmus; Podolskii, Vladimir V.

    2013-01-01

    For matrix games we study how small nonzero probability must be used in optimal strategies. We show that for image win–lose–draw games (i.e. image matrix games) nonzero probabilities smaller than image are never needed. We also construct an explicit image win–lose game such that the unique optimal...

  2. Matrix comparison, Part 2

    DEFF Research Database (Denmark)

    Schneider, Jesper Wiborg; Borlund, Pia

    2007-01-01

    The present two-part article introduces matrix comparison as a formal means for evaluation purposes in informetric studies such as cocitation analysis. In the first part, the motivation behind introducing matrix comparison to informetric studies, as well as two important issues influencing such c...

  3. Unitarity of CKM Matrix

    CERN Document Server

    Saleem, M

    2002-01-01

    The Unitarity of the CKM matrix is examined in the light of the latest available accurate data. The analysis shows that a conclusive result cannot be derived at present. Only more precise data can determine whether the CKM matrix opens new vistas beyond the standard model or not.

  4. Fuzzy risk matrix

    International Nuclear Information System (INIS)

    Markowski, Adam S.; Mannan, M. Sam

    2008-01-01

    A risk matrix is a mechanism to characterize and rank process risks that are typically identified through one or more multifunctional reviews (e.g., process hazard analysis, audits, or incident investigation). This paper describes a procedure for developing a fuzzy risk matrix that may be used for emerging fuzzy logic applications in different safety analyses (e.g., LOPA). The fuzzification of frequency and severity of the consequences of the incident scenario are described which are basic inputs for fuzzy risk matrix. Subsequently using different design of risk matrix, fuzzy rules are established enabling the development of fuzzy risk matrices. Three types of fuzzy risk matrix have been developed (low-cost, standard, and high-cost), and using a distillation column case study, the effect of the design on final defuzzified risk index is demonstrated

  5. Fuzzy vulnerability matrix

    International Nuclear Information System (INIS)

    Baron, Jorge H.; Rivera, S.S.

    2000-01-01

    The so-called vulnerability matrix is used in the evaluation part of the probabilistic safety assessment for a nuclear power plant, during the containment event trees calculations. This matrix is established from what is knows as Numerical Categories for Engineering Judgement. This matrix is usually established with numerical values obtained with traditional arithmetic using the set theory. The representation of this matrix with fuzzy numbers is much more adequate, due to the fact that the Numerical Categories for Engineering Judgement are better represented with linguistic variables, such as 'highly probable', 'probable', 'impossible', etc. In the present paper a methodology to obtain a Fuzzy Vulnerability Matrix is presented, starting from the recommendations on the Numerical Categories for Engineering Judgement. (author)

  6. The nuclear reaction matrix

    International Nuclear Information System (INIS)

    Krenciglowa, E.M.; Kung, C.L.; Kuo, T.T.S.; Osnes, E.; and Department of Physics, State University of New York at Stony Brook, Stony Brook, New York 11794)

    1976-01-01

    Different definitions of the reaction matrix G appropriate to the calculation of nuclear structure are reviewed and discussed. Qualitative physical arguments are presented in support of a two-step calculation of the G-matrix for finite nuclei. In the first step the high-energy excitations are included using orthogonalized plane-wave intermediate states, and in the second step the low-energy excitations are added in, using harmonic oscillator intermediate states. Accurate calculations of G-matrix elements for nuclear structure calculations in the Aapprox. =18 region are performed following this procedure and treating the Pauli exclusion operator Q 2 /sub p/ by the method of Tsai and Kuo. The treatment of Q 2 /sub p/, the effect of the intermediate-state spectrum and the energy dependence of the reaction matrix are investigated in detail. The present matrix elements are compared with various matrix elements given in the literature. In particular, close agreement is obtained with the matrix elements calculated by Kuo and Brown using approximate methods

  7. Representation of the Coulomb Matrix Elements by Means of Appell Hypergeometric Function F 2

    Science.gov (United States)

    Bentalha, Zine el abidine

    2018-06-01

    Exact analytical representation for the Coulomb matrix elements by means of Appell's double series F 2 is derived. The finite sum obtained for the Appell function F 2 allows us to evaluate explicitly the matrix elements of the two-body Coulomb interaction in the lowest Landau level. An application requiring the matrix elements of Coulomb potential in quantum Hall effect regime is presented.

  8. Nato Multinational Brigade Interoperability: Issues, Mitigating Solutions and is it Time for a Nato Multinational Brigade Doctrine?

    Directory of Open Access Journals (Sweden)

    Schiller Mark

    2016-06-01

    Full Text Available Multinational Brigade Operations involving NATO and its European Partners are the norm in the post-Cold War Era. Commonplace today are Multinational Brigades, composed of staffs and subordinate units representing almost every NATO Country and Partner, participating in training exercises or actual operations in both the European and Southwest Asian Theatres. Leadership challenges are prevalent for the Multinational Brigade Commander and his staff, especially those challenges they face in achieving an effective level of brigade interoperability in order to conduct successful operations in NATO’s present and future operating environments. The purpose of this paper is twofold: to examine the major interoperability obstacles a multinational brigade commander and his staff are likely to encounter during the planning and execution of brigade operations; and, to recommend actions and measures a multinational brigade commander and his staff can implement to facilitate interoperability in a multinational brigade operating environment. Several key interoperability topics considered integral to effective multinational brigade operations will be examined and analysed to include understanding partner unit capabilities and limitations facilitated by an integration plan, appropriate command and support relationships, compatible communications, synchronized intelligence and information collection, establishing effective liaison, and fratricide prevention. The paper conclusion will urge for a NATO land brigade doctrine considering doctrine’s critical importance to effective brigade command and control interoperability and the expected missions a land brigade will encounter in future NATO operating environments as part of the NATO Very High Readiness Joint Task Force (VJTF.

  9. Launching an EarthCube Interoperability Workbench for Constructing Workflows and Employing Service Interfaces

    Science.gov (United States)

    Fulker, D. W.; Pearlman, F.; Pearlman, J.; Arctur, D. K.; Signell, R. P.

    2016-12-01

    A major challenge for geoscientists—and a key motivation for the National Science Foundation's EarchCube initiative—is to integrate data across disciplines, as is necessary for complex Earth-system studies such as climate change. The attendant technical and social complexities have led EarthCube participants to devise a system-of-systems architectural concept. Its centerpiece is a (virtual) interoperability workbench, around which a learning community can coalesce, supported in their evolving quests to join data from diverse sources, to synthesize new forms of data depicting Earth phenomena, and to overcome immense obstacles that arise, for example, from mismatched nomenclatures, projections, mesh geometries and spatial-temporal scales. The full architectural concept will require significant time and resources to implement, but this presentation describes a (minimal) starter kit. With a keep-it-simple mantra this workbench starter kit can fulfill the following four objectives: 1) demonstrate the feasibility of an interoperability workbench by mid-2017; 2) showcase scientifically useful examples of cross-domain interoperability, drawn, e.g., from funded EarthCube projects; 3) highlight selected aspects of EarthCube's architectural concept, such as a system of systems (SoS) linked via service interfaces; 4) demonstrate how workflows can be designed and used in a manner that enables sharing, promotes collaboration and fosters learning. The outcome, despite its simplicity, will embody service interfaces sufficient to construct—from extant components—data-integration and data-synthesis workflows involving multiple geoscience domains. Tentatively, the starter kit will build on the Jupyter Notebook web application, augmented with libraries for interfacing current services (at data centers involved in EarthCube's Council of Data Facilities, e.g.) and services developed specifically for EarthCube and spanning most geoscience domains.

  10. INTEROPERABILITY AND STANDARDISATION IN THE DEPARTMENT OF DEFENCE: AN EXPLORATORY STUDY

    Directory of Open Access Journals (Sweden)

    J. De Waal

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: The political changes in South Africa have extended its international obligations by actively involving it in the social wellbeing of troubled African states. Under the auspices of the United Nations, this role is manifested in peacekeeping operations and other standard international practices. The ability of African allied forces to train, exercise, and operate efficiently, effectively, and economically together depends on the interoperability of their operational procedures, doctrine, administration, materiel and technology. This implies that all parties must have the same interpretation of ‘interoperability’. In this study, a conceptual model that explains interoperability and standardisation in terms of a systems hierarchy and the systems engineering process is developed. The study also explores the level of understanding of interoperability in the South African Department of Defence in terms of the levels of standardisation and its relationship to the concepts of systems, systems hierarchy, and systems engineering.

    AFRIKAANSE OPSOMMING: Die politieke veranderinge in Suid-Afrika het daartoe aanleiding gegee dat verdere internasionale verpligtinge die land opgelê is. Suid-Afrika, in samewerking met mede-Afrika lande en onder toesig van die Verenigde Nasies, moet deur middel van vredesoperasies by onstabiele Afrika lande betrokke raak. Die vermoë om gesamentlik aan vredesopleiding, vredesoefeninge en vredesoperasies op ‘n effektiewe, doeltreffende en ekonomiese wyse deel te neem, vereis dat daar versoenbaarheid tussen onderlinge operasionele prosedures, doktrine, administrasie, materieel en tegnologie is. Dit beteken dat alle partye eens omtrent die begrip ‘versoenbaarheid’ moet wees. In hierdie studie is ‘n konseptuele model wat versoenbaarheid en standaardisasie verduidelik in terme van die stelselhiërargie en die stelselingenieursweseproses ontwikkel. Hierdie studie het ook die vlak van begrip en

  11. Semantic Interoperability Almost Without Using The Same Vocabulary: Is It Possible?

    Science.gov (United States)

    Krisnadhi, A. A.

    2016-12-01

    Semantic interoperability, which is a key requirement in realizing cross-repository data integration, is often understood as using the same ontology or vocabulary. Consequently, within a particular domain, one can easily assume that there has to be one unifying domain ontology covering as many vocabulary terms in the domain as possible in order to realize any form of data integration across multiple data sources. Furthermore, the desire to provide very precise definition of those many terms led to the development of huge, foundational and domain ontologies that are comprehensive, but too complicated, restrictive, monolithic, and difficult to use and reuse, which cause common data providers to avoid using them. This problem is especially true in a domain as diverse as geosciences as it is virtually impossible to reach an agreement to the semantics of many terms (e.g., there are hundreds of definitions of forest used throughout the world). To overcome this challenge, modular ontology architecture has emerged in recent years, fueled among others, by advances in the ontology design pattern research. Each ontology pattern models only one key notion. It can act as a small module of a larger ontology. Such a module is developed in such a way that it is largely independent of how other notions in the same domain are modeled. This leads to an increased reusability. Furthermore, an ontology formed out of such modules would have an improved understandability over large, monolithic ontologies. Semantic interoperability in the aforementioned architecture is not achieved by enforcing the use of the same vocabulary, but rather, promoting alignment to the same ontology patterns. In this work, we elaborate how this architecture realizes the above idea. In particular, we describe how multiple data sources with differing perspectives and vocabularies can interoperate through this architecture. Building the solution upon semantic technologies such as Linked Data and the Web Ontology

  12. Interoperability in healthcare: major challenges in the creation of the enterprise environment

    Science.gov (United States)

    Lindsköld, L.; Wintell, M.; Lundberg, N.

    2009-02-01

    There is today a lack of interoperability in healthcare although the need for it is obvious. A new healthcare enterprise environment has been deployed for secure healthcare interoperability in the Western Region in Sweden (WRS). This paper is an empirical overview of the new enterprise environment supporting regional shared and transparent radiology domain information in the WRS. The enterprise environment compromises 17 radiology departments, 1,5 million inhabitants, using different RIS and PACS in a joint work-oriented network and additional cardiology, dentistry and clinical physiology departments. More than 160 terabytes of information are stored in the enterprise repository. Interoperability is developed according to the IHE mission, i.e. applying standards such as Digital Imaging and Communication in Medicine (DICOM) and Health Level 7 (HL7) to address specific clinical communication needs and support optimal patient care. The entire enterprise environment is implemented and used daily in WRS. The central prerequisites in the development of the enterprise environment in western region of Sweden were: 1) information harmonization, 2) reuse of standardized messages e.g. HL7 v2.x and v3.x, 3) development of a holistic information domain including both text and images, and 4) to create a continuous and dynamic update functionality. The central challenges in this project were: 1) the many different vendors acting in the region and the negotiations with them to apply communication roles/profiles such as HL7 (CDA, CCR), DICOM, and XML, 2) the question of whom owns the data, and 3) incomplete technical standards. This study concludes that to create a workflow that runs within an enterprise environment there are a number of central prerequisites and challenges that needs to be in place. This calls for negotiations on an international, national and regional level with standardization organizations, vendors, health management and health personnel.

  13. Data interoperability software solution for emergency reaction in the Europe Union

    Science.gov (United States)

    Casado, R.; Rubiera, E.; Sacristan, M.; Schütte, F.; Peters, R.

    2015-07-01

    Emergency management becomes more challenging in international crisis episodes because of cultural, semantic and linguistic differences between all stakeholders, especially first responders. Misunderstandings between first responders makes decision making slower and more difficult. However, spread and development of networks and IT-based emergency management systems (EMSs) have improved emergency responses, which have become more coordinated. Despite improvements made in recent years, EMSs have not still solved problems related to cultural, semantic and linguistic differences which are the real cause of slower decision making. In addition, from a technical perspective, the consolidation of current EMSs and the different formats used to exchange information offers another problem to be solved in any solution proposed for information interoperability between heterogeneous EMSs in different contexts. To overcome these problems, we present a software solution based on semantic and mediation technologies. EMERGency ELements (EMERGEL) (Fundacion CTIC and AntwortING Ingenieurbüro PartG, 2013), a common and modular ontology shared by all the stakeholders, has been defined. It offers the best solution to gather all stakeholders' knowledge in a unique and flexible data model, taking into account different countries' cultural and linguistic issues. To deal with the diversity of data protocols and formats, we have designed a service-oriented architecture for data interoperability (named DISASTER: Data Interoperability Solution At STakeholders Emergency Reaction) providing a flexible extensible solution to solve the mediation issues. Web services have been adopted as specific technology to implement this paradigm that has the most significant academic and industrial visibility and attraction. Contributions of this work have been validated through the design and development of a cross-border realistic prototype scenario, actively involving both emergency managers and emergency

  14. Matrix Metalloproteinase Enzyme Family

    Directory of Open Access Journals (Sweden)

    Ozlem Goruroglu Ozturk

    2013-04-01

    Full Text Available Matrix metalloproteinases play an important role in many biological processes such as embriogenesis, tissue remodeling, wound healing, and angiogenesis, and in some pathological conditions such as atherosclerosis, arthritis and cancer. Currently, 24 genes have been identified in humans that encode different groups of matrix metalloproteinase enzymes. This review discuss the members of the matrix metalloproteinase family and their substrate specificity, structure, function and the regulation of their enzyme activity by tissue inhibitors. [Archives Medical Review Journal 2013; 22(2.000: 209-220

  15. Matrix groups for undergraduates

    CERN Document Server

    Tapp, Kristopher

    2005-01-01

    Matrix groups touch an enormous spectrum of the mathematical arena. This textbook brings them into the undergraduate curriculum. It makes an excellent one-semester course for students familiar with linear and abstract algebra and prepares them for a graduate course on Lie groups. Matrix Groups for Undergraduates is concrete and example-driven, with geometric motivation and rigorous proofs. The story begins and ends with the rotations of a globe. In between, the author combines rigor and intuition to describe basic objects of Lie theory: Lie algebras, matrix exponentiation, Lie brackets, and maximal tori.

  16. Elementary matrix theory

    CERN Document Server

    Eves, Howard

    1980-01-01

    The usefulness of matrix theory as a tool in disciplines ranging from quantum mechanics to psychometrics is widely recognized, and courses in matrix theory are increasingly a standard part of the undergraduate curriculum.This outstanding text offers an unusual introduction to matrix theory at the undergraduate level. Unlike most texts dealing with the topic, which tend to remain on an abstract level, Dr. Eves' book employs a concrete elementary approach, avoiding abstraction until the final chapter. This practical method renders the text especially accessible to students of physics, engineeri

  17. AstroCloud, a Cyber-Infrastructure for Astronomy Research: Data Access and Interoperability

    Science.gov (United States)

    Fan, D.; He, B.; Xiao, J.; Li, S.; Li, C.; Cui, C.; Yu, C.; Hong, Z.; Yin, S.; Wang, C.; Cao, Z.; Fan, Y.; Mi, L.; Wan, W.; Wang, J.

    2015-09-01

    Data access and interoperability module connects the observation proposals, data, virtual machines and software. According to the unique identifier of PI (principal investigator), an email address or an internal ID, data can be collected by PI's proposals, or by the search interfaces, e.g. conesearch. Files associated with the searched results could be easily transported to cloud storages, including the storage with virtual machines, or several commercial platforms like Dropbox. Benefitted from the standards of IVOA (International Observatories Alliance), VOTable formatted searching result could be sent to kinds of VO software. Latter endeavor will try to integrate more data and connect archives and some other astronomical resources.

  18. The 'PEARL' Data Warehouse: Initial Challenges Faced with Semantic and Syntactic Interoperability.

    Science.gov (United States)

    Mahmoud, Samhar; Boyd, Andy; Curcin, Vasa; Bache, Richard; Ali, Asad; Miles, Simon; Taweel, Adel; Delaney, Brendan; Macleod, John

    2017-01-01

    Data about patients are available from diverse sources, including those routinely collected as individuals interact with service providers, and those provided directly by individuals through surveys. Linking these data can lead to a more complete picture about the individual, to inform either care decision making or research investigations. However, post-linkage, differences in data recording systems and formats present barriers to achieving these aims. This paper describes an approach to combine linked GP records with study observations, and reports initial challenges related to semantic and syntactic interoperability issues.

  19. Leveraging Python Interoperability Tools to Improve Sapphire's Usability

    Energy Technology Data Exchange (ETDEWEB)

    Gezahegne, A; Love, N S

    2007-12-10

    The Sapphire project at the Center for Applied Scientific Computing (CASC) develops and applies an extensive set of data mining algorithms for the analysis of large data sets. Sapphire's algorithms are currently available as a set of C++ libraries. However many users prefer higher level scripting languages such as Python for their ease of use and flexibility. In this report, we evaluate four interoperability tools for the purpose of wrapping Sapphire's core functionality with Python. Exposing Sapphire's functionality through a Python interface would increase its usability and connect its algorithms to existing Python tools.

  20. Bringing Health and Fitness Data Together for Connected Health Care: Mobile Apps as Enablers of Interoperability.

    Science.gov (United States)

    Gay, Valerie; Leijdekkers, Peter

    2015-11-18

    A transformation is underway regarding how we deal with our health. Mobile devices make it possible to have continuous access to personal health information. Wearable devices, such as Fitbit and Apple's smartwatch, can collect data continuously and provide insights into our health and fitness. However, lack of interoperability and the presence of data silos prevent users and health professionals from getting an integrated view of health and fitness data. To provide better health outcomes, a complete picture is needed which combines informal health and fitness data collected by the user together with official health records collected by health professionals. Mobile apps are well positioned to play an important role in the aggregation since they can tap into these official and informal health and data silos. The objective of this paper is to demonstrate that a mobile app can be used to aggregate health and fitness data and can enable interoperability. It discusses various technical interoperability challenges encountered while integrating data into one place. For 8 years, we have worked with third-party partners, including wearable device manufacturers, electronic health record providers, and app developers, to connect an Android app to their (wearable) devices, back-end servers, and systems. The result of this research is a health and fitness app called myFitnessCompanion, which enables users to aggregate their data in one place. Over 6000 users use the app worldwide to aggregate their health and fitness data. It demonstrates that mobile apps can be used to enable interoperability. Challenges encountered in the research process included the different wireless protocols and standards used to communicate with wireless devices, the diversity of security and authorization protocols used to be able to exchange data with servers, and lack of standards usage, such as Health Level Seven, for medical information exchange. By limiting the negative effects of health data silos

  1. Facilitating Semantic Interoperability Among Ocean Data Systems: ODIP-R2R Student Outcomes

    Science.gov (United States)

    Stocks, K. I.; Chen, Y.; Shepherd, A.; Chandler, C. L.; Dockery, N.; Elya, J. L.; Smith, S. R.; Ferreira, R.; Fu, L.; Arko, R. A.

    2014-12-01

    With informatics providing an increasingly important set of tools for geoscientists, it is critical to train the next generation of scientists in information and data techniques. The NSF-supported Rolling Deck to Repository (R2R) Program works with the academic fleet community to routinely document, assess, and preserve the underway sensor data from U.S. research vessels. The Ocean Data Interoperability Platform (ODIP) is an EU-US-Australian collaboration fostering interoperability among regional e-infrastructures through workshops and joint prototype development. The need to align terminology between systems is a common challenge across all of the ODIP prototypes. Five R2R students were supported to address aspects of semantic interoperability within ODIP. Developing a vocabulary matching service that links terms from different vocabularies with similar concept. The service implements Google Refine reconciliation service interface such that users can leverage Google Refine application as a friendly user interface while linking different vocabulary terms. Developing Resource Description Framework (RDF) resources that map Shipboard Automated Meteorological Oceanographic System (SAMOS) vocabularies to internationally served vocabularies. Each SAMOS vocabulary term (data parameter and quality control flag) will be described as an RDF resource page. These RDF resources allow for enhanced discoverability and retrieval of SAMOS data by enabling data searches based on parameter. Improving data retrieval and interoperability by exposing data and mapped vocabularies using Semantic Web technologies. We have collaborated with ODIP participating organizations in order to build a generalized data model that will be used to populate a SPARQL endpoint in order to provide expressive querying over our data files. Mapping local and regional vocabularies used by R2R to those used by ODIP partners. This work is described more fully in a companion poster. Making published Linked Data

  2. Model of the naval base logistic interoperability within the multinational operations

    Directory of Open Access Journals (Sweden)

    Bohdan Pac

    2011-12-01

    Full Text Available The paper concerns the model of the naval base logistics interoperability within the multinational operations conducted at sea by NATO or EU nations. The model includes the set of logistic requirements that NATO and EU expect from the contributing nations within the area of the logistic support provided to the forces operating out of the home bases. Model may reflect the scheme configuration, the set of requirements and its mathematical description for the naval base supporting multinational forces within maritime operations.

  3. NASA's Earth Science Gateway: A Platform for Interoperable Services in Support of the GEOSS Architecture

    Science.gov (United States)

    Alameh, N.; Bambacus, M.; Cole, M.

    2006-12-01

    Nasa's Earth Science as well as interdisciplinary research and applications activities require access to earth observations, analytical models and specialized tools and services, from diverse distributed sources. Interoperability and open standards for geospatial data access and processing greatly facilitate such access among the information and processing compo¬nents related to space¬craft, airborne, and in situ sensors; predictive models; and decision support tools. To support this mission, NASA's Geosciences Interoperability Office (GIO) has been developing the Earth Science Gateway (ESG; online at http://esg.gsfc.nasa.gov) by adapting and deploying a standards-based commercial product. Thanks to extensive use of open standards, ESG can tap into a wide array of online data services, serve a variety of audiences and purposes, and adapt to technology and business changes. Most importantly, the use of open standards allow ESG to function as a platform within a larger context of distributed geoscience processing, such as the Global Earth Observing System of Systems (GEOSS). ESG shares the goals of GEOSS to ensure that observations and products shared by users will be accessible, comparable, and understandable by relying on common standards and adaptation to user needs. By maximizing interoperability, modularity, extensibility and scalability, ESG's architecture fully supports the stated goals of GEOSS. As such, ESG's role extends beyond that of a gateway to NASA science data to become a shared platform that can be leveraged by GEOSS via: A modular and extensible architecture Consensus and community-based standards (e.g. ISO and OGC standards) A variety of clients and visualization techniques, including WorldWind and Google Earth A variety of services (including catalogs) with standard interfaces Data integration and interoperability Mechanisms for user involvement and collaboration Mechanisms for supporting interdisciplinary and domain-specific applications ESG

  4. Ocean Data Interoperability Platform (ODIP): using regional data systems for global ocean research

    Science.gov (United States)

    Schaap, D.; Thijsse, P.; Glaves, H.

    2017-12-01

    Ocean acidification, loss of coral reefs, sustainable exploitation of the marine environment are just a few of the challenges researchers around the world are currently attempting to understand and address. However, studies of these ecosystem level challenges are impossible unless researchers can discover and re-use the large volumes of interoperable multidisciplinary data that are currently only accessible through regional and global data systems that serve discreet, and often discipline specific, user communities. The plethora of marine data systems currently in existence are also using different standards, technologies and best practices making re-use of the data problematic for those engaged in interdisciplinary marine research. The Ocean Data Interoperability Platform (ODIP) is responding to this growing demand for discoverable, accessible and reusable data by establishing the foundations for a common global framework for marine data management. But creation of such an infrastructure is a major undertaking, and one that needs to be achieved in part by establishing different levels of interoperability across existing regional and global marine e-infrastructures. Workshops organised by ODIP II facilitate dialogue between selected regional and global marine data systems in an effort to identify potential solutions that integrate these marine e-infrastructures. The outcomes of these discussions have formed the basis for a number of prototype development tasks that aim to demonstrate effective sharing of data across multiple data systems, and allow users to access data from more than one system through a single access point. The ODIP II project is currently developing four prototype solutions that are establishing interoperability between selected regional marine data management infrastructures in Europe, the USA, Canada and Australia, and with the global POGO, IODE Ocean Data Portal (ODP) and GEOSS systems. The potential impact of implementing these solutions for

  5. Web services-based text-mining demonstrates broad impacts for interoperability and process simplification.

    Science.gov (United States)

    Wiegers, Thomas C; Davis, Allan Peter; Mattingly, Carolyn J

    2014-01-01

    The Critical Assessment of Information Extraction systems in Biology (BioCreAtIvE) challenge evaluation tasks collectively represent a community-wide effort to evaluate a variety of text-mining and information extraction systems applied to the biological domain. The BioCreative IV Workshop included five independent subject areas, including Track 3, which focused on named-entity recognition (NER) for the Comparative Toxicogenomics Database (CTD; http://ctdbase.org). Previously, CTD had organized document ranking and NER-related tasks for the BioCreative Workshop 2012; a key finding of that effort was that interoperability and integration complexity were major impediments to the direct application of the systems to CTD's text-mining pipeline. This underscored a prevailing problem with software integration efforts. Major interoperability-related issues included lack of process modularity, operating system incompatibility, tool configuration complexity and lack of standardization of high-level inter-process communications. One approach to potentially mitigate interoperability and general integration issues is the use of Web services to abstract implementation details; rather than integrating NER tools directly, HTTP-based calls from CTD's asynchronous, batch-oriented text-mining pipeline could be made to remote NER Web services for recognition of specific biological terms using BioC (an emerging family of XML formats) for inter-process communications. To test this concept, participating groups developed Representational State Transfer /BioC-compliant Web services tailored to CTD's NER requirements. Participants were provided with a comprehensive set of training materials. CTD evaluated results obtained from the remote Web service-based URLs against a test data set of 510 manually curated scientific articles. Twelve groups participated in the challenge. Recall, precision, balanced F-scores and response times were calculated. Top balanced F-scores for gene, chemical and

  6. tmBioC: improving interoperability of text-mining tools with BioC.

    Science.gov (United States)

    Khare, Ritu; Wei, Chih-Hsuan; Mao, Yuqing; Leaman, Robert; Lu, Zhiyong

    2014-01-01

    The lack of interoperability among biomedical text-mining tools is a major bottleneck in creating more complex applications. Despite the availability of numerous methods and techniques for various text-mining tasks, combining different tools requires substantial efforts and time owing to heterogeneity and variety in data formats. In response, BioC is a recent proposal that offers a minimalistic approach to tool interoperability by stipulating minimal changes to existing tools and applications. BioC is a family of XML formats that define how to present text documents and annotations, and also provides easy-to-use functions to read/write documents in the BioC format. In this study, we introduce our text-mining toolkit, which is designed to perform several challenging and significant tasks in the biomedical domain, and repackage the toolkit into BioC to enhance its interoperability. Our toolkit consists of six state-of-the-art tools for named-entity recognition, normalization and annotation (PubTator) of genes (GenNorm), diseases (DNorm), mutations (tmVar), species (SR4GN) and chemicals (tmChem). Although developed within the same group, each tool is designed to process input articles and output annotations in a different format. We modify these tools and enable them to read/write data in the proposed BioC format. We find that, using the BioC family of formats and functions, only minimal changes were required to build the newer versions of the tools. The resulting BioC wrapped toolkit, which we have named tmBioC, consists of our tools in BioC, an annotated full-text corpus in BioC, and a format detection and conversion tool. Furthermore, through participation in the 2013 BioCreative IV Interoperability Track, we empirically demonstrate that the tools in tmBioC can be more efficiently integrated with each other as well as with external tools: Our experimental results show that using BioC reduces >60% in lines of code for text-mining tool integration. The tmBioC toolkit

  7. Position paper: cognitive radio networking for multiple sensor network interoperability in mines

    CSIR Research Space (South Africa)

    Kagize, BM

    2008-01-01

    Full Text Available . These commercially available networks are purported to be self-organizing and self correcting, though the software behind these networks are proprietary with the caveat of inter-operability difficulties with other networks [5]. There is a non-propriety and open...: Research challenges,” - Ad Hoc Networks, 2006 – Elsevier [4] V Mhatre, C Rosenberg, “Homogeneous vs heterogeneous clustered sensor networks: a comparative study,” - Communications, 2004 IEEE International Conference on, 2004 - ieeexplore.ieee.org [5...

  8. LEARNING TOOLS INTEROPERABILITY – A NEW STANDARD FOR INTEGRATION OF DISTANCE LEARNING PLATFORMS

    Directory of Open Access Journals (Sweden)

    Oleksandr A. Shcherbyna

    2015-06-01

    Full Text Available For information technology in education there is always an issue of re-usage of electronic educational resources, their transferring possibility from one virtual learning environment to another. Previously, standardized sets of files were used to serve this purpose, for example, SCORM-packages. In this article the new standard Learning Tools Interoperability (LTI is reviewed, which allows users from one environment to access resources from another environment. This makes it possible to integrate them into a single distributed learning environment that is created and shared. The article gives examples of the practical use of standard LTI in Moodle learning management system using External tool and LTI provider plugins.

  9. The caCORE Software Development Kit: Streamlining construction of interoperable biomedical information services

    Directory of Open Access Journals (Sweden)

    Warzel Denise

    2006-01-01

    Full Text Available Abstract Background Robust, programmatically accessible biomedical information services that syntactically and semantically interoperate with other resources are challenging to construct. Such systems require the adoption of common information models, data representations and terminology standards as well as documented application programming interfaces (APIs. The National Cancer Institute (NCI developed the cancer common ontologic representation environment (caCORE to provide the infrastructure necessary to achieve interoperability across the systems it develops or sponsors. The caCORE Software Development Kit (SDK was designed to provide developers both within and outside the NCI with the tools needed to construct such interoperable software systems. Results The caCORE SDK requires a Unified Modeling Language (UML tool to begin the development workflow with the construction of a domain information model in the form of a UML Class Diagram. Models are annotated with concepts and definitions from a description logic terminology source using the Semantic Connector component. The annotated model is registered in the Cancer Data Standards Repository (caDSR using the UML Loader component. System software is automatically generated using the Codegen component, which produces middleware that runs on an application server. The caCORE SDK was initially tested and validated using a seven-class UML model, and has been used to generate the caCORE production system, which includes models with dozens of classes. The deployed system supports access through object-oriented APIs with consistent syntax for retrieval of any type of data object across all classes in the original UML model. The caCORE SDK is currently being used by several development teams, including by participants in the cancer biomedical informatics grid (caBIG program, to create compatible data services. caBIG compatibility standards are based upon caCORE resources, and thus the caCORE SDK has

  10. Some topics in matrix iterative analysis

    International Nuclear Information System (INIS)

    Khandekar, D.C.; Menon, S.V.G.; Sahni, D.C.

    1984-01-01

    This report deals with the general theory of matrix iterative analysis. The contents of the report are presented in the form of lecture notes primarily because the report is an outcome of a series of lectures delivered in the Theoretical Reactor Physics Section of the Bhabha Atomic Research Centre, Bombay. The first six lectures are devoted to the mathematical preliminaries needed to fully understand the subject. The remaining lectures provide an introduction to various iteractive methods and their intercomparison. (author)

  11. Rolling Deck to Repository (R2R): Supporting Global Data Access Through the Ocean Data Interoperability Platform (ODIP)

    Science.gov (United States)

    Arko, R. A.; Stocks, K.; Chandler, C. L.; Smith, S. R.; Miller, S. P.; Maffei, A. R.; Glaves, H. M.; Carbotte, S. M.

    2013-12-01

    The U.S. National Science Foundation supports a fleet of academic research vessels operating throughout the world's oceans. In addition to supporting the mission-specific goals of each expedition, these vessels routinely deploy a suite of underway environmental sensors, operating like mobile observatories. Recognizing that the data from these instruments have value beyond each cruise, NSF funded R2R in 2009 to ensure that these data are routinely captured, cataloged and described, and submitted to the appropriate national repository for long-term public access. In 2013, R2R joined the Ocean Data Interoperability Platform (ODIP; http://odip.org/). The goal of ODIP is to remove barriers to the effective sharing of data across scientific domains and international boundaries, by providing a forum to harmonize diverse regional systems. To advance this goal, ODIP organizes international workshops to foster the development of common standards and develop prototypes to evaluate and test potential standards and interoperability solutions. ODIP includes major organizations engaged in ocean data stewardship in the EU, US, and Australia, supported by the International Oceanographic Data and Information Exchange (IODE). Within the broad scope of ODIP, R2R focuses on contributions in 4 key areas: ● Implement a 'Linked Open Data' approach to disseminate data and documentation, using existing World Wide Web Consortium (W3C) specifications and machine-readable formats. Exposing content as Linked Open Data will provide a simple mechanism for ODIP collaborators to browse and compare data sets among repositories. ● Map key vocabularies used by R2R to their European and Australian counterparts. The existing heterogeneity among terms inhibits data discoverability, as a user searching on the term with which s/he is familiar may not find all data of interest. Mapping key terms across the different ODIP partners, relying on the backbone thesaurus provided by the NERC Vocabulary Server

  12. Hacking the Matrix.

    Science.gov (United States)

    Czerwinski, Michael; Spence, Jason R

    2017-01-05

    Recently in Nature, Gjorevski et al. (2016) describe a fully defined synthetic hydrogel that mimics the extracellular matrix to support in vitro growth of intestinal stem cells and organoids. The hydrogel allows exquisite control over the chemical and physical in vitro niche and enables identification of regulatory properties of the matrix. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. The Matrix Organization Revisited

    DEFF Research Database (Denmark)

    Gattiker, Urs E.; Ulhøi, John Parm

    1999-01-01

    This paper gives a short overview of matrix structure and technology management. It outlines some of the characteristics and also points out that many organizations may actualy be hybrids (i.e. mix several ways of organizing to allocate resorces effectively).......This paper gives a short overview of matrix structure and technology management. It outlines some of the characteristics and also points out that many organizations may actualy be hybrids (i.e. mix several ways of organizing to allocate resorces effectively)....

  14. Asymptotic behaviour of a rescattering series for nonlinear reggeons

    International Nuclear Information System (INIS)

    Akkelin, S.V.; Martynov, E.S.

    1990-01-01

    A series of elastic re-scattering (both quasi-eikonal and U-matrix ones) for reggeons with nonlinear trajectories are estimated asymptotically. The calculations are performed for models of supercritical and dipole pomerons. A weak dependence of the series of re-scattering on reggeon trajectory nonlinearity is revealed. 13 refs.; 3 figs

  15. Estimation of pure autoregressive vector models for revenue series ...

    African Journals Online (AJOL)

    This paper aims at applying multivariate approach to Box and Jenkins univariate time series modeling to three vector series. General Autoregressive Vector Models with time varying coefficients are estimated. The first vector is a response vector, while others are predictor vectors. By matrix expansion each vector, whether ...

  16. A Web Service Protocol Realizing Interoperable Internet of Things Tasking Capability

    Science.gov (United States)

    Huang, Chih-Yuan; Wu, Cheng-Hung

    2016-01-01

    The Internet of Things (IoT) is an infrastructure that interconnects uniquely-identifiable devices using the Internet. By interconnecting everyday appliances, various monitoring, and physical mashup applications can be constructed to improve human’s daily life. In general, IoT devices provide two main capabilities: sensing and tasking capabilities. While the sensing capability is similar to the World-Wide Sensor Web, this research focuses on the tasking capability. However, currently, IoT devices created by different manufacturers follow different proprietary protocols and are locked in many closed ecosystems. This heterogeneity issue impedes the interconnection between IoT devices and damages the potential of the IoT. To address this issue, this research aims at proposing an interoperable solution called tasking capability description that allows users to control different IoT devices using a uniform web service interface. This paper demonstrates the contribution of the proposed solution by interconnecting different IoT devices for different applications. In addition, the proposed solution is integrated with the OGC SensorThings API standard, which is a Web service standard defined for the IoT sensing capability. Consequently, the Extended SensorThings API can realize both IoT sensing and tasking capabilities in an integrated and interoperable manner. PMID:27589759

  17. Interoperable cross-domain semantic and geospatial framework for automatic change detection

    Science.gov (United States)

    Kuo, Chiao-Ling; Hong, Jung-Hong

    2016-01-01

    With the increasingly diverse types of geospatial data established over the last few decades, semantic interoperability in integrated applications has attracted much interest in the field of Geographic Information System (GIS). This paper proposes a new strategy and framework to process cross-domain geodata at the semantic level. This framework leverages the semantic equivalence of concepts between domains through bridge ontology and facilitates the integrated use of different domain data, which has been long considered as an essential superiority of GIS, but is impeded by the lack of understanding about the semantics implicitly hidden in the data. We choose the task of change detection to demonstrate how the introduction of ontology concept can effectively make the integration possible. We analyze the common properties of geodata and change detection factors, then construct rules and summarize possible change scenario for making final decisions. The use of topographic map data to detect changes in land use shows promising success, as far as the improvement of efficiency and level of automation is concerned. We believe the ontology-oriented approach will enable a new way for data integration across different domains from the perspective of semantic interoperability, and even open a new dimensionality for the future GIS.

  18. Harmonising phenomics information for a better interoperability in the rare disease field.

    Science.gov (United States)

    Maiella, Sylvie; Olry, Annie; Hanauer, Marc; Lanneau, Valérie; Lourghi, Halima; Donadille, Bruno; Rodwell, Charlotte; Köhler, Sebastian; Seelow, Dominik; Jupp, Simon; Parkinson, Helen; Groza, Tudor; Brudno, Michael; Robinson, Peter N; Rath, Ana

    2018-02-07

    HIPBI-RD (Harmonising phenomics information for a better interoperability in the rare disease field) is a three-year project which started in 2016 funded via the E-Rare 3 ERA-NET program. This project builds on three resources largely adopted by the rare disease (RD) community: Orphanet, its ontology ORDO (the Orphanet Rare Disease Ontology), HPO (the Human Phenotype Ontology) as well as PhenoTips software for the capture and sharing of structured phenotypic data for RD patients. Our project is further supported by resources developed by the European Bioinformatics Institute and the Garvan Institute. HIPBI-RD aims to provide the community with an integrated, RD-specific bioinformatics ecosystem that will harmonise the way phenomics information is stored in databases and patient files worldwide, and thereby contribute to interoperability. This ecosystem will consist of a suite of tools and ontologies, optimized to work together, and made available through commonly used software repositories. The project workplan follows three main objectives: The HIPBI-RD ecosystem will contribute to the interpretation of variants identified through exome and full genome sequencing by harmonising the way phenotypic information is collected, thus improving diagnostics and delineation of RD. The ultimate goal of HIPBI-RD is to provide a resource that will contribute to bridging genome-scale biology and a disease-centered view on human pathobiology. Achievements in Year 1. Copyright © 2018. Published by Elsevier Masson SAS.

  19. Next Generation Air Quality Platform: Openness and Interoperability for the Internet of Things.

    Science.gov (United States)

    Kotsev, Alexander; Schade, Sven; Craglia, Massimo; Gerboles, Michel; Spinelle, Laurent; Signorini, Marco

    2016-03-18

    The widespread diffusion of sensors, mobile devices, social media and open data are reconfiguring the way data underpinning policy and science are being produced and consumed. This in turn is creating both opportunities and challenges for policy-making and science. There can be major benefits from the deployment of the IoT in smart cities and environmental monitoring, but to realize such benefits, and reduce potential risks, there is an urgent need to address current limitations, including the interoperability of sensors, data quality, security of access and new methods for spatio-temporal analysis. Within this context, the manuscript provides an overview of the AirSensEUR project, which establishes an affordable open software/hardware multi-sensor platform, which is nonetheless able to monitor air pollution at low concentration levels. AirSensEUR is described from the perspective of interoperable data management with emphasis on possible use case scenarios, where reliable and timely air quality data would be essential.

  20. Measures for interoperability of phenotypic data: minimum information requirements and formatting.

    Science.gov (United States)

    Ćwiek-Kupczyńska, Hanna; Altmann, Thomas; Arend, Daniel; Arnaud, Elizabeth; Chen, Dijun; Cornut, Guillaume; Fiorani, Fabio; Frohmberg, Wojciech; Junker, Astrid; Klukas, Christian; Lange, Matthias; Mazurek, Cezary; Nafissi, Anahita; Neveu, Pascal; van Oeveren, Jan; Pommier, Cyril; Poorter, Hendrik; Rocca-Serra, Philippe; Sansone, Susanna-Assunta; Scholz, Uwe; van Schriek, Marco; Seren, Ümit; Usadel, Björn; Weise, Stephan; Kersey, Paul; Krajewski, Paweł

    2016-01-01

    Plant phenotypic data shrouds a wealth of information which, when accurately analysed and linked to other data types, brings to light the knowledge about the mechanisms of life. As phenotyping is a field of research comprising manifold, diverse and time-consuming experiments, the findings can be fostered by reusing and combining existing datasets. Their correct interpretation, and thus replicability, comparability and interoperability, is possible provided that the collected observations are equipped with an adequate set of metadata. So far there have been no common standards governing phenotypic data description, which hampered data exchange and reuse. In this paper we propose the guidelines for proper handling of the information about plant phenotyping experiments, in terms of both the recommended content of the description and its formatting. We provide a document called "Minimum Information About a Plant Phenotyping Experiment", which specifies what information about each experiment should be given, and a Phenotyping Configuration for the ISA-Tab format, which allows to practically organise this information within a dataset. We provide examples of ISA-Tab-formatted phenotypic data, and a general description of a few systems where the recommendations have been implemented. Acceptance of the rules described in this paper by the plant phenotyping community will help to achieve findable, accessible, interoperable and reusable data.

  1. Two-Level Evaluation on Sensor Interoperability of Features in Fingerprint Image Segmentation

    Directory of Open Access Journals (Sweden)

    Ya-Shuo Li

    2012-03-01

    Full Text Available Features used in fingerprint segmentation significantly affect the segmentation performance. Various features exhibit different discriminating abilities on fingerprint images derived from different sensors. One feature which has better discriminating ability on images derived from a certain sensor may not adapt to segment images derived from other sensors. This degrades the segmentation performance. This paper empirically analyzes the sensor interoperability problem of segmentation feature, which refers to the feature’s ability to adapt to the raw fingerprints captured by different sensors. To address this issue, this paper presents a two-level feature evaluation method, including the first level feature evaluation based on segmentation error rate and the second level feature evaluation based on decision tree. The proposed method is performed on a number of fingerprint databases which are obtained from various sensors. Experimental results show that the proposed method can effectively evaluate the sensor interoperability of features, and the features with good evaluation results acquire better segmentation accuracies of images originating from different sensors.

  2. Next Generation Air Quality Platform: Openness and Interoperability for the Internet of Things

    Directory of Open Access Journals (Sweden)

    Alexander Kotsev

    2016-03-01

    Full Text Available The widespread diffusion of sensors, mobile devices, social media and open data are reconfiguring the way data underpinning policy and science are being produced and consumed. This in turn is creating both opportunities and challenges for policy-making and science. There can be major benefits from the deployment of the IoT in smart cities and environmental monitoring, but to realize such benefits, and reduce potential risks, there is an urgent need to address current limitations, including the interoperability of sensors, data quality, security of access and new methods for spatio-temporal analysis. Within this context, the manuscript provides an overview of the AirSensEUR project, which establishes an affordable open software/hardware multi-sensor platform, which is nonetheless able to monitor air pollution at low concentration levels. AirSensEUR is described from the perspective of interoperable data management with emphasis on possible use case scenarios, where reliable and timely air quality data would be essential.

  3. Development of high performance scientific components for interoperability of computing packages

    Energy Technology Data Exchange (ETDEWEB)

    Gulabani, Teena Pratap [Iowa State Univ., Ames, IA (United States)

    2008-01-01

    Three major high performance quantum chemistry computational packages, NWChem, GAMESS and MPQC have been developed by different research efforts following different design patterns. The goal is to achieve interoperability among these packages by overcoming the challenges caused by the different communication patterns and software design of each of these packages. A chemistry algorithm is hard to develop as well as being a time consuming process; integration of large quantum chemistry packages will allow resource sharing and thus avoid reinvention of the wheel. Creating connections between these incompatible packages is the major motivation of the proposed work. This interoperability is achieved by bringing the benefits of Component Based Software Engineering through a plug-and-play component framework called Common Component Architecture (CCA). In this thesis, I present a strategy and process used for interfacing two widely used and important computational chemistry methodologies: Quantum Mechanics and Molecular Mechanics. To show the feasibility of the proposed approach the Tuning and Analysis Utility (TAU) has been coupled with NWChem code and its CCA components. Results show that the overhead is negligible when compared to the ease and potential of organizing and coping with large-scale software applications.

  4. The impact of SOA for achieving healthcare interoperability. An empirical investigation based on a hypothetical adoption.

    Science.gov (United States)

    Daskalakis, S; Mantas, J

    2009-01-01

    The evaluation of a service-oriented prototype implementation for healthcare interoperability. A prototype framework was developed, aiming to exploit the use of service-oriented architecture (SOA) concepts for achieving healthcare interoperability and to move towards a virtual patient record (VPR) paradigm. The prototype implementation was evaluated for its hypothetical adoption. The evaluation strategy was based on the initial proposition of the DeLone and McLean model of information systems (IS) success [1], as modeled by Iivari [2]. A set of SOA and VPR characteristics were empirically encapsulated within the dimensions of IS success model, combined with measures from previous research works. The data gathered was analyzed using partial least squares (PLS). The results highlighted that system quality is a partial predictor of system use but not of user satisfaction. On the contrary, information quality proved to be a significant predictor of user satisfaction and partially a strong significant predictor of system use. Moreover, system use did not prove to be a significant predictor of individual impact whereas the bi-directional relation between use and user satisfaction did not confirm. Additionally, user satisfaction was found to be a strong significant predictor of individual impact. Finally, individual impact proved to be a strong significant predictor of organizational impact. The empirical study attempted to obtain hypothetical, but still useful beliefs and perceptions regarding the SOA prototype implementation. The deduced observations can form the basis for further investigation regarding the adaptability of SOA implementations with VPR characteristics in the healthcare domain.

  5. The PSML format and library for norm-conserving pseudopotential data curation and interoperability

    Science.gov (United States)

    García, Alberto; Verstraete, Matthieu J.; Pouillon, Yann; Junquera, Javier

    2018-06-01

    Norm-conserving pseudopotentials are used by a significant number of electronic-structure packages, but the practical differences among codes in the handling of the associated data hinder their interoperability and make it difficult to compare their results. At the same time, existing formats lack provenance data, which makes it difficult to track and document computational workflows. To address these problems, we first propose a file format (PSML) that maps the basic concepts of the norm-conserving pseudopotential domain in a flexible form and supports the inclusion of provenance information and other important metadata. Second, we provide a software library (libPSML) that can be used by electronic structure codes to transparently extract the information in the file and adapt it to their own data structures, or to create converters for other formats. Support for the new file format has been already implemented in several pseudopotential generator programs (including ATOM and ONCVPSP), and the library has been linked with SIESTA and ABINIT, allowing them to work with the same pseudopotential operator (with the same local part and fully non-local projectors) thus easing the comparison of their results for the structural and electronic properties, as shown for several example systems. This methodology can be easily transferred to any other package that uses norm-conserving pseudopotentials, and offers a proof-of-concept for a general approach to interoperability.

  6. Multi-Agent Decision Support Tool to Enable Interoperability among Heterogeneous Energy Systems

    Directory of Open Access Journals (Sweden)

    Brígida Teixeira

    2018-02-01

    Full Text Available Worldwide electricity markets are undergoing a major restructuring process. One of the main reasons for the ongoing changes is to enable the adaptation of current market models to the new paradigm that arises from the large-scale integration of distributed generation sources. In order to deal with the unpredictability caused by the intermittent nature of the distributed generation and the large number of variables that contribute to the energy sector balance, it is extremely important to use simulation systems that are capable of dealing with the required complexity. This paper presents the Tools Control Center (TOOCC, a framework that allows the interoperability between heterogeneous energy and power simulation systems through the use of ontologies, allowing the simulation of scenarios with a high degree of complexity, through the cooperation of the individual capacities of each system. A case study based on real data is presented in order to demonstrate the interoperability capabilities of TOOCC. The simulation considers the energy management of a microgrid of a real university campus, from the perspective of the network manager and also of its consumers/producers, in a projection for a typical day of the winter of 2050.

  7. Inter-operator and inter-device agreement and reliability of the SEM Scanner.

    Science.gov (United States)

    Clendenin, Marta; Jaradeh, Kindah; Shamirian, Anasheh; Rhodes, Shannon L

    2015-02-01

    The SEM Scanner is a medical device designed for use by healthcare providers as part of pressure ulcer prevention programs. The objective of this study was to evaluate the inter-rater and inter-device agreement and reliability of the SEM Scanner. Thirty-one (31) volunteers free of pressure ulcers or broken skin at the sternum, sacrum, and heels were assessed with the SEM Scanner. Each of three operators utilized each of three devices to collect readings from four anatomical sites (sternum, sacrum, left and right heels) on each subject for a total of 108 readings per subject collected over approximately 30 min. For each combination of operator-device-anatomical site, three SEM readings were collected. Inter-operator and inter-device agreement and reliability were estimated. Over the course of this study, more than 3000 SEM Scanner readings were collected. Agreement between operators was good with mean differences ranging from -0.01 to 0.11. Inter-operator and inter-device reliability exceeded 0.80 at all anatomical sites assessed. The results of this study demonstrate the high reliability and good agreement of the SEM Scanner across different operators and different devices. Given the limitations of current methods to prevent and detect pressure ulcers, the SEM Scanner shows promise as an objective, reliable tool for assessing the presence or absence of pressure-induced tissue damage such as pressure ulcers. Copyright © 2015 Bruin Biometrics, LLC. Published by Elsevier Ltd.. All rights reserved.

  8. The development of clinical document standards for semantic interoperability in china.

    Science.gov (United States)

    Yang, Peng; Pan, Feng; Liu, Danhong; Xu, Yongyong; Wan, Yi; Tu, Haibo; Tang, Xuejun; Hu, Jianping

    2011-12-01

    This study is aimed at developing a set of data groups (DGs) to be employed as reusable building blocks for the construction of the eight most common clinical documents used in China's general hospitals in order to achieve their structural and semantic standardization. The Diagnostics knowledge framework, the related approaches taken from the Health Level Seven (HL7), the Integrating the Healthcare Enterprise (IHE), and the Healthcare Information Technology Standards Panel (HITSP) and 1,487 original clinical records were considered together to form the DG architecture and data sets. The internal structure, content, and semantics of each DG were then defined by mapping each DG data set to a corresponding Clinical Document Architecture data element and matching each DG data set to the metadata in the Chinese National Health Data Dictionary. By using the DGs as reusable building blocks, standardized structures and semantics regarding the clinical documents for semantic interoperability were able to be constructed. Altogether, 5 header DGs, 48 section DGs, and 17 entry DGs were developed. Several issues regarding the DGs, including their internal structure, identifiers, data set names, definitions, length and format, data types, and value sets, were further defined. Standardized structures and semantics regarding the eight clinical documents were structured by the DGs. This approach of constructing clinical document standards using DGs is a feasible standard-driven solution useful in preparing documents possessing semantic interoperability among the disparate information systems in China. These standards need to be validated and refined through further study.

  9. Electronic Toll Collection Systems and their Interoperability: The State of Art

    Energy Technology Data Exchange (ETDEWEB)

    Heras Molina, J. de la; Gomez Sanchez, J.; Vassallo Magro, J.M.

    2016-07-01

    The European Electronic Toll Service (EETS) was created in 2004 with the aim of ensuring interoperability among the existing electronic toll collection (ETC) systems in Europe. However, the lack of cooperation between groups of stakeholders has not made possible to achieve this goal ten years later. The purpose of this research is to determine the better way to achieve interoperability among the different ETC systems in Europe. Our study develops a review of the six main ETC systems available worldwide: Automatic Number Plate Recognition (ANPR), Dedicated Short-Range Communications (DSRC), Radio Frequency Identification (RFID), Satellite systems (GNSS), Tachograph, and Mobile communications tolling systems. The research also provides some insight on different emerging technologies. By focusing on different operational and strategic aspects offered by each technology, we identify their main strengths, weaknesses, opportunities and threats and makes different recommendations to improve the current framework. The research concludes that given the diversity of advantages and inconveniences offered by each system, the selection of a certain ETC technology should also take into account its potential to overcome the weaknesses in the current ETC framework. In this line, different policy recommendations are proposed to improve the present ETC strategy at the EU. (Author)

  10. Using Open and Interoperable Ways to Publish and Access LANCE AIRS Near-Real Time Data

    Science.gov (United States)

    Zhao, Peisheng; Lynnes, Christopher; Vollmer, Bruce; Savtchenko, Andrey; Theobald, Michael; Yang, Wenli

    2011-01-01

    The Atmospheric Infrared Sounder (AIRS) Near-Real Time (NRT) data from the Land Atmosphere Near real-time Capability for EOS (LANCE) element at the Goddard Earth Sciences Data and Information Services Center (GES DISC) provides information on the global and regional atmospheric state, with very low temporal latency, to support climate research and improve weather forecasting. An open and interoperable platform is useful to facilitate access to, and integration of, LANCE AIRS NRT data. As Web services technology has matured in recent years, a new scalable Service-Oriented Architecture (SOA) is emerging as the basic platform for distributed computing and large networks of interoperable applications. Following the provide-register-discover-consume SOA paradigm, this presentation discusses how to use open-source geospatial software components to build Web services for publishing and accessing AIRS NRT data, explore the metadata relevant to registering and discovering data and services in the catalogue systems, and implement a Web portal to facilitate users' consumption of the data and services.

  11. A Web Service Protocol Realizing Interoperable Internet of Things Tasking Capability.

    Science.gov (United States)

    Huang, Chih-Yuan; Wu, Cheng-Hung

    2016-08-31

    The Internet of Things (IoT) is an infrastructure that interconnects uniquely-identifiable devices using the Internet. By interconnecting everyday appliances, various monitoring, and physical mashup applications can be constructed to improve human's daily life. In general, IoT devices provide two main capabilities: sensing and tasking capabilities. While the sensing capability is similar to the World-Wide Sensor Web, this research focuses on the tasking capability. However, currently, IoT devices created by different manufacturers follow different proprietary protocols and are locked in many closed ecosystems. This heterogeneity issue impedes the interconnection between IoT devices and damages the potential of the IoT. To address this issue, this research aims at proposing an interoperable solution called tasking capability description that allows users to control different IoT devices using a uniform web service interface. This paper demonstrates the contribution of the proposed solution by interconnecting different IoT devices for different applications. In addition, the proposed solution is integrated with the OGC SensorThings API standard, which is a Web service standard defined for the IoT sensing capability. Consequently, the Extended SensorThings API can realize both IoT sensing and tasking capabilities in an integrated and interoperable manner.

  12. Interoperability at ESA Heliophysics Science Archives: IVOA, HAPI and other implementations

    Science.gov (United States)

    Martinez-Garcia, B.; Cook, J. P.; Perez, H.; Fernandez, M.; De Teodoro, P.; Osuna, P.; Arnaud, M.; Arviset, C.

    2017-12-01

    The data of ESA heliophysics science missions are preserved at the ESAC Science Data Centre (ESDC). The ESDC aims for the long term preservation of those data, which includes missions such as Ulysses, Soho, Proba-2, Cluster, Double Star, and in the future, Solar Orbiter. Scientists have access to these data through web services, command line and graphical user interfaces for each of the corresponding science mission archives. The International Virtual Observatory Alliance (IVOA) provides technical standards that allow interoperability among different systems that implement them. By adopting some IVOA standards, the ESA heliophysics archives are able to share their data with those tools and services that are VO-compatible. Implementation of those standards can be found in the existing archives: Ulysses Final Archive (UFA) and Soho Science Archive (SSA). They already make use of VOTable format definition and Simple Application Messaging Protocol (SAMP). For re-engineered or new archives, the implementation of services through Table Access Protocol (TAP) or Universal Worker Service (UWS) will leverage this interoperability. This will be the case for the Proba-2 Science Archive (P2SA) and the Solar Orbiter Archive (SOAR). We present here which IVOA standards were already used by the ESA Heliophysics archives in the past and the work on-going.

  13. A generic architecture for an adaptive, interoperable and intelligent type 2 diabetes mellitus care system.

    Science.gov (United States)

    Uribe, Gustavo A; Blobel, Bernd; López, Diego M; Schulz, Stefan

    2015-01-01

    Chronic diseases such as Type 2 Diabetes Mellitus (T2DM) constitute a big burden to the global health economy. T2DM Care Management requires a multi-disciplinary and multi-organizational approach. Because of different languages and terminologies, education, experiences, skills, etc., such an approach establishes a special interoperability challenge. The solution is a flexible, scalable, business-controlled, adaptive, knowledge-based, intelligent system following a systems-oriented, architecture-centric, ontology-based and policy-driven approach. The architecture of real systems is described, using the basics and principles of the Generic Component Model (GCM). For representing the functional aspects of a system the Business Process Modeling Notation (BPMN) is used. The system architecture obtained is presented using a GCM graphical notation, class diagrams and BPMN diagrams. The architecture-centric approach considers the compositional nature of the real world system and its functionalities, guarantees coherence, and provides right inferences. The level of generality provided in this paper facilitates use case specific adaptations of the system. By that way, intelligent, adaptive and interoperable T2DM care systems can be derived from the presented model as presented in another publication.

  14. METHODS FOR DESCRIPTION OF EDUCATION AND SCIENTIFIC SERVICES IN INFORMATION AND EDUCATION ON THE BASIS OF INTEROPERABILITY STACK EIF

    Directory of Open Access Journals (Sweden)

    Ekaterina V. Pavlova

    2015-01-01

    Full Text Available The article deals with methodology for description of scientifi c and educational services in education and information on the basis of interoperability stack EIF (European Interoperability Framework. The passage describes operation factors to depict services on every level of the methodology, tools used to describe the services and the content. We also provide the link between methodology of description with the life span of the service. The article presents an example of service description according to the methodology considering the current education and professional standards, ITIL recommendations, ontology on the OWL basis and WSDL-description. 

  15. Summation of series

    CERN Document Server

    Jolley, LB W

    2004-01-01

    Over 1,100 common series, all grouped for easy reference. Arranged by category, these series include arithmetical and geometrical progressions, powers and products of natural numbers, figurate and polygonal numbers, inverse natural numbers, exponential and logarithmic series, binomials, simple inverse products, factorials, trigonometrical and hyperbolic expansions, and additional series. 1961 edition.

  16. Random Matrix Theory and Econophysics

    Science.gov (United States)

    Rosenow, Bernd

    2000-03-01

    Random Matrix Theory (RMT) [1] is used in many branches of physics as a ``zero information hypothesis''. It describes generic behavior of different classes of systems, while deviations from its universal predictions allow to identify system specific properties. We use methods of RMT to analyze the cross-correlation matrix C of stock price changes [2] of the largest 1000 US companies. In addition to its scientific interest, the study of correlations between the returns of different stocks is also of practical relevance in quantifying the risk of a given stock portfolio. We find [3,4] that the statistics of most of the eigenvalues of the spectrum of C agree with the predictions of RMT, while there are deviations for some of the largest eigenvalues. We interpret these deviations as a system specific property, e.g. containing genuine information about correlations in the stock market. We demonstrate that C shares universal properties with the Gaussian orthogonal ensemble of random matrices. Furthermore, we analyze the eigenvectors of C through their inverse participation ratio and find eigenvectors with large ratios at both edges of the eigenvalue spectrum - a situation reminiscent of localization theory results. This work was done in collaboration with V. Plerou, P. Gopikrishnan, T. Guhr, L.A.N. Amaral, and H.E Stanley and is related to recent work of Laloux et al.. 1. T. Guhr, A. Müller Groeling, and H.A. Weidenmüller, ``Random Matrix Theories in Quantum Physics: Common Concepts'', Phys. Rep. 299, 190 (1998). 2. See, e.g. R.N. Mantegna and H.E. Stanley, Econophysics: Correlations and Complexity in Finance (Cambridge University Press, Cambridge, England, 1999). 3. V. Plerou, P. Gopikrishnan, B. Rosenow, L.A.N. Amaral, and H.E. Stanley, ``Universal and Nonuniversal Properties of Cross Correlations in Financial Time Series'', Phys. Rev. Lett. 83, 1471 (1999). 4. V. Plerou, P. Gopikrishnan, T. Guhr, B. Rosenow, L.A.N. Amaral, and H.E. Stanley, ``Random Matrix Theory

  17. Contribution to high voltage matrix switches reliability

    International Nuclear Information System (INIS)

    Lausenaz, Yvan

    2000-01-01

    Nowadays, power electronic equipment requirements are important, concerning performances, quality and reliability. On the other hand, costs have to be reduced in order to satisfy the market rules. To provide cheap, reliability and performances, many standard components with mass production are developed. But the construction of specific products must be considered following these two different points: in one band you can produce specific components, with delay, over-cost problems and eventuality quality and reliability problems, in the other and you can use standard components in a adapted topologies. The CEA of Pierrelatte has adopted this last technique of power electronic conception for the development of these high voltage pulsed power converters. The technique consists in using standard components and to associate them in series and in parallel. The matrix constitutes high voltage macro-switch where electrical parameters are distributed between the synchronized components. This study deals with the reliability of these structures. It brings up the high reliability aspect of MOSFETs matrix associations. Thanks to several homemade test facilities, we obtained lots of data concerning the components we use. The understanding of defects propagation mechanisms in matrix structures has allowed us to put forwards the necessity of robust drive system, adapted clamping voltage protection, and careful geometrical construction. All these reliability considerations in matrix associations have notably allowed the construction of a new matrix structure regrouping all solutions insuring reliability. Reliable and robust, this product has already reaches the industrial stage. (author) [fr

  18. An EarthCube Roadmap for Cross-Domain Interoperability in the Geosciences: Governance Aspects

    Science.gov (United States)

    Zaslavsky, I.; Couch, A.; Richard, S. M.; Valentine, D. W.; Stocks, K.; Murphy, P.; Lehnert, K. A.

    2012-12-01

    The goal of cross-domain interoperability is to enable reuse of data and models outside the original context in which these data and models are collected and used and to facilitate analysis and modeling of physical processes that are not confined to disciplinary or jurisdictional boundaries. A new research initiative of the U.S. National Science Foundation, called EarthCube, is developing a roadmap to address challenges of interoperability in the earth sciences and create a blueprint for community-guided cyberinfrastructure accessible to a broad range of geoscience researchers and students. Infrastructure readiness for cross-domain interoperability encompasses the capabilities that need to be in place for such secondary or derivative-use of information to be both scientifically sound and technically feasible. In this initial assessment we consider the following four basic infrastructure components that need to be present to enable cross-domain interoperability in the geosciences: metadata catalogs (at the appropriate community defined granularity) that provide standard discovery services over datasets, data access services, models and other resources of the domain; vocabularies that support unambiguous interpretation of domain resources and metadata; services used to access data repositories and other resources including models, visualizations and workflows; and formal information models that define structure and semantics of the information returned on service requests. General standards for these components have been proposed; they form the backbone of large scale integration activities in the geosciences. By utilizing these standards, EarthCube research designs can take advantage of data discovery across disciplines using the commonality in key data characteristics related to shared models of spatial features, time measurements, and observations. Data can be discovered via federated catalogs and linked nomenclatures from neighboring domains, while standard data

  19. Matrix Information Geometry

    CERN Document Server

    Bhatia, Rajendra

    2013-01-01

    This book is an outcome of the Indo-French Workshop on Matrix Information Geometries (MIG): Applications in Sensor and Cognitive Systems Engineering, which was held in Ecole Polytechnique and Thales Research and Technology Center, Palaiseau, France, in February 23-25, 2011. The workshop was generously funded by the Indo-French Centre for the Promotion of Advanced Research (IFCPAR).  During the event, 22 renowned invited french or indian speakers gave lectures on their areas of expertise within the field of matrix analysis or processing. From these talks, a total of 17 original contribution or state-of-the-art chapters have been assembled in this volume. All articles were thoroughly peer-reviewed and improved, according to the suggestions of the international referees. The 17 contributions presented  are organized in three parts: (1) State-of-the-art surveys & original matrix theory work, (2) Advanced matrix theory for radar processing, and (3) Matrix-based signal processing applications.  

  20. Matrix interdiction problem

    Energy Technology Data Exchange (ETDEWEB)

    Pan, Feng [Los Alamos National Laboratory; Kasiviswanathan, Shiva [Los Alamos National Laboratory

    2010-01-01

    In the matrix interdiction problem, a real-valued matrix and an integer k is given. The objective is to remove k columns such that the sum over all rows of the maximum entry in each row is minimized. This combinatorial problem is closely related to bipartite network interdiction problem which can be applied to prioritize the border checkpoints in order to minimize the probability that an adversary can successfully cross the border. After introducing the matrix interdiction problem, we will prove the problem is NP-hard, and even NP-hard to approximate with an additive n{gamma} factor for a fixed constant {gamma}. We also present an algorithm for this problem that achieves a factor of (n-k) mUltiplicative approximation ratio.