WorldWideScience

Sample records for heterogeneous subsystems interoperability

  1. Holistic Framework For Establishing Interoperability of Heterogeneous Software Development Tools

    Puett, Joseph

    2003-01-01

    This dissertation presents a Holistic Framework for Software Engineering (HFSE) that establishes collaborative mechanisms by which existing heterogeneous software development tools and models will interoperate...

  2. Semantic Interoperability in Heterogeneous IoT Infrastructure for Healthcare

    Sohail Jabbar

    2017-01-01

    Full Text Available Interoperability remains a significant burden to the developers of Internet of Things’ Systems. This is due to the fact that the IoT devices are highly heterogeneous in terms of underlying communication protocols, data formats, and technologies. Secondly due to lack of worldwide acceptable standards, interoperability tools remain limited. In this paper, we proposed an IoT based Semantic Interoperability Model (IoT-SIM to provide Semantic Interoperability among heterogeneous IoT devices in healthcare domain. Physicians communicate their patients with heterogeneous IoT devices to monitor their current health status. Information between physician and patient is semantically annotated and communicated in a meaningful way. A lightweight model for semantic annotation of data using heterogeneous devices in IoT is proposed to provide annotations for data. Resource Description Framework (RDF is a semantic web framework that is used to relate things using triples to make it semantically meaningful. RDF annotated patients’ data has made it semantically interoperable. SPARQL query is used to extract records from RDF graph. For simulation of system, we used Tableau, Gruff-6.2.0, and Mysql tools.

  3. Internet of Things Heterogeneous Interoperable Network Architecture Design

    Bhalerao, Dipashree M.

    2014-01-01

    Internet of Thing‘s (IoT) state of the art deduce that there is no mature Internet of Things architecture available. Thesis contributes an abstract generic IoT system reference architecture development with specifications. Novelties of thesis are proposed solutions and implementations....... It is proved that reduction of data at a source will result in huge vertical scalability and indirectly horizontal also. Second non functional feature contributes in heterogeneous interoperable network architecture for constrained Things. To eliminate increasing number of gateways, Wi-Fi access point...... with Bluetooth, Zigbee (new access point is called as BZ-Fi) is proposed. Co-existence of Wi-Fi, Bluetooth, and Zigbee network technologies results in interference. To reduce the interference, orthogonal frequency division multiplexing (OFDM) is proposed tobe implemented in Bluetooth and Zigbee. The proposed...

  4. A state-of-the-art review of interoperability amongst heterogeneous software systems

    Carlos Mario Zapata Jaramillo

    2009-05-01

    Full Text Available Information systems are sets of interacting elements aimed at supporting entrepreneurial or business activities; they cannot thus coexist in an isolated way but require their data to be shared so as to increase their productivity. Such systems’ interoperability is normally accomplished through mark-up standards, query languages and web services. The literature contains work related to software system interoperability; however, it presents some difficulties, such as the need for using the same platforms and different programming languages, the use of read only languages and the deficiencies in the formalism used for achieving it. This paper presents a critical review of the advances made regarding heterogeneous software systems’ interoperability.

  5. Semantic Registration and Discovery System of Subsystems and Services within an Interoperable Coordination Platform in Smart Cities

    Gregorio Rubio

    2016-06-01

    Full Text Available Smart subsystems like traffic, Smart Homes, the Smart Grid, outdoor lighting, etc. are built in many urban areas, each with a set of services that are offered to citizens. These subsystems are managed by self-contained embedded systems. However, coordination and cooperation between them are scarce. An integration of these systems which truly represents a “system of systems” could introduce more benefits, such as allowing the development of new applications and collective optimization. The integration should allow maximum reusability of available services provided by entities (e.g., sensors or Wireless Sensor Networks. Thus, it is of major importance to facilitate the discovery and registration of available services and subsystems in an integrated way. Therefore, an ontology-based and automatic system for subsystem and service registration and discovery is presented. Using this proposed system, heterogeneous subsystems and services could be registered and discovered in a dynamic manner with additional semantic annotations. In this way, users are able to build customized applications across different subsystems by using available services. The proposed system has been fully implemented and a case study is presented to show the usefulness of the proposed method.

  6. Semantic Registration and Discovery System of Subsystems and Services within an Interoperable Coordination Platform in Smart Cities.

    Rubio, Gregorio; Martínez, José Fernán; Gómez, David; Li, Xin

    2016-06-24

    Smart subsystems like traffic, Smart Homes, the Smart Grid, outdoor lighting, etc. are built in many urban areas, each with a set of services that are offered to citizens. These subsystems are managed by self-contained embedded systems. However, coordination and cooperation between them are scarce. An integration of these systems which truly represents a "system of systems" could introduce more benefits, such as allowing the development of new applications and collective optimization. The integration should allow maximum reusability of available services provided by entities (e.g., sensors or Wireless Sensor Networks). Thus, it is of major importance to facilitate the discovery and registration of available services and subsystems in an integrated way. Therefore, an ontology-based and automatic system for subsystem and service registration and discovery is presented. Using this proposed system, heterogeneous subsystems and services could be registered and discovered in a dynamic manner with additional semantic annotations. In this way, users are able to build customized applications across different subsystems by using available services. The proposed system has been fully implemented and a case study is presented to show the usefulness of the proposed method.

  7. Semantic Registration and Discovery System of Subsystems and Services within an Interoperable Coordination Platform in Smart Cities

    Rubio, Gregorio; Martínez, José Fernán; Gómez, David; Li, Xin

    2016-01-01

    Smart subsystems like traffic, Smart Homes, the Smart Grid, outdoor lighting, etc. are built in many urban areas, each with a set of services that are offered to citizens. These subsystems are managed by self-contained embedded systems. However, coordination and cooperation between them are scarce. An integration of these systems which truly represents a “system of systems” could introduce more benefits, such as allowing the development of new applications and collective optimization. The integration should allow maximum reusability of available services provided by entities (e.g., sensors or Wireless Sensor Networks). Thus, it is of major importance to facilitate the discovery and registration of available services and subsystems in an integrated way. Therefore, an ontology-based and automatic system for subsystem and service registration and discovery is presented. Using this proposed system, heterogeneous subsystems and services could be registered and discovered in a dynamic manner with additional semantic annotations. In this way, users are able to build customized applications across different subsystems by using available services. The proposed system has been fully implemented and a case study is presented to show the usefulness of the proposed method. PMID:27347965

  8. Interoperability

    Savin, Andrej

    be limited. Fourth, Data protection “by design” would be distinguished from data protection “by default”. Fifth, new fundamental rights would be introduced and the old ones clarified. Sixth, new rules on controllers’ and processors’ duties, on supervisory authorities and on sanctions would be introduced....... Finally, the Commission would obtain significant new powers to adopt delegated acts. This appendix explores the impact that the proposed Regulation might have on interoperability of user-­‐generated services.4 Since the proposed Regulation is an instrument of high complexity, only those provisions...... of direct relevance for the project and Work Package 5 will be analysed here....

  9. Semantic interoperability in a heterogeneous smart lighting system

    Bhardwaj, S.; Ozcelebi, T.; Lukkien, J.J.; Verhoeven, R.

    2010-01-01

    Smart spaces typically consist of collaborating heterogeneous nodes with various resource capacities, e.g. processing, memory, storage, energy capabilities. Low capacity nodes can operate using very simple protocols, allowing them to provide and consume only simple services, as opposed to devices

  10. Assessment of conformity and suitability for use of the energy subsystem of interoperable high-speed lines; Konformitaetsbewertung und EG-Pruefverfahren fuer das Teilsystem Energie

    Behrends, D. [Steglitzer Damm, Deutschen Bahn, Berlin (Germany); Brodkorb, A.; Matthes, R. [Siemens AG, Erlangen (Germany)

    2003-05-01

    The directive 96/48/EG governs the conformity assessment of conformity of interoperability constituents and of suitability for use of subsystems of the trans-European high-speed rail system. The assessment of the overhead contact line type SICAT H 1.0 and the energy subsystem of the new high-speed line Belgian boundary-Rotterdam-Hoofddop (HSL Zuid) in the Netherlands establish examples for the application of the technical specification for the energy subsystem. (orig.) [German] Die Richtlinie 96/48/EG regelt die Bewertung der Konformitaet von Komponenten und das EG-Pruefverfahren von Teilsystemen des interoperablen transeuropaeischen Hochgeschwindigkeitsbahnsystems. Beispiele fuer diese beiden Vorgaenge sind die Bewertung der Oberleitungsbauart SICAT H 1.0 und die Pruefung der Energieversorgung der Hochgeschwindigkeitsstrecke belgische Grenze-Rotterdam-Hoofddop (HSL Zuid) in den Niederlanden. (orig.)

  11. How to ensure sustainable interoperability in heterogeneous distributed systems through architectural approach.

    Pape-Haugaard, Louise; Frank, Lars

    2011-01-01

    A major obstacle in ensuring ubiquitous information is the utilization of heterogeneous systems in eHealth. The objective in this paper is to illustrate how an architecture for distributed eHealth databases can be designed without lacking the characteristic features of traditional sustainable databases. The approach is firstly to explain traditional architecture in central and homogeneous distributed database computing, followed by a possible approach to use an architectural framework to obtain sustainability across disparate systems i.e. heterogeneous databases, concluded with a discussion. It is seen that through a method of using relaxed ACID properties on a service-oriented architecture it is possible to achieve data consistency which is essential when ensuring sustainable interoperability.

  12. Multi-Agent Decision Support Tool to Enable Interoperability among Heterogeneous Energy Systems

    Brígida Teixeira

    2018-02-01

    Full Text Available Worldwide electricity markets are undergoing a major restructuring process. One of the main reasons for the ongoing changes is to enable the adaptation of current market models to the new paradigm that arises from the large-scale integration of distributed generation sources. In order to deal with the unpredictability caused by the intermittent nature of the distributed generation and the large number of variables that contribute to the energy sector balance, it is extremely important to use simulation systems that are capable of dealing with the required complexity. This paper presents the Tools Control Center (TOOCC, a framework that allows the interoperability between heterogeneous energy and power simulation systems through the use of ontologies, allowing the simulation of scenarios with a high degree of complexity, through the cooperation of the individual capacities of each system. A case study based on real data is presented in order to demonstrate the interoperability capabilities of TOOCC. The simulation considers the energy management of a microgrid of a real university campus, from the perspective of the network manager and also of its consumers/producers, in a projection for a typical day of the winter of 2050.

  13. Modeling and interoperability of heterogeneous genomic big data for integrative processing and querying.

    Masseroli, Marco; Kaitoua, Abdulrahman; Pinoli, Pietro; Ceri, Stefano

    2016-12-01

    While a huge amount of (epi)genomic data of multiple types is becoming available by using Next Generation Sequencing (NGS) technologies, the most important emerging problem is the so-called tertiary analysis, concerned with sense making, e.g., discovering how different (epi)genomic regions and their products interact and cooperate with each other. We propose a paradigm shift in tertiary analysis, based on the use of the Genomic Data Model (GDM), a simple data model which links genomic feature data to their associated experimental, biological and clinical metadata. GDM encompasses all the data formats which have been produced for feature extraction from (epi)genomic datasets. We specifically describe the mapping to GDM of SAM (Sequence Alignment/Map), VCF (Variant Call Format), NARROWPEAK (for called peaks produced by NGS ChIP-seq or DNase-seq methods), and BED (Browser Extensible Data) formats, but GDM supports as well all the formats describing experimental datasets (e.g., including copy number variations, DNA somatic mutations, or gene expressions) and annotations (e.g., regarding transcription start sites, genes, enhancers or CpG islands). We downloaded and integrated samples of all the above-mentioned data types and formats from multiple sources. The GDM is able to homogeneously describe semantically heterogeneous data and makes the ground for providing data interoperability, e.g., achieved through the GenoMetric Query Language (GMQL), a high-level, declarative query language for genomic big data. The combined use of the data model and the query language allows comprehensive processing of multiple heterogeneous data, and supports the development of domain-specific data-driven computations and bio-molecular knowledge discovery. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Requirements for Interoperability in Healthcare Information Systems

    Rita Noumeir

    2012-01-01

    Full Text Available Interoperability is a requirement for the successful deployment of Electronic Health Records (EHR. EHR improves the quality of healthcare by enabling access to all relevant information at the diagnostic decision moment, regardless of location. It is a system that results from the cooperation of several heterogeneous distributed subsystems that need to successfully exchange information relative to a specific healthcare process. This paper analyzes interoperability impediments in healthcare by first defining them and providing concrete healthcare examples, followed by discussion of how specifications can be defined and how verification can be conducted to eliminate those impediments and ensure interoperability in healthcare. This paper also analyzes how Integrating the Healthcare Enterprise (IHE has been successful in enabling interoperability, and identifies some neglected aspects that need attention.

  15. Interoperability and Security Support for Heterogeneous COTS/GOTS/Legacy Component-Based Architecture

    Tran, Tam

    2000-01-01

    There is a need for Commercial-off-the-shelf (COTS), Government-off-the-shelf (GOTS) and legacy components to interoperate in a secure distributed computing environment in order to facilitate the development of evolving applications...

  16. Balancing of Heterogeneity and Interoperability in E-Business Networks: The Role of Standards and Protocols

    Frank-Dieter Dorloff; Ejub Kajan

    2012-01-01

    To reach this interoperability visibility and common understanding must be ensured on all levels of the interoperability pyramid. This includes common agreements about the visions, political and legal restrictions, clear descriptions about the collaboration scenarios, included business processes and-rules, the type and roles of the Documents, a common understandable vocabulary, etc. To do this in an effective and automatable manner, ICT based concepts, frameworks and models have to be defined...

  17. Towards multi-layer interoperability of heterogeneous IoT platforms : the INTER-IoT approach

    Fortino, Giancarlo; Savaglio, Claudio; Palau, Carlos E.; de Puga, Jara Suarez; Ghanza, Maria; Paprzycki, Marcin; Montesinos, Miguel; Liotta, Antonio; Llop, Miguel; Gravina, R.; Palau, C.E.; Manso, M.; Liotta, A.; Fortino, G.

    2018-01-01

    Open interoperability delivers on the promise of enabling vendors and developers to interact and interoperate, without interfering with anyone’s ability to compete by delivering a superior product and experience. In the absence of global IoT standards, the INTER-IoT voluntary approach will support

  18. Interoperability of remote handling control system software modules at Divertor Test Platform 2 using middleware

    Tuominen, Janne; Rasi, Teemu; Mattila, Jouni; Siuko, Mikko; Esque, Salvador; Hamilton, David

    2013-01-01

    Highlights: ► The prototype DTP2 remote handling control system is a heterogeneous collection of subsystems, each realizing a functional area of responsibility. ► Middleware provides well-known, reusable solutions to problems, such as heterogeneity, interoperability, security and dependability. ► A middleware solution was selected and integrated with the DTP2 RH control system. The middleware was successfully used to integrate all relevant subsystems and functionality was demonstrated. -- Abstract: This paper focuses on the inter-subsystem communication channels in a prototype distributed remote handling control system at Divertor Test Platform 2 (DTP2). The subsystems are responsible for specific tasks and, over the years, their development has been carried out using various platforms and programming languages. The communication channels between subsystems have different priorities, e.g. very high messaging rate and deterministic timing or high reliability in terms of individual messages. Generally, a control system's communication infrastructure should provide interoperability, scalability, performance and maintainability. An attractive approach to accomplish this is to use a standardized and proven middleware implementation. The selection of a middleware can have a major cost impact in future integration efforts. In this paper we present development done at DTP2 using the Object Management Group's (OMG) standard specification for Data Distribution Service (DDS) for ensuring communications interoperability. DDS has gained a stable foothold especially in the military field. It lacks a centralized broker, thereby avoiding a single-point-of-failure. It also includes an extensive set of Quality of Service (QoS) policies. The standard defines a platform- and programming language independent model and an interoperability wire protocol that enables DDS vendor interoperability, allowing software developers to avoid vendor lock-in situations

  19. Interoperability of remote handling control system software modules at Divertor Test Platform 2 using middleware

    Tuominen, Janne, E-mail: janne.m.tuominen@tut.fi [Tampere University of Technology, Department of Intelligent Hydraulics and Automation, Tampere (Finland); Rasi, Teemu; Mattila, Jouni [Tampere University of Technology, Department of Intelligent Hydraulics and Automation, Tampere (Finland); Siuko, Mikko [VTT, Technical Research Centre of Finland, Tampere (Finland); Esque, Salvador [F4E, Fusion for Energy, Torres Diagonal Litoral B3, Josep Pla2, 08019, Barcelona (Spain); Hamilton, David [ITER Organization, Route de Vinon sur Verdon, 13115 Saint Paul Lez Durance (France)

    2013-10-15

    Highlights: ► The prototype DTP2 remote handling control system is a heterogeneous collection of subsystems, each realizing a functional area of responsibility. ► Middleware provides well-known, reusable solutions to problems, such as heterogeneity, interoperability, security and dependability. ► A middleware solution was selected and integrated with the DTP2 RH control system. The middleware was successfully used to integrate all relevant subsystems and functionality was demonstrated. -- Abstract: This paper focuses on the inter-subsystem communication channels in a prototype distributed remote handling control system at Divertor Test Platform 2 (DTP2). The subsystems are responsible for specific tasks and, over the years, their development has been carried out using various platforms and programming languages. The communication channels between subsystems have different priorities, e.g. very high messaging rate and deterministic timing or high reliability in terms of individual messages. Generally, a control system's communication infrastructure should provide interoperability, scalability, performance and maintainability. An attractive approach to accomplish this is to use a standardized and proven middleware implementation. The selection of a middleware can have a major cost impact in future integration efforts. In this paper we present development done at DTP2 using the Object Management Group's (OMG) standard specification for Data Distribution Service (DDS) for ensuring communications interoperability. DDS has gained a stable foothold especially in the military field. It lacks a centralized broker, thereby avoiding a single-point-of-failure. It also includes an extensive set of Quality of Service (QoS) policies. The standard defines a platform- and programming language independent model and an interoperability wire protocol that enables DDS vendor interoperability, allowing software developers to avoid vendor lock-in situations.

  20. Heterogeneous but “Standard” Coding Systems for Adverse Events: Issues in Achieving Interoperability between Apples and Oranges

    Richesson, Rachel L.; Fung, Kin Wah; Krischer, Jeffrey P.

    2008-01-01

    Monitoring adverse events (AEs) is an important part of clinical research and a crucial target for data standards. The representation of adverse events themselves requires the use of controlled vocabularies with thousands of needed clinical concepts. Several data standards for adverse events currently exist, each with a strong user base. The structure and features of these current adverse event data standards (including terminologies and classifications) are different, so comparisons and evaluations are not straightforward, nor are strategies for their harmonization. Three different data standards - the Medical Dictionary for Regulatory Activities (MedDRA) and the Systematized Nomenclature of Medicine Clinical Terms (SNOMED CT) terminologies, and Common Terminology Criteria for Adverse Events (CTCAE) classification - are explored as candidate representations for AEs. This paper describes the structural features of each coding system, their content and relationship to the Unified Medical Language System (UMLS), and unsettled issues for future interoperability of these standards. PMID:18406213

  1. Grid interoperability: the interoperations cookbook

    Field, L; Schulz, M [CERN (Switzerland)], E-mail: Laurence.Field@cern.ch, E-mail: Markus.Schulz@cern.ch

    2008-07-01

    Over recent years a number of grid projects have emerged which have built grid infrastructures that are now the computing backbones for various user communities. A significant number of these communities are limited to one grid infrastructure due to the different middleware and procedures used in each grid. Grid interoperation is trying to bridge these differences and enable virtual organizations to access resources independent of the grid project affiliation. This paper gives an overview of grid interoperation and describes the current methods used to bridge the differences between grids. Actual use cases encountered during the last three years are discussed and the most important interfaces required for interoperability are highlighted. A summary of the standardisation efforts in these areas is given and we argue for moving more aggressively towards standards.

  2. Grid interoperability: the interoperations cookbook

    Field, L; Schulz, M

    2008-01-01

    Over recent years a number of grid projects have emerged which have built grid infrastructures that are now the computing backbones for various user communities. A significant number of these communities are limited to one grid infrastructure due to the different middleware and procedures used in each grid. Grid interoperation is trying to bridge these differences and enable virtual organizations to access resources independent of the grid project affiliation. This paper gives an overview of grid interoperation and describes the current methods used to bridge the differences between grids. Actual use cases encountered during the last three years are discussed and the most important interfaces required for interoperability are highlighted. A summary of the standardisation efforts in these areas is given and we argue for moving more aggressively towards standards

  3. Intercloud Architecture Framework for Interoperability and Integration

    Demchenko, Y.; Ngo, C.; Makkes, M.X.; Strijkers, R.J.

    2013-01-01

    This report presents on-going research to develop the Intercloud Architecture Framework (ICAF) that addresses interoperability and integration issues in multi-provider multi-domain heterogeneous Cloud based infrastructure services and applications provisioning, including integration and

  4. Interoperability for Enterprise Systems and Applications

    Jardim-Gonçalves, Ricardo; Popplewell, Keith; Mendonça, João

    2016-01-01

    A concise reference to the state of the art in systems interoperability, Enterprise Interoperability VII will be of great value to engineers and computer scientists working in manufacturing and other process industries and to software engineers and electronic and manufacturing engineers working in the academic environment. Furthermore, it shows how knowledge of the meaning within information and the use to which it will be put have to be held in common between enterprises for consistent and efficient inter-enterprise networks. Over 30 papers, ranging from academic research through case studies to industrial and administrative experience of interoperability show how, in a scenario of globalised markets, where the capacity to cooperate with other organizations efficiently is essential in order to remain economically, socially and environmentally cost-effective, the most innovative digitized and networked enterprises ensure that their systems and applications are able to interoperate across heterogeneous collabo...

  5. Defining Inter-Cloud Architecture for Interoperability and Integration

    Demchenko, Y.; Ngo, C.; Makkes, M.X.; Strijkers, R.J.; Laat, C. de

    2012-01-01

    This paper presents on-going research to develop the Inter-Cloud Architecture that should address problems in multi-provider multi-domain heterogeneous Cloud based applications integration and interoperability, including integration and interoperability with legacy infrastructure services. Cloud

  6. Defining inter-cloud architecture for interoperability and integration

    Demchenko, Y.; Ngo, C.; Makkes, M.X.; Strijkers, R.; de Laat, C.; Zimmermann, W.; Lee, Y.W.; Demchenko, Y.

    2012-01-01

    This paper presents an on-going research to develop the Inter-Cloud Architecture, which addresses the architectural problems in multi-provider multi-domain heterogeneous cloud based applications integration and interoperability, including integration and interoperability with legacy infrastructure

  7. Interoperability Strategic Vision

    Widergren, Steven E.; Knight, Mark R.; Melton, Ronald B.; Narang, David; Martin, Maurice; Nordman, Bruce; Khandekar, Aditya; Hardy, Keith S.

    2018-02-28

    The Interoperability Strategic Vision whitepaper aims to promote a common understanding of the meaning and characteristics of interoperability and to provide a strategy to advance the state of interoperability as applied to integration challenges facing grid modernization. This includes addressing the quality of integrating devices and systems and the discipline to improve the process of successfully integrating these components as business models and information technology improve over time. The strategic vision for interoperability described in this document applies throughout the electric energy generation, delivery, and end-use supply chain. Its scope includes interactive technologies and business processes from bulk energy levels to lower voltage level equipment and the millions of appliances that are becoming equipped with processing power and communication interfaces. A transformational aspect of a vision for interoperability in the future electric system is the coordinated operation of intelligent devices and systems at the edges of grid infrastructure. This challenge offers an example for addressing interoperability concerns throughout the electric system.

  8. Intercloud architecture for interoperability and integration

    Demchenko, Y.; Makkes, M.X.; Strijkers, R.; de Laat, C.

    2012-01-01

    This paper presents on-going research to develop the Intercloud Architecture Framework (ICAF) that addresses problems in multi-provider multi-domain heterogeneous cloud based infrastructure services and applications integration and interoperability. The paper refers to existing standards in Cloud

  9. Intercloud Architecture for interoperability and integration

    Demchenko, Y.; Makkes, M.X.; Strijkers, R.J.; Laat, C. de

    2012-01-01

    This paper presents on-going research to develop the Intercloud Architecture Framework (ICAF) that addresses problems in multi-provider multi-domain heterogeneous cloud based infrastructure services and applications integration and interoperability. The paper refers to existing standards in Cloud

  10. On MDA - SOA based Intercloud Interoperability framework

    Tahereh Nodehi

    2013-01-01

    Full Text Available Cloud computing has been one of the latest technologies which assures reliable delivery of on - demand computing services over the Internet. Cloud service providers have established geographically distributed data centers and computing resources, which are available online as service. The clouds operated by different service providers working together in collaboration can open up lots more spaces for innovative scenarios with huge amount of resources provisioning on demand. However, current cloud systems do not support intercloud interoperability. This paper is thus motivated to address Intercloud Interoperabilityby analyzing different methodologies that have been applied to resolve various scenarios of interoperability. Model Driven Architecture (MDA and Service Oriented Architecture (SOA method have been used to address interoperability in various scenarios, which also opens up spaces to address intercloud interoperability by making use of these well accepted methodologies. The focus of this document is to show Intercloud Interoperability can be supported through a Model Driven approach and Service Oriented systems. Moreover, the current state of the art in Intercloud, concept and benefits of MDA and SOA are discussed in the paper. At the same time this paper also proposes a generic architecture for MDA - SOA based framework, which can be useful for developing applications which will require intercloud interoperability. The paper justi fies the usability of the framework by a use - case scenario for dynamic workload migration among heterogeneous clouds.

  11. Toward an Interoperability Architecture

    Buddenberg, Rex

    2001-01-01

    .... The continued burgeoning of the Internet constitutes an existence proof. But a common networking base is insufficient to reach a goal of cross-system interoperability - the large information system...

  12. Interoperability for electronic ID

    Zygadlo, Zuzanna

    2009-01-01

    Electronic Business, including eBanking, eCommerce and eGovernmental services, is today based on a large variety of security solutions, comprising electronic IDs provided by a broad community of Public Key Infrastructure (PKI) vendors. Significant differences in implementations of those solutions introduce a problem of lack of interoperability in electronic business, which have not yet been resolved by standardization and interoperability initiatives based on existing PKI trust models. It i...

  13. Buildings Interoperability Landscape

    Hardin, Dave [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Stephan, Eric G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wang, Weimin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Corbin, Charles D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Widergren, Steven E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-12-31

    Through its Building Technologies Office (BTO), the United States Department of Energy’s Office of Energy Efficiency and Renewable Energy (DOE-EERE) is sponsoring an effort to advance interoperability for the integration of intelligent buildings equipment and automation systems, understanding the importance of integration frameworks and product ecosystems to this cause. This is important to BTO’s mission to enhance energy efficiency and save energy for economic and environmental purposes. For connected buildings ecosystems of products and services from various manufacturers to flourish, the ICT aspects of the equipment need to integrate and operate simply and reliably. Within the concepts of interoperability lie the specification, development, and certification of equipment with standards-based interfaces that connect and work. Beyond this, a healthy community of stakeholders that contribute to and use interoperability work products must be developed. On May 1, 2014, the DOE convened a technical meeting to take stock of the current state of interoperability of connected equipment and systems in buildings. Several insights from that meeting helped facilitate a draft description of the landscape of interoperability for connected buildings, which focuses mainly on small and medium commercial buildings. This document revises the February 2015 landscape document to address reviewer comments, incorporate important insights from the Buildings Interoperability Vision technical meeting, and capture thoughts from that meeting about the topics to be addressed in a buildings interoperability vision. In particular, greater attention is paid to the state of information modeling in buildings and the great potential for near-term benefits in this area from progress and community alignment.

  14. Towards technical interoperability in telemedicine.

    Craft, Richard Layne, II

    2004-05-01

    For telemedicine to realize the vision of anywhere, anytime access to care, the question of how to create a fully interoperable technical infrastructure must be addressed. After briefly discussing how 'technical interoperability' compares with other types of interoperability being addressed in the telemedicine community today, this paper describes reasons for pursuing technical interoperability, presents a proposed framework for realizing technical interoperability, identifies key issues that will need to be addressed if technical interoperability is to be achieved, and suggests a course of action that the telemedicine community might follow to accomplish this goal.

  15. Interoperable Communications for Hierarchical Heterogeneous Wireless Networks

    2016-04-01

    International Conference on Advanced Networks and Telecommuncations Systems (ANTS). 14-DEC-13, Kattankulathur, India. : , Husheng Li, Qi Zeng , Lijun Qian. GPS...correlation in space is too large, which implies that the correlation is overestimated. Other methods may be more accurate, faster or less memory ...limited, an intelligent mechanism is needed for the information selection and signaling design of the cross-network communication for collaborative

  16. Semantically Interoperable XML Data.

    Vergara-Niedermayr, Cristobal; Wang, Fusheng; Pan, Tony; Kurc, Tahsin; Saltz, Joel

    2013-09-01

    XML is ubiquitously used as an information exchange platform for web-based applications in healthcare, life sciences, and many other domains. Proliferating XML data are now managed through latest native XML database technologies. XML data sources conforming to common XML schemas could be shared and integrated with syntactic interoperability. Semantic interoperability can be achieved through semantic annotations of data models using common data elements linked to concepts from ontologies. In this paper, we present a framework and software system to support the development of semantic interoperable XML based data sources that can be shared through a Grid infrastructure. We also present our work on supporting semantic validated XML data through semantic annotations for XML Schema, semantic validation and semantic authoring of XML data. We demonstrate the use of the system for a biomedical database of medical image annotations and markups.

  17. Semantically Interoperable XML Data

    Vergara-Niedermayr, Cristobal; Wang, Fusheng; Pan, Tony; Kurc, Tahsin; Saltz, Joel

    2013-01-01

    XML is ubiquitously used as an information exchange platform for web-based applications in healthcare, life sciences, and many other domains. Proliferating XML data are now managed through latest native XML database technologies. XML data sources conforming to common XML schemas could be shared and integrated with syntactic interoperability. Semantic interoperability can be achieved through semantic annotations of data models using common data elements linked to concepts from ontologies. In this paper, we present a framework and software system to support the development of semantic interoperable XML based data sources that can be shared through a Grid infrastructure. We also present our work on supporting semantic validated XML data through semantic annotations for XML Schema, semantic validation and semantic authoring of XML data. We demonstrate the use of the system for a biomedical database of medical image annotations and markups. PMID:25298789

  18. Lemnos Interoperable Security Program

    Stewart, John [Tennessee Valley Authority, Knoxville, TN (United States); Halbgewachs, Ron [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Chavez, Adrian [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Smith, Rhett [Schweitzer Engineering Laboratories, Chattanooga, TN (United States); Teumim, David [Teumim Technical, Allentown, PA (United States)

    2012-01-31

    The manner in which the control systems are being designed and operated in the energy sector is undergoing some of the most significant changes in history due to the evolution of technology and the increasing number of interconnections to other system. With these changes however come two significant challenges that the energy sector must face; 1) Cyber security is more important than ever before, and 2) Cyber security is more complicated than ever before. A key requirement in helping utilities and vendors alike in meeting these challenges is interoperability. While interoperability has been present in much of the discussions relating to technology utilized within the energy sector and especially the Smart Grid, it has been absent in the context of cyber security. The Lemnos project addresses these challenges by focusing on the interoperability of devices utilized within utility control systems which support critical cyber security functions. In theory, interoperability is possible with many of the cyber security solutions available to utilities today. The reality is that the effort required to achieve cyber security interoperability is often a barrier for utilities. For example, consider IPSec, a widely-used Internet Protocol to define Virtual Private Networks, or tunnels , to communicate securely through untrusted public and private networks. The IPSec protocol suite has a significant number of configuration options and encryption parameters to choose from, which must be agreed upon and adopted by both parties establishing the tunnel. The exercise in getting software or devices from different vendors to interoperate is labor intensive and requires a significant amount of security expertise by the end user. Scale this effort to a significant number of devices operating over a large geographical area and the challenge becomes so overwhelming that it often leads utilities to pursue solutions from a single vendor. These single vendor solutions may inadvertently lock

  19. Space power subsystem sizing

    Geis, J.W.

    1992-01-01

    This paper discusses a Space Power Subsystem Sizing program which has been developed by the Aerospace Power Division of Wright Laboratory, Wright-Patterson Air Force Base, Ohio. The Space Power Subsystem program (SPSS) contains the necessary equations and algorithms to calculate photovoltaic array power performance, including end-of-life (EOL) and beginning-of-life (BOL) specific power (W/kg) and areal power density (W/m 2 ). Additional equations and algorithms are included in the spreadsheet for determining maximum eclipse time as a function of orbital altitude, and inclination. The Space Power Subsystem Sizing program (SPSS) has been used to determine the performance of several candidate power subsystems for both Air Force and SDIO potential applications. Trade-offs have been made between subsystem weight and areal power density (W/m 2 ) as influenced by orbital high energy particle flux and time in orbit

  20. FLTSATCOM interoperability applications

    Woolford, Lynn

    A mobile Fleet Satellite Communications (FLTSATCOM) system called the Mobile Operational Control Center (MOCC) was developed which has demonstrated the ability to be interoperable with many of the current FLTSATCOM command and control channels. This low-cost system is secure in all its communications, is lightweight, and provides a gateway for other communications formats. The major elements of this system are made up of a personal computer, a protocol microprocessor, and off-the-shelf mobile communication components. It is concluded that with both FLTSATCOM channel protocol and data format interoperability, the MOCC has the ability provide vital information in or near real time, which significantly improves mission effectiveness.

  1. Towards an enterprise interoperability framework

    Kotzé, P

    2010-06-01

    Full Text Available This paper presents relevant interoperability approaches and solutions applied to global/international networked (collaborative) enterprises or organisations and conceptualise an enhanced enterprise interoperability framework. The paper covers...

  2. A Theory of Interoperability Failures

    McBeth, Michael S

    2003-01-01

    This paper develops a theory of interoperability failures. Interoperability in this paper refers to the exchange of information and the use of information, once exchanged, between two or more systems...

  3. An Ontological Solution to Support Interoperability in the Textile Industry

    Duque, Arantxa; Campos, Cristina; Jiménez-Ruiz, Ernesto; Chalmeta, Ricardo

    Significant developments in information and communication technologies and challenging market conditions have forced enterprises to adapt their way of doing business. In this context, providing mechanisms to guarantee interoperability among heterogeneous organisations has become a critical issue. Even though prolific research has already been conducted in the area of enterprise interoperability, we have found that enterprises still struggle to introduce fully interoperable solutions, especially, in terms of the development and application of ontologies. Thus, the aim of this paper is to introduce basic ontology concepts in a simple manner and to explain the advantages of the use of ontologies to improve interoperability. We will also present a case study showing the implementation of an application ontology for an enterprise in the textile/clothing sector.

  4. Virtual Quantum Subsystems

    Zanardi, Paolo

    2001-01-01

    The physical resources available to access and manipulate the degrees of freedom of a quantum system define the set A of operationally relevant observables. The algebraic structure of A selects a preferred tensor product structure, i.e., a partition into subsystems. The notion of compoundness for quantum systems is accordingly relativized. Universal control over virtual subsystems can be achieved by using quantum noncommutative holonomies

  5. XML interoperability standards for seamless communication: An analysis of industry-neutral and domain-specific initiatives

    Chituc, C.M.

    2017-01-01

    Attaining seamless interoperability among heterogeneous communication systems and technologies remains a great challenge in todays’ networked world. Real time information exchange among heterogeneous and geographically distributed systems is required to support the execution of complex e-business

  6. Augmenting interoperability across repositories architectural ideas

    CERN. Geneva

    2005-01-01

    The aDORe digital repository architecture designed and implemented by the Los Alamos Research Library is fully standards-based and highly modular, with the various components of the architecture interacting in a protocol-driven manner. Although aDORe was designed for use in the context of the Los Alamos Library, its modular and standards-based design has led to interesting insights regarding possible new levels of interoperability in a federation of heterogeneous repositories. The presentation will discuss these insights, and will illustrate that attractive federations of repositories can be built by introducing rather basic interoperability requirements. The presentation will also show that, once these requirements are met, a powerful service framework that overlays the federation can emerge.

  7. Interoperability does matter

    Manfred Goepel

    2006-04-01

    Full Text Available In companies, the historically developed IT systems are mostly application islands. They always produce good results if the system's requirements and surroundings are not changed and as long as a system interface is not needed. With the ever increas-ing dynamic and globalization of the market, however, these IT islands are certain to collapse. Interoperability (IO is the bid of the hour, assuming the integration of users, data, applications and processes. In the following, important IO enablers such as ETL, EAI, and SOA will be examined on the basis of practica-bility. It will be shown that especially SOA produces a surge of interoperability that could rightly be referred to as IT evolution.

  8. An Interoperable Cartographic Database

    Slobodanka Ključanin; Zdravko Galić

    2007-01-01

    The concept of producing a prototype of interoperable cartographic database is explored in this paper, including the possibilities of integration of different geospatial data into the database management system and their visualization on the Internet. The implementation includes vectorization of the concept of a single map page, creation of the cartographic database in an object-relation database, spatial analysis, definition and visualization of the database content in the form of a map on t...

  9. An Interoperable Cartographic Database

    Slobodanka Ključanin

    2007-05-01

    Full Text Available The concept of producing a prototype of interoperable cartographic database is explored in this paper, including the possibilities of integration of different geospatial data into the database management system and their visualization on the Internet. The implementation includes vectorization of the concept of a single map page, creation of the cartographic database in an object-relation database, spatial analysis, definition and visualization of the database content in the form of a map on the Internet. 

  10. Inter-operability

    Plaziat, J.F.; Moulin, P.; Van Beurden, R.; Ballet, E.

    2005-01-01

    Building an internal gas market implies establishing harmonized rules for cross border trading between operators. To that effect, the European association EASEE-gas is carrying out standards and procedures, commonly called 'inter-operability'. Set up in 2002, the Association brings together all segments of the gas industry: producers, transporters, distributors, traders and shippers, suppliers, consumers and service providers. This workshop presents the latest status on issues such as barriers to gas trade in Europe, rules and procedures under preparation by EASEE-gas, and the implementation schedule of these rules by operators. This article gathers 5 presentations about this topic given at the gas conference

  11. Unmanned Ground Vehicle (UGV) Interoperability Laboratory

    Federal Laboratory Consortium — The UGV Interoperability Lab provides the capability to verify vendor conformance against government-defined interoperability profiles (IOPs). This capability allows...

  12. Flexible Language Interoperability

    Ekman, Torbjörn; Mechlenborg, Peter; Schultz, Ulrik Pagh

    2007-01-01

    Virtual machines raise the abstraction level of the execution environment at the cost of restricting the set of supported languages. Moreover, the ability of a language implementation to integrate with other languages hosted on the same virtual machine typically constrains the features...... of the language. In this paper, we present a highly flexible yet efficient approach to hosting multiple programming languages on an object-oriented virtual machine. Our approach is based on extending the interface of each class with language-specific wrapper methods, offering each language a tailored view...... of a given class. This approach can be deployed both on a statically typed virtual machine, such as the JVM, and on a dynamic virtual machine, such as a Smalltalk virtual machine. We have implemented our approach to language interoperability on top of a prototype virtual machine for embedded systems based...

  13. Plugfest 2009: Global Interoperability in Telerobotics and Telemedicine.

    King, H Hawkeye; Hannaford, Blake; Kwok, Ka-Wai; Yang, Guang-Zhong; Griffiths, Paul; Okamura, Allison; Farkhatdinov, Ildar; Ryu, Jee-Hwan; Sankaranarayanan, Ganesh; Arikatla, Venkata; Tadano, Kotaro; Kawashima, Kenji; Peer, Angelika; Schauß, Thomas; Buss, Martin; Miller, Levi; Glozman, Daniel; Rosen, Jacob; Low, Thomas

    2010-05-07

    Despite the great diversity of teleoperator designs and applications, their underlying control systems have many similarities. These similarities can be exploited to enable inter-operability between heterogeneous systems. We have developed a network data specification, the Interoperable Telerobotics Protocol, that can be used for Internet based control of a wide range of teleoperators. In this work we test interoperable telerobotics on the global Internet, focusing on the telesurgery application domain. Fourteen globally dispersed telerobotic master and slave systems were connected in thirty trials in one twenty four hour period. Users performed common manipulation tasks to demonstrate effective master-slave operation. With twenty eight (93%) successful, unique connections the results show a high potential for standardizing telerobotic operation. Furthermore, new paradigms for telesurgical operation and training are presented, including a networked surgery trainer and upper-limb exoskeleton control of micro-manipulators.

  14. Spacecraft Design Thermal Control Subsystem

    Miyake, Robert N.

    2008-01-01

    The Thermal Control Subsystem engineers task is to maintain the temperature of all spacecraft components, subsystems, and the total flight system within specified limits for all flight modes from launch to end-of-mission. In some cases, specific stability and gradient temperature limits will be imposed on flight system elements. The Thermal Control Subsystem of "normal" flight systems, the mass, power, control, and sensing systems mass and power requirements are below 10% of the total flight system resources. In general the thermal control subsystem engineer is involved in all other flight subsystem designs.

  15. Environmental Control Subsystem Development

    Laidlaw, Jacob; Zelik, Jonathan

    2017-01-01

    Kennedy Space Center's Launch Pad 39B, part of Launch Complex 39, is currently undergoing construction to prepare it for NASA's Space Launch System missions. The Environmental Control Subsystem, which provides the vehicle with an air or nitrogen gas environment, required development of its local and remote display screens. The remote displays, developed by NASA contractors and previous interns, were developed without complete functionality; the remote displays were revised, adding functionality to over 90 displays. For the local displays, multiple test procedures were developed to assess the functionality of the screens, as well as verify requirements. One local display screen was also developed.

  16. Evaluation of Enterprise Architecture Interoperability

    Jamison, Theresa A; Niska, Brice T; Layman, Phillip A; Whitney, Steven P

    2005-01-01

    ...), which describes these architectures. The purpose of this project, suggested by Air Force Space Command, was to examine the value of existing analytical tools in making an interoperability assessment of individual enterprises, as well...

  17. Evolution of magnetic disk subsystems

    Kaneko, Satoru

    1994-06-01

    The higher recording density of magnetic disk realized today has brought larger storage capacity per unit and smaller form factors. If the required access performance per MB is constant, the performance of large subsystems has to be several times better. This article describes mainly the technology for improving the performance of the magnetic disk subsystems and the prospects of their future evolution. Also considered are 'crosscall pathing' which makes the data transfer channel more effective, 'disk cache' which improves performance coupling with solid state memory technology, and 'RAID' which improves the availability and integrity of disk subsystems by organizing multiple disk drives in a subsystem. As a result, it is concluded that since the performance of the subsystem is dominated by that of the disk cache, maximation of the performance of the disk cache subsystems is very important.

  18. Regional transmission subsystem planning

    Costa Bortoni, Edson da [Quadrante Softwares Especializados Ltda., Itajuba, MG (Brazil); Bajay, Sergio Valdir; Barros Correia, Paulo de [Universidade Estadual de Campinas, SP (Brazil). Faculdade de Engenharia Mecanica; Santos, Afonso Henriques Moreira; Haddad, Jamil [Escola Federal de Engenharia de Itajuba, MG (Brazil)

    1994-12-31

    This work presents an approach for the planning of transmission systems by employing mixed--integer linear programming to obtain a cost and operating characteristics optimized system. The voltage loop equations are written in a modified form, so that, at the end of the analysis, the model behaves as a DC power flow, with the help of the two Kirchhoff`s laws, exempting the need of interaction with an external power flow program for analysis of the line loading. The model considers the occurrence of contingencies, so that the final result is a network robust to the most severe contingencies. This whole technique is adapted to the regional electric power transmission subsystems. (author) 9 refs., 4 figs.

  19. Product-driven Enterprise Interoperability for Manufacturing Systems Integration

    Dassisti , Michele; Panetto , Hervé; Tursi , Angela

    2006-01-01

    International audience; The “Babel tower effect”, induced by the heterogeneity of applications available in the operation of enterprises brings to a consistent lack of “exchangeability” and risk of semantic loss whenever cooperation has to take place within the same enterprise. Generally speaking, this kind of problem falls within the umbrella of interoperability between local reference information models .This position paper discuss some idea on this field and traces a research roadmap to ma...

  20. Smart Grid Interoperability Maturity Model

    Widergren, Steven E.; Levinson, Alex; Mater, J.; Drummond, R.

    2010-04-28

    The integration of automation associated with electricity resources (including transmission and distribution automation and demand-side resources operated by end-users) is key to supporting greater efficiencies and incorporating variable renewable resources and electric vehicles into the power system. The integration problems faced by this community are analogous to those faced in the health industry, emergency services, and other complex communities with many stakeholders. To highlight this issue and encourage communication and the development of a smart grid interoperability community, the GridWise Architecture Council (GWAC) created an Interoperability Context-Setting Framework. This "conceptual model" has been helpful to explain the importance of organizational alignment in addition to technical and informational interface specifications for "smart grid" devices and systems. As a next step to building a community sensitive to interoperability, the GWAC is investigating an interoperability maturity model (IMM) based on work done by others to address similar circumstances. The objective is to create a tool or set of tools that encourages a culture of interoperability in this emerging community. The tools would measure status and progress, analyze gaps, and prioritize efforts to improve the situation.

  1. Interoperability and HealthGRID.

    Bescos, C; Schmitt, D; Kass, J; García-Barbero, M; Kantchev, P

    2005-01-01

    GRID technology, with initiatives like the GGF, will have the potential to allow both competition and interoperability not only among applications and toolkits, but also among implementations of key services. The pyramid of eHealth interoperability should be achieved from standards in communication and data security, storage and processing, to the policy initiatives, including organizational protocols, financing procedures, and legal framework. The open challenges for GRID use in clinical fields illustrate the potential of the combination of grid technologies with medical routine into a wider interoperable framework. The Telemedicine Alliance is a consortium (ESA, WHO and ITU), initiated in 2002, in building a vision for the provision of eHealth to European citizens by 2010. After a survey with more that 50 interviews of experts, interoperability was identified as the main showstopper to eHealth implementation. There are already several groups and organizations contributing to standardization. TM-Alliance is supporting the "e-Health Standardization Coordination Group" (eHSCG). It is now, in the design and development phase of GRID technology in Health, the right moment to act with the aim of achieving an interoperable and open framework. The Health area should benefit from the initiatives started at the GGF in terms of global architecture and services definitions, as well as from the security and other web services applications developed under the Internet umbrella. There is a risk that existing important results of the standardization efforts in this area are not taken up simply because they are not always known.

  2. Managing interoperability and complexity in health systems.

    Bouamrane, M-M; Tao, C; Sarkar, I N

    2015-01-01

    In recent years, we have witnessed substantial progress in the use of clinical informatics systems to support clinicians during episodes of care, manage specialised domain knowledge, perform complex clinical data analysis and improve the management of health organisations' resources. However, the vision of fully integrated health information eco-systems, which provide relevant information and useful knowledge at the point-of-care, remains elusive. This journal Focus Theme reviews some of the enduring challenges of interoperability and complexity in clinical informatics systems. Furthermore, a range of approaches are proposed in order to address, harness and resolve some of the many remaining issues towards a greater integration of health information systems and extraction of useful or new knowledge from heterogeneous electronic data repositories.

  3. OGC and Grid Interoperability in enviroGRIDS Project

    Gorgan, Dorian; Rodila, Denisa; Bacu, Victor; Giuliani, Gregory; Ray, Nicolas

    2010-05-01

    EnviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is a 4-years FP7 Project aiming to address the subjects of ecologically unsustainable development and inadequate resource management. The project develops a Spatial Data Infrastructure of the Black Sea Catchment region. The geospatial technologies offer very specialized functionality for Earth Science oriented applications as well as the Grid oriented technology that is able to support distributed and parallel processing. One challenge of the enviroGRIDS project is the interoperability between geospatial and Grid infrastructures by providing the basic and the extended features of the both technologies. The geospatial interoperability technology has been promoted as a way of dealing with large volumes of geospatial data in distributed environments through the development of interoperable Web service specifications proposed by the Open Geospatial Consortium (OGC), with applications spread across multiple fields but especially in Earth observation research. Due to the huge volumes of data available in the geospatial domain and the additional introduced issues (data management, secure data transfer, data distribution and data computation), the need for an infrastructure capable to manage all those problems becomes an important aspect. The Grid promotes and facilitates the secure interoperations of geospatial heterogeneous distributed data within a distributed environment, the creation and management of large distributed computational jobs and assures a security level for communication and transfer of messages based on certificates. This presentation analysis and discusses the most significant use cases for enabling the OGC Web services interoperability with the Grid environment and focuses on the description and implementation of the most promising one. In these use cases we give a special attention to issues such as: the relations between computational grid and

  4. Linked data for transaction based enterprise interoperability

    Folmer, E.J.A.; Krukkert, D.

    2015-01-01

    Interoperability is of major importance in B2B environments. Starting with EDI in the ‘80s, currently interoperability relies heavily on XMLbased standards. Although having great impact, still issues remain to be solved for improving B2B interoperability. These issues include lack of dynamics, cost

  5. Personalized-Detailed Clinical Model for Data Interoperability Among Clinical Standards

    Khan, Wajahat Ali; Hussain, Maqbool; Afzal, Muhammad; Amin, Muhammad Bilal; Saleem, Muhammad Aamir; Lee, Sungyoung

    2013-01-01

    Objective: Data interoperability among health information exchange (HIE) systems is a major concern for healthcare practitioners to enable provisioning of telemedicine-related services. Heterogeneity exists in these systems not only at the data level but also among different heterogeneous healthcare standards with which these are compliant. The relationship between healthcare organization data and different heterogeneous standards is necessary to achieve the goal of data level interoperabi...

  6. The interoperability force in the ERP field

    Boza, Andrés; Cuenca, Llanos; Poler, Raúl; Michaelides, Zenon

    2015-04-01

    Enterprise resource planning (ERP) systems participate in interoperability projects and this participation sometimes leads to new proposals for the ERP field. The aim of this paper is to identify the role that interoperability plays in the evolution of ERP systems. To go about this, ERP systems have been first identified within interoperability frameworks. Second, the initiatives in the ERP field driven by interoperability requirements have been identified from two perspectives: technological and business. The ERP field is evolving from classical ERP as information system integrators to a new generation of fully interoperable ERP. Interoperability is changing the way of running business, and ERP systems are changing to adapt to the current stream of interoperability.

  7. HETEROGENEOUS INTEGRATION TECHNOLOGY

    2017-08-24

    AFRL-RY-WP-TR-2017-0168 HETEROGENEOUS INTEGRATION TECHNOLOGY Dr. Burhan Bayraktaroglu Devices for Sensing Branch Aerospace Components & Subsystems...Final September 1, 2016 – May 1, 2017 4. TITLE AND SUBTITLE HETEROGENEOUS INTEGRATION TECHNOLOGY 5a. CONTRACT NUMBER In-house 5b. GRANT NUMBER N/A...provide a structure for this review. The history and the current status of integration technologies in each category are examined and product examples are

  8. Spherical subsystem of galactic radiosources

    Gorshkov, A G; Popov, M V [Moskovskij Gosudarstvennyj Univ. (USSR). Gosudarstvennyj Astronomicheskij Inst. ' ' GAISh' '

    1975-05-01

    The concentration of statistically complete sampling radiosources of the Ohiof scanning with plane spectra towards the Galaxy centre has been discovered. Quantitative calculations have showed that the sources form a spheric subsystem, which is close in parameters to such old formations in the Galaxy as globular clusters and the RRLsub(YR) type stars. The luminosity of the galaxy spheric subsystem object equals 10/sup 33/ erg/sec, the total number of objects being 7000. The existence of such a subsystem explains s the anomalously by low incline of statistics lgN-lgS in HF scanning PKS (..gamma..-2700Mgz) and the Michigan University scanning (..gamma..=8000Mgz) because the sources of galaxy spheric subsystem make up a considerable share in the total number of sources, especially at high frequencies (50% of sources with a flux greater than a unit of flux per 8000Mgz). It is very probable that the given subsystem consists of the representatives of one of the following class of objects: a) heat sources - the H2H regions with T=10/sup 40/K, Nsub(e)=10/sup 3/, l=1 ps b) supermass black holes with mass M/Mo approximately 10/sup 5/.

  9. Space power subsystem automation technology

    Graves, J. R. (Compiler)

    1982-01-01

    The technology issues involved in power subsystem automation and the reasonable objectives to be sought in such a program were discussed. The complexities, uncertainties, and alternatives of power subsystem automation, along with the advantages from both an economic and a technological perspective were considered. Whereas most spacecraft power subsystems now use certain automated functions, the idea of complete autonomy for long periods of time is almost inconceivable. Thus, it seems prudent that the technology program for power subsystem automation be based upon a growth scenario which should provide a structured framework of deliberate steps to enable the evolution of space power subsystems from the current practice of limited autonomy to a greater use of automation with each step being justified on a cost/benefit basis. Each accomplishment should move toward the objectives of decreased requirement for ground control, increased system reliability through onboard management, and ultimately lower energy cost through longer life systems that require fewer resources to operate and maintain. This approach seems well-suited to the evolution of more sophisticated algorithms and eventually perhaps even the use of some sort of artificial intelligence. Multi-hundred kilowatt systems of the future will probably require an advanced level of autonomy if they are to be affordable and manageable.

  10. UGV: security analysis of subsystem control network

    Abbott-McCune, Sam; Kobezak, Philip; Tront, Joseph; Marchany, Randy; Wicks, Al

    2013-05-01

    Unmanned Ground vehicles (UGVs) are becoming prolific in the heterogeneous superset of robotic platforms. The sensors which provide odometry, localization, perception, and vehicle diagnostics are fused to give the robotic platform a sense of the environment it is traversing. The automotive industry CAN bus has dominated the industry due to the fault tolerance and the message structure allowing high priority messages to reach the desired node in a real time environment. UGVs are being researched and produced at an accelerated rate to preform arduous, repetitive, and dangerous missions that are associated with a military action in a protracted conflict. The technology and applications of the research will inevitably be turned into dual-use platforms to aid civil agencies in the performance of their various operations. Our motivation is security of the holistic system; however as subsystems are outsourced in the design, the overall security of the system may be diminished. We will focus on the CAN bus topology and the vulnerabilities introduced in UGVs and recognizable security vulnerabilities that are inherent in the communications architecture. We will show how data can be extracted from an add-on CAN bus that can be customized to monitor subsystems. The information can be altered or spoofed to force the vehicle to exhibit unwanted actions or render the UGV unusable for the designed mission. The military relies heavily on technology to maintain information dominance, and the security of the information introduced onto the network by UGVs must be safeguarded from vulnerabilities that can be exploited.

  11. Standards to open and interoperable digital libraries

    Luís Fernando Sayão

    2007-12-01

    Full Text Available Interoperability is one of the main issues in creating a networked system of digital libraries. However, the interoperability as the way to accomplish data exchange and service collaboration requires adoption of a set of open standards covering all digital repository processes. The aim of this document is to revise the most important standards, protocols and the best pratices that form the framework to an open and fully interoperable digital library.

  12. A Guide to Understanding Emerging Interoperability Technologies

    Bollinger, Terry

    2000-01-01

    .... Over time, individual interoperability problems tend to disappear as the resources involved literally become part of one system through integration and standardization, but the overall problem...

  13. Implementation of a metadata architecture and knowledge collection to support semantic interoperability in an enterprise data warehouse.

    Dhaval, Rakesh; Borlawsky, Tara; Ostrander, Michael; Santangelo, Jennifer; Kamal, Jyoti; Payne, Philip R O

    2008-11-06

    In order to enhance interoperability between enterprise systems, and improve data validity and reliability throughout The Ohio State University Medical Center (OSUMC), we have initiated the development of an ontology-anchored metadata architecture and knowledge collection for our enterprise data warehouse. The metadata and corresponding semantic relationships stored in the OSUMC knowledge collection are intended to promote consistency and interoperability across the heterogeneous clinical, research, business and education information managed within the data warehouse.

  14. RuleML-Based Learning Object Interoperability on the Semantic Web

    Biletskiy, Yevgen; Boley, Harold; Ranganathan, Girish R.

    2008-01-01

    Purpose: The present paper aims to describe an approach for building the Semantic Web rules for interoperation between heterogeneous learning objects, namely course outlines from different universities, and one of the rule uses: identifying (in)compatibilities between course descriptions. Design/methodology/approach: As proof of concept, a rule…

  15. The GEOSS solution for enabling data interoperability and integrative research.

    Nativi, Stefano; Mazzetti, Paolo; Craglia, Max; Pirrone, Nicola

    2014-03-01

    Global sustainability research requires an integrative research effort underpinned by digital infrastructures (systems) able to harness data and heterogeneous information across disciplines. Digital data and information sharing across systems and applications is achieved by implementing interoperability: a property of a product or system to work with other products or systems, present or future. There are at least three main interoperability challenges a digital infrastructure must address: technological, semantic, and organizational. In recent years, important international programs and initiatives are focusing on such an ambitious objective. This manuscript presents and combines the studies and the experiences carried out by three relevant projects, focusing on the heavy metal domain: Global Mercury Observation System, Global Earth Observation System of Systems (GEOSS), and INSPIRE. This research work recognized a valuable interoperability service bus (i.e., a set of standards models, interfaces, and good practices) proposed to characterize the integrative research cyber-infrastructure of the heavy metal research community. In the paper, the GEOSS common infrastructure is discussed implementing a multidisciplinary and participatory research infrastructure, introducing a possible roadmap for the heavy metal pollution research community to join GEOSS as a new Group on Earth Observation community of practice and develop a research infrastructure for carrying out integrative research in its specific domain.

  16. Block storage subsystem performance analysis

    CERN. Geneva

    2016-01-01

    You feel that your service is slow because of the storage subsystem? But there are too many abstraction layers between your software and the raw block device for you to debug all this pile... Let's dive on the platters and check out how the block storage sees your I/Os! We can even figure out what those patterns are meaning.

  17. Modelling and approaching pragmatic interoperability of distributed geoscience data

    Ma, Xiaogang

    2010-05-01

    Interoperability of geodata, which is essential for sharing information and discovering insights within a cyberinfrastructure, is receiving increasing attention. A key requirement of interoperability in the context of geodata sharing is that data provided by local sources can be accessed, decoded, understood and appropriately used by external users. Various researchers have discussed that there are four levels in data interoperability issues: system, syntax, schematics and semantics, which respectively relate to the platform, encoding, structure and meaning of geodata. Ontology-driven approaches have been significantly studied addressing schematic and semantic interoperability issues of geodata in the last decade. There are different types, e.g. top-level ontologies, domain ontologies and application ontologies and display forms, e.g. glossaries, thesauri, conceptual schemas and logical theories. Many geodata providers are maintaining their identified local application ontologies in order to drive standardization in local databases. However, semantic heterogeneities often exist between these local ontologies, even though they are derived from equivalent disciplines. In contrast, common ontologies are being studied in different geoscience disciplines (e.g., NAMD, SWEET, etc.) as a standardization procedure to coordinate diverse local ontologies. Semantic mediation, e.g. mapping between local ontologies, or mapping local ontologies to common ontologies, has been studied as an effective way of achieving semantic interoperability between local ontologies thus reconciling semantic heterogeneities in multi-source geodata. Nevertheless, confusion still exists in the research field of semantic interoperability. One problem is caused by eliminating elements of local pragmatic contexts in semantic mediation. Comparing to the context-independent feature of a common domain ontology, local application ontologies are closely related to elements (e.g., people, time, location

  18. Interoperability of Web Archives and Digital Libraries

    Kalb, Hendrik; Lazaridou, Paraskevi; Pinsent, Edward

    2013-01-01

    The interoperability of web archives and digital libraries is crucial to avoid silos of preserved data and content. While various researches focus on specfic facets of the challenge to interoperate, there is a lack of empirical work about the overall situation of actual challenges. We conduct...

  19. Model for Trans-sector Digital Interoperability

    Madureira, António; den Hartog, Frank; Goncalves da Silva, Eduardo; Baken, Nico; Zhao, L.; Macaulay, L.

    2009-01-01

    Interoperability refers to the ability of two or more systems or components to exchange information and to use the information that has been exchanged. The importance of interoperability has grown together with the adoption of Digital Information Networks (DINs). DINs refer to information networks

  20. Model for Trans-sector Digital Interoperability

    Madureira, A.; Den Hartog, F.; Silva, E.; Baken, N.

    2010-01-01

    Interoperability refers to the ability of two or more systems or components to exchange information and to use the information that has been exchanged. The importance of interoperability has grown together with the adoption of Digital Information Networks (DINs). DINs refer to information networks

  1. Model for Trans-sector Digital Interoperability

    Popplewell, Keith; Madureira, António; Harding, Jenny; den Hartog, Frank; Goncalves da Silva, Eduardo; Poler, Raul; Chalmeta, Ricardo; Baken, Nico

    Interoperability refers to the ability of two or more systems or components to exchange information and to use the information that has been exchanged. The importance of interoperability has grown together with the adoption of Digital Information Networks (DINs). DINs refer to information networks

  2. Innovation in OGC: The Interoperability Program

    George Percivall

    2015-10-01

    Full Text Available The OGC Interoperability Program is a source of innovation in the development of open standards. The approach to innovation is based on hands-on; collaborative engineering leading to more mature standards and implementations. The process of the Interoperability Program engages a community of sponsors and participants based on an economic model that benefits all involved. Each initiative begins with an innovative approach to identify interoperability needs followed by agile software development to advance the state of technology to the benefit of society. Over eighty initiatives have been conducted in the Interoperability Program since the breakthrough Web Mapping Testbed began the program in 1999. OGC standards that were initiated in Interoperability Program are the basis of two thirds of the certified compliant products.

  3. A Framework for Seamless Interoperation of Heterogeneous Distributed Software Components

    2005-05-01

    have access control policy This paper proposes that access control patterns (in that is defined by privacy and confidentiality legislation the form of...2003, Prentice Hall, Upper Saddle River, New Jersey 07458 [5] Dhbingra, V., "Business-to-Business Ecommerce ," http://proiects.bus.lsu.edu/independent

  4. Holistic Framework For Establishing Interoperability of Heterogeneous Software Development Tools

    Puett, Joseph

    2003-01-01

    ...; however, this research focuses on establishing a holistic approach over the entire development effort where unrealized synergies and dependencies between all of the tools' artifacts can be visualized...

  5. Conceptual Model Formalization in a Semantic Interoperability Service Framework: Transforming Relational Database Schemas to OWL.

    Bravo, Carlos; Suarez, Carlos; González, Carolina; López, Diego; Blobel, Bernd

    2014-01-01

    Healthcare information is distributed through multiple heterogeneous and autonomous systems. Access to, and sharing of, distributed information sources are a challenging task. To contribute to meeting this challenge, this paper presents a formal, complete and semi-automatic transformation service from Relational Databases to Web Ontology Language. The proposed service makes use of an algorithm that allows to transform several data models of different domains by deploying mainly inheritance rules. The paper emphasizes the relevance of integrating the proposed approach into an ontology-based interoperability service to achieve semantic interoperability.

  6. MITS Data Acquisition Subsystem Acceptance Test procedure

    Allison, R.

    1980-01-01

    This is an acceptance procedure for the Data Acquisition Subsystem of the Machine Interface Test System (MITS). Prerequisites, requirements, and detailed step-by-step instruction are presented for inspecting and performance testing the subsystem

  7. The Phenix Detector magnet subsystem

    Yamamoto, R.M.; Bowers, J.M.; Harvey, A.R.

    1995-01-01

    The PHENIX [Photon Electron New Heavy Ion Experiment] Detector is one of two large detectors presently under construction for RHIC (Relativistic Heavy Ion Collider) located at Brookhaven National Laboratory. Its primary goal is to detect a new phase of matter; the quark-gluon plasma. In order to achieve this objective, the PHENIX Detector utilizes a complex magnet subsystem which is comprised of two large magnets identified as the Central Magnet (CM) and the Muon Magnet (MM). Muon Identifier steel is also included as part of this package. The entire magnet subsystem stands over 10 meters tall and weighs in excess of 1900 tons (see Fig. 1). Magnet size alone provided many technical challenges throughout the design and fabrication of the project. In addition, interaction with foreign collaborators provided the authors with new areas to address and problems to solve. Russian collaborators would fabricate a large fraction of the steel required and Japanese collaborators would supply the first coil. This paper will describe the overall design of the PHENIX magnet subsystem and discuss its present fabrication status

  8. The Phenix Detector magnet subsystem

    Yamamoto, R.M.; Bowers, J.M.; Harvey, A.R. [Lawrence Livermore National Lab., CA (United States)] [and others

    1995-05-19

    The PHENIX [Photon Electron New Heavy Ion Experiment] Detector is one of two large detectors presently under construction for RHIC (Relativistic Heavy Ion Collider) located at Brookhaven National Laboratory. Its primary goal is to detect a new phase of matter; the quark-gluon plasma. In order to achieve this objective, the PHENIX Detector utilizes a complex magnet subsystem which is comprised of two large magnets identified as the Central Magnet (CM) and the Muon Magnet (MM). Muon Identifier steel is also included as part of this package. The entire magnet subsystem stands over 10 meters tall and weighs in excess of 1900 tons (see Fig. 1). Magnet size alone provided many technical challenges throughout the design and fabrication of the project. In addition, interaction with foreign collaborators provided the authors with new areas to address and problems to solve. Russian collaborators would fabricate a large fraction of the steel required and Japanese collaborators would supply the first coil. This paper will describe the overall design of the PHENIX magnet subsystem and discuss its present fabrication status.

  9. [Financing, organization, costs and services performance of the Argentinean health sub-systems.

    Yavich, Natalia; Báscolo, Ernesto Pablo; Haggerty, Jeannie

    2016-01-01

    To analyze the relationship between health system financing and services organization models with costs and health services performance in each of Rosario's health sub-systems. The financing and organization models were characterized using secondary data. Costs were calculated using the WHO/SHA methodology. Healthcare quality was measured by a household survey (n=822). Public subsystem:Vertically integrated funding and primary healthcare as a leading strategy to provide services produced low costs and individual-oriented healthcare but with weak accessibility conditions and comprehensiveness. Private subsystem: Contractual integration and weak regulatory and coordination mechanisms produced effects opposed to those of the public sub-system. Social security: Contractual integration and strong regulatory and coordination mechanisms contributed to intermediate costs and overall high performance. Each subsystem financing and services organization model had a strong and heterogeneous influence on costs and health services performance.

  10. Trust Model to Enhance Security and Interoperability of Cloud Environment

    Li, Wenjuan; Ping, Lingdi

    Trust is one of the most important means to improve security and enable interoperability of current heterogeneous independent cloud platforms. This paper first analyzed several trust models used in large and distributed environment and then introduced a novel cloud trust model to solve security issues in cross-clouds environment in which cloud customer can choose different providers' services and resources in heterogeneous domains can cooperate. The model is domain-based. It divides one cloud provider's resource nodes into the same domain and sets trust agent. It distinguishes two different roles cloud customer and cloud server and designs different strategies for them. In our model, trust recommendation is treated as one type of cloud services just like computation or storage. The model achieves both identity authentication and behavior authentication. The results of emulation experiments show that the proposed model can efficiently and safely construct trust relationship in cross-clouds environment.

  11. Connectivity, interoperability and manageability challenges in internet of things

    Haseeb, Shariq; Hashim, Aisha Hassan A.; Khalifa, Othman O.; Ismail, Ahmad Faris

    2017-09-01

    The vision of Internet of Things (IoT) is about interconnectivity between sensors, actuators, people and processes. IoT exploits connectivity between physical objects like fridges, cars, utilities, buildings and cities for enhancing the lives of people through automation and data analytics. However, this sudden increase in connected heterogeneous IoT devices takes a huge toll on the existing Internet infrastructure and introduces new challenges for researchers to embark upon. This paper highlights the effects of heterogeneity challenges on connectivity, interoperability, management in greater details. It also surveys some of the existing solutions adopted in the core network to solve the challenges of massive IoT deployment. The paper finally concludes that IoT architecture and network infrastructure needs to be reengineered ground-up, so that IoT solutions can be safely and efficiently deployed.

  12. Assessment of Collaboration and Interoperability in an Information Management System to Support Bioscience Research

    Myneni, Sahiti; Patel, Vimla L.

    2009-01-01

    Biomedical researchers often have to work on massive, detailed, and heterogeneous datasets that raise new challenges of information management. This study reports an investigation into the nature of the problems faced by the researchers in two bioscience test laboratories when dealing with their data management applications. Data were collected using ethnographic observations, questionnaires, and semi-structured interviews. The major problems identified in working with these systems were related to data organization, publications, and collaboration. The interoperability standards were analyzed using a C4I framework at the level of connection, communication, consolidation, and collaboration. Such an analysis was found to be useful in judging the capabilities of data management systems at different levels of technological competency. While collaboration and system interoperability are the “must have” attributes of these biomedical scientific laboratory information management applications, usability and human interoperability are the other design concerns that must also be addressed for easy use and implementation. PMID:20351900

  13. Extending the GI Brokering Suite to Support New Interoperability Specifications

    Boldrini, E.; Papeschi, F.; Santoro, M.; Nativi, S.

    2014-12-01

    The GI brokering suite provides the discovery, access, and semantic Brokers (i.e. GI-cat, GI-axe, GI-sem) that empower a Brokering framework for multi-disciplinary and multi-organizational interoperability. GI suite has been successfully deployed in the framework of several programmes and initiatives, such as European Union funded projects, NSF BCube, and the intergovernmental coordinated effort Global Earth Observation System of Systems (GEOSS). Each GI suite Broker facilitates interoperability for a particular functionality (i.e. discovery, access, semantic extension) among a set of brokered resources published by autonomous providers (e.g. data repositories, web services, semantic assets) and a set of heterogeneous consumers (e.g. client applications, portals, apps). A wide set of data models, encoding formats, and service protocols are already supported by the GI suite, such as the ones defined by international standardizing organizations like OGC and ISO (e.g. WxS, CSW, SWE, GML, netCDF) and by Community specifications (e.g. THREDDS, OpenSearch, OPeNDAP, ESRI APIs). Using GI suite, resources published by a particular Community or organization through their specific technology (e.g. OPeNDAP/netCDF) can be transparently discovered, accessed, and used by different Communities utilizing their preferred tools (e.g. a GIS visualizing WMS layers). Since Information Technology is a moving target, new standards and technologies continuously emerge and are adopted in the Earth Science context too. Therefore, GI Brokering suite was conceived to be flexible and accommodate new interoperability protocols and data models. For example, GI suite has recently added support to well-used specifications, introduced to implement Linked data, Semantic Web and precise community needs. Amongst the others, they included: DCAT: a RDF vocabulary designed to facilitate interoperability between Web data catalogs. CKAN: a data management system for data distribution, particularly used by

  14. Impact of coalition interoperability on PKI

    Krall, Edward J.

    2003-07-01

    This paper examines methods for providing PKI interoperability among units of a coalition of armed forces drawn from different nations. The area in question is tactical identity management, for the purposes of confidentiality, integrity and non-repudiation in such a dynamic coalition. The interoperating applications under consideration range from email and other forms of store-and-forward messaging to TLS and IPSEC-protected real-time communications. Six interoperability architectures are examined with advantages and disadvantages of each described in the paper.

  15. Risk Management Considerations for Interoperable Acquisition

    Meyers, B. C

    2006-01-01

    .... The state of risk management practice -- the specification of standards and the methodologies to implement them -- is addressed and examined with respect to the needs of system-of-systems interoperability...

  16. Interoperability for Entreprise Systems and Applications '12

    Doumeingts, Guy; Katzy, Bernhard; Chalmeta, Ricardo

    2012-01-01

    Within a scenario of globalised markets, where the capacity to efficiently cooperate with other firms starts to become essential in order to remain in the market in an economically, socially and environmentally cost-effective manner, it can be seen how the most innovative enterprises are beginning to redesign their business model to become interoperable. This goal of interoperability is essential, not only from the perspective of the individual enterprise but also in the new business structures that are now emerging, such as supply chains, virtual enterprises, interconnected organisations or extended enterprises, as well as in mergers and acquisitions. Composed of over 40 papers, Enterprise Interoperability V ranges from academic research through case studies to industrial and administrative experience of interoperability. The international nature of the authorship contnues to broaden. Many of the papers have examples and illustrations calculated to deepen understanding and generate new ideas. The I-ESA'12 Co...

  17. Epimenides: Interoperability Reasoning for Digital Preservation

    Kargakis, Yannis; Tzitzikas, Yannis; van Horik, M.P.M.

    2014-01-01

    This paper presents Epimenides, a system that implements a novel interoperability dependency reasoning approach for assisting digital preservation activities. A distinctive feature is that it can model also converters and emulators, and the adopted modelling approach enables the automatic reasoning

  18. River Basin Standards Interoperability Pilot

    Pesquer, Lluís; Masó, Joan; Stasch, Christoph

    2016-04-01

    There is a lot of water information and tools in Europe to be applied in the river basin management but fragmentation and a lack of coordination between countries still exists. The European Commission and the member states have financed several research and innovation projects in support of the Water Framework Directive. Only a few of them are using the recently emerging hydrological standards, such as the OGC WaterML 2.0. WaterInnEU is a Horizon 2020 project focused on creating a marketplace to enhance the exploitation of EU funded ICT models, tools, protocols and policy briefs related to water and to establish suitable conditions for new market opportunities based on these offerings. One of WaterInnEU's main goals is to assess the level of standardization and interoperability of these outcomes as a mechanism to integrate ICT-based tools, incorporate open data platforms and generate a palette of interchangeable components that are able to use the water data emerging from the recently proposed open data sharing processes and data models stimulated by initiatives such as the INSPIRE directive. As part of the standardization and interoperability activities in the project, the authors are designing an experiment (RIBASE, the present work) to demonstrate how current ICT-based tools and water data can work in combination with geospatial web services in the Scheldt river basin. The main structure of this experiment, that is the core of the present work, is composed by the following steps: - Extraction of information from river gauges data in OGC WaterML 2.0 format using SOS services (preferably compliant to the OGC SOS 2.0 Hydrology Profile Best Practice). - Model floods using a WPS 2.0, WaterML 2.0 data and weather forecast models as input. - Evaluation of the applicability of Sensor Notification Services in water emergencies. - Open distribution of the input and output data as OGC web services WaterML, / WCS / WFS and with visualization utilities: WMS. The architecture

  19. Data Modeling Challenges of Advanced Interoperability.

    Blobel, Bernd; Oemig, Frank; Ruotsalainen, Pekka

    2018-01-01

    Progressive health paradigms, involving many different disciplines and combining multiple policy domains, requires advanced interoperability solutions. This results in special challenges for modeling health systems. The paper discusses classification systems for data models and enterprise business architectures and compares them with the ISO Reference Architecture. On that basis, existing definitions, specifications and standards of data models for interoperability are evaluated and their limitations are discussed. Amendments to correctly use those models and to better meet the aforementioned challenges are offered.

  20. Investigation of Automated Terminal Interoperability Test

    Brammer, Niklas

    2008-01-01

    In order to develop and secure the functionality of its cellular communications systems, Ericsson deals with numerous R&D and I&V activities. One important aspect is interoperability with mobile terminals from different vendors on the world market. Therefore Ericsson co-operates with mobile platform and user equipment manufacturers. These companies visit the interoperability developmental testing (IoDT) laboratories in Linköping to test their developmental products and prototypes in o...

  1. Grid interoperability: joining grid information systems

    Flechl, M; Field, L

    2008-01-01

    A grid is defined as being 'coordinated resource sharing and problem solving in dynamic, multi-institutional virtual organizations'. Over recent years a number of grid projects, many of which have a strong regional presence, have emerged to help coordinate institutions and enable grids. Today, we face a situation where a number of grid projects exist, most of which are using slightly different middleware. Grid interoperation is trying to bridge these differences and enable Virtual Organizations to access resources at the institutions independent of their grid project affiliation. Grid interoperation is usually a bilateral activity between two grid infrastructures. Recently within the Open Grid Forum, the Grid Interoperability Now (GIN) Community Group is trying to build upon these bilateral activities. The GIN group is a focal point where all the infrastructures can come together to share ideas and experiences on grid interoperation. It is hoped that each bilateral activity will bring us one step closer to the overall goal of a uniform grid landscape. A fundamental aspect of a grid is the information system, which is used to find available grid services. As different grids use different information systems, interoperation between these systems is crucial for grid interoperability. This paper describes the work carried out to overcome these differences between a number of grid projects and the experiences gained. It focuses on the different techniques used and highlights the important areas for future standardization

  2. Maturity Model for Advancing Smart Grid Interoperability

    Knight, Mark; Widergren, Steven E.; Mater, J.; Montgomery, Austin

    2013-10-28

    Abstract—Interoperability is about the properties of devices and systems to connect and work properly. Advancing interoperability eases integration and maintenance of the resulting interconnection. This leads to faster integration, lower labor and component costs, predictability of projects and the resulting performance, and evolutionary paths for upgrade. When specifications are shared and standardized, competition and novel solutions can bring new value streams to the community of stakeholders involved. Advancing interoperability involves reaching agreement for how things join at their interfaces. The quality of the agreements and the alignment of parties involved in the agreement present challenges that are best met with process improvement techniques. The GridWise® Architecture Council (GWAC) sponsored by the United States Department of Energy is supporting an effort to use concepts from capability maturity models used in the software industry to advance interoperability of smart grid technology. An interoperability maturity model has been drafted and experience is being gained through trials on various types of projects and community efforts. This paper describes the value and objectives of maturity models, the nature of the interoperability maturity model and how it compares with other maturity models, and experiences gained with its use.

  3. Towards Interoperable Preservation Repositories: TIPR

    Priscilla Caplan

    2010-07-01

    Full Text Available Towards Interoperable Preservation Repositories (TIPR is a project funded by the Institute of Museum and Library Services to create and test a Repository eXchange Package (RXP. The package will make it possible to transfer complex digital objects between dissimilar preservation repositories.  For reasons of redundancy, succession planning and software migration, repositories must be able to exchange copies of archival information packages with each other. Every different repository application, however, describes and structures its archival packages differently. Therefore each system produces dissemination packages that are rarely understandable or usable as submission packages by other repositories. The RXP is an answer to that mismatch. Other solutions for transferring packages between repositories focus either on transfers between repositories of the same type, such as DSpace-to-DSpace transfers, or on processes that rely on central translation services.  Rather than build translators between many dissimilar repository types, the TIPR project has defined a standards-based package of metadata files that can act as an intermediary information package, the RXP, a lingua franca all repositories can read and write.

  4. A web services choreography scenario for interoperating bioinformatics applications

    Cheung David W

    2004-03-01

    Full Text Available Abstract Background Very often genome-wide data analysis requires the interoperation of multiple databases and analytic tools. A large number of genome databases and bioinformatics applications are available through the web, but it is difficult to automate interoperation because: 1 the platforms on which the applications run are heterogeneous, 2 their web interface is not machine-friendly, 3 they use a non-standard format for data input and output, 4 they do not exploit standards to define application interface and message exchange, and 5 existing protocols for remote messaging are often not firewall-friendly. To overcome these issues, web services have emerged as a standard XML-based model for message exchange between heterogeneous applications. Web services engines have been developed to manage the configuration and execution of a web services workflow. Results To demonstrate the benefit of using web services over traditional web interfaces, we compare the two implementations of HAPI, a gene expression analysis utility developed by the University of California San Diego (UCSD that allows visual characterization of groups or clusters of genes based on the biomedical literature. This utility takes a set of microarray spot IDs as input and outputs a hierarchy of MeSH Keywords that correlates to the input and is grouped by Medical Subject Heading (MeSH category. While the HTML output is easy for humans to visualize, it is difficult for computer applications to interpret semantically. To facilitate the capability of machine processing, we have created a workflow of three web services that replicates the HAPI functionality. These web services use document-style messages, which means that messages are encoded in an XML-based format. We compared three approaches to the implementation of an XML-based workflow: a hard coded Java application, Collaxa BPEL Server and Taverna Workbench. The Java program functions as a web services engine and interoperates

  5. Cassini Mission Sequence Subsystem (MSS)

    Alland, Robert

    2011-01-01

    This paper describes my work with the Cassini Mission Sequence Subsystem (MSS) team during the summer of 2011. It gives some background on the motivation for this project and describes the expected benefit to the Cassini program. It then introduces the two tasks that I worked on - an automatic system auditing tool and a series of corrections to the Cassini Sequence Generator (SEQ_GEN) - and the specific objectives these tasks were to accomplish. Next, it details the approach I took to meet these objectives and the results of this approach, followed by a discussion of how the outcome of the project compares with my initial expectations. The paper concludes with a summary of my experience working on this project, lists what the next steps are, and acknowledges the help of my Cassini colleagues.

  6. Operationally Responsive Spacecraft Subsystem, Phase I

    National Aeronautics and Space Administration — Saber Astronautics proposes spacecraft subsystem control software which can autonomously reconfigure avionics for best performance during various mission conditions....

  7. Enterprise interoperability VI : Interoperability for Agility, Resilience and Plasticity of Collaboration

    Bénaben, Frédérick; Poler, Raúl; Bourrières, Jean-Paul

    2014-01-01

    A concise reference to the state of the art in systems interoperability, Enterprise Interoperability VI will be of great value to engineers and computer scientists working in manufacturing and other process industries and to software engineers and electronic and manufacturing engineers working in the academic environment. Over 40 papers, ranging from academic research through case studies to industrial and administrative experience of interoperability show how, in a scenario of globalised markets, where the capacity to cooperate with other firms efficiently starts to become essential in order to remain in the market in an economically, socially and environmentally cost-effective manner, the most innovative enterprises are beginning to redesign their business model to become interoperable. This goal of interoperability is essential, not only from the perspective of the individual enterprise but also in the new business structures that are now emerging, such as supply chains, virtual enterprises, interconnected...

  8. Interoperability of Heliophysics Virtual Observatories

    Thieman, J.; Roberts, A.; King, T.; King, J.; Harvey, C.

    2008-01-01

    If you'd like to find interrelated heliophysics (also known as space and solar physics) data for a research project that spans, for example, magnetic field data and charged particle data from multiple satellites located near a given place and at approximately the same time, how easy is this to do? There are probably hundreds of data sets scattered in archives around the world that might be relevant. Is there an optimal way to search these archives and find what you want? There are a number of virtual observatories (VOs) now in existence that maintain knowledge of the data available in subdisciplines of heliophysics. The data may be widely scattered among various data centers, but the VOs have knowledge of what is available and how to get to it. The problem is that research projects might require data from a number of subdisciplines. Is there a way to search multiple VOs at once and obtain what is needed quickly? To do this requires a common way of describing the data such that a search using a common term will find all data that relate to the common term. This common language is contained within a data model developed for all of heliophysics and known as the SPASE (Space Physics Archive Search and Extract) Data Model. NASA has funded the main part of the development of SPASE but other groups have put resources into it as well. How well is this working? We will review the use of SPASE and how well the goal of locating and retrieving data within the heliophysics community is being achieved. Can the VOs truly be made interoperable despite being developed by so many diverse groups?

  9. National Ignition Facility subsystem design requirements target positioning subsystem SSDR 1.8.2

    Pittenger, L.

    1996-01-01

    This Subsystem Design Requirement document is a development specification that establishes the performance, design, development and test requirements for the target positioner subsystem (WBS 1.8.2) of the NIF Target Experimental System (WBS 1.8)

  10. Data and Mined-Knowledge Interoperability in eHealth Systems

    Sartipi, Kamran; Najafi, Mehran; Kazemzadeh, Reza S.

    2008-01-01

    Current healthcare infrastructures in the advanced societies can not fulfil the demands for quality public health services which are characterized by patient-centric, seamless interoperation of heterogeneous healthcare systems, and nation-wide electronic health record services. Consequently, the governments and healthcare institutions are embracing new information and communication technologies to provide the necessary infrastructures for healthcare and medical services. In this chapter, we a...

  11. A step-by-step methodology for enterprise interoperability projects

    Chalmeta, Ricardo; Pazos, Verónica

    2015-05-01

    Enterprise interoperability is one of the key factors for enhancing enterprise competitiveness. Achieving enterprise interoperability is an extremely complex process which involves different technological, human and organisational elements. In this paper we present a framework to help enterprise interoperability. The framework has been developed taking into account the three domains of interoperability: Enterprise Modelling, Architecture and Platform and Ontologies. The main novelty of the framework in comparison to existing ones is that it includes a step-by-step methodology that explains how to carry out an enterprise interoperability project taking into account different interoperability views, like business, process, human resources, technology, knowledge and semantics.

  12. IHE based interoperability - benefits and challenges.

    Wozak, Florian; Ammenwerth, Elske; Hörbst, Alexander; Sögner, Peter; Mair, Richard; Schabetsberger, Thomas

    2008-01-01

    Optimized workflows and communication between institutions involved in a patient's treatment process can lead to improved quality and efficiency in the healthcare sector. Electronic Health Records (EHRs) provide a patient-centered access to clinical data across institutional boundaries supporting the above mentioned aspects. Interoperability is regarded as vital success factor. However a clear definition of interoperability does not exist. The aim of this work is to define and to assess interoperability criteria as required for EHRs. The definition and assessment of interoperability criteria is supported by the analysis of existing literature and personal experience as well as by discussions with several domain experts. Criteria for interoperability addresses the following aspects: Interfaces, Semantics, Legal and organizational aspects and Security. The Integrating the Healthcare Enterprises initiative (IHE) profiles make a major contribution to these aspects, but they also arise new problems. Flexibility for adoption to different organizational/regional or other specific conditions is missing. Regional or national initiatives should get a possibility to realize their specific needs within the boundaries of IHE profiles. Security so far is an optional element which is one of IHE greatest omissions. An integrated security approach seems to be preferable. Irrespective of the so far practical significance of the IHE profiles it appears to be of great importance, that the profiles are constantly checked against practical experiences and are continuously adapted.

  13. Information Subsystem of Shadow Economy Deactivation

    Filippova, Tatyana V.

    2015-01-01

    The article presents information subsystem of shadow economy deactivation aimed at minimizing negative effects caused by its reproduction. In Russia, as well as in other countries, efficient implementation of the suggested system of shadow economy deactivation can be ensured by the developed information subsystem.

  14. Installation package for the Solaron solar subsystem

    1979-01-01

    Information that is intended to be a guide for installation, operation, and maintenance of the various solar subsystems is presented. The subsystems consist of the following: collectors, storage, transport (air handler) and controller for heat pump and peak storage. Two prototype residential systems were installed at Akron, Ohio, and Duffield, Virginia.

  15. Private quantum subsystems and quasiorthogonal operator algebras

    Levick, Jeremy; Kribs, David W; Pereira, Rajesh; Jochym-O’Connor, Tomas; Laflamme, Raymond

    2016-01-01

    We generalize a recently discovered example of a private quantum subsystem to find private subsystems for Abelian subgroups of the n-qubit Pauli group, which exist in the absence of private subspaces. In doing so, we also connect these quantum privacy investigations with the theory of quasiorthogonal operator algebras through the use of tools from group theory and operator theory. (paper)

  16. Response of subsystems on inelastic structures

    Lin, J.; Mahin, S.A.

    1984-01-01

    Preliminary analysis are performed to obtain insight into the seismic response of subsystems supported on simple structures that yield during severe earthquake ground motions. Current design recommendations for subsystems accounting for yielding of the supporting structures are assessed and found to be unconservative. An amplification factor is defined to quantify the effects of inelastic deformations of the supporting structure on subsystem response. Design guidelines are formulated for predicting the amplification factor based on statistical evaluation of the results generated for ten earthquake ground motions. Using these values, design floor response spectra can be obtained from conventional linear elastic floor response spectra accounting for yielding of the supporting structure without having to perform inelastic analysis. The effects of non-zero subsystem mass are examined. The recommended amplification factors are found to be applicable even when the mass of subsystem approaches that of the supporting structure

  17. Efficient chaotic based satellite power supply subsystem

    Ramos Turci, Luiz Felipe; Macau, Elbert E.N.; Yoneyama, Takashi

    2009-01-01

    In this work, we investigate the use of the Dynamical System Theory to increase the efficiency of the satellite power supply subsystems. The core of a satellite power subsystem relies on its DC/DC converter. This is a very nonlinear system that presents a multitude of phenomena ranging from bifurcations, quasi-periodicity, chaos, coexistence of attractors, among others. The traditional power subsystem design techniques try to avoid these nonlinear phenomena so that it is possible to use linear system theory in small regions about the equilibrium points. Here, we show that more efficiency can be drawn from a power supply subsystem if the DC/DC converter operates in regions of high nonlinearity. In special, if it operates in a chaotic regime, is has an intrinsic sensitivity that can be exploited to efficiently drive the power subsystem over high ranges of power requests by using control of chaos techniques.

  18. Efficient chaotic based satellite power supply subsystem

    Ramos Turci, Luiz Felipe [Technological Institute of Aeronautics (ITA), Sao Jose dos Campos, SP (Brazil)], E-mail: felipeturci@yahoo.com.br; Macau, Elbert E.N. [National Institute of Space Research (Inpe), Sao Jose dos Campos, SP (Brazil)], E-mail: elbert@lac.inpe.br; Yoneyama, Takashi [Technological Institute of Aeronautics (ITA), Sao Jose dos Campos, SP (Brazil)], E-mail: takashi@ita.br

    2009-10-15

    In this work, we investigate the use of the Dynamical System Theory to increase the efficiency of the satellite power supply subsystems. The core of a satellite power subsystem relies on its DC/DC converter. This is a very nonlinear system that presents a multitude of phenomena ranging from bifurcations, quasi-periodicity, chaos, coexistence of attractors, among others. The traditional power subsystem design techniques try to avoid these nonlinear phenomena so that it is possible to use linear system theory in small regions about the equilibrium points. Here, we show that more efficiency can be drawn from a power supply subsystem if the DC/DC converter operates in regions of high nonlinearity. In special, if it operates in a chaotic regime, is has an intrinsic sensitivity that can be exploited to efficiently drive the power subsystem over high ranges of power requests by using control of chaos techniques.

  19. ECCS Operability With One or More Subsystem(s) Inoperable

    Swantner, Stephen R.; Andrachek, James D.

    2002-01-01

    equivalent to a single Operable ECCS train exists with those components out of service. This evaluation ensures that the safety analysis assumption associated with one train of emergency core cooling system (ECCS) is still preserved by various combinations of components in opposite trains. An ECCS train is inoperable if it is not capable of delivering design flow to the reactor coolant system (RCS). Individual components are inoperable of they are not capable of performing their design function, or support systems are not available. Due to the redundancy of trains and the diversity of subsystems, the inoperability of one component in a train does render the ECCS incapable of performing its function. Neither does the inoperability of two different components, each in a different train, necessarily result in a loss of function for the ECCS. The intent of Condition A is to maintain a combination of components such that 100% of the ECCS flow equivalent to a single Operable ECCS train remains available. This allows increased flexibility in plant operations under circumstances when components in the required subsystem may be inoperable, but the ECCS remains capable of delivering 100% of the required flow equivalent. This paper presents a methodology for identifying the minimum set of components necessary for 100% of the ECCS flow equivalent to a single Operable ECCS train. An example of the implementation of this methodology is provided for a typical Westinghouse 3-loop ECCS design. (authors)

  20. ACCESS Sub-system Performance

    Kaiser, Mary Elizabeth; Morris, Matthew J.; Aldoroty, Lauren Nicole; Godon, David; Pelton, Russell; McCandliss, Stephan R.; Kurucz, Robert L.; Kruk, Jeffrey W.; Rauscher, Bernard J.; Kimble, Randy A.; Wright, Edward L.; Benford, Dominic J.; Gardner, Jonathan P.; Feldman, Paul D.; Moos, H. Warren; Riess, Adam G.; Bohlin, Ralph; Deustua, Susana E.; Dixon, William Van Dyke; Sahnow, David J.; Lampton, Michael; Perlmutter, Saul

    2016-01-01

    ACCESS: Absolute Color Calibration Experiment for Standard Stars is a series of rocket-borne sub-orbital missions and ground-based experiments designed to leverage significant technological advances in detectors, instruments, and the precision of the fundamental laboratory standards used to calibrate these instruments to enable improvements in the precision of the astrophysical flux scale through the transfer of laboratory absolute detector standards from the National Institute of Standards and Technology (NIST) to a network of stellar standards with a calibration accuracy of 1% and a spectral resolving power of 500 across the 0.35 to 1.7 micron bandpass.A cross wavelength calibration of the astrophysical flux scale to this level of precision over this broad a bandpass is relevant for the data used to probe fundamental astrophysical problems such as the SNeIa photometry based measurements used to constrain dark energy theories.We will describe the strategy for achieving this level of precision, the payload and calibration configuration, present sub-system test data, and the status and preliminary performance of the integration and test of the spectrograph and telescope. NASA APRA sounding rocket grant NNX14AH48G supports this work.

  1. The Calipso Thermal Control Subsystem

    Gasbarre, Joseph F.; Ousley, Wes; Valentini, Marc; Thomas, Jason; Dejoie, Joel

    2007-01-01

    The Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) is a joint NASA-CNES mission to study the Earth s cloud and aerosol layers. The satellite is composed of a primary payload (built by Ball Aerospace) and a spacecraft platform bus (PROTEUS, built by Alcatel Alenia Space). The thermal control subsystem (TCS) for the CALIPSO satellite is a passive design utilizing radiators, multi-layer insulation (MLI) blankets, and both operational and survival surface heaters. The most temperature sensitive component within the satellite is the laser system. During thermal vacuum testing of the integrated satellite, the laser system s operational heaters were found to be inadequate in maintaining the lasers required set point. In response, a solution utilizing the laser system s survival heaters to augment the operational heaters was developed with collaboration between NASA, CNES, Ball Aerospace, and Alcatel-Alenia. The CALIPSO satellite launched from Vandenberg Air Force Base in California on April 26th, 2006. Evaluation of both the platform and payload thermal control systems show they are performing as expected and maintaining the critical elements of the satellite within acceptable limits.

  2. Scientific Digital Libraries, Interoperability, and Ontologies

    Hughes, J. Steven; Crichton, Daniel J.; Mattmann, Chris A.

    2009-01-01

    Scientific digital libraries serve complex and evolving research communities. Justifications for the development of scientific digital libraries include the desire to preserve science data and the promises of information interconnectedness, correlative science, and system interoperability. Shared ontologies are fundamental to fulfilling these promises. We present a tool framework, some informal principles, and several case studies where shared ontologies are used to guide the implementation of scientific digital libraries. The tool framework, based on an ontology modeling tool, was configured to develop, manage, and keep shared ontologies relevant within changing domains and to promote the interoperability, interconnectedness, and correlation desired by scientists.

  3. The DFG Viewer for Interoperability in Germany

    Ralf Goebel

    2010-02-01

    Full Text Available This article deals with the DFG Viewer for Interoperability, a free and open source web-based viewer for digitised books, and assesses its relevance for interoperability in Germany. First the specific situation in Germany is described, including the important role of the Deutsche Forschungsgemeinschaft (German Research Foundation. The article then moves on to the overall concept of the viewer and its technical background. It introduces the data formats and standards used, it briefly illustrates how the viewer works and includes a few examples.

  4. Benefit quantification of interoperability in coordinate metrology

    Savio, E.; Carmignato, S.; De Chiffre, Leonardo

    2014-01-01

    these inefficiencies. The paper presents a methodology for an economic evaluation of interoperability benefits with respect to the verification of geometrical product specifications. It requires input data from testing and inspection activities, as well as information on training of personnel and licensing of software......One of the factors contributing to limited reproducibility of coordinate measurements is the use of different inspection software. Time-consuming efforts for translation of part programmes are sometimes needed, and interoperability of inspection equipment has the potential to reduce...

  5. Toward semantic interoperability with linked foundational ontologies in ROMULUS

    Khan, ZC

    2013-06-01

    Full Text Available A purpose of a foundational ontology is to solve interoperability issues among ontologies. Many foundational ontologies have been developed, reintroducing the ontology interoperability problem. We address this with the new online foundational...

  6. Contribution of Clinical Archetypes, and the Challenges, towards Achieving Semantic Interoperability for EHRs.

    Tapuria, Archana; Kalra, Dipak; Kobayashi, Shinji

    2013-12-01

    The objective is to introduce 'clinical archetype' which is a formal and agreed way of representing clinical information to ensure interoperability across and within Electronic Health Records (EHRs). The paper also aims at presenting the challenges building quality labeled clinical archetypes and the challenges towards achieving semantic interoperability between EHRs. Twenty years of international research, various European healthcare informatics projects and the pioneering work of the openEHR Foundation have led to the following results. The requirements for EHR information architectures have been consolidated within ISO 18308 and adopted within the ISO 13606 EHR interoperability standard. However, a generic EHR architecture cannot ensure that the clinical meaning of information from heterogeneous sources can be reliably interpreted by receiving systems and services. Therefore, clinical models called 'clinical archetypes' are required to formalize the representation of clinical information within the EHR. Part 2 of ISO 13606 defines how archetypes should be formally represented. The current challenge is to grow clinical communities to build a library of clinical archetypes and to identify how evidence of best practice and multi-professional clinical consensus should best be combined to define archetypes at the optimal level of granularity and specificity and quality label them for wide adoption. Standardizing clinical terms within EHRs using clinical terminology like Systematized Nomenclature of Medicine Clinical Terms is also a challenge. Clinical archetypes would play an important role in achieving semantic interoperability within EHRs. Attempts are being made in exploring the design and adoption challenges for clinical archetypes.

  7. IHE cross-enterprise document sharing for imaging: interoperability testing software

    Renaud Bérubé

    2010-09-01

    Full Text Available Abstract Background With the deployments of Electronic Health Records (EHR, interoperability testing in healthcare is becoming crucial. EHR enables access to prior diagnostic information in order to assist in health decisions. It is a virtual system that results from the cooperation of several heterogeneous distributed systems. Interoperability between peers is therefore essential. Achieving interoperability requires various types of testing. Implementations need to be tested using software that simulates communication partners, and that provides test data and test plans. Results In this paper we describe a software that is used to test systems that are involved in sharing medical images within the EHR. Our software is used as part of the Integrating the Healthcare Enterprise (IHE testing process to test the Cross Enterprise Document Sharing for imaging (XDS-I integration profile. We describe its architecture and functionalities; we also expose the challenges encountered and discuss the elected design solutions. Conclusions EHR is being deployed in several countries. The EHR infrastructure will be continuously evolving to embrace advances in the information technology domain. Our software is built on a web framework to allow for an easy evolution with web technology. The testing software is publicly available; it can be used by system implementers to test their implementations. It can also be used by site integrators to verify and test the interoperability of systems, or by developers to understand specifications ambiguities, or to resolve implementations difficulties.

  8. System model the processing of heterogeneous sensory information in robotized complex

    Nikolaev, V.; Titov, V.; Syryamkin, V.

    2018-05-01

    Analyzed the scope and the types of robotic systems consisting of subsystems of the form "a heterogeneous sensors data processing subsystem". On the basis of the Queuing theory model is developed taking into account the unevenness of the intensity of information flow from the sensors to the subsystem of information processing. Analytical solution to assess the relationship of subsystem performance and uneven flows. The research of the obtained solution in the range of parameter values of practical interest.

  9. Equipping the Enterprise Interoperability Problem Solver

    Oude Luttighuis, Paul; Folmer, Erwin Johan Albert; Charalabidis, Yannis

    2010-01-01

    The maturity of the enterprise interoperability field does not match the importance attached to it by many, both in the public as well as the private community. A host of models, paradigms, designs, standards, methods, and instruments seems to be available, but many of them are only used in rather

  10. Smart Grid Interoperability Maturity Model Beta Version

    Widergren, Steven E.; Drummond, R.; Giroti, Tony; Houseman, Doug; Knight, Mark; Levinson, Alex; longcore, Wayne; Lowe, Randy; Mater, J.; Oliver, Terry V.; Slack, Phil; Tolk, Andreas; Montgomery, Austin

    2011-12-02

    The GridWise Architecture Council was formed by the U.S. Department of Energy to promote and enable interoperability among the many entities that interact with the electric power system. This balanced team of industry representatives proposes principles for the development of interoperability concepts and standards. The Council provides industry guidance and tools that make it an available resource for smart grid implementations. In the spirit of advancing interoperability of an ecosystem of smart grid devices and systems, this document presents a model for evaluating the maturity of the artifacts and processes that specify the agreement of parties to collaborate across an information exchange interface. You are expected to have a solid understanding of large, complex system integration concepts and experience in dealing with software component interoperation. Those without this technical background should read the Executive Summary for a description of the purpose and contents of the document. Other documents, such as checklists, guides, and whitepapers, exist for targeted purposes and audiences. Please see the www.gridwiseac.org website for more products of the Council that may be of interest to you.

  11. An interoperable security framework for connected healthcare

    Asim, M.; Petkovic, M.; Qu, M.; Wang, Changjie

    2011-01-01

    Connected and interoperable healthcare system promises to reduce the cost of healthcare delivery, increase its efficiency and enable consumers to better engage with clinicians and manage their care. However at the same time it introduces new risks towards security and privacy of personal health

  12. An Interoperable Security Framework for Connected Healthcare

    Asim, M.; Petkovic, M.; Qu, M.; Wang, C.

    2011-01-01

    Connected and interoperable healthcare system promises to reduce thecost of the healthcare delivery, increase its efficiency and enableconsumers to better engage with clinicians and manage their care. However at the same time it introduces new risks towards security andprivacy of personal health

  13. Data Transport Subsystem - The SFOC glue

    Parr, Stephen J.

    1988-01-01

    The design and operation of the Data Transport Subsystem (DTS) for the JPL Space Flight Operation Center (SFOC) are described. The SFOC is the ground data system under development to serve interplanetary space probes; in addition to the DTS, it comprises a ground interface facility, a telemetry-input subsystem, data monitor and display facilities, and a digital TV system. DTS links the other subsystems via an ISO OSI presentation layer and an LAN. Here, particular attention is given to the DTS services and service modes (virtual circuit, datagram, and broadcast), the DTS software architecture, the logical-name server, the role of the integrated AI library, and SFOC as a distributed system.

  14. Enhancing Data Interoperability with Web Services

    Shrestha, S. R.; Zimble, D. A.; Wang, W.; Herring, D.; Halpert, M.

    2014-12-01

    In an effort to improve data access and interoperability of climate and weather data, the National Oceanic and Atmospheric Administration's (NOAA) Climate.gov and Climate Prediction Center (CPC) are exploring various platform solutions to enhance a user's ability to locate, preview, and acquire the data. The Climate.gov and CPC data team faces multiple challenges including the various kinds of data and formats, inconsistency of metadata records, variety of data service implementations, very large volumes of data and geographically distributed locations. We have created the Data Access and Interoperability project to design a web-based platform, where interoperability between systems can be leveraged to allow greater data discovery, access, visualization and delivery. In the interoperable data platform, systems can integrate with each other to support the synthesis of climate and weather data. Interoperability is the ability for users to discover the available climate and weather data, preview and interact with the data, and acquire the data in common digital formats through a simple web-based interface. The goal of the interoperable data platform is to leverage existing web services, implement the established standards and integrate with existing solutions across the earth sciences domain instead of creating new technologies. Towards this effort to improve the interoperability of the platform, we are collaborating with ESRI Inc. to provide climate and weather data via web services. In this presentation, we will discuss and demonstrate how to use ArcGIS to author RESTful based scientific web services using open standards. These web services are able to encapsulate the logic required to handle and describe scientific data through a variety of service types including, image, map, feature, geoprocessing, and their respective service methods. Combining these types of services and leveraging well-documented APIs, including the ArcGIS JavaScript API, we can afford to

  15. Interoperability of Geographic Information: A Communication Process –Based Prototype

    Jean Brodeur

    2005-04-01

    Full Text Available Since 1990, municipal, state/provincial, and federal governments have developed numerous geographic databases over the years to fulfill organizations' specific needs. As such, same real world topographic phenomena have been abstracted differently, for instance vegetation (surface, trees (surface, wooded area (line, wooded area (point and line, milieu boisé (surface, zone boisée (unknown geometry. Today, information about these geographic phenomena is accessible on the Internet from Web infrastructures specially developed to simplify their access. Early in the nineties, the development of interoperability of geographic information has been undertaken to solve syntactic, structural, and semantic heterogeneities as well as spatial and temporal heterogeneities to facilitate sharing and integration of such data. Recently, we have proposed a new conceptual framework for interoperability of geographic information based on the human communication process, cognitive science, and ontology, and introduced geosemantic proximity, a reasoning methodology to qualify dynamically the semantic similarity between geographic abstractions. This framework could be of interest to other disciplines. This paper presents the details of our framework for interoperability of geographic information as well as a prototype.

  16. Designing learning management system interoperability in semantic web

    Anistyasari, Y.; Sarno, R.; Rochmawati, N.

    2018-01-01

    The extensive adoption of learning management system (LMS) has set the focus on the interoperability requirement. Interoperability is the ability of different computer systems, applications or services to communicate, share and exchange data, information, and knowledge in a precise, effective and consistent way. Semantic web technology and the use of ontologies are able to provide the required computational semantics and interoperability for the automation of tasks in LMS. The purpose of this study is to design learning management system interoperability in the semantic web which currently has not been investigated deeply. Moodle is utilized to design the interoperability. Several database tables of Moodle are enhanced and some features are added. The semantic web interoperability is provided by exploited ontology in content materials. The ontology is further utilized as a searching tool to match user’s queries and available courses. It is concluded that LMS interoperability in Semantic Web is possible to be performed.

  17. Periodic subsystem density-functional theory

    Genova, Alessandro; Pavanello, Michele; Ceresoli, Davide

    2014-01-01

    By partitioning the electron density into subsystem contributions, the Frozen Density Embedding (FDE) formulation of subsystem Density Functional Theory (DFT) has recently emerged as a powerful tool for reducing the computational scaling of Kohn–Sham DFT. To date, however, FDE has been employed to molecular systems only. Periodic systems, such as metals, semiconductors, and other crystalline solids have been outside the applicability of FDE, mostly because of the lack of a periodic FDE implementation. To fill this gap, in this work we aim at extending FDE to treat subsystems of molecular and periodic character. This goal is achieved by a dual approach. On one side, the development of a theoretical framework for periodic subsystem DFT. On the other, the realization of the method into a parallel computer code. We find that periodic FDE is capable of reproducing total electron densities and (to a lesser extent) also interaction energies of molecular systems weakly interacting with metallic surfaces. In the pilot calculations considered, we find that FDE fails in those cases where there is appreciable density overlap between the subsystems. Conversely, we find FDE to be in semiquantitative agreement with Kohn–Sham DFT when the inter-subsystem density overlap is low. We also conclude that to make FDE a suitable method for describing molecular adsorption at surfaces, kinetic energy density functionals that go beyond the GGA level must be employed

  18. Periodic subsystem density-functional theory

    Genova, Alessandro; Ceresoli, Davide; Pavanello, Michele

    2014-11-01

    By partitioning the electron density into subsystem contributions, the Frozen Density Embedding (FDE) formulation of subsystem Density Functional Theory (DFT) has recently emerged as a powerful tool for reducing the computational scaling of Kohn-Sham DFT. To date, however, FDE has been employed to molecular systems only. Periodic systems, such as metals, semiconductors, and other crystalline solids have been outside the applicability of FDE, mostly because of the lack of a periodic FDE implementation. To fill this gap, in this work we aim at extending FDE to treat subsystems of molecular and periodic character. This goal is achieved by a dual approach. On one side, the development of a theoretical framework for periodic subsystem DFT. On the other, the realization of the method into a parallel computer code. We find that periodic FDE is capable of reproducing total electron densities and (to a lesser extent) also interaction energies of molecular systems weakly interacting with metallic surfaces. In the pilot calculations considered, we find that FDE fails in those cases where there is appreciable density overlap between the subsystems. Conversely, we find FDE to be in semiquantitative agreement with Kohn-Sham DFT when the inter-subsystem density overlap is low. We also conclude that to make FDE a suitable method for describing molecular adsorption at surfaces, kinetic energy density functionals that go beyond the GGA level must be employed.

  19. Middleware Interoperability for Robotics: A ROS-YARP Framework

    Plinio Moreno

    2016-10-01

    Full Text Available Middlewares are fundamental tools for progress in research and applications in robotics. They enable the integration of multiple heterogeneous sensing and actuation devices, as well as providing general purpose modules for key robotics functions (kinematics, navigation, planning. However, no existing middleware yet provides a complete set of functionalities for all robotics applications, and many robots may need to rely on more than one framework. This paper focuses on the interoperability between two of the most prevalent middleware in robotics: YARP and ROS. Interoperability between middlewares should ideally allow users to execute existing software without the necessity of: (i changing the existing code, and (ii writing hand-coded ``bridges'' for each use-case. We propose a framework enabling the communication between existing YARP modules and ROS nodes for robotics applications in an automated way. Our approach generates the ``bridging gap'' code from a configuration file, connecting YARP ports and ROS topics through code-generated YARP Bottles. %%The configuration file must describe: (i the sender entities, (ii the way to group and convert the information read from the sender, (iii the structure of the output message and (iv the receiving entity. Our choice for the many inputs to one output is the most common use-case in robotics applications, where examples include filtering, decision making and visualization. %We support YARP/ROS and ROS/YARP sender/receiver configurations, which are demonstrated in a humanoid on wheels robot that uses YARP for upper body motor control and visual perception, and ROS for mobile base control and navigation algorithms.

  20. Regulatory Barriers Blocking Standardization of Interoperability

    Zhong, Daidi; Kirwan, Michael J; Duan, Xiaolian

    2013-01-01

    Developing and implementing a set of personal health device interoperability standards is key to cultivating a healthy global industry ecosystem. The standardization organizations, including the Institute of Electrical and Electronics Engineers 11073 Personal Health Device Workgroup (IEEE 11073-PHD WG) and Continua Health Alliance, are striving for this purpose. However, factors like the medial device regulation, health policy, and market reality have placed non-technical barriers over the ad...

  1. UGV Control Interoperability Profile (IOP), Version 0

    2011-12-21

    a tracked vehicle to climb stairs , traverse ditches/ruts, etc. The operator should be able to control the position of the flippers via the OCU and...Unclassified UGV Control Interoperability Profile (IOP) Version 0 Robotic Systems, Joint Project Office (RS JPO) SFAE-GCS-UGV MS...NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Robotic Systems, Joint Project Office (RS JPO),SFAE-GCS-UGV MS 266,6501 East 11 Mile Road

  2. Future Interoperability of Camp Protection Systems (FICAPS)

    Caron, Sylvie; Gündisch, Rainer; Marchand, Alain; Stahl, Karl-Hermann

    2013-05-01

    The FICAPS Project has been established as a Project of the European Defence Agency based on an initiative of Germany and France. Goal of this Project was to derive Guidelines, which by a proper implementation in future developments improve Camp Protection Systems (CPS) by enabling and improving interoperability between Camp Protection Systems and its Equipments of different Nations involved in multinational missions. These Guidelines shall allow for: • Real-time information exchange between equipments and systems of different suppliers and nations (even via SatCom), • Quick and easy replacement of equipments (even of different Nations) at run-time in the field by means of plug and play capability, thus lowering the operational and logistic costs and making the system highly available, • Enhancement of system capabilities (open and modular systems) by adding new equipment with new capabilities (just plug-in, automatic adjustment of the HMI Human Machine Interface) without costly and time consuming validation and test on system level (validation and test can be done on Equipment level), Four scenarios have been identified to summarize the interoperability requirements from an operational viewpoint. To prove the definitions given in the Guideline Document, a French and a German Demonstration System, based on existing national assets, were realized. Demonstrations, showing the capabilities given by the defined interoperability requirements with respect to the operational scenarios, were performed. Demonstrations included remote control of a CPS by another CPS, remote sensor control (Electro-Optic/InfraRed EO/IR) and remote effector control. This capability can be applied to extend the protection area or to protect distant infrastructural assets Demonstrations have been performed. The required interoperability functionality was shown successfully. Even if the focus of the FICAPS project was on camp protection, the solution found is also appropriate for other

  3. Interoperability in the e-Government Context

    2012-01-01

    TN-014 | 3 ing e- government systems focus primarily on these technical challenges [UNDP 2007a, p. 10; CS Transform 2009, p. 3]. More recently...Thailand’s government hits its own wall. Responding agencies and non- governmental groups are unable to share information vital to the rescue effort...Interoperability and Open Standards for e- Governance .” egov (Sep. 1, 2007): 17–19. [Secretary General, United Nations 2010] Secretary General, United

  4. The evaluation subsystem of RODOS

    Niculae, C.; Treitz, M.; Geldermann, J.

    2003-01-01

    Full text: The evaluation subsystem (ESY) of the RODOS aims to rank countermeasure strategies according to their potential benefit and preference weights provided by the decision makers (DMS). In the previous version of the ESY, the structure of the decision problem (attributes, strategies, etc.) had to be largely defined by the early modules in the RODOS chain (ASY-CSYESY). For this reason, the ESY runs would be initiated with a list of strategies, a comprehensive attribute tree and a consequence table giving the impacts for each attribute under each strategy. The first sub-module of the ESY allows the user to select the attributes to be analyzed and then filters out the remaining attributes. For instance, the CSY module LCMT passes over 100 attributes to the ESY, from which one would expect the analyst/DMS to select maybe 10 to 15 for the evaluation. This sub-module also adds a sub-tree of subjective attributes (qualitative information) to the attribute tree provided by the CSY and allows the user to select which of these should be passed forward for further analysis. In addition, data from the economic and health modules (e.g. costs, health effects, etc.) can be grafted on as a sub-tree. The second sub-module performs the ranking of the alternative strategies and outputs a short list of best strategies. The last component of the ESY contains an explanation facility that uses a fine set of rules to reason about the ranking of the strategies. Due to the complexity of the nuclear emergency management and the wide range of DMS and stakeholders involved in the decision process, it is difficult to predetermine the range of strategies they will consider. The current strategies or groups of strategies included in the system are only driven by radiological factors. Research in the field of multicriteria decision aid has shown that value focused approaches could result in new sets of alternatives, new criteria to be considered or different decision tree structures

  5. Environmental Models as a Service: Enabling Interoperability ...

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantage of streamlined deployment processes and affordable cloud access to move algorithms and data to the web for discoverability and consumption. In these deployments, environmental models can become available to end users through RESTful web services and consistent application program interfaces (APIs) that consume, manipulate, and store modeling data. RESTful modeling APIs also promote discoverability and guide usability through self-documentation. Embracing the RESTful paradigm allows models to be accessible via a web standard, and the resulting endpoints are platform- and implementation-agnostic while simultaneously presenting significant computational capabilities for spatial and temporal scaling. RESTful APIs present data in a simple verb-noun web request interface: the verb dictates how a resource is consumed using HTTP methods (e.g., GET, POST, and PUT) and the noun represents the URL reference of the resource on which the verb will act. The RESTful API can self-document in both the HTTP response and an interactive web page using the Open API standard. This lets models function as an interoperable service that promotes sharing, documentation, and discoverability. Here, we discuss the

  6. Embedded Thermal Control for Spacecraft Subsystems Miniaturization

    Didion, Jeffrey R.

    2014-01-01

    Optimization of spacecraft size, weight and power (SWaP) resources is an explicit technical priority at Goddard Space Flight Center. Embedded Thermal Control Subsystems are a promising technology with many cross cutting NSAA, DoD and commercial applications: 1.) CubeSatSmallSat spacecraft architecture, 2.) high performance computing, 3.) On-board spacecraft electronics, 4.) Power electronics and RF arrays. The Embedded Thermal Control Subsystem technology development efforts focus on component, board and enclosure level devices that will ultimately include intelligent capabilities. The presentation will discuss electric, capillary and hybrid based hardware research and development efforts at Goddard Space Flight Center. The Embedded Thermal Control Subsystem development program consists of interrelated sub-initiatives, e.g., chip component level thermal control devices, self-sensing thermal management, advanced manufactured structures. This presentation includes technical status and progress on each of these investigations. Future sub-initiatives, technical milestones and program goals will be presented.

  7. Advancing Smart Grid Interoperability and Implementing NIST's Interoperability Roadmap

    Basso,T.; DeBlasio, R.

    2010-04-01

    The IEEE American National Standards project P2030TM addressing smart grid interoperability and the IEEE 1547 series of standards addressing distributed resources interconnection with the grid have been identified in priority action plans in the Report to NIST on the Smart Grid Interoperability Standards Roadmap. This paper presents the status of the IEEE P2030 development, the IEEE 1547 series of standards publications and drafts, and provides insight on systems integration and grid infrastructure. The P2030 and 1547 series of standards are sponsored by IEEE Standards Coordinating Committee 21.

  8. Timing subsystem development: Network synchronization experiments

    Backe, K. R.

    1983-01-01

    This paper describes a program in which several experimental timing subsystem prototypes were designed, fabricated, and field tested using a small network of troposcatter and microwave digital communication links. This equipment was responsible for modem/radio interfacing, time interval measurement, clock adjustment and distribution, synchronization technique, and node to node information exchange. Presented are discussions of the design approach, measurement plan, and performance assessment methods. Recommendations are made based on the findings of the test program and an evaluation of the design of both the hardware and software elements of the timing subsystem prototypes.

  9. Primary electric propulsion thrust subsystem definition

    Masek, T. D.; Ward, J. W.; Kami, S.

    1975-01-01

    A review is presented of the current status of primary propulsion thrust subsystem (TSS) performance, packaging considerations, and certain operational characteristics. Thrust subsystem related work from recent studies by Jet Propulsion Laboratories (JPL), Rockwell and Boeing is discussed. Existing performance for 30-cm thrusters, power processors and TSS is present along with projections for future improvements. Results of analyses to determine (1) magnetic field distributions resulting from an array of thrusters, (2) thruster emitted particle flux distributions from an array of thrusters, and (3) TSS element failure rates are described to indicate the availability of analytical tools for evaluation of TSS designs.

  10. Lasing without inversion due to cooling subsystem

    Shakhmuratov, R.N.

    1997-01-01

    The new possibility of inversionless lasing is discussed. We have considered the resonant interaction of a two-level system (TLS) with photons and the adiabatic interaction with an ensemble of Bose particles. It is found out that a TLS with equally populated energy levels amplifies the coherent light with Stokes-shifted frequency. This becomes possible as photon emission is accompanied by Bose particles excitation. The energy flow from the TLS to the photon subsystem is realized due to the Bose subsystem being at finite temperature and playing the cooler role. The advantage of this new lasing principle is discussed. It is shown that lasing conditions strongly differ from conventional ones

  11. Telemedicine system interoperability architecture: concept description and architecture overview.

    Craft, Richard Layne, II

    2004-05-01

    In order for telemedicine to realize the vision of anywhere, anytime access to care, it must address the question of how to create a fully interoperable infrastructure. This paper describes the reasons for pursuing interoperability, outlines operational requirements that any interoperability approach needs to consider, proposes an abstract architecture for meeting these needs, identifies candidate technologies that might be used for rendering this architecture, and suggests a path forward that the telemedicine community might follow.

  12. The role of architecture and ontology for interoperability.

    Blobel, Bernd; González, Carolina; Oemig, Frank; Lopéz, Diego; Nykänen, Pirkko; Ruotsalainen, Pekka

    2010-01-01

    Turning from organization-centric to process-controlled or even to personalized approaches, advanced healthcare settings have to meet special interoperability challenges. eHealth and pHealth solutions must assure interoperability between actors cooperating to achieve common business objectives. Hereby, the interoperability chain also includes individually tailored technical systems, but also sensors and actuators. For enabling corresponding pervasive computing and even autonomic computing, individualized systems have to be based on an architecture framework covering many domains, scientifically managed by specialized disciplines using their specific ontologies in a formalized way. Therefore, interoperability has to advance from a communication protocol to an architecture-centric approach mastering ontology coordination challenges.

  13. PACS/information systems interoperability using Enterprise Communication Framework.

    alSafadi, Y; Lord, W P; Mankovich, N J

    1998-06-01

    Interoperability among healthcare applications goes beyond connectivity to allow components to exchange structured information and work together in a predictable, coordinated fashion. To facilitate building an interoperability infrastructure, an Enterprise Communication Framework (ECF) was developed by the members of the Andover Working Group for Healthcare Interoperability (AWG-OHI). The ECF consists of four models: 1) Use Case Model, 2) Domain Information Model (DIM), 3) Interaction Model, and 4) Message Model. To realize this framework, a software component called the Enterprise Communicator (EC) is used. In this paper, we will demonstrate the use of the framework in interoperating a picture archiving and communication system (PACS) with a radiology information system (RIS).

  14. Robotics Systems Joint Project Office (RSJPO) Interoperability Profiles (IOPS) 101

    2012-07-01

    interoperability, although they are supported by some interoperability attributes  For example, stair climbing » Stair climbing is not something that...IOPs need to specify » However, the mobility & actuation related interoperable messages can be used to provide stair climbing » Also...interoperability can enable management of different poses or modes, one of which may be stair climbing R O B O T IC S Y S T E M S J P O L e a d e r s h i p

  15. Position paper: cognitive radio networking for multiple sensor network interoperability in mines

    Kagize, BM

    2008-01-01

    Full Text Available . These commercially available networks are purported to be self-organizing and self correcting, though the software behind these networks are proprietary with the caveat of inter-operability difficulties with other networks [5]. There is a non-propriety and open...: Research challenges,” - Ad Hoc Networks, 2006 – Elsevier [4] V Mhatre, C Rosenberg, “Homogeneous vs heterogeneous clustered sensor networks: a comparative study,” - Communications, 2004 IEEE International Conference on, 2004 - ieeexplore.ieee.org [5...

  16. Clinical data integration model. Core interoperability ontology for research using primary care data.

    Ethier, J-F; Curcin, V; Barton, A; McGilchrist, M M; Bastiaens, H; Andreasson, A; Rossiter, J; Zhao, L; Arvanitis, T N; Taweel, A; Delaney, B C; Burgun, A

    2015-01-01

    This article is part of the Focus Theme of METHODS of Information in Medicine on "Managing Interoperability and Complexity in Health Systems". Primary care data is the single richest source of routine health care data. However its use, both in research and clinical work, often requires data from multiple clinical sites, clinical trials databases and registries. Data integration and interoperability are therefore of utmost importance. TRANSFoRm's general approach relies on a unified interoperability framework, described in a previous paper. We developed a core ontology for an interoperability framework based on data mediation. This article presents how such an ontology, the Clinical Data Integration Model (CDIM), can be designed to support, in conjunction with appropriate terminologies, biomedical data federation within TRANSFoRm, an EU FP7 project that aims to develop the digital infrastructure for a learning healthcare system in European Primary Care. TRANSFoRm utilizes a unified structural / terminological interoperability framework, based on the local-as-view mediation paradigm. Such an approach mandates the global information model to describe the domain of interest independently of the data sources to be explored. Following a requirement analysis process, no ontology focusing on primary care research was identified and, thus we designed a realist ontology based on Basic Formal Ontology to support our framework in collaboration with various terminologies used in primary care. The resulting ontology has 549 classes and 82 object properties and is used to support data integration for TRANSFoRm's use cases. Concepts identified by researchers were successfully expressed in queries using CDIM and pertinent terminologies. As an example, we illustrate how, in TRANSFoRm, the Query Formulation Workbench can capture eligibility criteria in a computable representation, which is based on CDIM. A unified mediation approach to semantic interoperability provides a

  17. Experimental evaluation of the IP multimedia subsystem

    Oredope, A.; Liotta, A.; Yang, K.; Tyrode-Goilo, D.H.; Magedanz, T.; Mauro Madeira, E.R.M.; Dini, P.

    2005-01-01

    The IP Multimedia Subsystem (IMS) is the latest framework for a seamless conversion of the ordinary Internet with mobile cellular systems. As such it has the backing of all major companies since it aims to offer a unified solution to integrated mobile services, including mechanisms for security,

  18. MITS Feed and Withdrawal Subsystem: operating procedures

    Brown, W.S.

    1980-01-01

    This procedure details the steps involved in establishing closed loop flows, providing UF 6 vapor to the FEED header of the Sampling Subsystem and returning it through the PRODUCT and TAILS headers via the F and W recycle valves. It is essentially a Startup Procedure

  19. Union Listing via OCLC's Serials Control Subsystem.

    O'Malley, Terrence J.

    1984-01-01

    Describes library use of Conversion of Serials Project's (CONSER) online national machine-readable database for serials to create online union lists of serials via OCLC's Serial Control Subsystem. Problems in selection of appropriate, accurate, and authenticated records and prospects for the future are discussed. Twenty sources and sample records…

  20. Accelerated life testing of spacecraft subsystems

    Wiksten, D.; Swanson, J.

    1972-01-01

    The rationale and requirements for conducting accelerated life tests on electronic subsystems of spacecraft are presented. A method for applying data on the reliability and temperature sensitivity of the parts contained in a sybsystem to the selection of accelerated life test parameters is described. Additional considerations affecting the formulation of test requirements are identified, and practical limitations of accelerated aging are described.

  1. Integrating the autonomous subsystems management process

    Ashworth, Barry R.

    1992-01-01

    Ways in which the ranking of the Space Station Module Power Management and Distribution testbed may be achieved and an individual subsystem's internal priorities may be managed within the complete system are examined. The application of these results in the integration and performance leveling of the autonomously managed system is discussed.

  2. MITS Feed and Withdrawal Subsystem: operating procedures

    Brown, W.S.

    1980-01-01

    This document details procedures for the operation of the MITS (Machine Interface Test System) Feed and Withdrawal Subsystem (F and W). Included are fill with UF 6 , establishment of recycle and thruput flows, shutdown, UF 6 makeup, dump to supply container, Cascade dump to F and W, and lights cold trap dump, all normal procedures, plus an alternate procedure for trapping light gases

  3. Analog subsystem for the plutonium protection system

    Arlowe, H.D.

    1978-12-01

    An analog subsystem is described which monitors certain functions in the Plutonium Protection System. Rotary and linear potentiometer output signals are digitized, as are the outputs from thermistors and container ''bulge'' sensors. This work was sponsored by the Department of Energy/Office of Safeguards and Security (DOE/OSS) as part of the overall Sandia Fixed Facility Physical Protection Program

  4. MITS Feed and Withdrawal Subsystem: operating procedures

    Brown, W.S.

    1980-01-01

    This procedure details the steps involved in filling two of the four MITS (Machine Interface Test System) Feed and Withdrawal subsystem main traps and the Sample/Inventory Make-up Pipette with uranium hexafluoride from the ''AS RECEIVED'' UF 6 supply

  5. Presence in the IP multimedia subsystem

    Lin, L.; Liotta, A.

    2007-01-01

    With an ever increasing penetration of Internet Protocol (IP) technologies, the wireless industry is evolving the mobile core network towards all-IP network. The IP Multimedia Subsystem (IMS) is a standardised Next Generation Network (NGN) architectural framework defined by the 3rd Generation

  6. Electronic Subsystems For Laser Communication System

    Long, Catherine; Maruschak, John; Patschke, Robert; Powers, Michael

    1992-01-01

    Electronic subsystems of free-space laser communication system carry digital signals at 650 Mb/s over long distances. Applicable to general optical communications involving transfer of great quantities of data, and transmission and reception of video images of high definition.

  7. Interlibrary Loan Communications Subsystem: Users Manual.

    OCLC Online Computer Library Center, Inc., Dublin, OH.

    The OCLC Interlibrary Loan (ILL) Communications Subsystem provides participating libraries with on-line control of ILL transactions. This user manual includes a glossary of terms related to the procedures in using the system. Sections describe computer entry, searching, loan request form, loan response form, ILL procedures, the special message…

  8. National Ingition Facility subsystem design requirements optics subsystems SSDR 1.6

    English, R.E.

    1996-01-01

    This Subsystems Design Requirement (SSDR) document specifies the functions to be performed and the subsystems design requirements for the major optical components. These optical components comprise those custom designed and fabricated for amplification and transport of the full aperture NIF beam and does not include those off-the-shelf components that may be part of other optical sub-systems (i.e. alignment or diagnostic systems). This document also describes the optical component processing requirements and the QA/damage testing necessary to ensure that the optical components meet or exceed the requirements

  9. Enterprise Interoperability - Proceedings of the 5th International IFIP Working Conference on Enterprise Interoperability, IWEI 2013

    van Sinderen, Marten J.; Oude Luttighuis, P.H.W.M.; Folmer, Erwin Johan Albert; Bosems, S.; Unknown, [Unknown

    IWEI is an International IFIP Working Conference covering all aspects of enterprise interoperability with the purpose of achieving flexible cross-organizational collaboration through integrated support at business and technical levels. It provides a forum for discussing ideas and results among both

  10. 77 FR 19575 - Promoting Interoperability in the 700 MHz Commercial Spectrum; Interoperability of Mobile User...

    2012-04-02

    ... efforts and/or through modifications to the Commission's technical rules or other regulatory measures. The... regulatory measures. \\1\\ The Commission has a longstanding interest in promoting the interoperability of... standards for Long-Term Evolution (LTE) wireless broadband technology are developed by the 3rd Generation...

  11. Open Source Interoperability: It's More than Technology

    Dominic Sartorio

    2008-01-01

    Full Text Available The Open Solutions Alliance is a consortium of leading commercial open source vendors, integrators and end users dedicated to the growth of open source based solutions in the enterprise. We believe Linux and other infrastructure software, such as Apache, has become mainstream, and packaged solutions represent the next great growth opportunity. However some unique challenges can temper that opportunity. These challenges include getting the word out about the maturity and enterprise-readiness of those solutions, ensuring interoperability both with each other and with other proprietary and legacy solutions, and ensuring healthy collaboration between vendors and their respective customer and developer communities.

  12. RFID in libraries a step toward interoperability

    Ayre, Lori Bowen

    2012-01-01

    The approval by The National Information Standards Organization (NISO) of a new standard for RFID in libraries is a big step toward interoperability among libraries and vendors. By following this set of practices and procedures, libraries can ensure that an RFID tag in one library can be used seamlessly by another, assuming both comply, even if they have different suppliers for tags, hardware, and software. In this issue of Library Technology Reports, Lori Bowen Ayre, an experienced implementer of automated materials handling systems, Provides background on the evolution of the standard

  13. An ontological system for interoperable spatial generalisation in biodiversity monitoring

    Nieland, Simon; Moran, Niklas; Kleinschmit, Birgit; Förster, Michael

    2015-11-01

    Semantic heterogeneity remains a barrier to data comparability and standardisation of results in different fields of spatial research. Because of its thematic complexity, differing acquisition methods and national nomenclatures, interoperability of biodiversity monitoring information is especially difficult. Since data collection methods and interpretation manuals broadly vary there is a need for automatised, objective methodologies for the generation of comparable data-sets. Ontology-based applications offer vast opportunities in data management and standardisation. This study examines two data-sets of protected heathlands in Germany and Belgium which are based on remote sensing image classification and semantically formalised in an OWL2 ontology. The proposed methodology uses semantic relations of the two data-sets, which are (semi-)automatically derived from remote sensing imagery, to generate objective and comparable information about the status of protected areas by utilising kernel-based spatial reclassification. This automatised method suggests a generalisation approach, which is able to generate delineation of Special Areas of Conservation (SAC) of the European biodiversity Natura 2000 network. Furthermore, it is able to transfer generalisation rules between areas surveyed with varying acquisition methods in different countries by taking into account automated inference of the underlying semantics. The generalisation results were compared with the manual delineation of terrestrial monitoring. For the different habitats in the two sites an accuracy of above 70% was detected. However, it has to be highlighted that the delineation of the ground-truth data inherits a high degree of uncertainty, which is discussed in this study.

  14. Socially Aware Heterogeneous Wireless Networks.

    Kosmides, Pavlos; Adamopoulou, Evgenia; Demestichas, Konstantinos; Theologou, Michael; Anagnostou, Miltiades; Rouskas, Angelos

    2015-06-11

    The development of smart cities has been the epicentre of many researchers' efforts during the past decade. One of the key requirements for smart city networks is mobility and this is the reason stable, reliable and high-quality wireless communications are needed in order to connect people and devices. Most research efforts so far, have used different kinds of wireless and sensor networks, making interoperability rather difficult to accomplish in smart cities. One common solution proposed in the recent literature is the use of software defined networks (SDNs), in order to enhance interoperability among the various heterogeneous wireless networks. In addition, SDNs can take advantage of the data retrieved from available sensors and use them as part of the intelligent decision making process contacted during the resource allocation procedure. In this paper, we propose an architecture combining heterogeneous wireless networks with social networks using SDNs. Specifically, we exploit the information retrieved from location based social networks regarding users' locations and we attempt to predict areas that will be crowded by using specially-designed machine learning techniques. By recognizing possible crowded areas, we can provide mobile operators with recommendations about areas requiring datacell activation or deactivation.

  15. Food product tracing technology capabilities and interoperability.

    Bhatt, Tejas; Zhang, Jianrong Janet

    2013-12-01

    Despite the best efforts of food safety and food defense professionals, contaminated food continues to enter the food supply. It is imperative that contaminated food be removed from the supply chain as quickly as possible to protect public health and stabilize markets. To solve this problem, scores of technology companies purport to have the most effective, economical product tracing system. This study sought to compare and contrast the effectiveness of these systems at analyzing product tracing information to identify the contaminated ingredient and likely source, as well as distribution of the product. It also determined if these systems can work together to better secure the food supply (their interoperability). Institute of Food Technologists (IFT) hypothesized that when technology providers are given a full set of supply-chain data, even for a multi-ingredient product, their systems will generally be able to trace a contaminated product forward and backward through the supply chain. However, when provided with only a portion of supply-chain data, even for a product with a straightforward supply chain, it was expected that interoperability of the systems will be lacking and that there will be difficulty collaborating to identify sources and/or recipients of potentially contaminated product. IFT provided supply-chain data for one complex product to 9 product tracing technology providers, and then compared and contrasted their effectiveness at analyzing product tracing information to identify the contaminated ingredient and likely source, as well as distribution of the product. A vertically integrated foodservice restaurant agreed to work with IFT to secure data from its supply chain for both a multi-ingredient and a simpler product. Potential multi-ingredient products considered included canned tuna, supreme pizza, and beef tacos. IFT ensured that all supply-chain data collected did not include any proprietary information or information that would otherwise

  16. Towards E-Society Policy Interoperability

    Iannella, Renato

    The move towards the Policy-Oriented Web is destined to provide support for policy expression and management in the core web layers. One of the most promising areas that can drive this new technology adoption is e-Society communities. With so much user-generated content being shared by these social networks, there is the real danger that the implicit sharing rules that communities have developed over time will be lost in translation in the new digital communities. This will lead to a corresponding loss in confidence in e-Society sites. The Policy-Oriented Web attempts to turn the implicit into the explicit with a common framework for policy language interoperability and awareness. This paper reports on the policy driving factors from the Social Networks experiences using real-world use cases and scenarios. In particular, the key functions of policy-awareness - for privacy, rights, and identity - will be the driving force that enables the e-Society to appreciate new interoperable policy regimes.

  17. Open Health Tools: Tooling for Interoperable Healthcare

    Skip McGaughey

    2008-11-01

    Full Text Available The Open Health Tools initiative is creating an ecosystem focused on the production of software tooling that promotes the exchange of medical information across political, geographic, cultural, product, and technology lines. At its core, OHT believes that the availability of high-quality tooling that interoperates will propel the industry forward, enabling organizations and vendors to build products and systems that effectively work together. This will ?raise the interoperability bar? as a result of having tools that just work. To achieve these lofty goals, careful consideration must be made to the constituencies that will be most affected by an OHT-influenced world. This document outlines a vision of OHT?s impact to these stakeholders. It does not explain the OHT process itself or how the OHT community operates. Instead, we place emphasis on the impact of that process within the health industry. The catchphrase ?code is king? underpins this document, meaning that the manifestation of any open source community lies in the products and technology it produces.

  18. Reference architecture for interoperability testing of Electric Vehicle charging

    Lehfuss, F.; Nohrer, M.; Werkmany, E.; Lopezz, J.A.; Zabalaz, E.

    2015-01-01

    This paper presents a reference architecture for interoperability testing of electric vehicles as well as their support equipment with the smart grid and the e-Mobility environment. Pan-European Electric Vehicle (EV)-charging is currently problematic as there are compliance and interoperability

  19. Interoperability of Demand Response Resources Demonstration in NY

    Wellington, Andre

    2014-03-31

    The Interoperability of Demand Response Resources Demonstration in NY (Interoperability Project) was awarded to Con Edison in 2009. The objective of the project was to develop and demonstrate methodologies to enhance the ability of customer sited Demand Response resources to integrate more effectively with electric delivery companies and regional transmission organizations.

  20. Promoting Interoperability: The Case for Discipline-Specific PSAPS

    2014-12-01

    multijurisdictional, interoperability is a key factor for success. Responses to 9/11,9 the Oso mudslides in Washington, the Boston Marathon bombing...Continuum125 2. Functional Interoperability As demonstrated by the 9/11 attacks, the Oso mudslide in Washington, the Boston Marathon bombing, and other large

  1. On the applicability of schema integration techniques to database interoperation

    Vermeer, Mark W.W.; Apers, Peter M.G.

    1996-01-01

    We discuss the applicability of schema integration techniques developed for tightly-coupled database interoperation to interoperation of databases stemming from different modelling contexts. We illustrate that in such an environment, it is typically quite difficult to infer the real-world semantics

  2. Interactive test tool for interoperable C-ITS development

    Voronov, A.; Englund, C.; Bengtsson, H.H.; Chen, L.; Ploeg, J.; Jongh, J.F.C.M. de; Sluis, H.J.D. van de

    2015-01-01

    This paper presents the architecture of an Interactive Test Tool (ITT) for interoperability testing of Cooperative Intelligent Transport Systems (C-ITS). Cooperative systems are developed by different manufacturers at different locations, which makes interoperability testing a tedious task. Up until

  3. Connected Lighting System Interoperability Study Part 1: Application Programming Interfaces

    Gaidon, Clement [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Poplawski, Michael [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2017-10-31

    First in a series of studies that focuses on interoperability as realized by the use of Application Programming Interfaces (APIs), explores the diversity of such interfaces in several connected lighting systems; characterizes the extent of interoperability that they provide; and illustrates challenges, limitations, and tradeoffs that were encountered during this exploration.

  4. Establishing Interoperability of a Blog Archive through Linked Open Data

    Kalb, Hendrik; Lazaridou, Paraskevi; Trier, Matthias

    2013-01-01

    on archived data. However, interoperability among BlogForever archives, as well as with other digital libraries, is necessary in order to avoid silos of data. In this paper, we reveal some of our efforts to establish interoperability through the application of Linked Open data....

  5. PERSPECTIVES ON INTEROPERABILITY INTEGRATION WITHIN NATO DEFENSE PLANNING PROCESS

    Florian CIOCAN

    2011-01-01

    Full Text Available Interoperability is not a new area of effort at NATO level. In fact, interoperability and more specifi cally standardization, has been a key element of the Alliance’s approach to fi elding forces for decades. But as the security and operational environment has been in a continuous change, the need to face the new threats and the current involvement in challenging operations in Afghanistan and elsewhere alongside with the necessity to interoperate at lower and lower levels of command with an increasing number of nations, including non-NATO ISAF partners, NGOs, and other organizations, have made the task even more challenging. In this respect Interoperability Integration within NATO Defense Planning Process will facilitate the timely identifi cation, development and delivery of required forces and capabilities that are interoperable and adequately prepared, equipped, trained and supported to undertake the Alliance’s full spectrum of missions.

  6. Maintenance and operations cost model for DSN subsystems

    Burt, R. W.; Lesh, J. R.

    1977-01-01

    A procedure is described which partitions the recurring costs of the Deep Space Network (DSN) over the individual DSN subsystems. The procedure results in a table showing the maintenance, operations, sustaining engineering and supportive costs for each subsystems.

  7. Partitioning a macroscopic system into independent subsystems

    Delle Site, Luigi; Ciccotti, Giovanni; Hartmann, Carsten

    2017-08-01

    We discuss the problem of partitioning a macroscopic system into a collection of independent subsystems. The partitioning of a system into replica-like subsystems is nowadays a subject of major interest in several fields of theoretical and applied physics. The thermodynamic approach currently favoured by practitioners is based on a phenomenological definition of an interface energy associated with the partition, due to a lack of easily computable expressions for a microscopic (i.e. particle-based) interface energy. In this article, we outline a general approach to derive sharp and computable bounds for the interface free energy in terms of microscopic statistical quantities. We discuss potential applications in nanothermodynamics and outline possible future directions.

  8. RF subsystem design for microwave communication receivers

    Bickford, W. J.; Brodsky, W. G.

    A system review of the RF subsystems of (IFF) transponders, tropscatter receivers and SATCOM receivers is presented. The quantity potential for S-band and X-band IFF transponders establishes a baseline requirement. From this, the feasibility of a common design for these and other receivers is evaluated. Goals are established for a GaAs MMIC (monolithic microwave integrated circuit) device and related local oscillator preselector and self-test components.

  9. Optical Subsystems for Next Generation Access Networks

    Lazaro, J.A; Polo, V.; Schrenk, B.

    2011-01-01

    Recent optical technologies are providing higher flexibility to next generation access networks: on the one hand, providing progressive FTTx and specifically FTTH deployment, progressively shortening the copper access network; on the other hand, also opening fixed-mobile convergence solutions...... in next generation PON architectures. It is provided an overview of the optical subsystems developed for the implementation of the proposed NG-Access Networks....

  10. T Plant removal of PWR Chiller Subsystem

    Dana, C.M.

    1994-01-01

    The PWR Pool Chiller System is not longer required for support of the Shippingport Blanket Fuel Assemblies Storage. The Engineering Work Plan will provide the overall coordination of the documentation and physical changes to deactivate the unneeded subsystem. The physical removal of all energy sources for the Chiller equipment will be covered under a one time work plan. The documentation changes will be covered using approved Engineering Change Notices and Procedure Change Authorizations as needed

  11. The charged particle accelerators subsystems modeling

    Averyanov, G P; Kobylyatskiy, A V

    2017-01-01

    Presented web-based resource for information support the engineering, science and education in Electrophysics, containing web-based tools for simulation subsystems charged particle accelerators. Formulated the development motivation of Web-Environment for Virtual Electrophysical Laboratories. Analyzes the trends of designs the dynamic web-environments for supporting of scientific research and E-learning, within the framework of Open Education concept. (paper)

  12. Stepping-Motion Motor-Control Subsystem For Testing Bearings

    Powers, Charles E.

    1992-01-01

    Control subsystem closed-loop angular-position-control system causing motor and bearing under test to undergo any of variety of continuous or stepping motions. Also used to test bearing-and-motor assemblies, motors, angular-position sensors including rotating shafts, and like. Monitoring subsystem gathers data used to evaluate performance of bearing or other article under test. Monitoring subsystem described in article, "Monitoring Subsystem For Testing Bearings" (GSC-13432).

  13. Automated searching for quantum subsystem codes

    Crosswhite, Gregory M.; Bacon, Dave

    2011-01-01

    Quantum error correction allows for faulty quantum systems to behave in an effectively error-free manner. One important class of techniques for quantum error correction is the class of quantum subsystem codes, which are relevant both to active quantum error-correcting schemes as well as to the design of self-correcting quantum memories. Previous approaches for investigating these codes have focused on applying theoretical analysis to look for interesting codes and to investigate their properties. In this paper we present an alternative approach that uses computational analysis to accomplish the same goals. Specifically, we present an algorithm that computes the optimal quantum subsystem code that can be implemented given an arbitrary set of measurement operators that are tensor products of Pauli operators. We then demonstrate the utility of this algorithm by performing a systematic investigation of the quantum subsystem codes that exist in the setting where the interactions are limited to two-body interactions between neighbors on lattices derived from the convex uniform tilings of the plane.

  14. Secure Interoperable Open Smart Grid Demonstration Project

    Magee, Thoman [Consolidated Edison Company Of New York, Inc., NY (United States)

    2014-12-28

    The Consolidated Edison, Inc., of New York (Con Edison) Secure Interoperable Open Smart Grid Demonstration Project (SGDP), sponsored by the United States (US) Department of Energy (DOE), demonstrated that the reliability, efficiency, and flexibility of the grid can be improved through a combination of enhanced monitoring and control capabilities using systems and resources that interoperate within a secure services framework. The project demonstrated the capability to shift, balance, and reduce load where and when needed in response to system contingencies or emergencies by leveraging controllable field assets. The range of field assets includes curtailable customer loads, distributed generation (DG), battery storage, electric vehicle (EV) charging stations, building management systems (BMS), home area networks (HANs), high-voltage monitoring, and advanced metering infrastructure (AMI). The SGDP enables the seamless integration and control of these field assets through a common, cyber-secure, interoperable control platform, which integrates a number of existing legacy control and data systems, as well as new smart grid (SG) systems and applications. By integrating advanced technologies for monitoring and control, the SGDP helps target and reduce peak load growth, improves the reliability and efficiency of Con Edison’s grid, and increases the ability to accommodate the growing use of distributed resources. Con Edison is dedicated to lowering costs, improving reliability and customer service, and reducing its impact on the environment for its customers. These objectives also align with the policy objectives of New York State as a whole. To help meet these objectives, Con Edison’s long-term vision for the distribution grid relies on the successful integration and control of a growing penetration of distributed resources, including demand response (DR) resources, battery storage units, and DG. For example, Con Edison is expecting significant long-term growth of DG

  15. Key pillars of data interoperability in Earth Sciences - INSPIRE and beyond

    Tomas, Robert; Lutz, Michael

    2013-04-01

    The well-known heterogeneity and fragmentation of data models, formats and controlled vocabularies of environmental data limit potential data users from utilising the wealth of environmental information available today across Europe. The main aim of INSPIRE1 is to improve this situation and give users possibility to access, use and correctly interpret environmental data. Over the past years number of INSPIRE technical guidelines (TG) and implementing rules (IR) for interoperability have been developed, involving hundreds of domain experts from across Europe. The data interoperability specifications, which have been developed for all 34 INSPIRE spatial data themes2, are the central component of the TG and IR. Several of these themes are related to the earth sciences, e.g. geology (including hydrogeology, geophysics and geomorphology), mineral and energy resources, soil science, natural hazards, meteorology, oceanography, hydrology and land cover. The following main pillars for data interoperability and harmonisation have been identified during the development of the specifications: Conceptual data models describe the spatial objects and their properties and relationships for the different spatial data themes. To achieve cross-domain harmonization, the data models for all themes are based on a common modelling framework (the INSPIRE Generic Conceptual Model3) and managed in a common UML repository. Harmonised vocabularies (or code lists) are to be used in data exchange in order to overcome interoperability issues caused by heterogeneous free-text and/or multi-lingual content. Since a mapping to a harmonized vocabulary could be difficult, the INSPIRE data models typically allow the provision of more specific terms from local vocabularies in addition to the harmonized terms - utilizing either the extensibility options or additional terminological attributes. Encoding. Currently, specific XML profiles of the Geography Markup Language (GML) are promoted as the standard

  16. DIMP: an interoperable solution for software integration and product data exchange

    Wang, Xi Vincent; Xu, Xun William

    2012-08-01

    Today, globalisation has become one of the main trends of manufacturing business that has led to a world-wide decentralisation of resources amongst not only individual departments within one company but also business partners. However, despite the development and improvement in the last few decades, difficulties in information exchange and sharing still exist in heterogeneous applications environments. This article is divided into two parts. In the first part, related research work and integrating solutions are reviewed and discussed. The second part introduces a collaborative environment called distributed interoperable manufacturing platform, which is based on a module-based, service-oriented architecture (SOA). In the platform, the STEP-NC data model is used to facilitate data-exchange among heterogeneous CAD/CAM/CNC systems.

  17. XML and Graphs for Modeling, Integration and Interoperability:a CMS Perspective

    van Lingen, Frank

    2004-01-01

    This thesis reports on a designer's Ph.D. project called “XML and Graphs for Modeling, Integration and Interoperability: a CMS perspective”. The project has been performed at CERN, the European laboratory for particle physics, in collaboration with the Eindhoven University of Technology and the University of the West of England in Bristol. CMS (Compact Muon Solenoid) is a next-generation high energy physics experiment at CERN, which will start running in 2007. The complexity of such a detector used in the experiment and the autonomous groups that are part of the CMS experiment, result in disparate data sources (different in format, type and structure). Users need to access and exchange data located in multiple heterogeneous sources in a domain-specific manner and may want to access a simple unit of information without having to understand details of the underlying schema. Users want to access the same information from several different heterogeneous sources. It is neither desirable nor fea...

  18. Enabling interoperability in Geoscience with GI-suite

    Boldrini, Enrico; Papeschi, Fabrizio; Santoro, Mattia; Nativi, Stefano

    2015-04-01

    GI-suite is a brokering framework targeting interoperability of heterogeneous systems in the Geoscience domain. The framework is composed by different brokers each one focusing on a specific functionality: discovery, access and semantics (i.e. GI-cat, GI-axe, GI-sem). The brokering takes place between a set of heterogeneous publishing services and a set of heterogeneous consumer applications: the brokering target is represented by resources (e.g. coverages, features, or metadata information) required to seamlessly flow from the providers to the consumers. Different international and community standards are now supported by GI-suite, making possible the successful deployment of GI-suite in many international projects and initiatives (such as GEOSS, NSF BCube and several EU funded projects). As for the publisher side more than 40 standards and implementations are supported (e.g. Dublin Core, OAI-PMH, OGC W*S, Geonetwork, THREDDS Data Server, Hyrax Server, etc.). The support for each individual standard is provided by means of specific GI-suite components, called accessors. As for the consumer applications side more than 15 standards and implementations are supported (e.g. ESRI ArcGIS, Openlayers, OGC W*S, OAI-PMH clients, etc.). The support for each individual standard is provided by means of specific profiler components. The GI-suite can be used in different scenarios by different actors: - A data provider having a pre-existent data repository can deploy and configure GI-suite to broker it and making thus available its data resources through different protocols to many different users (e.g. for data discovery and/or data access) - A data consumer can use GI-suite to discover and/or access resources from a variety of publishing services that are already publishing data according to well-known standards. - A community can deploy and configure GI-suite to build a community (or project-specific) broker: GI-suite can broker a set of community related repositories and

  19. Interoperability science cases with the CDPP tools

    Nathanaël, J.; Cecconi, B.; André, N.; Bouchemit, M.; Gangloff, M.; Budnik, E.; Jacquey, C.; Pitout, F.; Durand, J.; Rouillard, A.; Lavraud, B.; Genot, V. N.; Popescu, D.; Beigbeder, L.; Toniutti, J. P.; Caussarieu, S.

    2017-12-01

    Data exchange protocols are never as efficient as when they are invisible for the end user who is then able to discover data, to cross compare observations and modeled data and finally to perform in depth analysis. Over the years these protocols, including SAMP from IVOA, EPN-TAP from the Europlanet 2020 RI community, backed by standard web-services, have been deployed in tools designed by the French Centre de Données de la Physique des Plasmas (CDPP) including AMDA, the Propagation Tool, 3DView, ... . This presentation will focus on science cases which show the capability of interoperability in the planetary and heliophysics contexts, involving both CDPP and companion tools. Europlanet 2020 RI has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No 654208.

  20. Interoperability of Standards for Robotics in CIME

    Kroszynski, Uri; Sørensen, Torben; Ludwig, Arnold

    1997-01-01

    Esprit Project 6457 "Interoperability of Standards for Robotics in CIME (InterRob)" belongs to the Subprogramme "Integration in Manufacturing" of Esprit, the European Specific Programme for Research and Development in Information Technology supported by the European Commision.The first main goal...... of InterRob was to close the information chain between product design, simulation, programming, and robot control by developing standardized interfaces and their software implementation for standards STEP (International Standard for the Exchange of Product model data, ISO 10303) and IRL (Industrial Robot...... Language, DIN 66312). This is a continuation of the previous Esprit projects CAD*I and NIRO, which developed substantial basics of STEP.The InterRob approach is based on standardized models for product geometry, kinematics, robotics, dynamics and control, hence on a coherent neutral information model...

  1. Interoperability between phenotype and anatomy ontologies.

    Hoehndorf, Robert; Oellrich, Anika; Rebholz-Schuhmann, Dietrich

    2010-12-15

    Phenotypic information is important for the analysis of the molecular mechanisms underlying disease. A formal ontological representation of phenotypic information can help to identify, interpret and infer phenotypic traits based on experimental findings. The methods that are currently used to represent data and information about phenotypes fail to make the semantics of the phenotypic trait explicit and do not interoperate with ontologies of anatomy and other domains. Therefore, valuable resources for the analysis of phenotype studies remain unconnected and inaccessible to automated analysis and reasoning. We provide a framework to formalize phenotypic descriptions and make their semantics explicit. Based on this formalization, we provide the means to integrate phenotypic descriptions with ontologies of other domains, in particular anatomy and physiology. We demonstrate how our framework leads to the capability to represent disease phenotypes, perform powerful queries that were not possible before and infer additional knowledge. http://bioonto.de/pmwiki.php/Main/PheneOntology.

  2. Flexible solution for interoperable cloud healthcare systems.

    Vida, Mihaela Marcella; Lupşe, Oana Sorina; Stoicu-Tivadar, Lăcrămioara; Bernad, Elena

    2012-01-01

    It is extremely important for the healthcare domain to have a standardized communication because will improve the quality of information and in the end the resulting benefits will improve the quality of patients' life. The standards proposed to be used are: HL7 CDA and CCD. For a better access to the medical data a solution based on cloud computing (CC) is investigated. CC is a technology that supports flexibility, seamless care, and reduced costs of the medical act. To ensure interoperability between healthcare information systems a solution creating a Web Custom Control is presented. The control shows the database tables and fields used to configure the two standards. This control will facilitate the work of the medical staff and hospital administrators, because they can configure the local system easily and prepare it for communication with other systems. The resulted information will have a higher quality and will provide knowledge that will support better patient management and diagnosis.

  3. Space reactor system and subsystem investigations: assessment of technology issues for the reactor and shield subsystem. SP-100 Program

    Atkins, D.F.; Lillie, A.F.

    1983-01-01

    As part of Rockwell's effort on the SP-100 Program, preliminary assessment has been completed of current nuclear technology as it relates to candidate reactor/shield subsystems for the SP-100 Program. The scope of the assessment was confined to the nuclear package (to the reactor and shield subsystems). The nine generic reactor subsystems presented in Rockwell's Subsystem Technology Assessment Report, ESG-DOE-13398, were addressed for the assessment

  4. Biodiversity information platforms: From standards to interoperability

    Walter Berendsohn

    2011-11-01

    Full Text Available One of the most serious bottlenecks in the scientific workflows of biodiversity sciences is the need to integrate data from different sources, software applications, and services for analysis, visualisation and publication. For more than a quarter of a century the TDWG Biodiversity Information Standards organisation has a central role in defining and promoting data standards and protocols supporting interoperability between disparate and locally distributed systems. Although often not sufficiently recognized, TDWG standards are the foundation of many popular Biodiversity Informatics applications and infrastructures ranging from small desktop software solutions to large scale international data networks. However, individual scientists and groups of collaborating scientist have difficulties in fully exploiting the potential of standards that are often notoriously complex, lack non-technical documentations, and use different representations and underlying technologies. In the last few years, a series of initiatives such as Scratchpads, the EDIT Platform for Cybertaxonomy, and biowikifarm have started to implement and set up virtual work platforms for biodiversity sciences which shield their users from the complexity of the underlying standards. Apart from being practical work-horses for numerous working processes related to biodiversity sciences, they can be seen as information brokers mediating information between multiple data standards and protocols. The ViBRANT project will further strengthen the flexibility and power of virtual biodiversity working platforms by building software interfaces between them, thus facilitating essential information flows needed for comprehensive data exchange, data indexing, web-publication, and versioning. This work will make an important contribution to the shaping of an international, interoperable, and user-oriented biodiversity information infrastructure.

  5. BIM Interoperability Limitations: Australian and Malaysian Rail Projects

    Kenley Russell

    2016-01-01

    Full Text Available Building information modelling (BIM is defined as a process involving the generation and management of digital representation of physical and functional characteristics of a facility. The purpose of interoperability in integrated or “open” BIM is to facilitate the information exchange between different digital systems, models and tools. There has been effort towards data interoperability with development of open source standards and object-oriented models, such as industry foundation classes (IFC for vertical infrastructure. However, the lack of open data standards for the information exchange for horizontal infrastructure limits the adoption and effectiveness of integrated BIM. The paper outlines two interoperability issues for construction of rail infrastructure. The issues are presented in two case study reports, one from Australia and one from Malaysia. The each case study includes: a description of the project, the application of BIM in the project, a discussion of the promised BIM interoperability solution plus the identification of the unresolved lack of interoperability for horizontal infrastructure project management. The Moreton Bay Rail project in Australia introduces general software interoperability issues. The Light Rail Extension project in Kuala Lumpur outlines an example of the integration problems related to two different location data structures. The paper highlights how the continuing lack of data interoperability limits utilisation of integrated BIM for horizontal infrastructure rail projects.

  6. Managing Interoperability for GEOSS - A Report from the SIF

    Khalsa, S. J.; Actur, D.; Nativi, S.; Browdy, S.; Eglitis, P.

    2009-04-01

    The Global Earth Observation System of Systems (GEOSS) is a coordinating and integrating framework for Earth observing and information systems, which are contributed on a voluntary basis by Members and Participating Organizations of the intergovernmental Group on Earth Observations (GEO). GEOSS exists to support informed decision making for the benefit of society, including the implementation of international environmental treaty obligations. GEO Members and Participating organizations use the GEOSS Common Infrastructure (GCI) to register their Earth observation resources, thereby making them discoverable and consumable by both humans and client applications. Essential to meeting GEO user needs is a process for supporting interoperability of observing, processing, modeling and dissemination capabilities. The GEO Standards and Interoperability Forum (SIF) was created to develop, implement and oversee this process. The SIF supports GEO organizations contributing resources to the GEOSS by helping them understand and work with the GEOSS interoperability guidelines and encouraging them to register their "interoperability arrangements" (standards or other ad hoc arrangements for interoperability) in the GEOSS standards registry, which is part of the GCI. These registered interoperability arrangements support the actual services used to achieve interoperability of systems. By making information about these interoperability arrangements available to users of the GEOSS the SIF enhances the understanding and utility of contributed resources. We describe the procedures that the SIF has enacted to carry out its work. To operate effectively the SIF uses a workflow system and is establishing a set of regional teams and domain experts. In the near term our work has focused on population and review of the GEOSS Standards Registry, but we are also developing approaches to achieving progressive convergence on, and uptake of, an optimal set of interoperability arrangements for all of

  7. Space-reactor electric systems: subsystem technology assessment

    Anderson, R.V.; Bost, D.; Determan, W.R.

    1983-01-01

    This report documents the subsystem technology assessment. For the purpose of this report, five subsystems were defined for a space reactor electric system, and the report is organized around these subsystems: reactor; shielding; primary heat transport; power conversion and processing; and heat rejection. The purpose of the assessment was to determine the current technology status and the technology potentials for different types of the five subsystems. The cost and schedule needed to develop these potentials were estimated, and sets of development-compatible subsystems were identified

  8. IoT interoperability : a hub-based approach

    Blackstock, Michael; Lea, Rodger

    2014-01-01

    Interoperability in the Internet of Things is critical for emerging services and applications. In this paper we advocate the use of IoT ‘hubs’ to aggregate things using web protocols, and suggest a staged approach to interoperability. In the context of a UK government funded project involving 8 IoT projects to address cross-domain IoT interoperability, we introduce the HyperCat IoT catalogue specification. We then describe the tools and techniques we developed to adapt an existing data portal...

  9. Cloud portability and interoperability issues and current trends

    Di Martino, Beniamino; Esposito, Antonio

    2015-01-01

    This book offers readers a quick, comprehensive and up-to-date overview of the most important methodologies, technologies, APIs and standards related to the portability and interoperability of cloud applications and services, illustrated by a number of use cases representing a variety of interoperability and portability scenarios. The lack of portability and interoperability between cloud platforms at different service levels is the main issue affecting cloud-based services today. The brokering, negotiation, management, monitoring and reconfiguration of cloud resources are challenging tasks

  10. A logical approach to semantic interoperability in healthcare.

    Bird, Linda; Brooks, Colleen; Cheong, Yu Chye; Tun, Nwe Ni

    2011-01-01

    Singapore is in the process of rolling out a number of national e-health initiatives, including the National Electronic Health Record (NEHR). A critical enabler in the journey towards semantic interoperability is a Logical Information Model (LIM) that harmonises the semantics of the information structure with the terminology. The Singapore LIM uses a combination of international standards, including ISO 13606-1 (a reference model for electronic health record communication), ISO 21090 (healthcare datatypes), and SNOMED CT (healthcare terminology). The LIM is accompanied by a logical design approach, used to generate interoperability artifacts, and incorporates mechanisms for achieving unidirectional and bidirectional semantic interoperability.

  11. National Ignition Facility subsystem design requirements target area auxiliary subsystem SSDR 1.8.6

    Reitz, T.

    1996-01-01

    This Subsystem Design Requirement (SSDR) establishes the performance, design, development, and test requirements for the Target Area Auxiliary Subsystems (WBS 1.8.6), which is part of the NIF Target Experimental System (WBS 1.8). This document responds directly to the requirements detailed in NIF Target Experimental System SDR 003 document. Key elements of the Target Area Auxiliary Subsystems include: WBS 1.8.6.1 Local Utility Services; WBS 1.8.6.2 Cable Trays; WBS 1.8.6.3 Personnel, Safety, and Occupational Access; WBS 1.8.6.4 Assembly, Installation, and Maintenance Equipment; WBS 1.8.6.4.1 Target Chamber Service System; WBS 1.8.6.4.2 Target Bay Service Systems

  12. The JPL telerobotic Manipulator Control and Mechanization (MCM) subsystem

    Hayati, Samad; Lee, Thomas S.; Tso, Kam; Backes, Paul; Kan, Edwin; Lloyd, J.

    1989-01-01

    The Manipulator Control and Mechanization (MCM) subsystem of the telerobot system provides the real-time control of the robot manipulators in autonomous and teleoperated modes and real time input/output for a variety of sensors and actuators. Substantial hardware and software are included in this subsystem which interfaces in the hierarchy of the telerobot system with the other subsystems. The other subsystems are: run time control, task planning and reasoning, sensing and perception, and operator control subsystem. The architecture of the MCM subsystem, its capabilities, and details of various hardware and software elements are described. Important improvements in the MCM subsystem over the first version are: dual arm coordinated trajectory generation and control, addition of integrated teleoperation, shared control capability, replacement of the ultimate controllers with motor controllers, and substantial increase in real time processing capability.

  13. Improving the interoperability of biomedical ontologies with compound alignments.

    Oliveira, Daniela; Pesquita, Catia

    2018-01-09

    Ontologies are commonly used to annotate and help process life sciences data. Although their original goal is to facilitate integration and interoperability among heterogeneous data sources, when these sources are annotated with distinct ontologies, bridging this gap can be challenging. In the last decade, ontology matching systems have been evolving and are now capable of producing high-quality mappings for life sciences ontologies, usually limited to the equivalence between two ontologies. However, life sciences research is becoming increasingly transdisciplinary and integrative, fostering the need to develop matching strategies that are able to handle multiple ontologies and more complex relations between their concepts. We have developed ontology matching algorithms that are able to find compound mappings between multiple biomedical ontologies, in the form of ternary mappings, finding for instance that "aortic valve stenosis"(HP:0001650) is equivalent to the intersection between "aortic valve"(FMA:7236) and "constricted" (PATO:0001847). The algorithms take advantage of search space filtering based on partial mappings between ontology pairs, to be able to handle the increased computational demands. The evaluation of the algorithms has shown that they are able to produce meaningful results, with precision in the range of 60-92% for new mappings. The algorithms were also applied to the potential extension of logical definitions of the OBO and the matching of several plant-related ontologies. This work is a first step towards finding more complex relations between multiple ontologies. The evaluation shows that the results produced are significant and that the algorithms could satisfy specific integration needs.

  14. Power Subsystem Approach for the Europa Mission

    Ulloa-Severino Antonio

    2017-01-01

    Full Text Available NASA is planning to launch a spacecraft on a mission to the Jovian moon Europa, in order to conduct a detailed reconnaissance and investigation of its habitability. The spacecraft would orbit Jupiter and perform a detailed science investigation of Europa, utilizing a number of science instruments including an ice-penetrating radar to determine the icy shell thickness and presence of subsurface oceans. The spacecraft would be exposed to harsh radiation and extreme temperature environments. To meet mission objectives, the spacecraft power subsystem is being architected and designed to operate efficiently, and with a high degree of reliability.

  15. Building the IOOS data management subsystem

    de La Beaujardière, J.; Mendelssohn, R.; Ortiz, C.; Signell, R.

    2010-01-01

    We discuss progress to date and plans for the Integrated Ocean Observing System (IOOS??) Data Management and Communications (DMAC) subsystem. We begin by presenting a conceptual architecture of IOOS DMAC. We describe work done as part of a 3-year pilot project known as the Data Integration Framework and the subsequent assessment of lessons learned. We present work that has been accomplished as part of the initial version of the IOOS Data Catalog. Finally, we discuss near-term plans for augmenting IOOS DMAC capabilities.

  16. Structure of the Galaxy and its subsystems

    Ruprecht, J.

    1979-01-01

    Current knowledge is summed up of the structure of our galaxy consisting of more than 100 thousand million stars of an overal mass of 10 44 g, and of interstellar dust and gas. The galaxy comprises several subsystems, the oldest of which being of a spherical shape while the younger ones are more-or-less oblate rotational ellipsoids. It is considered on the basis of visual and radio observations that the galaxy has a spiral structure with many arms, similar to other galaxies. The structure of the galaxy nucleus has not yet been fully explained. (Ha)

  17. Optical fiber telecommunications components and subsystems

    Kaminow, Ivan; Willner, Alan E

    2013-01-01

    Optical Fiber Telecommunications VI (A&B) is the sixth in a series that has chronicled the progress in the R&D of lightwave communications since the early 1970s. Written by active authorities from academia and industry, this edition brings a fresh look to many essential topics, including devices, subsystems, systems and networks. A central theme is the enabling of high-bandwidth communications in a cost-effective manner for the development of customer applications. These volumes are an ideal reference for R&D engineers and managers, optical systems implementers, university researchers and s

  18. CCSDS SM and C Mission Operations Interoperability Prototype

    Lucord, Steven A.

    2010-01-01

    This slide presentation reviews the prototype of the Spacecraft Monitor and Control (SM&C) Operations for interoperability among other space agencies. This particular prototype uses the German Space Agency (DLR) to test the ideas for interagency coordination.

  19. Interoperable Multimedia Annotation and Retrieval for the Tourism Sector

    Chatzitoulousis, Antonios; Efraimidis, Pavlos S.; Athanasiadis, I.N.

    2015-01-01

    The Atlas Metadata System (AMS) employs semantic web annotation techniques in order to create an interoperable information annotation and retrieval platform for the tourism sector. AMS adopts state-of-the-art metadata vocabularies, annotation techniques and semantic web technologies.

  20. Patterns in Standards and Technologies for Economic Information Systems Interoperability

    Vasile Irimia

    2012-06-01

    Full Text Available This paper presets results from a review of the current standards used for collaboration between economic information systems, including web services and service oriented architecture, EDI, ebXML framework, RosettaNet framework, cXML, xCBL UBL, BPMN, BPEL, WS-CDL, ASN.1, and others. Standards have a key role in promoting economic information system interoperability, and thus enable collaboration. Analyzing the current standards, technologies and applications used for economic information systems interoperability has revealed a common pattern that runs through all of them. From this pattern we construct a basic model of interoperability around which we relate and judge all standards, technologies and applications for economic information systems interoperability.

  1. Radio Interoperability: There Is More to It Than Hardware

    Hutchins, Susan G; Timmons, Ronald P

    2007-01-01

    Radio Interoperability: The Problem *Superfluous radio transmissions contribute to auditory overload of first responders -Obscure development of an accurate operational picture for all involved -Radio spectrum is a limited commodity once...

  2. A Cultural Framework for the Interoperability of C2 Systems

    Slay, Jill

    2002-01-01

    In considering some of the difficulties experienced in coalition operations, it becomes apparent that attention is needed, is in establishing a cultural framework for the interoperability of personnel (the human agents...

  3. ISAIA: Interoperable Systems for Archival Information Access

    Hanisch, Robert J.

    2002-01-01

    The ISAIA project was originally proposed in 1999 as a successor to the informal AstroBrowse project. AstroBrowse, which provided a data location service for astronomical archives and catalogs, was a first step toward data system integration and interoperability. The goals of ISAIA were ambitious: '...To develop an interdisciplinary data location and integration service for space science. Building upon existing data services and communications protocols, this service will allow users to transparently query hundreds or thousands of WWW-based resources (catalogs, data, computational resources, bibliographic references, etc.) from a single interface. The service will collect responses from various resources and integrate them in a seamless fashion for display and manipulation by the user.' Funding was approved only for a one-year pilot study, a decision that in retrospect was wise given the rapid changes in information technology in the past few years and the emergence of the Virtual Observatory initiatives in the US and worldwide. Indeed, the ISAIA pilot study was influential in shaping the science goals, system design, metadata standards, and technology choices for the virtual observatory. The ISAIA pilot project also helped to cement working relationships among the NASA data centers, US ground-based observatories, and international data centers. The ISAIA project was formed as a collaborative effort between thirteen institutions that provided data to astronomers, space physicists, and planetary scientists. Among the fruits we ultimately hoped would come from this project would be a central site on the Web that any space scientist could use to efficiently locate existing data relevant to a particular scientific question. Furthermore, we hoped that the needed technology would be general enough to allow smaller, more-focused community within space science could use the same technologies and standards to provide more specialized services. A major challenge to searching

  4. GEOSS interoperability for Weather, Ocean and Water

    Richardson, David; Nyenhuis, Michael; Zsoter, Ervin; Pappenberger, Florian

    2013-04-01

    "Understanding the Earth system — its weather, climate, oceans, atmosphere, water, land, geodynamics, natural resources, ecosystems, and natural and human-induced hazards — is crucial to enhancing human health, safety and welfare, alleviating human suffering including poverty, protecting the global environment, reducing disaster losses, and achieving sustainable development. Observations of the Earth system constitute critical input for advancing this understanding." With this in mind, the Group on Earth Observations (GEO) started implementing the Global Earth Observation System of Systems (GEOSS). GEOWOW, short for "GEOSS interoperability for Weather, Ocean and Water", is supporting this objective. GEOWOW's main challenge is to improve Earth observation data discovery, accessibility and exploitability, and to evolve GEOSS in terms of interoperability, standardization and functionality. One of the main goals behind the GEOWOW project is to demonstrate the value of the TIGGE archive in interdisciplinary applications, providing a vast amount of useful and easily accessible information to the users through the GEO Common Infrastructure (GCI). GEOWOW aims at developing funcionalities that will allow easy discovery, access and use of TIGGE archive data and of in-situ observations, e.g. from the Global Runoff Data Centre (GRDC), to support applications such as river discharge forecasting.TIGGE (THORPEX Interactive Grand Global Ensemble) is a key component of THORPEX: a World Weather Research Programme to accelerate the improvements in the accuracy of 1-day to 2 week high-impact weather forecasts for the benefit of humanity. The TIGGE archive consists of ensemble weather forecast data from ten global NWP centres, starting from October 2006, which has been made available for scientific research. The TIGGE archive has been used to analyse hydro-meteorological forecasts of flooding in Europe as well as in China. In general the analysis has been favourable in terms of

  5. Forcing Interoperability: An Intentionally Fractured Approach

    Gallaher, D. W.; Brodzik, M.; Scambos, T.; Stroeve, J.

    2008-12-01

    The NSIDC is attempting to rebuild a significant portion of its public-facing cyberinfrastructure to better meet the needs expressed by the cryospheric community. The project initially addresses a specific science need - understanding Greenland's contribution to global sea level rise through comparison and analysis of variables such as temperature, albedo, melt, ice velocity and surface elevation. This project will ultimately be expanded to cover most of NSIDC's cryospheric data. Like many organizations, we need to provide users with data discovery interfaces, collaboration tools and mapping services. Complicating this effort is the need to reduce the volume of raw data delivered to the user. Data growth, especially with time-series data, will overwhelm our software, processors and network like never before. We need to provide the users the ability to perform first level analysis directly on our site. In order to accomplish this, the users should be free to modify the behavior of these tools as well as incorporate their own tools and analysis to meet their needs. Rather than building one monolithic project to build this system, we have chosen to build three semi-independent systems. One team is building a data discovery and web based distribution system, the second is building an advanced analysis and workflow system and the third is building a customized web mapping service. These systems will use the same underlying data structures and services but will employ different technologies and teams to build their objectives, schedules and user interfaces. Obviously, we are adding complexity and risk to the overall project however this may be the best method to achieve interoperability because the development teams will be required to build off each others work. The teams will be forced to design with other users in mind as opposed to building interoperability as an afterthought, which a tendency in monolithic systems. All three teams will take advantage of preexisting

  6. Visual Development Environment for Semantically Interoperable Smart Cities Applications

    Roukounaki , Aikaterini; Soldatos , John; Petrolo , Riccardo; Loscri , Valeria; Mitton , Nathalie; Serrano , Martin

    2015-01-01

    International audience; This paper presents an IoT architecture for the semantic interoperability of diverse IoT systems and applications in smart cities. The architecture virtualizes diverse IoT systems and ensures their modelling and representation according to common standards-based IoT ontologies. Furthermore, based on this architecture, the paper introduces a first-of-a-kind visual development environment which eases the development of semantically interoperable applications in smart cit...

  7. Interoperability, Enterprise Architectures, and IT Governance in Government

    Scholl , Hans ,; Kubicek , Herbert; Cimander , Ralf

    2011-01-01

    Part 4: Architecture, Security and Interoperability; International audience; Government represents a unique, and also uniquely complex, environment for interoperation of information systems as well as for integration of workflows and processes across governmental levels and branches. While private-sector organizations by and large have the capacity to implement “enterprise architectures” in a relatively straightforward fashion, for notable reasons governments do not enjoy such luxury. For thi...

  8. The Sentinel 4 focal plane subsystem

    Hohn, Rüdiger; Skegg, Michael P.; Hermsen, Markus; Hinger, Jürgen; Williges, Christian; Reulke, Ralf

    2017-09-01

    The Sentinel 4 instrument is an imaging spectrometer, developed by Airbus under ESA contract in the frame of the joint European Union (EU)/ESA COPERNICUS program with the objective of monitoring trace gas concentrations. Sentinel 4 will provide accurate measurements of key atmospheric constituents such as ozone, nitrogen dioxide, sulfur dioxide, formaldehyde, as well as aerosol and cloud properties. Sentinel 4 is unique in being the first geostationary UVN mission. The SENTINEL 4 space segment will be integrated on EUMETSAT's Meteosat Third Generation Sounder satellite (MTG-S). Sentinel 4 will provide coverage of Europe and adjacent regions. The Sentinel 4 instrument comprises as a major element two Focal Plane Subsystems (FPS) covering the wavelength ranges 305 nm to 500 nm (UVVIS) and 750 nm to 775 nm (NIR) respectively. The paper describes the Focal Plane Subsystems, comprising the detectors, the optical bench and the control electronics. Further the design and development approach will be presented as well as first measurement results of FPS Qualification Model.

  9. Evaluating the Organizational Interoperability Maturity Level in ICT Research Center

    Manijeh Haghighinasab

    2011-03-01

    Full Text Available Interoperability refers to the ability to provide services and to accept services from other systems or devices. Collaborative enterprises face additional challenges to interoperate seamlessly within a networked organization. The major task here is to assess the maturity level of interoperating organizations. For this purpose the maturity models for enterprise were reviewed based on vendors’ reliability and advantages versus disadvantages. Interoperability maturity model was deduced from ATHENA project as European Integrated Project in 2005, this model named as EIMM was examined in Iran information and Communication Institute as a leading Telecommunication organization. 115 questionnaires were distributed between staff of 4 departments: Information Technology, Communication Technology, Security and Strategic studies regarding six areas of concern: Enterprise Modeling, Business Strategy Process, Organization and Competences, Products and Services, Systems and Technology, Legal Environment, Security and Trust at five maturity levels: Performed, Modeled , Integrated, Interoperable and Optimizing maturity. The findings showed different levels of maturity in this Institute. To achieve Interoperability level, appropriate practices are proposed for promotion to the higher levels.

  10. Heterogeneous computing with OpenCL

    2013-01-01

    Heterogeneous Computing with OpenCL teaches OpenCL and parallel programming for complex systems that may include a variety of device architectures: multi-core CPUs, GPUs, and fully-integrated Accelerated Processing Units (APUs) such as AMD Fusion technology. Designed to work on multiple platforms and with wide industry support, OpenCL will help you more effectively program for a heterogeneous future. Written by leaders in the parallel computing and OpenCL communities, this book will give you hands-on OpenCL experience to address a range of fundamental parallel algorithms. The authors explore memory spaces, optimization techniques, graphics interoperability, extensions, and debugging and profiling. Intended to support a parallel programming course, Heterogeneous Computing with OpenCL includes detailed examples throughout, plus additional online exercises and other supporting materials.

  11. Regulatory barriers blocking standardization of interoperability.

    Zhong, Daidi; Kirwan, Michael J; Duan, Xiaolian

    2013-07-12

    Developing and implementing a set of personal health device interoperability standards is key to cultivating a healthy global industry ecosystem. The standardization organizations, including the Institute of Electrical and Electronics Engineers 11073 Personal Health Device Workgroup (IEEE 11073-PHD WG) and Continua Health Alliance, are striving for this purpose. However, factors like the medial device regulation, health policy, and market reality have placed non-technical barriers over the adoption of technical standards throughout the industry. These barriers have significantly impaired the motivations of consumer device vendors who desire to enter the personal health market and the overall success of personal health industry ecosystem. In this paper, we present the affect that these barriers have placed on the health ecosystem. This requires immediate action from policy makers and other stakeholders. The current regulatory policy needs to be updated to reflect the reality and demand of consumer health industry. Our hope is that this paper will draw wide consensus amongst its readers, policy makers, and other stakeholders.

  12. Interoperable Data Sharing for Diverse Scientific Disciplines

    Hughes, John S.; Crichton, Daniel; Martinez, Santa; Law, Emily; Hardman, Sean

    2016-04-01

    For diverse scientific disciplines to interoperate they must be able to exchange information based on a shared understanding. To capture this shared understanding, we have developed a knowledge representation framework using ontologies and ISO level archive and metadata registry reference models. This framework provides multi-level governance, evolves independent of implementation technologies, and promotes agile development, namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. The knowledge representation framework is populated through knowledge acquisition from discipline experts. It is also extended to meet specific discipline requirements. The result is a formalized and rigorous knowledge base that addresses data representation, integrity, provenance, context, quantity, and their relationships within the community. The contents of the knowledge base is translated and written to files in appropriate formats to configure system software and services, provide user documentation, validate ingested data, and support data analytics. This presentation will provide an overview of the framework, present the Planetary Data System's PDS4 as a use case that has been adopted by the international planetary science community, describe how the framework is being applied to other disciplines, and share some important lessons learned.

  13. Recent ARC developments: Through modularity to interoperability

    Smirnova, O; Cameron, D; Ellert, M; Groenager, M; Johansson, D; Kleist, J; Dobe, P; Joenemo, J; Konya, B; Fraagaat, T; Konstantinov, A; Nilsen, J K; Saada, F Ould; Qiang, W; Read, A; Kocan, M; Marton, I; Nagy, Zs; Moeller, S; Mohn, B

    2010-01-01

    The Advanced Resource Connector (ARC) middleware introduced by NorduGrid is one of the basic Grid solutions used by scientists worldwide. While being well-proven in daily use by a wide variety of scientific applications at large-scale infrastructures like the Nordic DataGrid Facility (NDGF) and smaller scale projects, production ARC of today is still largely based on conventional Grid technologies and custom interfaces introduced a decade ago. In order to guarantee sustainability, true cross-system portability and standards-compliance based interoperability, the ARC community undertakes a massive effort of implementing modular Web Service (WS) approach into the middleware. With support from the EU KnowARC project, new components were introduced and the existing key ARC services got extended with WS technology based standard-compliant interfaces following a service-oriented architecture. Such components include the hosting environment framework, the resource-coupled execution service, the re-engineered client library, the self-healing storage solution and the peer-to-peer information system, to name a few. Gradual introduction of these new services and client tools into the production middleware releases is carried out together with NDGF and thus ensures a smooth transition to the next generation Grid middleware. Standard interfaces and modularity of the new component design are essential for ARC contributions to the planned Universal Middleware Distribution of the European Grid Initiative.

  14. Recent ARC developments: Through modularity to interoperability

    Smirnova, O; Cameron, D; Ellert, M; Groenager, M; Johansson, D; Kleist, J [NDGF, Kastruplundsgade 22, DK-2770 Kastrup (Denmark); Dobe, P; Joenemo, J; Konya, B [Lund University, Experimental High Energy Physics, Institute of Physics, Box 118, SE-22100 Lund (Sweden); Fraagaat, T; Konstantinov, A; Nilsen, J K; Saada, F Ould; Qiang, W; Read, A [University of Oslo, Department of Physics, P. O. Box 1048, Blindern, N-0316 Oslo (Norway); Kocan, M [Pavol Jozef Safarik University, Faculty of Science, Jesenna 5, SK-04000 Kosice (Slovakia); Marton, I; Nagy, Zs [NIIF/HUNGARNET, Victor Hugo 18-22, H-1132 Budapest (Hungary); Moeller, S [University of Luebeck, Inst. Of Neuro- and Bioinformatics, Ratzeburger Allee 160, D-23538 Luebeck (Germany); Mohn, B, E-mail: oxana.smirnova@hep.lu.s [Uppsala University, Department of Physics and Astronomy, Div. of Nuclear and Particle Physics, Box 535, SE-75121 Uppsala (Sweden)

    2010-04-01

    The Advanced Resource Connector (ARC) middleware introduced by NorduGrid is one of the basic Grid solutions used by scientists worldwide. While being well-proven in daily use by a wide variety of scientific applications at large-scale infrastructures like the Nordic DataGrid Facility (NDGF) and smaller scale projects, production ARC of today is still largely based on conventional Grid technologies and custom interfaces introduced a decade ago. In order to guarantee sustainability, true cross-system portability and standards-compliance based interoperability, the ARC community undertakes a massive effort of implementing modular Web Service (WS) approach into the middleware. With support from the EU KnowARC project, new components were introduced and the existing key ARC services got extended with WS technology based standard-compliant interfaces following a service-oriented architecture. Such components include the hosting environment framework, the resource-coupled execution service, the re-engineered client library, the self-healing storage solution and the peer-to-peer information system, to name a few. Gradual introduction of these new services and client tools into the production middleware releases is carried out together with NDGF and thus ensures a smooth transition to the next generation Grid middleware. Standard interfaces and modularity of the new component design are essential for ARC contributions to the planned Universal Middleware Distribution of the European Grid Initiative.

  15. The advanced microgrid. Integration and interoperability

    Bower, Ward Isaac [Ward Bower Innovations, LLC, Albuquerque, NM (United Staes); Ton, Dan T. [U.S. Dept. of Energy, Washington, DC (United States); Guttromson, Ross [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Glover, Steven F [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Stamp, Jason Edwin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bhatnagar, Dhruv [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Reilly, Jim [Reily Associates, Pittston, PA (United States)

    2014-02-01

    This white paper focuses on "advanced microgrids," but sections do, out of necessity, reference today's commercially available systems and installations in order to clearly distinguish the differences and advances. Advanced microgrids have been identified as being a necessary part of the modern electrical grid through a two DOE microgrid workshops, the National Institute of Standards and Technology, Smart Grid Interoperability Panel and other related sources. With their grid-interconnectivity advantages, advanced microgrids will improve system energy efficiency and reliability and provide enabling technologies for grid-independence to end-user sites. One popular definition that has been evolved and is used in multiple references is that a microgrid is a group of interconnected loads and distributed-energy resources within clearly defined electrical boundaries that acts as a single controllable entity with respect to the grid. A microgrid can connect and disconnect from the grid to enable it to operate in both grid-connected or island-mode. Further, an advanced microgrid can then be loosely defined as a dynamic microgrid.

  16. AliEn - EDG Interoperability in ALICE

    Bagnasco, S; Buncic, P; Carminati, F; Cerello, P G; Saiz, P

    2003-01-01

    AliEn (ALICE Environment) is a GRID-like system for large scale job submission and distributed data management developed and used in the context of ALICE, the CERN LHC heavy-ion experiment. With the aim of exploiting upcoming Grid resources to run AliEn-managed jobs and store the produced data, the problem of AliEn-EDG interoperability was addressed and an in-terface was designed. One or more EDG (European Data Grid) User Interface machines run the AliEn software suite (Cluster Monitor, Storage Element and Computing Element), and act as interface nodes between the systems. An EDG Resource Broker is seen by the AliEn server as a single Computing Element, while the EDG storage is seen by AliEn as a single, large Storage Element; files produced in EDG sites are registered in both the EDG Replica Catalogue and in the AliEn Data Catalogue, thus ensuring accessibility from both worlds. In fact, both registrations are required: the AliEn one is used for the data management, the EDG one to guarantee the integrity and...

  17. Data interoperability software solution for emergency reaction in the Europe Union

    Casado, R.; Rubiera, E.; Sacristan, M.; Schütte, F.; Peters, R.

    2015-07-01

    Emergency management becomes more challenging in international crisis episodes because of cultural, semantic and linguistic differences between all stakeholders, especially first responders. Misunderstandings between first responders makes decision making slower and more difficult. However, spread and development of networks and IT-based emergency management systems (EMSs) have improved emergency responses, which have become more coordinated. Despite improvements made in recent years, EMSs have not still solved problems related to cultural, semantic and linguistic differences which are the real cause of slower decision making. In addition, from a technical perspective, the consolidation of current EMSs and the different formats used to exchange information offers another problem to be solved in any solution proposed for information interoperability between heterogeneous EMSs in different contexts. To overcome these problems, we present a software solution based on semantic and mediation technologies. EMERGency ELements (EMERGEL) (Fundacion CTIC and AntwortING Ingenieurbüro PartG, 2013), a common and modular ontology shared by all the stakeholders, has been defined. It offers the best solution to gather all stakeholders' knowledge in a unique and flexible data model, taking into account different countries' cultural and linguistic issues. To deal with the diversity of data protocols and formats, we have designed a service-oriented architecture for data interoperability (named DISASTER: Data Interoperability Solution At STakeholders Emergency Reaction) providing a flexible extensible solution to solve the mediation issues. Web services have been adopted as specific technology to implement this paradigm that has the most significant academic and industrial visibility and attraction. Contributions of this work have been validated through the design and development of a cross-border realistic prototype scenario, actively involving both emergency managers and emergency

  18. Principles of data integration and interoperability in the GEO Biodiversity Observation Network

    Saarenmaa, Hannu; Ó Tuama, Éamonn

    2010-05-01

    The goal of the Global Earth Observation System of Systems (GEOSS) is to link existing information systems into a global and flexible network to address nine areas of critical importance to society. One of these "societal benefit areas" is biodiversity and it will be supported by a GEOSS sub-system known as the GEO Biodiversity Observation Network (GEO BON). In planning the GEO BON, it was soon recognised that there are already a multitude of existing networks and initiatives in place worldwide. What has been lacking is a coordinated framework that allows for information sharing and exchange between the networks. Traversing across the various scales of biodiversity, in particular from the individual and species levels to the ecosystems level has long been a challenge. Furthermore, some of the major regions of the world have already taken steps to coordinate their efforts, but links between the regions have not been a priority until now. Linking biodiversity data to that of the other GEO societal benefit areas, in particular ecosystems, climate, and agriculture to produce useful information for the UN Conventions and other policy-making bodies is another need that calls for integration of information. Integration and interoperability are therefore a major theme of GEO BON, and a "system of systems" is very much needed. There are several approaches to integration that need to be considered. Data integration requires harmonising concepts, agreeing on vocabularies, and building ontologies. Semantic mediation of data using these building blocks is still not easy to achieve. Agreements on, or mappings between, the metadata standards that will be used across the networks is a major requirement that will need to be addressed early on. With interoperable metadata, service integration will be possible through registry of registries systems such as GBIF's forthcoming GBDRS and the GEO Clearinghouse. Chaining various services that build intermediate products using workflow

  19. Double Shell Tank (DST) Process Waste Sampling Subsystem Definition Report

    RASMUSSEN, J.H.

    2000-01-01

    This report defines the Double-Shell Tank (DST) Process Waste Sampling Subsystem (PWSS). This subsystem definition report fully describes and identifies the system boundaries of the PWSS. This definition provides a basis for developing functional, performance, and test requirements (i.e., subsystem specification), as necessary, for the PWSS. The resultant PWSS specification will include the sampling requirements to support the transfer of waste from the DSTs to the Privatization Contractor during Phase 1 of Waste Feed Delivery

  20. Conditional density matrix: systems and subsystems in quantum mechanics

    Belokurov, V.V.; Khrustalev, O.A.; Sadovnichij, V.A.; Timofeevskaya, O.D.

    2003-01-01

    A new quantum mechanical notion - Conditional Density Matrix - is discussed and is applied to describe some physical processes. This notion is a natural generalization of von Neumann density matrix for such processes as divisions of quantum systems into subsystems and reunifications of subsystems into new joint systems. Conditional Density Matrix assigns a quantum state to a subsystem of a composite system on condition that another part of the composite system is in some pure state

  1. IOOS modeling subsystem: vision and implementation strategy

    Rosenfeld, Leslie; Chao, Yi; Signell, Richard P.

    2012-01-01

    Numerical modeling is vital to achieving the U.S. IOOS® goals of predicting, understanding and adapting to change in the ocean and Great Lakes. In the next decade IOOS should cultivate a holistic approach to coastal ocean prediction, and encourage more balanced investment among the observing, modeling and information management subsystems. We believe the vision of a prediction framework driven by observations, and leveraging advanced technology and understanding of the ocean and Great Lakes, would lead to a new era for IOOS that would not only produce more powerful information, but would also capture broad community support, particularly from the general public, thus allowing IOOS to develop into the comprehensive information system that was envisioned at the outset.

  2. Response spectrum analysis for multi-supported subsystems

    Reed, J.W.

    1983-01-01

    A methodology was developed to analyze multi-supported subsystems (e.g., piping systems) for seismic or other dynamic forces using response spectrum input. Currently, subsystems which are supported at more than one location in a nuclear power plant building are analyzed either by the time-history method or by response spectrum procedures, where spectra which envelop all support locations are used. The former procedure is exceedingly expensive, while the latter procedure is inexpensive but very conservative. Improved analysis procedures are currently being developed which are either coupled- or uncoupled-system approaches. For the coupled-system approach, response feedback between the subsystem and building system is included. For the uncoupled-system approach, feedback is neglected; however, either time history or response spectrum methods can be used. The methodology developed for analyzing multi-supported subsystems is based on the assumption that the building response and the subsystem response are uncoupled. This is the same assumption implicitly made by analysts who design singly-supported subsystems using floor response spectrum input. This approach implies that there is no response feedback between the primary building system and the subsystem, which is generally found to be conservative. The methodology developed for multi-supported subsystems makes this same assumption and thus should produce results with the same ease and degree of accuracy as results obtained for singly-supported subsystems. (orig./HP)

  3. Automatic control of a primary electric thrust subsystem

    Macie, T. W.; Macmedan, M. L.

    1975-01-01

    A concept for automatic control of the thrust subsystem has been developed by JPL and participating NASA Centers. This paper reports on progress in implementing the concept at JPL. Control of the Thrust Subsystem (TSS) is performed by the spacecraft computer command subsystem, and telemetry data is extracted by the spacecraft flight data subsystem. The Data and Control Interface Unit, an element of the TSS, provides the interface with the individual elements of the TSS. The control philosophy and implementation guidelines are presented. Control requirements are listed, and the control mechanism, including the serial digital data intercommunication system, is outlined. The paper summarizes progress to Fall 1974.

  4. Plant development, auxin, and the subsystem incompleteness theorem.

    Niklas, Karl J; Kutschera, Ulrich

    2012-01-01

    Plant morphogenesis (the process whereby form develops) requires signal cross-talking among all levels of organization to coordinate the operation of metabolic and genomic subsystems operating in a larger network of subsystems. Each subsystem can be rendered as a logic circuit supervising the operation of one or more signal-activated system. This approach simplifies complex morphogenetic phenomena and allows for their aggregation into diagrams of progressively larger networks. This technique is illustrated here by rendering two logic circuits and signal-activated subsystems, one for auxin (IAA) polar/lateral intercellular transport and another for IAA-mediated cell wall loosening. For each of these phenomena, a circuit/subsystem diagram highlights missing components (either in the logic circuit or in the subsystem it supervises) that must be identified experimentally if each of these basic plant phenomena is to be fully understood. We also illustrate the "subsystem incompleteness theorem," which states that no subsystem is operationally self-sufficient. Indeed, a whole-organism perspective is required to understand even the most simple morphogenetic process, because, when isolated, every biological signal-activated subsystem is morphogenetically ineffective.

  5. MARIAN: Flexible Interoperability for Federated Digital Libraries

    Goncalves, Marcos A.; France, Robert K.; Fox, Edward A.; Hilf, Eberhard R.; Zimmermann, Kerstin; Severiens, Thomas

    2001-01-01

    Federated digital libraries are composed of distributed autonomous (heterogeneous) information services but provide users with a transparent, integrated view of collected information respecting different information sources' autonomy. In this paper we discuss a federated system for the Networked Digital Library of Theses and Dissertations (NDLTD), an international consortium of universities, libraries, and other supporting institutions focused on electronic theses and dissertations (ETDs). Th...

  6. A Web Service Protocol Realizing Interoperable Internet of Things Tasking Capability

    Huang, Chih-Yuan; Wu, Cheng-Hung

    2016-01-01

    The Internet of Things (IoT) is an infrastructure that interconnects uniquely-identifiable devices using the Internet. By interconnecting everyday appliances, various monitoring, and physical mashup applications can be constructed to improve human’s daily life. In general, IoT devices provide two main capabilities: sensing and tasking capabilities. While the sensing capability is similar to the World-Wide Sensor Web, this research focuses on the tasking capability. However, currently, IoT devices created by different manufacturers follow different proprietary protocols and are locked in many closed ecosystems. This heterogeneity issue impedes the interconnection between IoT devices and damages the potential of the IoT. To address this issue, this research aims at proposing an interoperable solution called tasking capability description that allows users to control different IoT devices using a uniform web service interface. This paper demonstrates the contribution of the proposed solution by interconnecting different IoT devices for different applications. In addition, the proposed solution is integrated with the OGC SensorThings API standard, which is a Web service standard defined for the IoT sensing capability. Consequently, the Extended SensorThings API can realize both IoT sensing and tasking capabilities in an integrated and interoperable manner. PMID:27589759

  7. A Web Service Protocol Realizing Interoperable Internet of Things Tasking Capability.

    Huang, Chih-Yuan; Wu, Cheng-Hung

    2016-08-31

    The Internet of Things (IoT) is an infrastructure that interconnects uniquely-identifiable devices using the Internet. By interconnecting everyday appliances, various monitoring, and physical mashup applications can be constructed to improve human's daily life. In general, IoT devices provide two main capabilities: sensing and tasking capabilities. While the sensing capability is similar to the World-Wide Sensor Web, this research focuses on the tasking capability. However, currently, IoT devices created by different manufacturers follow different proprietary protocols and are locked in many closed ecosystems. This heterogeneity issue impedes the interconnection between IoT devices and damages the potential of the IoT. To address this issue, this research aims at proposing an interoperable solution called tasking capability description that allows users to control different IoT devices using a uniform web service interface. This paper demonstrates the contribution of the proposed solution by interconnecting different IoT devices for different applications. In addition, the proposed solution is integrated with the OGC SensorThings API standard, which is a Web service standard defined for the IoT sensing capability. Consequently, the Extended SensorThings API can realize both IoT sensing and tasking capabilities in an integrated and interoperable manner.

  8. Model and Interoperability using Meta Data Annotations

    David, O.

    2011-12-01

    Software frameworks and architectures are in need for meta data to efficiently support model integration. Modelers have to know the context of a model, often stepping into modeling semantics and auxiliary information usually not provided in a concise structure and universal format, consumable by a range of (modeling) tools. XML often seems the obvious solution for capturing meta data, but its wide adoption to facilitate model interoperability is limited by XML schema fragmentation, complexity, and verbosity outside of a data-automation process. Ontologies seem to overcome those shortcomings, however the practical significance of their use remains to be demonstrated. OMS version 3 took a different approach for meta data representation. The fundamental building block of a modular model in OMS is a software component representing a single physical process, calibration method, or data access approach. Here, programing language features known as Annotations or Attributes were adopted. Within other (non-modeling) frameworks it has been observed that annotations lead to cleaner and leaner application code. Framework-supported model integration, traditionally accomplished using Application Programming Interfaces (API) calls is now achieved using descriptive code annotations. Fully annotated components for various hydrological and Ag-system models now provide information directly for (i) model assembly and building, (ii) data flow analysis for implicit multi-threading or visualization, (iii) automated and comprehensive model documentation of component dependencies, physical data properties, (iv) automated model and component testing, calibration, and optimization, and (v) automated audit-traceability to account for all model resources leading to a particular simulation result. Such a non-invasive methodology leads to models and modeling components with only minimal dependencies on the modeling framework but a strong reference to its originating code. Since models and

  9. An Open Source Tool to Test Interoperability

    Bermudez, L. E.

    2012-12-01

    Scientists interact with information at various levels from gathering of the raw observed data to accessing portrayed processed quality control data. Geoinformatics tools help scientist on the acquisition, storage, processing, dissemination and presentation of geospatial information. Most of the interactions occur in a distributed environment between software components that take the role of either client or server. The communication between components includes protocols, encodings of messages and managing of errors. Testing of these communication components is important to guarantee proper implementation of standards. The communication between clients and servers can be adhoc or follow standards. By following standards interoperability between components increase while reducing the time of developing new software. The Open Geospatial Consortium (OGC), not only coordinates the development of standards but also, within the Compliance Testing Program (CITE), provides a testing infrastructure to test clients and servers. The OGC Web-based Test Engine Facility, based on TEAM Engine, allows developers to test Web services and clients for correct implementation of OGC standards. TEAM Engine is a JAVA open source facility, available at Sourceforge that can be run via command line, deployed in a web servlet container or integrated in developer's environment via MAVEN. The TEAM Engine uses the Compliance Test Language (CTL) and TestNG to test HTTP requests, SOAP services and XML instances against Schemas and Schematron based assertions of any type of web service, not only OGC services. For example, the OGC Web Feature Service (WFS) 1.0.0 test has more than 400 test assertions. Some of these assertions includes conformance of HTTP responses, conformance of GML-encoded data; proper values for elements and attributes in the XML; and, correct error responses. This presentation will provide an overview of TEAM Engine, introduction of how to test via the OGC Testing web site and

  10. Heterogeneous reactors

    Moura Neto, C. de; Nair, R.P.K.

    1979-08-01

    The microscopic study of a cell is meant for the determination of the infinite multiplication factor of the cell, which is given by the four factor formula: K(infinite) = n(epsilon)pf. The analysis of an homogeneous reactor is similar to that of an heterogeneous reactor, but each factor of the four factor formula can not be calculated by the formulas developed in the case of an homogeneous reactor. A great number of methods was developed for the calculation of heterogeneous reactors and some of them are discussed. (Author) [pt

  11. tmBioC: improving interoperability of text-mining tools with BioC.

    Khare, Ritu; Wei, Chih-Hsuan; Mao, Yuqing; Leaman, Robert; Lu, Zhiyong

    2014-01-01

    The lack of interoperability among biomedical text-mining tools is a major bottleneck in creating more complex applications. Despite the availability of numerous methods and techniques for various text-mining tasks, combining different tools requires substantial efforts and time owing to heterogeneity and variety in data formats. In response, BioC is a recent proposal that offers a minimalistic approach to tool interoperability by stipulating minimal changes to existing tools and applications. BioC is a family of XML formats that define how to present text documents and annotations, and also provides easy-to-use functions to read/write documents in the BioC format. In this study, we introduce our text-mining toolkit, which is designed to perform several challenging and significant tasks in the biomedical domain, and repackage the toolkit into BioC to enhance its interoperability. Our toolkit consists of six state-of-the-art tools for named-entity recognition, normalization and annotation (PubTator) of genes (GenNorm), diseases (DNorm), mutations (tmVar), species (SR4GN) and chemicals (tmChem). Although developed within the same group, each tool is designed to process input articles and output annotations in a different format. We modify these tools and enable them to read/write data in the proposed BioC format. We find that, using the BioC family of formats and functions, only minimal changes were required to build the newer versions of the tools. The resulting BioC wrapped toolkit, which we have named tmBioC, consists of our tools in BioC, an annotated full-text corpus in BioC, and a format detection and conversion tool. Furthermore, through participation in the 2013 BioCreative IV Interoperability Track, we empirically demonstrate that the tools in tmBioC can be more efficiently integrated with each other as well as with external tools: Our experimental results show that using BioC reduces >60% in lines of code for text-mining tool integration. The tmBioC toolkit

  12. Interoperability Guidelines for Lithuanian E-Learning Management Systems

    Eugenijus Kurilovas

    2011-08-01

    Full Text Available Purpose – the paper aims to analyse e-learning content and repositories along with the problems of learning organisation interoperability. The main objective of the paper is to analyse scientific research results and the newest international experience in the area and to provide interoperability guidelines and recommendations for the implementation of appropriate Lithuanian state programmes. The learning content and repositories recommendations are designed for the implementation of the Lithuanian education portal project as well as Lithuanian Virtual University (LVU programme’s information services’ (LABT / eLABa and e-learning services’ (LieDM sub-programmes. The whole education institution recommendations are designed for the maintenance and development of LVU programme’s management services’ (LieMSIS system.Design/methodology/approach – methods used for the general analysis of proposed interoperability guidelines (reccomendations were bibliographic research and comparative analysis of Lithuanian and foreign scientific works published in periodicals and large-scale EU-funded interoperability projects deliverables. System analysis and comparative analysis methods were used in order to formulate and analyse systems’ interoperability guidelines and recommendations. The author employed the experimental research method while working in the appropriate EU-funded interoperability projects to form the guidelines (recommendations. In order to summarize the results, the evaluative research method was used..Findings – the international guidelines and recommendations presented in the paper could be suitable for implementation while developing Lithuanian state education information systems such as the Lithuanian education portal, the Lithuanian academic libraries’ (eLABa system, the Lithuanian distance learning system (LieDM, and the Lithuanian universities’ management system (LieMSIS.Research limitations/implications – the paper

  13. Interoperability Guidelines for Lithuanian E-Learning Management Systems

    Eugenijus Kurilovas

    2013-08-01

    Full Text Available Purpose – the paper aims to analyse e-learning content and repositories along with the problems of learning organisation interoperability. The main objective of the paper is to analyse scientific research results and the newest international experience in the area and to provide interoperability guidelines and recommendations for the implementation of appropriate Lithuanian state programmes. The learning content and repositories recommendations are designed for the implementation of the Lithuanian education portal project as well as Lithuanian Virtual University (LVU programme’s information services’ (LABT / eLABa and e-learning services’ (LieDM sub-programmes. The whole education institution recommendations are designed for the maintenance and development of LVU programme’s management services’ (LieMSIS system.Design/methodology/approach – methods used for the general analysis of proposed interoperability guidelines (reccomendations were bibliographic research and comparative analysis of Lithuanian and foreign scientific works published in periodicals and large-scale EU-funded interoperability projects deliverables. System analysis and comparative analysis methods were used in order to formulate and analyse systems’ interoperability guidelines and recommendations. The author employed the experimental research method while working in the appropriate EU-funded interoperability projects to form the guidelines (recommendations. In order to summarize the results, the evaluative research method was used.Findings – the international guidelines and recommendations presented in the paper could be suitable for implementation while developing Lithuanian state education information systems such as the Lithuanian education portal, the Lithuanian academic libraries’ (eLABa system, the Lithuanian distance learning system (LieDM, and the Lithuanian universities’ management system (LieMSIS.Research limitations/implications – the paper

  14. An Integrated Framework to Achieve Interoperability in Person-Centric Health Management

    Fabio Vergari

    2011-01-01

    Full Text Available The need for high-quality out-of-hospital healthcare is a known socioeconomic problem. Exploiting ICT's evolution, ad-hoc telemedicine solutions have been proposed in the past. Integrating such ad-hoc solutions in order to cost-effectively support the entire healthcare cycle is still a research challenge. In order to handle the heterogeneity of relevant information and to overcome the fragmentation of out-of-hospital instrumentation in person-centric healthcare systems, a shared and open source interoperability component can be adopted, which is ontology driven and based on the semantic web data model. The feasibility and the advantages of the proposed approach are demonstrated by presenting the use case of real-time monitoring of patients' health and their environmental context.

  15. The dynamic information architecture system : a simulation framework to provide interoperability for process models

    Hummel, J. R.; Christiansen, J. H.

    2002-01-01

    As modeling and simulation becomes a more important part of the day-to-day activities in industry and government, organizations are being faced with the vexing problem of how to integrate a growing suite of heterogeneous models both within their own organizations and between organizations. The Argonne National Laboratory, which is operated by the University of Chicago for the United States Department of Energy, has developed the Dynamic Information Architecture System (DIAS) to address such problems. DIAS is an object-oriented, subject domain independent framework that is used to integrate legacy or custom-built models and applications. In this paper we will give an overview of the features of DIAS and give examples of how it has been used to integrate models in a number of applications. We shall also describe some of the key supporting DIAS tools that provide seamless interoperability between models and applications

  16. Semantic modeling and interoperability in product and process engineering a technology for engineering informatics

    2013-01-01

    In the past decade, feature-based design and manufacturing has gained some momentum in various engineering domains to represent and reuse semantic patterns with effective applicability. However, the actual scope of feature application is still very limited. Semantic Modeling and Interoperability in Product and Process Engineering provides a systematic solution for the challenging engineering informatics field aiming at the enhancement of sustainable knowledge representation, implementation and reuse in an open and yet practically manageable scale.   This semantic modeling technology supports uniform, multi-facet and multi-level collaborative system engineering with heterogeneous computer-aided tools, such as CADCAM, CAE, and ERP.  This presented unified feature model can be applied to product and process representation, development, implementation and management. Practical case studies and test samples are provided to illustrate applications which can be implemented by the readers in real-world scenarios. �...

  17. Double Shell Tank (DST) Monitor and Control Subsystem Definition Report

    BAFUS, R.R.

    2000-01-01

    The system description of the Double-Shell Tank (DST) Monitor and Control Subsystem establishes the system boundaries and describes the interface of the DST Monitor and Control Subsystem with new and existing systems that are required to accomplish the Waste Feed Delivery (WFD) mission

  18. Subsystem cost data for the tritium systems test assembly

    Bartlit, J.R.; Anderson, J.L.; Rexroth, V.G.

    1983-01-01

    Details of subsystem costs are among the questions most frequently asked about the $14.4 million Tritium Systems Test Assembly (TSTA) at Los Alamos National Laboratory. This paper presents a breakdown of cost components for each of the 20 major subsystems of TSTA. Also included are details to aid in adjusting the costs to other years, contracting conditions, or system sizes

  19. Does Normal Processing Provide Evidence of Specialised Semantic Subsystems?

    Shapiro, Laura R.; Olson, Andrew C.

    2005-01-01

    Category-specific disorders are frequently explained by suggesting that living and non-living things are processed in separate subsystems (e.g. Caramazza & Shelton, 1998). If subsystems exist, there should be benefits for normal processing, beyond the influence of structural similarity. However, no previous study has separated the relative…

  20. Interoperable Cloud Networking for intelligent power supply; Interoperables Cloud Networking fuer intelligente Energieversorgung

    Hardin, Dave [Invensys Operations Management, Foxboro, MA (United States)

    2010-09-15

    Intelligent power supply by a so-called Smart Grid will make it possible to control consumption by market-based pricing and signals for load reduction. This necessitates that both the energy rates and the energy information are distributed reliably and in real time to automation systems in domestic and other buildings and in industrial plants over a wide geographic range and across the most varied grid infrastructures. Effective communication at this level of complexity necessitates computer and grid resources that are normally only available in the computer centers of big industries. The cloud computing technology, which is described here in some detail, has all features to provide reliability, interoperability and efficiency for large-scale smart grid applications, at lower cost than traditional computer centers. (orig.)

  1. Landscape of the EU-US Research Infrastructures and actors: Moving towards international interoperability of earth system data

    Asmi, Ari; Powers, Lindsay

    2015-04-01

    Research Infrastructures (RIs) are major long-term investments supporting innovative, bottom-up research activities. In the environmental research, they range from high atmosphere radars, to field observation networks and coordinated laboratory facilities. The Earth system is highly interactive and each part of the system interconnected across the spatial and disciplinary borders. However, due practical and historical reasons, the RIs are built from disciplinary points-of-view and separately in different parts of the world, with differing standards, policies, methods and research cultures. This heterogeneity provides necessary diversity to study the complex Earth system, but makes cross-disciplinary and/or global interoperability a challenge. Global actions towards better interoperability are surfacing, especially with EU and US. For example, recent mandates within the US government prioritize open data for federal agencies and federally funded science, and encourage collaboration among agencies to reduce duplication of efforts and increase efficient use of resources. There are several existing initiatives working toward these goals (e.g., COOPEUS, EarthCube, RDA, ICSU-WDS, DataOne, ESIP, USGEO, GEO). However, there is no cohesive framework to coordinate efforts among these, and other, entities. COOPEUS and EarthCube have now begun to map the landscape of interoperability efforts across earth science domains. The COOPEUS mapping effort describes the EU and US landscape of environmental research infrastructures to accomplish the following: identify gaps in services (data provision) necessary to address societal priorities; provide guidance for development of future research infrastructures; and identify opportunities for Research Infrastructures (RIs) to collaborate on issues of common interest. EarthCube mapping effort identifies opportunities to engage a broader community by identifying scientific domain organizations and entities. We present the current situation

  2. Simulating the Various Subsystems of a Coal Mine

    V. Okolnishnikov

    2016-06-01

    Full Text Available A set of simulation models of various subsystems of a coal mine was developed with the help of a new visual interactive simulation system of technological processes. This paper contains a brief description of this simulation system and its possibilities. The main possibilities provided by the simulation system are: the quick construction of models from library elements, 3D representation, and the communication of models with actual control systems. These simulation models were developed for the simulation of various subsystems of a coal mine: underground conveyor network subsystems, pumping subsystems and coal face subsystems. These simulation models were developed with the goal to be used as a quality and reliability assurance tool for new process control systems in coal mining.

  3. Metadata behind the Interoperability of Wireless Sensor Networks

    Miguel Angel Manso Callejo

    2009-05-01

    Full Text Available Wireless Sensor Networks (WSNs produce changes of status that are frequent, dynamic and unpredictable, and cannot be represented using a linear cause-effect approach. Consequently, a new approach is needed to handle these changes in order to support dynamic interoperability. Our approach is to introduce the notion of context as an explicit representation of changes of a WSN status inferred from metadata elements, which in turn, leads towards a decision-making process about how to maintain dynamic interoperability. This paper describes the developed context model to represent and reason over different WSN status based on four types of contexts, which have been identified as sensing, node, network and organisational contexts. The reasoning has been addressed by developing contextualising and bridges rules. As a result, we were able to demonstrate how contextualising rules have been used to reason on changes of WSN status as a first step towards maintaining dynamic interoperability.

  4. An Interoperability Framework and Capability Profiling for Manufacturing Software

    Matsuda, M.; Arai, E.; Nakano, N.; Wakai, H.; Takeda, H.; Takata, M.; Sasaki, H.

    ISO/TC184/SC5/WG4 is working on ISO16100: Manufacturing software capability profiling for interoperability. This paper reports on a manufacturing software interoperability framework and a capability profiling methodology which were proposed and developed through this international standardization activity. Within the context of manufacturing application, a manufacturing software unit is considered to be capable of performing a specific set of function defined by a manufacturing software system architecture. A manufacturing software interoperability framework consists of a set of elements and rules for describing the capability of software units to support the requirements of a manufacturing application. The capability profiling methodology makes use of the domain-specific attributes and methods associated with each specific software unit to describe capability profiles in terms of unit name, manufacturing functions, and other needed class properties. In this methodology, manufacturing software requirements are expressed in terns of software unit capability profiles.

  5. An Architecture for Semantically Interoperable Electronic Health Records.

    Toffanello, André; Gonçalves, Ricardo; Kitajima, Adriana; Puttini, Ricardo; Aguiar, Atualpa

    2017-01-01

    Despite the increasing adhesion of electronic health records, the challenge of semantic interoperability remains unsolved. The fact that different parties can exchange messages does not mean they can understand the underlying clinical meaning, therefore, it cannot be assumed or treated as a requirement. This work introduces an architecture designed to achieve semantic interoperability, in a way which organizations that follow different policies may still share medical information through a common infrastructure comparable to an ecosystem, whose organisms are exemplified within the Brazilian scenario. Nonetheless, the proposed approach describes a service-oriented design with modules adaptable to different contexts. We also discuss the establishment of an enterprise service bus to mediate a health infrastructure defined on top of international standards, such as openEHR and IHE. Moreover, we argue that, in order to achieve truly semantic interoperability in a wide sense, a proper profile must be published and maintained.

  6. Dynamic Business Networks: A Headache for Sustainable Systems Interoperability

    Agostinho, Carlos; Jardim-Goncalves, Ricardo

    Collaborative networked environments emerged with the spread of the internet, contributing to overcome past communication barriers, and identifying interoperability as an essential property. When achieved seamlessly, efficiency is increased in the entire product life cycle. Nowadays, most organizations try to attain interoperability by establishing peer-to-peer mappings with the different partners, or in optimized networks, by using international standard models as the core for information exchange. In current industrial practice, mappings are only defined once, and the morphisms that represent them, are hardcoded in the enterprise systems. This solution has been effective for static environments, where enterprise and product models are valid for decades. However, with an increasingly complex and dynamic global market, models change frequently to answer new customer requirements. This paper draws concepts from the complex systems science and proposes a framework for sustainable systems interoperability in dynamic networks, enabling different organizations to evolve at their own rate.

  7. Interoperable and standard e-Health solution over Bluetooth.

    Martinez, I; Del Valle, P; Munoz, P; Trigo, J D; Escayola, J; Martínez-Espronceda, M; Muñoz, A; Serrano, L; Garcia, J

    2010-01-01

    The new paradigm of e-Health demands open sensors and middleware components that permit transparent integration and end-to-end interoperability of new personal health devices. The use of standards seems to be the internationally adopted way to solve these problems. This paper presents the implementation of an end-to-end standards-based e-Health solution. This includes ISO/IEEE11073 standard for the interoperability of the medical devices in the patient environment and EN13606 standard for the interoperable exchange of the Electronic Healthcare Record. The design strictly fulfills all the technical features of the most recent versions of both standards. The implemented prototype has been tested in a laboratory environment to demonstrate its feasibility for its further transfer to the healthcare system.

  8. Presence in the IP Multimedia Subsystem

    Ling Lin

    2007-01-01

    Full Text Available With an ever increasing penetration of Internet Protocol (IP technologies, the wireless industry is evolving the mobile core network towards all-IP network. The IP Multimedia Subsystem (IMS is a standardised Next Generation Network (NGN architectural framework defined by the 3rd Generation Partnership Project (3GPP to bridge the gap between circuit-switched and packet-switched networks and consolidate both sides into on single all-IP network for all services. In this paper, we provide an insight into the limitation of the presence service, one of the fundamental building blocks of the IMS. Our prototype-based study is unique of its kind and helps identifying the factors which limit the scalability of the current version of the presence service (3GPP TS 23.141 version 7.2.0 Release 7 [1], which will in turn dramatically limit the performance of advanced IMS services. We argue that the client-server paradigm behind the current IMS architecture does not suite the requirements of the IMS system, which defies the very purpose of its introduction. We finally elaborate on possible avenues for addressing this problem.

  9. The CALIPSO Integrated Thermal Control Subsystem

    Gasbarre, Joseph F.; Ousley, Wes; Valentini, Marc; Thomas, Jason; Dejoie, Joel

    2007-01-01

    The Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) is a joint NASA-CNES mission to study the Earth's cloud and aerosol layers. The satellite is composed of a primary payload (built by Ball Aerospace) and a spacecraft platform bus (PROTEUS, built by Alcatel Alenia Space). The thermal control subsystem (TCS) for the CALIPSO satellite is a passive design utilizing radiators, multi-layer insulation (MLI) blankets, and both operational and survival surface heaters. The most temperature sensitive component within the satellite is the laser system. During thermal vacuum testing of the integrated satellite, the laser system's operational heaters were found to be inadequate in maintaining the lasers required set point. In response, a solution utilizing the laser system's survival heaters to augment the operational heaters was developed with collaboration between NASA, CNES, Ball Aerospace, and Alcatel-Alenia. The CALIPSO satellite launched from Vandenberg Air Force Base in California on April 26th, 2006. Evaluation of both the platform and payload thermal control systems show they are performing as expected and maintaining the critical elements of the satellite within acceptable limits.

  10. Rosary as the ethnoreligious marker of the actional subsystem of significative field of catholicism

    V. I. Kryachko

    2015-02-01

    The socioevaluative, autointentional, identificative attempts to explicate some ethnoreligious crosscorrelations between different significative structural fields in sociospace are based on the author’s Model of the structure of significative field of Catholicism, which consists of the following 12 basic significative structural subsystems: 1 anthropomorphic significative subsystem, which includes human­similar (manlike and personificated symbolic constructions, monuments and architectural ensembles, as well as symbols of human body parts, their combinations and signals; 2 zoomorphic significative subsystem, which includes animal­similar significative constructions and signs of their separate bodyparts, as well as symbols of their lifeproducts; 3 vegetomorphic significative subsystem, which includes plant­similar significative elements and food products; 4 geomorphic significative subsystem; 5 geometric significative subsystem; 6 astral­referent significative subsystem; 7 coloristic significative subsystem; 8 topos­instalative significative subsystem; 9 objective­instrumental significative subsystem; 10 architectural exterior­interior significative subsystem; 11 abstractive significative subsystem; 12 actional significative subsystem.

  11. Modeling and simulation of a 100 kWe HT-PEMFC subsystem integrated with an absorption chiller subsystem

    Arsalis, Alexandros

    2012-01-01

    A 100 kWe liquid-cooled HT-PEMFC subsystem is integrated with an absorption chiller subsystem to provide electricity and cooling. The system is designed, modeled and simulated to investigate the potential of this technology for future novel energy system applications. Liquid-cooling can provide...

  12. Interoperable eHealth Platform for Personalized Smart Services

    Mihaylov, Mihail Rumenov; Mihovska, Albena Dimitrova; Kyriazakos, Sofoklis

    2015-01-01

    personalized context-aware applications to serve the user's needs. This paper proposes the use of advised sensing, context-aware and cloud-based lifestyle reasoning to design an innovative eHealth platform that supports highly personalized smart services to primary users. The architecture of the platform has...... been designed in accordance with the interoperability requirements and standards as proposed by ITU-T and Continua Alliance. In particular, we define the interface dependencies and functional requirements needed, to allow eCare and eHealth vendors to manufacture interoperable sensors, ambient and home...

  13. Interoperable Archetypes With a Three Folded Terminology Governance.

    Pederson, Rune; Ellingsen, Gunnar

    2015-01-01

    The use of openEHR archetypes increases the interoperability of clinical terminology, and in doing so improves upon the availability of clinical terminology for both primary and secondary purposes. Where clinical terminology is employed in the EPR system, research reports conflicting a results for the use of structuring and standardization as measurements of success. In order to elucidate this concept, this paper focuses on the effort to establish a national repository for openEHR based archetypes in Norway where clinical terminology could be included with benefit for interoperability three folded.

  14. Interoperation of World-Wide Production e-Science Infrastructures

    Riedel, M; Soddemann, T; Field, L; Navarro, JP; Casey, J; Litmaath, M; Baud, J; Koblitz, B; Catlett, C; Skow, D; Wang, S; Saeki, Y; Sato, H; Matsuoka, S; Geddes, N

    Many production Grid and e-Science infrastructures have begun to offer services to end-users during the past several years with an increasing number of scientific applications that require access to a wide variety of resources and services in multiple Grids. Therefore, the Grid Interoperation Now—Community Group of the Open Grid Forum—organizes and manages interoperation efforts among those production Grid infrastructures to reach the goal of a world-wide Grid vision on a technical level in the near future. This contribution highlights fundamental approaches of the group and discusses open standards in the context of production e-Science infrastructures.

  15. Improved semantic interoperability for content reuse through knowledge organization systems

    José Antonio Moreiro González

    2012-04-01

    Full Text Available The Knowledge Organization Systems (KOS are resources designed to improve the knowledge interoperability, management and retrieval. As increases the web resources, it’s evidenced the lack of KOS, with the consequent impact in the resources interoperability. The KOSS are, by definition, complicated and costly tools, so much in his creation as in his management. The reuse of similar organizational structures is a necessary element in this context. They analyses experiences of reuse of The KOS and signals like the new standards are impinged on this appearance.

  16. Requirements for and barriers towards interoperable ehealth technology in primary care

    Oude Nijeweme-d'Hollosy, Wendeline; van Velsen, Lex Stefan; Huygens, Martine; Hermens, Hermanus J.

    Despite eHealth technology's rapid growth, eHealth applications are rarely embedded within primary care, mostly because systems lack interoperability. This article identifies requirements for, and barriers towards, interoperable eHealth technology from healthcare professionals' perspective -- the

  17. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation (presentation)

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  18. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation.

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  19. Heterogeneous Gossip

    Frey, Davide; Guerraoui, Rachid; Kermarrec, Anne-Marie; Koldehofe, Boris; Mogensen, Martin; Monod, Maxime; Quéma, Vivien

    Gossip-based information dissemination protocols are considered easy to deploy, scalable and resilient to network dynamics. Load-balancing is inherent in these protocols as the dissemination work is evenly spread among all nodes. Yet, large-scale distributed systems are usually heterogeneous with respect to network capabilities such as bandwidth. In practice, a blind load-balancing strategy might significantly hamper the performance of the gossip dissemination.

  20. Special topic interoperability and EHR: Combining openEHR, SNOMED, IHE, and continua as approaches to interoperability on national ehealth

    Bestek, M.; Stanimirovi, D.

    2017-01-01

    into the eHealth environment, and identification of the main success factors in the field, which are necessary for achieving required interoperability, and consequently, for the successful implementation of eHealth projects in general. Methods: The paper represents an in-depth analysis regarding...... the potential application of openEHR, SNOMED, IHE and Continua approaches in the development and implementation process of eHealth in Slovenia. The research method used is both exploratory and deductive in nature. The methodological framework is grounded on information retrieval with a special focus on research...... could alleviate the pertinent interoperability issues in the Slovenian eHealth context. 2. Analyzing the possibilities (requirements) for their inclusion in the construction process for individual eHealth solutions. 3. Identification and charting the main success factors in the interoperability field...

  1. Architectures for the Development of the National Interoperability Framework in Romania

    Codrin-Florentin NISIOIU

    2015-10-01

    Full Text Available The authors of Digital Agenda consider that Europe do not take fully advantage of interoperability. They believe that we need effective interoperability between IT products and services to build a truly Digital Society. The Digital Agenda can only be effective if all the elements and applications are interoperable and based on open standards and platforms. In this context, I propose in this article a specific architecture for developing Romanian National Interoperability framework.

  2. Medical Device Plug-and-Play Interoperability Standards and Technology Leadership

    2017-10-01

    Award Number: W81XWH-09-1-0705 TITLE: “Medical Device Plug-and-Play Interoperability Standards and Technology Leadership” PRINCIPAL INVESTIGATOR...Sept 2016 – 20 Sept 2017 4. TITLE AND SUBTITLE “Medical Device Plug-and-Play Interoperability 5a. CONTRACT NUMBER Standards and Technology ...efficiency through interoperable medical technologies . We played a leadership role on interoperability safety standards (AAMI, AAMI/UL Joint

  3. Subsystem response analysis for the Seismic Safety Margins Research Program

    Chuang, T.Y.

    1981-01-01

    A review of the state-of-the-art of seismic qualification methods of subsystem has been completed. This task assesses the accuracy of seismic analysis techniques to predict dynamic response, and also identifies and quantifies sources of random and modeling undertainty in subsystem response determination. The subsystem has been classified as two categories according to the nature of support: multiply supported subsystems (e.g., piping systems) and singly supported subsystems (e.g., pumps, turbines, electrical control panels, etc.). The mutliply supported piping systems are analyzed by multisupport input time history method. The input motions are the responses of major structures. The dynamic models of the subsystems identified by the event/fault tree are created. The responses calculated by multisupport input time history method are consistent with the fragility parameters. These responses are also coordinated with the event/fault tree description. The subsystem responses are then evaluated against the fragility curves of components and systems and incorporated in the event/fault tree analysis. (orig./HP)

  4. Pemanfaatan Google API Untuk Model Interoperability Web Berbasis PHP Dengan Google Drive

    Sumiari, Ni Kadek

    2015-01-01

    Dalam sebuah website tercapinya interoperability suatu system sangatlah penting. Penggunaan database berbasis Mysql, Sql Server ataupun oracle memang sudah sangat lumrah dipergunakan dalam sebuah system berbasis website. Namun penggunaan database tersebut tidak bisa menjamin apakah interoperability dari system tersebut dapat tercapai. Selain dari keamanan data dari segi implementasi system juga cukup sulit. Salah satu solusi dalam mencapi interoperability suatu system berbasis website adalah...

  5. Interoperability of Services in an Open Broadband Market : Cases from the Netherlands

    Burgmeijer, J.

    2006-01-01

    End-to-end interoperability of broadband services and networks is a condition for an open broadband market. A business model for broadband service interoperability is given. Two cases from the Netherlands, of initiatives from the market to reach interoperability, are presented: E-norm and FIST VoIP.

  6. Datacube Interoperability, Encoding Independence, and Analytics

    Baumann, Peter; Hirschorn, Eric; Maso, Joan

    2017-04-01

    Datacubes are commonly accepted as an enabling paradigm which provides a handy abstraction for accessing and analyzing the zillions of image files delivered by the manifold satellite instruments and climate simulations, among others. Additionally, datacubes are the classic model for statistical and OLAP datacubes, so a further information category can be integrated. From a standards perspective, spatio-temporal datacubes naturally are included in the concept of coverages which encompass regular and irregular grids, point clouds, and general meshes - or, more abstractly, digital representations of spatio-temporally varying phenomena. ISO 19123, which is identical to OGC Abstract Topic 6, gives a high-level abstract definition which is complemented by the OGC Coverage Implementation Schema (CIS) which is an interoperable, yet format independent concretization of the abstract model. Currently, ISO is working on adopting OGC CIS as ISO 19123-2; the existing ISO 19123 standard is under revision by one of the abstract authors and will become ISO 19123-1. The roadmap agreed by ISO further foresees adoption of the OGC Web Coverage Service (WCS) as an ISO standard so that a complete data and service model will exist. In 2016, INSPIRE has adopted WCS as Coverage Download Service, including the datacube analytics language Web Coverage Processing Service (WCPS). The rasdaman technology (www.rasdaman.org) is both OGC and INSPIRE Reference Implementation. In the global EarthServer initiative rasdaman database sizes are exceeding 250 TB today, heading for the Petabyte frontier well in 2017. Technically, CIS defines a compact, efficient model for representing multi-dimensional datacubes in several ways. The classical coverage cube defines a domain set (where are values?), a range set (what are these values?), and range type (what do the values mean?), as well as a "bag" for arbitrary metadata. With CIS 1.1, coordinate/value pair sequences have been added, as well as tiled

  7. Shuttle Orbiter Active Thermal Control Subsystem design and flight experience

    Bond, Timothy A.; Metcalf, Jordan L.; Asuncion, Carmelo

    1991-01-01

    The paper examines the design of the Space Shuttle Orbiter Active Thermal Control Subsystem (ATCS) constructed for providing the vehicle and payload cooling during all phases of a mission and during ground turnaround operations. The operation of the Shuttle ATCS and some of the problems encountered during the first 39 flights of the Shuttle program are described, with special attention given to the major problems encountered with the degradation of the Freon flow rate on the Orbiter Columbia, the Flash Evaporator Subsystem mission anomalies which occurred on STS-26 and STS-34, and problems encountered with the Ammonia Boiler Subsystem. The causes and the resolutions of these problems are discussed.

  8. Subsystem response review. Seismic safety margins research program

    Kennedy, R.P.; Campbell, R.D.; Wesley, D.A.; Kamil, H.; Gantayat, A.; Vasudevan, R.

    1981-07-01

    A study was conducted to document the state of the art in seismic qualification of nuclear power plant components and subsystems by analysis and testing and to identify the sources and magnitude of the uncertainties associated with analysis and testing methods. The uncertainties are defined in probabilistic terms for use in probabilistic seismic risk studies. Recommendations are made for the most appropriate subsystem response analysis methods to minimize response uncertainties. Additional studies, to further quantify testing uncertainties, are identified. Although the general effect of non-linearities on subsystem response is discussed, recommendations and conclusions are based principally on linear elastic analysis and testing models. (author)

  9. Waveform Diversity and Design for Interoperating Radar Systems

    2013-01-01

    University Di Pisa Department Di Ingegneria Dell Informazione Elettronica, Informatica , Telecomunicazioni Via Girolamo Caruso 16 Pisa, Italy 56122...NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) University Di Pisa Department Di Ingegneria Dell Informazione Elettronica, Informatica ...DIPARTIMENTO DI INGEGNERIA DELL’INFORMAZIONE ELETTRONICA, INFORMATICA , TELECOMUNICAZIONI WAVEFORM DIVERSITY AND DESIGN FOR INTEROPERATING

  10. Managing Uncertainty: The Road Towards Better Data Interoperability

    Herschel, M.; van Keulen, Maurice

    Data interoperability encompasses the many data management activities needed for effective information management in anyone´s or any organization´s everyday work such as data cleaning, coupling, fusion, mapping, and information extraction. It is our conviction that a significant amount of money and

  11. Interoperability of Electronic Health Records: A Physician-Driven Redesign.

    Miller, Holly; Johns, Lucy

    2018-01-01

    PURPOSE: Electronic health records (EHRs), now used by hundreds of thousands of providers and encouraged by federal policy, have the potential to improve quality and decrease costs in health care. But interoperability, although technically feasible among different EHR systems, is the weak link in the chain of logic. Interoperability is inhibited by poor understanding, by suboptimal implementation, and at times by a disinclination to dilute market share or patient base on the part of vendors or providers, respectively. The intent of this project has been to develop a series of practicable recommendations that, if followed by EHR vendors and users, can promote and enhance interoperability, helping EHRs reach their potential. METHODOLOGY: A group of 11 physicians, one nurse, and one health policy consultant, practicing from California to Massachusetts, has developed a document titled "Feature and Function Recommendations To Optimize Clinician Usability of Direct Interoperability To Enhance Patient Care" that offers recommendations from the clinician point of view. This report introduces some of these recommendations and suggests their implications for policy and the "virtualization" of EHRs. CONCLUSION: Widespread adoption of even a few of these recommendations by designers and vendors would enable a major advance toward the "Triple Aim" of improving the patient experience, improving the health of populations, and reducing per capita costs.

  12. Look who's talking. A guide to interoperability groups and resources.

    2011-06-01

    There are huge challenges in getting medical devices to communicate with other devices and to information systems. Fortunately, a number of groups have emerged to help hospitals cope. Here's a description of the most prominent ones, including useful web links for each. We also discuss the latest and most pertinent interoperability standards.

  13. The Role of Markup for Enabling Interoperability in Health Informatics

    Steve eMckeever

    2015-05-01

    Full Text Available Interoperability is the faculty of making information systems work together. In this paper we will distinguish a number of different forms that interoperability can take and show how they are realised on a variety of physiological and health care use cases. The last fifteen years has seen the rise of very cheap digital storage both on and off cite. With the advent of the 'Internet of Things' people's expectations are for greater interconnectivity and seamless interoperability. The potential impact these technologies have on healthcare are dramatic: from improved diagnoses through immediate access to a patient's electronic health record, to 'in silico' modeling of organs and early stage drug trials, to predictive medicine based on top-down modeling of disease progression and treatment. We will begin by looking at the underlying technology, classify the various kinds of interoperability that exist in the field, and discuss how they are realised. We conclude with a discussion on future possibilities that big data and further standardizations will enable.

  14. A development framework for semantically interoperable health information systems.

    Lopez, Diego M; Blobel, Bernd G M E

    2009-02-01

    Semantic interoperability is a basic challenge to be met for new generations of distributed, communicating and co-operating health information systems (HIS) enabling shared care and e-Health. Analysis, design, implementation and maintenance of such systems and intrinsic architectures have to follow a unified development methodology. The Generic Component Model (GCM) is used as a framework for modeling any system to evaluate and harmonize state of the art architecture development approaches and standards for health information systems as well as to derive a coherent architecture development framework for sustainable, semantically interoperable HIS and their components. The proposed methodology is based on the Rational Unified Process (RUP), taking advantage of its flexibility to be configured for integrating other architectural approaches such as Service-Oriented Architecture (SOA), Model-Driven Architecture (MDA), ISO 10746, and HL7 Development Framework (HDF). Existing architectural approaches have been analyzed, compared and finally harmonized towards an architecture development framework for advanced health information systems. Starting with the requirements for semantic interoperability derived from paradigm changes for health information systems, and supported in formal software process engineering methods, an appropriate development framework for semantically interoperable HIS has been provided. The usability of the framework has been exemplified in a public health scenario.

  15. Interoperability, Scaling, and the Digital Libraries Research Agenda.

    Lynch, Clifford; Garcia-Molina, Hector

    1996-01-01

    Summarizes reports and activities at the Information Infrastructure Technology and Applications workshop on digital libraries (Reston, Virginia, August 22, 1995). Defines digital library roles and identifies areas of needed research, including: interoperability; protocols for digital objects; collection management; interface design; human-computer…

  16. Information and documentation - Thesauri and interoperability with other vocabularies

    Lykke, Marianne; Dalbin, Sylvie; Smedt, Johan De

    ISO 25964-2:2013 is applicable to thesauri and other types of vocabulary that are commonly used for information retrieval. It describes, compares and contrasts the elements and features of these vocabularies that are implicated when interoperability is needed. It gives recommendations for the est...

  17. Design of large-scale enterprise interoperable value webs

    Hofman, W.J.

    2011-01-01

    Still a lot of enterprises are faced with the issue of interoperability. Whereas large enterprises are able to implement the required technology, SMEs (Small and Medium sized Enterprises) face challenges as they lack knowledge and budget. Enterprises have defined their specific semantics and

  18. Ontologies for interaction : enabling serendipitous interoperability in smart environments

    Niezen, G.

    2012-01-01

    The thesis describes the design and development of an ontology and software framework to support user interaction in ubiquitous computing scenarios. The key goal of ubiquitous computing is "serendipitous interoperability", where devices that were not necessarily designed to work together should be

  19. The next generation of interoperability agents in healthcare.

    Cardoso, Luciana; Marins, Fernando; Portela, Filipe; Santos, Manuel; Abelha, António; Machado, José

    2014-05-16

    Interoperability in health information systems is increasingly a requirement rather than an option. Standards and technologies, such as multi-agent systems, have proven to be powerful tools in interoperability issues. In the last few years, the authors have worked on developing the Agency for Integration, Diffusion and Archive of Medical Information (AIDA), which is an intelligent, agent-based platform to ensure interoperability in healthcare units. It is increasingly important to ensure the high availability and reliability of systems. The functions provided by the systems that treat interoperability cannot fail. This paper shows the importance of monitoring and controlling intelligent agents as a tool to anticipate problems in health information systems. The interaction between humans and agents through an interface that allows the user to create new agents easily and to monitor their activities in real time is also an important feature, as health systems evolve by adopting more features and solving new problems. A module was installed in Centro Hospitalar do Porto, increasing the functionality and the overall usability of AIDA.

  20. Enterprise interoperability with SOA: a survey of service composition approaches

    Mantovaneli Pessoa, Rodrigo; Goncalves da Silva, Eduardo; van Sinderen, Marten J.; Quartel, Dick; Ferreira Pires, Luis

    Service-oriented architecture (SOA) claims to facilitate the construction of flexible and loosely coupled business applications, and therefore is seen as an enabling factor for enterprise interoperability. The concept of service, which is central to SOA, is very convenient to address the matching of

  1. Towards Cross-Organizational Innovative Business Process Interoperability Services

    Karacan, Ömer; Del Grosso, Enrico; Carrez, Cyril; Taglino, Francesco

    This paper presents the vision and initial results of the COIN (FP7-IST-216256) European project for the development of open source Collaborative Business Process Interoperability (CBPip) in cross-organisational business collaboration environments following the Software-as-a-Service Utility (SaaS-U) paradigm.

  2. The role of markup for enabling interoperability in health informatics.

    McKeever, Steve; Johnson, David

    2015-01-01

    Interoperability is the faculty of making information systems work together. In this paper we will distinguish a number of different forms that interoperability can take and show how they are realized on a variety of physiological and health care use cases. The last 15 years has seen the rise of very cheap digital storage both on and off site. With the advent of the Internet of Things people's expectations are for greater interconnectivity and seamless interoperability. The potential impact these technologies have on healthcare are dramatic: from improved diagnoses through immediate access to a patient's electronic health record, to in silico modeling of organs and early stage drug trials, to predictive medicine based on top-down modeling of disease progression and treatment. We will begin by looking at the underlying technology, classify the various kinds of interoperability that exist in the field, and discuss how they are realized. We conclude with a discussion on future possibilities that big data and further standardizations will enable.

  3. The MADE reference information model for interoperable pervasive telemedicine systems

    Fung, L.S.N.; Jones, Valerie M.; Hermens, Hermanus J.

    2017-01-01

    Objectives: The main objective is to develop and validate a reference information model (RIM) to support semantic interoperability of pervasive telemedicine systems. The RIM is one component within a larger, computer-interpretable "MADE language" developed by the authors in the context of the

  4. The Next Generation of Interoperability Agents in Healthcare

    Luciana Cardoso

    2014-05-01

    Full Text Available Interoperability in health information systems is increasingly a requirement rather than an option. Standards and technologies, such as multi-agent systems, have proven to be powerful tools in interoperability issues. In the last few years, the authors have worked on developing the Agency for Integration, Diffusion and Archive of Medical Information (AIDA, which is an intelligent, agent-based platform to ensure interoperability in healthcare units. It is increasingly important to ensure the high availability and reliability of systems. The functions provided by the systems that treat interoperability cannot fail. This paper shows the importance of monitoring and controlling intelligent agents as a tool to anticipate problems in health information systems. The interaction between humans and agents through an interface that allows the user to create new agents easily and to monitor their activities in real time is also an important feature, as health systems evolve by adopting more features and solving new problems. A module was installed in Centro Hospitalar do Porto, increasing the functionality and the overall usability of AIDA.

  5. 47 CFR 0.192 - Emergency Response Interoperability Center.

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Emergency Response Interoperability Center. 0.192 Section 0.192 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL COMMISSION ORGANIZATION..., industry representatives, and service providers. [75 FR 28207, May 20, 2010] ...

  6. ngVLA Cryogenic Subsystem Concept

    Wootten, Al; Urbain, Denis; Grammer, Wes; Durand, S.

    2018-01-01

    The VLA’s success over 35 years of operations stems in part from dramatically upgraded components over the years. The time has come to build a new array to lead the radio astronomical science into its next 40 years. To accomplish that, a next generation VLA (ngVLA) is envisioned to have 214 antennas with diameters of 18m. The core of the array will be centered at the current VLA location, but the arms will extend out to 1000km.The VLA cryogenic subsystem equipment and technology have remained virtually unchanged since the early 1980s. While adequate for a 27-antenna array, scaling the current system for an array of 214 antennas would be prohibitively expensive in terms of operating cost and maintenance. The overall goal is to limit operating cost to within three times the current level, despite having 8 times the number of antennas. To help realize this goal, broadband receivers and compact feeds will be utilized to reduce both the size and number of cryostats required. The current baseline front end concept calls for just two moderately-sized cryostats for the entire 1.2-116 GHz frequency range, as opposed to 8 in the VLA.For the ngVLA cryogenics, our objective is a well-optimized and efficient system that uses state-of-the-art technology to minimize per-antenna power consumption and maximize reliability. Application of modern technologies, such as variable-speed operation for the scroll compressors and cryocooler motor drives, allow the cooling capacity of the system to be dynamically matched to thermal loading in each cryostat. Significantly, power savings may be realized while the maintenance interval of the cryocoolers is also extended.Finally, a receiver designed to minimize thermal loading can produce savings directly translating to lower operating cost when variable-speed drives are used. Multi-layer insulation (MLI) on radiation shields and improved IR filters on feed windows can significantly reduce heat loading.Measurements done on existing cryogenic

  7. Interoperability and Security Support for Heterogeneous COTS/GOTS/Legacy Component-Based Architecture

    Tran, Tam

    2000-01-01

    .... This thesis researches existing open standards solutions to the distributed component integration problem and proposes an application framework that supports application wrappers and a uniform...

  8. IF-MANET: Interoperable framework for heterogeneous mobile ad hoc networks

    Hassan, H.

    2015-01-01

    The advances in low power micro-processors, wireless networks and embedded systems have raised the need to utilize the significant resources of mobile devices. These devices for example, smart phones, tablets, laptops, wearables, and sensors are gaining enormous processing power, storage capacity and wireless bandwidth. In addition, the advancement in wireless mobile technology has created a new communication paradigm via which a wireless network can be created without any priori infrastructu...

  9. Development status of a preprototype water electrolysis subsystem

    Martin, R. B.; Erickson, A. C.

    1981-01-01

    A preprototype water electrolysis subsystem was designed and fabricated for NASA's advanced regenerative life support program. A solid polymer is used for the cell electrolyte. The electrolysis module has 12 cells that can generate 5.5 kg/day of oxygen for the metabolic requirements of three crewmembers, for cabin leakage, and for the oxygen and hydrogen required for carbon dioxide collection and reduction processes. The subsystem can be operated at a pressure between 276 and 2760 kN/sq m and in a continuous constant-current, cyclic, or standby mode. A microprocessor is used to aid in operating the subsystem. Sensors and controls provide fault detection and automatic shutdown. The results of development, demonstration, and parametric testing are presented. Modifications to enhance operation in an integrated and manned test are described. Prospective improvements for the electrolysis subsystem are discussed.

  10. PREVAIL-EPL alpha tool electron optics subsystem

    Pfeiffer, Hans C.; Dhaliwal, Rajinder S.; Golladay, Steven D.; Doran, Samuel K.; Gordon, Michael S.; Kendall, Rodney A.; Lieberman, Jon E.; Pinckney, David J.; Quickle, Robert J.; Robinson, Christopher F.; Rockrohr, James D.; Stickel, Werner; Tressler, Eileen V.

    2001-08-01

    The IBM/Nikon alliance is continuing pursuit of an EPL stepper alpha tool based on the PREVAIL technology. This paper provides a status report of the alliance activity with particular focus on the Electron Optical Subsystem developed at IBM. We have previously reported on design features of the PREVAIL alpha system. The new state-of-the-art e-beam lithography concepts have since been reduced to practice and turned into functional building blocks of a production level lithography tool. The electron optical alpha tool subsystem has been designed, build, assembled and tested at IBM's Semiconductor Research and Development Center (SRDC) in East Fishkill, New York. After demonstrating subsystem functionality, the electron optical column and all associated control electronics hardware and software have been shipped during January 2001 to Nikon's facility in Kumagaya, Japan, for integration into the Nikon commercial e-beam stepper alpha tool. Early pre-shipment results obtained with this electron optical subsystem are presented.

  11. Automated Subsystem Control for Life Support System (ASCLSS)

    Block, Roger F.

    1987-01-01

    The Automated Subsystem Control for Life Support Systems (ASCLSS) program has successfully developed and demonstrated a generic approach to the automation and control of space station subsystems. The automation system features a hierarchical and distributed real-time control architecture which places maximum controls authority at the lowest or process control level which enhances system autonomy. The ASCLSS demonstration system pioneered many automation and control concepts currently being considered in the space station data management system (DMS). Heavy emphasis is placed on controls hardware and software commonality implemented in accepted standards. The approach demonstrates successfully the application of real-time process and accountability with the subsystem or process developer. The ASCLSS system completely automates a space station subsystem (air revitalization group of the ASCLSS) which moves the crew/operator into a role of supervisory control authority. The ASCLSS program developed over 50 lessons learned which will aide future space station developers in the area of automation and controls..

  12. Effector-Triggered Self-Replication in Coupled Subsystems.

    Komáromy, Dávid; Tezcan, Meniz; Schaeffer, Gaël; Marić, Ivana; Otto, Sijbren

    2017-11-13

    In living systems processes like genome duplication and cell division are carefully synchronized through subsystem coupling. If we are to create life de novo, similar control over essential processes such as self-replication need to be developed. Here we report that coupling two dynamic combinatorial subsystems, featuring two separate building blocks, enables effector-mediated control over self-replication. The subsystem based on the first building block shows only self-replication, whereas that based on the second one is solely responsive toward a specific external effector molecule. Mixing the subsystems arrests replication until the effector molecule is added, resulting in the formation of a host-effector complex and the liberation of the building block that subsequently engages in self-replication. The onset, rate and extent of self-replication is controlled by the amount of effector present. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Design and study of geosciences data share platform :platform framework, data interoperability, share approach

    Lu, H.; Yi, D.

    2010-12-01

    The Deep Exploration is one of the important approaches to the Geoscience research. Since 1980s we had started it and achieved a lot of data. Researchers usually integrate both data of space exploration and deep exploration to study geological structures and represent the Earth’s subsurface, and analyze and explain on the base of integrated data. Due to the different exploration approach it results the heterogeneity of data, and therefore the data achievement is always of the import issue to make the researchers confused. The problem of data share and interaction has to be solved during the development of the SinoProbe research project. Through the research of domestic and overseas well-known exploration project and geosciences data platform, the subject explores the solution of data share and interaction. Based on SOA we present the deep exploration data share framework which comprises three level: data level is used for the solution of data store and the integration of the heterogeneous data; medial level provides the data service of geophysics, geochemistry, etc. by the means of Web service, and carry out kinds of application combination by the use of GIS middleware and Eclipse RCP; interaction level provides professional and non-professional customer the access to different accuracy data. The framework adopts GeoSciML data interaction approach. GeoSciML is a geosciences information markup language, as an application of the OpenGIS Consortium’s (OGC) Geography Markup Language (GML). It transfers heterogeneous data into one earth frame and implements inter-operation. We dissertate in this article the solution how to integrate the heterogeneous data and share the data in the project of SinoProbe.

  14. Measurement system as a subsystem of the quality management system

    Ľubica Floreková; Ján Terpák; Marcela Čarnogurská

    2006-01-01

    Each measurement system and a control principle must be based on certain facts about the system behaviour (what), operation (how) and structure (why). Each system is distributed into subsystems that provide an input for the next subsystem. For each system, start is important the begin, that means system characteristics, collecting of data, its hierarchy and the processes distribution.A measurement system (based on the chapter 8 of the standard ISO 9001:2000 Quality management system, requirem...

  15. Opto-mechanical subsystem with temperature compensation through isothemal design

    Goodwin, F. E. (Inventor)

    1977-01-01

    An opto-mechanical subsystem for supporting a laser structure which minimizes changes in the alignment of the laser optics in response to temperature variations is described. Both optical and mechanical structural components of the system are formed of the same material, preferably beryllium, which is selected for high mechanical strength and good thermal conducting qualities. All mechanical and optical components are mounted and assembled to provide thorough thermal coupling throughout the subsystem to prevent the development of temperature gradients.

  16. An Algorithm for Integrated Subsystem Embodiment and System Synthesis

    Lewis, Kemper

    1997-01-01

    Consider the statement,'A system has two coupled subsystems, one of which dominates the design process. Each subsystem consists of discrete and continuous variables, and is solved using sequential analysis and solution.' To address this type of statement in the design of complex systems, three steps are required, namely, the embodiment of the statement in terms of entities on a computer, the mathematical formulation of subsystem models, and the resulting solution and system synthesis. In complex system decomposition, the subsystems are not isolated, self-supporting entities. Information such as constraints, goals, and design variables may be shared between entities. But many times in engineering problems, full communication and cooperation does not exist, information is incomplete, or one subsystem may dominate the design. Additionally, these engineering problems give rise to mathematical models involving nonlinear functions of both discrete and continuous design variables. In this dissertation an algorithm is developed to handle these types of scenarios for the domain-independent integration of subsystem embodiment, coordination, and system synthesis using constructs from Decision-Based Design, Game Theory, and Multidisciplinary Design Optimization. Implementation of the concept in this dissertation involves testing of the hypotheses using example problems and a motivating case study involving the design of a subsonic passenger aircraft.

  17. An analytical model for an input/output-subsystem

    Roemgens, J.

    1983-05-01

    An input/output-subsystem of one or several computers if formed by the external memory units and the peripheral units of a computer system. For these subsystems mathematical models are established, taking into account the special properties of the I/O-subsystems, in order to avoid planning errors and to allow for predictions of the capacity of such systems. Here an analytical model is presented for the magnetic discs of a I/O-subsystem, using analytical methods for the individual waiting queues or waiting queue networks. Only I/O-subsystems of IBM-computer configurations are considered, which can be controlled by the MVS operating system. After a description of the hardware and software components of these I/O-systems, possible solutions from the literature are presented and discussed with respect to their applicability in IBM-I/O-subsystems. Based on these models a special scheme is developed which combines the advantages of the literature models and avoids the disadvantages in part. (orig./RW) [de

  18. Ocean Data Interoperability Platform: developing a common global framework for marine data management

    Glaves, Helen; Schaap, Dick

    2017-04-01

    In recent years there has been a paradigm shift in marine research moving from the traditional discipline based methodology employed at the national level by one or more organizations, to a multidisciplinary, ecosystem level approach conducted on an international scale. This increasingly holistic approach to marine research is in part being driven by policy and legislation. For example, the European Commission's Blue Growth strategy promotes sustainable growth in the marine environment including the development of sea-basin strategies (European Commission 2014). As well as this policy driven shift to ecosystem level marine research there are also scientific and economic drivers for a basin level approach. Marine monitoring is essential for assessing the health of an ecosystem and determining the impacts of specific factors and activities on it. The availability of large volumes of good quality data is fundamental to this increasingly holistic approach to ocean research but there are significant barriers to its re-use. These are due to the heterogeneity of the data resulting from having been collected by many organizations around the globe using a variety of sensors mounted on a range of different platforms. The data is then delivered and archived in a range of formats, using various spatial coordinate systems and aligned with different standards. This heterogeneity coupled with organizational and national policies on data sharing make access and re-use of marine data problematic. In response to the need for greater sharing of marine data a number of e-infrastructures have been developed but these have different levels of granularity with the majority having been developed at the regional level to address specific requirements for data e.g. SeaDataNet in Europe, the Australian Ocean Data Network (AODN). These data infrastructures are also frequently aligned with the priorities of the local funding agencies and have been created in isolation from those developed

  19. Meeting People’s Needs in a Fully Interoperable Domotic Environment

    Vittorio Miori

    2012-05-01

    Full Text Available The key idea underlying many Ambient Intelligence (AmI projects and applications is context awareness, which is based mainly on their capacity to identify users and their locations. The actual computing capacity should remain in the background, in the periphery of our awareness, and should only move to the center if and when necessary. Computing thus becomes ‘invisible’, as it is embedded in the environment and everyday objects. The research project described herein aims to realize an Ambient Intelligence-based environment able to improve users’ quality of life by learning their habits and anticipating their needs. This environment is part of an adaptive, context-aware framework designed to make today’s incompatible heterogeneous domotic systems fully interoperable, not only for connecting sensors and actuators, but for providing comprehensive connections of devices to users. The solution is a middleware architecture based on open and widely recognized standards capable of abstracting the peculiarities of underlying heterogeneous technologies and enabling them to co-exist and interwork, without however eliminating their differences. At the highest level of this infrastructure, the Ambient Intelligence framework, integrated with the domotic sensors, can enable the system to recognize any unusual or dangerous situations and anticipate health problems or special user needs in a technological living environment, such as a house or a public space.

  20. Implementation and Evaluation of Four Interoperable Open Standards for the Internet of Things

    Jazayeri, Mohammad Ali; Liang, Steve H. L.; Huang, Chih-Yuan

    2015-01-01

    Recently, researchers are focusing on a new use of the Internet called the Internet of Things (IoT), in which enabled electronic devices can be remotely accessed over the Internet. As the realization of IoT concept is still in its early stages, manufacturers of Internet-connected devices and IoT web service providers are defining their proprietary protocols based on their targeted applications. Consequently, IoT becomes heterogeneous in terms of hardware capabilities and communication protocols. Addressing these heterogeneities by following open standards is a necessary step to communicate with various IoT devices. In this research, we assess the feasibility of applying existing open standards on resource-constrained IoT devices. The standard protocols developed in this research are OGC PUCK over Bluetooth, TinySOS, SOS over CoAP, and OGC SensorThings API. We believe that by hosting open standard protocols on IoT devices, not only do the devices become self-describable, self-contained, and interoperable, but innovative applications can also be easily developed with standardized interfaces. In addition, we use memory consumption, request message size, response message size, and response latency to benchmark the efficiency of the implemented protocols. In all, this research presents and evaluates standard-based solutions to better understand the feasibility of applying existing standards to the IoT vision. PMID:26402683

  1. Implementation and Evaluation of Four Interoperable Open Standards for the Internet of Things.

    Jazayeri, Mohammad Ali; Liang, Steve H L; Huang, Chih-Yuan

    2015-09-22

    Recently, researchers are focusing on a new use of the Internet called the Internet of Things (IoT), in which enabled electronic devices can be remotely accessed over the Internet. As the realization of IoT concept is still in its early stages, manufacturers of Internet-connected devices and IoT web service providers are defining their proprietary protocols based on their targeted applications. Consequently, IoT becomes heterogeneous in terms of hardware capabilities and communication protocols. Addressing these heterogeneities by following open standards is a necessary step to communicate with various IoT devices. In this research, we assess the feasibility of applying existing open standards on resource-constrained IoT devices. The standard protocols developed in this research are OGC PUCK over Bluetooth, TinySOS, SOS over CoAP, and OGC SensorThings API. We believe that by hosting open standard protocols on IoT devices, not only do the devices become self-describable, self-contained, and interoperable, but innovative applications can also be easily developed with standardized interfaces. In addition, we use memory consumption, request message size, response message size, and response latency to benchmark the efficiency of the implemented protocols. In all, this research presents and evaluates standard-based solutions to better understand the feasibility of applying existing standards to the IoT vision.

  2. Extending the Scope of the Resource Admission Control Subsystem (RACS) in IP multimedia subsystem using cognitive radios

    Muwonge, BK

    2008-04-01

    Full Text Available is greatly increased, and resource reservation and QoS management by the RACS is also greatly increased. Index Terms—Traffic Engineering; Cross Layer; Cognitive Radio, IP Multimedia Subsystem (IMS) I. INTRODUCTION HE IP Multimedia Subsystem (IMS...) is seen as the answer to the much talked-about convergence of data and telecommunication services. The original IMS design was by the 3rd Generation Partnership Project (3GPP) for delivering IP Multimedia services to end users, using telecommunication...

  3. Implementing standards for the interoperability among healthcare providers in the public regionalized Healthcare Information System of the Lombardy Region.

    Barbarito, Fulvio; Pinciroli, Francesco; Mason, John; Marceglia, Sara; Mazzola, Luca; Bonacina, Stefano

    2012-08-01

    Information technologies (ITs) have now entered the everyday workflow in a variety of healthcare providers with a certain degree of independence. This independence may be the cause of difficulty in interoperability between information systems and it can be overcome through the implementation and adoption of standards. Here we present the case of the Lombardy Region, in Italy, that has been able, in the last 10 years, to set up the Regional Social and Healthcare Information System, connecting all the healthcare providers within the region, and providing full access to clinical and health-related documents independently from the healthcare organization that generated the document itself. This goal, in a region with almost 10 millions citizens, was achieved through a twofold approach: first, the political and operative push towards the adoption of the Health Level 7 (HL7) standard within single hospitals and, second, providing a technological infrastructure for data sharing based on interoperability specifications recognized at the regional level for messages transmitted from healthcare providers to the central domain. The adoption of such regional interoperability specifications enabled the communication among heterogeneous systems placed in different hospitals in Lombardy. Integrating the Healthcare Enterprise (IHE) integration profiles which refer to HL7 standards are adopted within hospitals for message exchange and for the definition of integration scenarios. The IHE patient administration management (PAM) profile with its different workflows is adopted for patient management, whereas the Scheduled Workflow (SWF), the Laboratory Testing Workflow (LTW), and the Ambulatory Testing Workflow (ATW) are adopted for order management. At present, the system manages 4,700,000 pharmacological e-prescriptions, and 1,700,000 e-prescriptions for laboratory exams per month. It produces, monthly, 490,000 laboratory medical reports, 180,000 radiology medical reports, 180

  4. ICD-11 (JLMMS) and SCT Inter-Operation.

    Mamou, Marzouk; Rector, Alan; Schulz, Stefan; Campbell, James; Solbrig, Harold; Rodrigues, Jean-Marie

    2016-01-01

    The goal of this work is to contribute to a smooth and semantically sound inter-operability between the ICD-11 (International Classification of Diseases-11th revision Joint Linearization for Mortality, Morbidity and Statistics) and SNOMED CT (SCT). To guarantee such inter-operation between a classification, characterized by a single hierarchy of mutually exclusive and exhaustive classes, as is the JLMMS successor of ICD-10 on the one hand, and the multi-hierarchical, ontology-based clinical terminology SCT on the other hand, we use ontology axioms that logically express generalizable truths. This is expressed by the compositional grammar of SCT, together with queries on axiomsof SCT. We test the feasibility of the method on the circulatory chapter of ICD-11 JLMMS and present limitations and results.

  5. Interoperability And Value Added To Earth Observation Data

    Gasperi, J.

    2012-04-01

    Geospatial web services technology has provided a new means for geospatial data interoperability. Open Geospatial Consortium (OGC) services such as Web Map Service (WMS) to request maps on the Internet, Web Feature Service (WFS) to exchange vectors or Catalog Service for the Web (CSW) to search for geospatialized data have been widely adopted in the Geosciences community in general and in the remote sensing community in particular. These services make Earth Observation data available to a wider range of public users than ever before. The mapshup web client offers an innovative and efficient user interface that takes advantage of the power of interoperability. This presentation will demonstrate how mapshup can be effectively used in the context of natural disasters management.

  6. Interoperable mesh and geometry tools for advanced petascale simulations

    Diachin, L; Bauer, A; Fix, B; Kraftcheck, J; Jansen, K; Luo, X; Miller, M; Ollivier-Gooch, C; Shephard, M S; Tautges, T; Trease, H

    2007-01-01

    SciDAC applications have a demonstrated need for advanced software tools to manage the complexities associated with sophisticated geometry, mesh, and field manipulation tasks, particularly as computer architectures move toward the petascale. The Center for Interoperable Technologies for Advanced Petascale Simulations (ITAPS) will deliver interoperable and interchangeable mesh, geometry, and field manipulation services that are of direct use to SciDAC applications. The premise of our technology development goal is to provide such services as libraries that can be used with minimal intrusion into application codes. To develop these technologies, we focus on defining a common data model and data-structure neutral interfaces that unify a number of different services such as mesh generation and improvement, front tracking, adaptive mesh refinement, shape optimization, and solution transfer operations. We highlight the use of several ITAPS services in SciDAC applications

  7. Building Future Transatlantic Interoperability Around a Robust NATO Response Force

    2012-10-01

    than already traveled . However, this accrued wealth of interoperable capa- bility may be at its apogee, soon to decline as the result of two looming...and Bydgo- szcz, Poland, as well as major national training centers such as the bilateral U.S.- Romanian Joint Task Force– East at Kogalniceanu...operations. Increase U.S. and Allied Exchange Students at National and NATO military schools. Austerity measures may eventually affect the investment

  8. Technical Data Interoperability (TDI) Pathfinder Via Emerging Standards

    Conroy, Mike; Gill, Paul; Hill, Bradley; Ibach, Brandon; Jones, Corey; Ungar, David; Barch, Jeffrey; Ingalls, John; Jacoby, Joseph; Manning, Josh; hide

    2014-01-01

    The TDI project (TDI) investigates trending technical data standards for applicability to NASA vehicles, space stations, payloads, facilities, and equipment. TDI tested COTS software compatible with a certain suite of related industry standards for capabilities of individual benefits and interoperability. These standards not only esnable Information Technology (IT) efficiencies, but also address efficient structures and standard content for business processes. We used source data from generic industry samples as well as NASA and European Space Agency (ESA) data from space systems.

  9. The challenge of networked enterprises for cloud computing interoperability

    Mezgár, István; Rauschecker, Ursula

    2014-01-01

    Manufacturing enterprises have to organize themselves into effective system architectures forming different types of Networked Enterprises (NE) to match fast changing market demands. Cloud Computing (CC) is an important up to date computing concept for NE, as it offers significant financial and technical advantages beside high-level collaboration possibilities. As cloud computing is a new concept the solutions for handling interoperability, portability, security, privacy and standardization c...

  10. Interoperability between Fingerprint Biometric Systems: An Empirical Study

    Gashi, I.; Mason, S.; Lugini, L.; Marasco, E.; Cukic, B.

    2014-01-01

    Fingerprints are likely the most widely used biometric in commercial as well as law enforcement applications. With the expected rapid growth of fingerprint authentication in mobile devices their importance justifies increased demands for dependability. An increasing number of new sensors,applications and a diverse user population also intensify concerns about the interoperability in fingerprint authentication. In most applications, fingerprints captured for user enrollment with one device may...

  11. Emergent Semantics Interoperability in Large-Scale Decentralized Information Systems

    Cudré-Mauroux, Philippe

    2008-01-01

    Peer-to-peer systems are evolving with new information-system architectures, leading to the idea that the principles of decentralization and self-organization will offer new approaches in informatics, especially for systems that scale with the number of users or for which central authorities do not prevail. This book describes a new way of building global agreements (semantic interoperability) based only on decentralized, self-organizing interactions.

  12. Smart hospitality—Interconnectivity and interoperability towards an ecosystem

    Buhalis, Dimitrios; Leung, Rosanna

    2018-01-01

    The Internet and cloud computing changed the way business operate. Standardised web-based applications simplify data interchange which allow internal applications and business partners systems to become interconnected and interoperable. This study conceptualises the smart and agile hospitality enterprises of the future, and proposes a smart hospitality ecosystem that adds value to all stakeholders. Internal data from applications among all stakeholders, consolidated with external environment ...

  13. Secure and interoperable communication infrastructures for PPDR organisations

    Müller, Wilmuth; Marques, Hugo; Pereira, Luis; Rodriguez, Jonathan; Brouwer, Frank; Bouwers, Bert; Politis, Ilias; Lykourgiotis, Asimakis; Ladas, Alexandros; Adigun, Olayinka; Jelenc, David

    2016-05-01

    The growing number of events affecting public safety and security (PS&S) on a regional scale with potential to grow up to large scale cross border disasters puts an increased pressure on agencies and organisation responsible for PS&S. In order to respond timely and in an adequate manner to such events, Public Protection and Disaster Relief (PPDR) organisations need to cooperate, align their procedures and activities, share the needed information and be interoperable. Existing PPDR/PMR technologies such as TETRA, TETRAPOL or P25, do not currently provide broadband capability nor is expected such technologies to be upgraded in the future. This presents a major limitation in supporting new services and information flows. Furthermore, there is no known standard that addresses interoperability of these technologies. In this contribution the design of a next generation communication infrastructure for PPDR organisations which fulfills the requirements of secure and seamless end-to-end communication and interoperable information exchange within the deployed communication networks is presented. Based on Enterprise Architecture of PPDR organisations, a next generation PPDR network that is backward compatible with legacy communication technologies is designed and implemented, capable of providing security, privacy, seamless mobility, QoS and reliability support for mission-critical Private Mobile Radio (PMR) voice and broadband data services. The designed solution provides a robust, reliable, and secure mobile broadband communications system for a wide variety of PMR applications and services on PPDR broadband networks, including the ability of inter-system, interagency and cross-border operations with emphasis on interoperability between users in PMR and LTE.

  14. Enabling IoT ecosystems through platform interoperability

    Bröring, Arne; Schmid, Stefan; Schindhelm, Corina-Kim; Khelil, Abdelmajid; Kabisch, Sebastian; Kramer, Denis; Le Phuoc, Danh; Mitic, Jelena; Anicic, Darko; Teniente López, Ernest

    2017-01-01

    Today, the Internet of Things (IoT) comprises vertically oriented platforms for things. Developers who want to use them need to negotiate access individually and adapt to the platform-specific API and information models. Having to perform these actions for each platform often outweighs the possible gains from adapting applications to multiple platforms. This fragmentation of the IoT and the missing interoperability result in high entry barriers for developers and prevent the emergence of broa...

  15. The Internet of Things: New Interoperability, Management and Security Challenges

    Elkhodr, Mahmoud; Shahrestani, Seyed; Cheung, Hon

    2016-01-01

    The Internet of Things (IoT) brings connectivity to about every objects found in the physical space. It extends connectivity to everyday objects. From connected fridges, cars and cities, the IoT creates opportunities in numerous domains. However, this increase in connectivity creates many prominent challenges. This paper provides a survey of some of the major issues challenging the widespread adoption of the IoT. Particularly, it focuses on the interoperability, management, securi...

  16. On the Feasibility of Interoperable Schemes in Hand Biometrics

    Miguel A. Ferrer

    2012-02-01

    Full Text Available Personal recognition through hand-based biometrics has attracted the interest of many researchers in the last twenty years. A significant number of proposals based on different procedures and acquisition devices have been published in the literature. However, comparisons between devices and their interoperability have not been thoroughly studied. This paper tries to fill this gap by proposing procedures to improve the interoperability among different hand biometric schemes. The experiments were conducted on a database made up of 8,320 hand images acquired from six different hand biometric schemes, including a flat scanner, webcams at different wavelengths, high quality cameras, and contactless devices. Acquisitions on both sides of the hand were included. Our experiment includes four feature extraction methods which determine the best performance among the different scenarios for two of the most popular hand biometrics: hand shape and palm print. We propose smoothing techniques at the image and feature levels to reduce interdevice variability. Results suggest that comparative hand shape offers better performance in terms of interoperability than palm prints, but palm prints can be more effective when using similar sensors.

  17. On the feasibility of interoperable schemes in hand biometrics.

    Morales, Aythami; González, Ester; Ferrer, Miguel A

    2012-01-01

    Personal recognition through hand-based biometrics has attracted the interest of many researchers in the last twenty years. A significant number of proposals based on different procedures and acquisition devices have been published in the literature. However, comparisons between devices and their interoperability have not been thoroughly studied. This paper tries to fill this gap by proposing procedures to improve the interoperability among different hand biometric schemes. The experiments were conducted on a database made up of 8,320 hand images acquired from six different hand biometric schemes, including a flat scanner, webcams at different wavelengths, high quality cameras, and contactless devices. Acquisitions on both sides of the hand were included. Our experiment includes four feature extraction methods which determine the best performance among the different scenarios for two of the most popular hand biometrics: hand shape and palm print. We propose smoothing techniques at the image and feature levels to reduce interdevice variability. Results suggest that comparative hand shape offers better performance in terms of interoperability than palm prints, but palm prints can be more effective when using similar sensors.

  18. Interoperable and accessible census and survey data from IPUMS.

    Kugler, Tracy A; Fitch, Catherine A

    2018-02-27

    The first version of the Integrated Public Use Microdata Series (IPUMS) was released to users in 1993, and since that time IPUMS has come to stand for interoperable and accessible census and survey data. Initially created to harmonize U.S. census microdata over time, IPUMS now includes microdata from the U.S. and international censuses and from surveys on health, employment, and other topics. IPUMS also provides geo-spatial data, aggregate population data, and environmental data. IPUMS supports ten data products, each disseminating an integrated data collection with a set of tools that make complex data easy to find, access, and use. Key features are record-level integration to create interoperable datasets, user-friendly interfaces, and comprehensive metadata and documentation. The IPUMS philosophy aligns closely with the FAIR principles of findability, accessibility, interoperability, and re-usability. IPUMS data have catalyzed knowledge generation across a wide range of social science and other disciplines, as evidenced by the large volume of publications and other products created by the vast IPUMS user community.

  19. INTEROPERABLE FRAMEWORK SOLUTION TO ICU HEALTH CARE MONITORING

    Shola Usha Rani

    2015-03-01

    Full Text Available An interoperable telehealth system provides an independent healthcare solution for better management of health and wellness. It allows people to manage their heart disease and diabetes etc. by sending their health parameters like blood pressure, heart rate, glucose levels, temperature, weight, respiration from remote place to health professional, and get real-time feedback on their condition. Here different medical devices are connected to the patient for monitoring. Each kind of device is manufactured by different vendors. And each device information and communication requires different installation and network design. It causes design complexities and network overheads when moving patients for diagnosis examinations. This problem will be solved by interoperability among devices. The ISO/IEEE 11073 is an international standard which produces interoperable hospital information system solution to medical devices. One such type of integrated environment that requires the integration of medical devices is ICU (Intensive Care Unit. This paper presents the issues for ICU monitoring system and framework solution for it.

  20. Rollout Strategy to Implement Interoperable Traceability in the Seafood Industry.

    Gooch, Martin; Dent, Benjamin; Sylvia, Gilbert; Cusack, Christopher

    2017-08-01

    Verifying the accuracy and rigor of data exchanged within and between businesses for the purposes of traceability rests on the existence of effective and efficient interoperable information systems that meet users' needs. Interoperability, particularly given the complexities intrinsic to the seafood industry, requires that the systems used by businesses operating along the supply chain share a common technology architecture that is robust, resilient, and evolves as industry needs change. Technology architectures are developed through engaging industry stakeholders in understanding why an architecture is required, the benefits provided to the industry and individual businesses and supply chains, and how the architecture will translate into practical results. This article begins by reiterating the benefits that the global seafood industry can capture by implementing interoperable chain-length traceability and the reason for basing the architecture on a peer-to-peer networked database concept versus more traditional centralized or linear approaches. A summary of capabilities that already exist within the seafood industry that the proposed architecture uses is discussed; and a strategy for implementing the architecture is presented. The 6-step strategy is presented in the form of a critical path. © 2017 Institute of Food Technologists®.

  1. Interoperability of CAD Standards and Robotics in CIME

    Sørensen, Torben

    The research presented in this dissertation concerns the identification of problems and provision of solutions for increasing the degree of interoperability between CAD, CACSD (Computer Aided Control Systems Design) and CAR (Computer Aided Robotics) in Computer Integrated Manufacturing and Engine......The research presented in this dissertation concerns the identification of problems and provision of solutions for increasing the degree of interoperability between CAD, CACSD (Computer Aided Control Systems Design) and CAR (Computer Aided Robotics) in Computer Integrated Manufacturing......· The development of a STEP based interface for general control system data and functions, especially related to robot motion control for interoperability of CAD, CACSD, and CAR systems for the extension of the inter-system communication capabilities beyond the stage achieved up to now.This interface development...... comprehends the following work:· The definition of the concepts of 'information' and 'information model', and the selection of a proper information modeling methodology within the STEP methodologies.· The elaboration of a general function model of a generic robot motion controller in IDEF0 for interface...

  2. On the Feasibility of Interoperable Schemes in Hand Biometrics

    Morales, Aythami; González, Ester; Ferrer, Miguel A.

    2012-01-01

    Personal recognition through hand-based biometrics has attracted the interest of many researchers in the last twenty years. A significant number of proposals based on different procedures and acquisition devices have been published in the literature. However, comparisons between devices and their interoperability have not been thoroughly studied. This paper tries to fill this gap by proposing procedures to improve the interoperability among different hand biometric schemes. The experiments were conducted on a database made up of 8,320 hand images acquired from six different hand biometric schemes, including a flat scanner, webcams at different wavelengths, high quality cameras, and contactless devices. Acquisitions on both sides of the hand were included. Our experiment includes four feature extraction methods which determine the best performance among the different scenarios for two of the most popular hand biometrics: hand shape and palm print. We propose smoothing techniques at the image and feature levels to reduce interdevice variability. Results suggest that comparative hand shape offers better performance in terms of interoperability than palm prints, but palm prints can be more effective when using similar sensors. PMID:22438714

  3. Double-Shell Tank (DST) Monitor and Control Subsystem Specification

    BAFUS, R.R.

    2000-01-01

    This specification establishes the performance requirements and provides references to the requisite codes and standards to be applied during design of the Double-Shell Tank (DST) Monitor and Control Subsystem that supports the first phase of Waste Feed Delivery. This subsystem specification establishes the interface and performance requirements and provides references to the requisite codes and standards to be applied during the design of the Double-Shell Tank (DST) Monitor and Control Subsystem. The DST Monitor and Control Subsystem consists of the new and existing equipment that will be used to provide tank farm operators with integrated local monitoring and control of the DST systems to support Waste Feed Delivery (WFD). New equipment will provide automatic control and safety interlocks where required and provide operators with visibility into the status of DST subsystem operations (e.g., DST mixer pump operation and DST waste transfers) and the ability to manually control specified DST functions as necessary. This specification is intended to be the basis for new project/installations (W-521, etc.). This specification is not intended to retroactively affect previously established project design criteria without specific direction by the program

  4. Preprototype vapor compression distillation subsystem. [recovering potable water from wastewater

    Ellis, G. S.; Wynveen, R. A.; Schubert, F. H.

    1979-01-01

    A three-person capacity preprototype vapor compression distillation subsystem for recovering potable water from wastewater aboard spacecraft was designed, assembled, and tested. The major components of the subsystem are: (1) a distillation unit which includes a compressor, centrifuge, central shaft, and outer shell; (2) a purge pump; (3) a liquids pump; (4) a post-treat cartridge; (5) a recycle/filter tank; (6) an evaporator high liquid level sensor; and (7) the product water conductivity monitor. A computer based control monitor instrumentation carries out operating mode change sequences, monitors and displays subsystem parameters, maintains intramode controls, and stores and displays fault detection information. The mechanical hardware occupies 0.467 m3, requires 171 W of electrical power, and has a dry weight of 143 kg. The subsystem recovers potable water at a rate of 1.59 kg/hr, which is equivalent to a duty cycle of approximately 30% for a crew of three. The product water has no foul taste or odor. Continued development of the subsystem is recommended for reclaiming water for human consumption as well as for flash evaporator heat rejection, urinal flushing, washing, and other on-board water requirements.

  5. The Main Subsystems Involved in Defining the Quality Management System in a Hospital

    Dobrea Valentina Alina

    2010-06-01

    Full Text Available The hospital is the most important organization in health field, so they have to improve the quality in all the activities deployed. A very suitable way to show the hospital’s preoccupation for quality of health services is the quality management system certificate according ISO 9001/2000. In understanding the architecture of the hospital quality management system is necessary to decompose this system in subsystems and analyze each separately: the managerial subsystem, the human subsystem, the social subsystem, thetechnical subsystem, the informative subsystem. The relationship between those subsystems leads to the continuous improvement of quality in health services.

  6. Latest developments for the IAGOS database: Interoperability and metadata

    Boulanger, Damien; Gautron, Benoit; Thouret, Valérie; Schultz, Martin; van Velthoven, Peter; Broetz, Bjoern; Rauthe-Schöch, Armin; Brissebrat, Guillaume

    2014-05-01

    In-service Aircraft for a Global Observing System (IAGOS, http://www.iagos.org) aims at the provision of long-term, frequent, regular, accurate, and spatially resolved in situ observations of the atmospheric composition. IAGOS observation systems are deployed on a fleet of commercial aircraft. The IAGOS database is an essential part of the global atmospheric monitoring network. Data access is handled by open access policy based on the submission of research requests which are reviewed by the PIs. Users can access the data through the following web sites: http://www.iagos.fr or http://www.pole-ether.fr as the IAGOS database is part of the French atmospheric chemistry data centre ETHER (CNES and CNRS). The database is in continuous development and improvement. In the framework of the IGAS project (IAGOS for GMES/COPERNICUS Atmospheric Service), major achievements will be reached, such as metadata and format standardisation in order to interoperate with international portals and other databases, QA/QC procedures and traceability, CARIBIC (Civil Aircraft for the Regular Investigation of the Atmosphere Based on an Instrument Container) data integration within the central database, and the real-time data transmission. IGAS work package 2 aims at providing the IAGOS data to users in a standardized format including the necessary metadata and information on data processing, data quality and uncertainties. We are currently redefining and standardizing the IAGOS metadata for interoperable use within GMES/Copernicus. The metadata are compliant with the ISO 19115, INSPIRE and NetCDF-CF conventions. IAGOS data will be provided to users in NetCDF or NASA Ames format. We also are implementing interoperability between all the involved IAGOS data services, including the central IAGOS database, the former MOZAIC and CARIBIC databases, Aircraft Research DLR database and the Jülich WCS web application JOIN (Jülich OWS Interface) which combines model outputs with in situ data for

  7. Modeling Interoperable Information Systems with 3LGM² and IHE.

    Stäubert, S; Schaaf, M; Jahn, F; Brandner, R; Winter, A

    2015-01-01

    Strategic planning of information systems (IS) in healthcare requires descriptions of the current and the future IS state. Enterprise architecture planning (EAP) tools like the 3LGM² tool help to build up and to analyze IS models. A model of the planned architecture can be derived from an analysis of current state IS models. Building an interoperable IS, i. e. an IS consisting of interoperable components, can be considered a relevant strategic information management goal for many IS in healthcare. Integrating the healthcare enterprise (IHE) is an initiative which targets interoperability by using established standards. To link IHE concepts to 3LGM² concepts within the 3LGM² tool. To describe how an information manager can be supported in handling the complex IHE world and planning interoperable IS using 3LGM² models. To describe how developers or maintainers of IHE profiles can be supported by the representation of IHE concepts in 3LGM². Conceptualization and concept mapping methods are used to assign IHE concepts such as domains, integration profiles actors and transactions to the concepts of the three-layer graph-based meta-model (3LGM²). IHE concepts were successfully linked to 3LGM² concepts. An IHE-master-model, i. e. an abstract model for IHE concepts, was modeled with the help of 3LGM² tool. Two IHE domains were modeled in detail (ITI, QRPH). We describe two use cases for the representation of IHE concepts and IHE domains as 3LGM² models. Information managers can use the IHE-master-model as reference model for modeling interoperable IS based on IHE profiles during EAP activities. IHE developers are supported in analyzing consistency of IHE concepts with the help of the IHE-master-model and functions of the 3LGM² tool The complex relations between IHE concepts can be modeled by using the EAP method 3LGM². 3LGM² tool offers visualization and analysis features which are now available for the IHE-master-model. Thus information managers and IHE

  8. Mathematical modeling of control subsystems for CELSS: Application to diet

    Waleh, Ahmad; Nguyen, Thoi K.; Kanevsky, Valery

    1991-01-01

    The dynamic control of a Closed Ecological Life Support System (CELSS) in a closed space habitat is of critical importance. The development of a practical method of control is also a necessary step for the selection and design of realistic subsystems and processors for a CELSS. Diet is one of the dynamic factors that strongly influences, and is influenced, by the operational states of all major CELSS subsystems. The problems of design and maintenance of a stable diet must be obtained from well characterized expert subsystems. The general description of a mathematical model that forms the basis of an expert control program for a CELSS is described. The formulation is expressed in terms of a complete set of time dependent canonical variables. System representation is dynamic and includes time dependent storage buffers. The details of the algorithm are described. The steady state results of the application of the method for representative diets made from wheat, potato, and soybean are presented.

  9. Double Shell Tank (DST) Transfer Piping Subsystem Specification

    GRAVES, C.E.

    2000-01-01

    This specification establishes the performance requirements and provides references to the requisite codes and standards to be applied during design of the Double-Shell Tank (DST) Transfer Piping Subsystem that supports the first phase of Waste Feed Delivery. This specification establishes the performance requirements and provides references to the requisite codes and standards to be applied during design of the Double-Shell Tank (DST) Transfer Piping Subsystem that supports the first phase of waste feed delivery. This subsystem transfers waste between transfer-associated structures (pits) and to the River Protection Project (RPP) Privatization Contractor Facility where it will be processed into an immobilized waste form. This specification is intended to be the basis for new projects/installations (W-521, etc.). This specification is not intended to retroactively affect previously established project design criteria without specific direction by the program

  10. The complete Heyting algebra of subsystems and contextuality

    Vourdas, A.

    2013-01-01

    The finite set of subsystems of a finite quantum system with variables in Z(n), is studied as a Heyting algebra. The physical meaning of the logical connectives is discussed. It is shown that disjunction of subsystems is more general concept than superposition. Consequently, the quantum probabilities related to commuting projectors in the subsystems, are incompatible with associativity of the join in the Heyting algebra, unless if the variables belong to the same chain. This leads to contextuality, which in the present formalism has as contexts, the chains in the Heyting algebra. Logical Bell inequalities, which contain “Heyting factors,” are discussed. The formalism is also applied to the infinite set of all finite quantum systems, which is appropriately enlarged in order to become a complete Heyting algebra

  11. Reference architecture and interoperability model for data mining and fusion in scientific cross-domain infrastructures

    Haener, Rainer; Waechter, Joachim; Grellet, Sylvain; Robida, Francois

    2017-04-01

    Interoperability is the key factor in establishing scientific research environments and infrastructures, as well as in bringing together heterogeneous, geographically distributed risk management, monitoring, and early warning systems. Based on developments within the European Plate Observing System (EPOS), a reference architecture has been devised that comprises architectural blue-prints and interoperability models regarding the specification of business processes and logic as well as the encoding of data, metadata, and semantics. The architectural blueprint is developed on the basis of the so called service-oriented architecture (SOA) 2.0 paradigm, which combines intelligence and proactiveness of event-driven with service-oriented architectures. SOA 2.0 supports analysing (Data Mining) both, static and real-time data in order to find correlations of disparate information that do not at first appear to be intuitively obvious: Analysed data (e.g., seismological monitoring) can be enhanced with relationships discovered by associating them (Data Fusion) with other data (e.g., creepmeter monitoring), with digital models of geological structures, or with the simulation of geological processes. The interoperability model describes the information, communication (conversations) and the interactions (choreographies) of all participants involved as well as the processes for registering, providing, and retrieving information. It is based on the principles of functional integration, implemented via dedicated services, communicating via service-oriented and message-driven infrastructures. The services provide their functionality via standardised interfaces: Instead of requesting data directly, users share data via services that are built upon specific adapters. This approach replaces the tight coupling at data level by a flexible dependency on loosely coupled services. The main component of the interoperability model is the comprehensive semantic description of the information

  12. Embedded Thermal Control for Subsystems for Next Generation Spacecraft Applications

    Didion, Jeffrey R.

    2015-01-01

    Thermal Fluids and Analysis Workshop, Silver Spring MD NCTS 21070-15. NASA, the Defense Department and commercial interests are actively engaged in developing miniaturized spacecraft systems and scientific instruments to leverage smaller cheaper spacecraft form factors such as CubeSats. This paper outlines research and development efforts among Goddard Space Flight Center personnel and its several partners to develop innovative embedded thermal control subsystems. Embedded thermal control subsystems is a cross cutting enabling technology integrating advanced manufacturing techniques to develop multifunctional intelligent structures to reduce Size, Weight and Power (SWaP) consumption of both the thermal control subsystem and overall spacecraft. Embedded thermal control subsystems permit heat acquisition and rejection at higher temperatures than state of the art systems by employing both advanced heat transfer equipment (integrated heat exchangers) and high heat transfer phenomena. The Goddard Space Flight Center Thermal Engineering Branch has active investigations seeking to characterize advanced thermal control systems for near term spacecraft missions. The embedded thermal control subsystem development effort consists of fundamental research as well as development of breadboard and prototype hardware and spaceflight validation efforts. This paper will outline relevant fundamental investigations of micro-scale heat transfer and electrically driven liquid film boiling. The hardware development efforts focus upon silicon based high heat flux applications (electronic chips, power electronics etc.) and multifunctional structures. Flight validation efforts include variable gravity campaigns and a proposed CubeSat based flight demonstration of a breadboard embedded thermal control system. The CubeSat investigation is technology demonstration will characterize in long-term low earth orbit a breadboard embedded thermal subsystem and its individual components to develop

  13. Ground test facility for nuclear testing of space reactor subsystems

    Quapp, W.J.; Watts, K.D.

    1985-01-01

    Two major reactor facilities at the INEL have been identified as easily adaptable for supporting the nuclear testing of the SP-100 reactor subsystem. They are the Engineering Test Reactor (ETR) and the Loss of Fluid Test Reactor (LOFT). In addition, there are machine shops, analytical laboratories, hot cells, and the supporting services (fire protection, safety, security, medical, waste management, etc.) necessary to conducting a nuclear test program. This paper presents the conceptual approach for modifying these reactor facilities for the ground engineering test facility for the SP-100 nuclear subsystem. 4 figs

  14. Coexistence of uniquely ergodic subsystems of interval mapping

    Ye Xiangdong.

    1991-10-01

    The purpose of this paper is to show that uniquely ergodic subsystems of interval mapping also coexist in the same way as minimal sets do. To do this we give some notations in section 2. In section 3 we define D-function of a uniquely ergodic system and show its basic properties. We prove the coexistence of uniquely ergodic subsystems of interval mapping in section 4. Lastly we give the examples of uniquely ergodic systems with given D-functions in section 5. 27 refs

  15. Integrated flight/propulsion control - Subsystem specifications for performance

    Neighbors, W. K.; Rock, Stephen M.

    1993-01-01

    A procedure is presented for calculating multiple subsystem specifications given a number of performance requirements on the integrated system. This procedure applies to problems where the control design must be performed in a partitioned manner. It is based on a structured singular value analysis, and generates specifications as magnitude bounds on subsystem uncertainties. The performance requirements should be provided in the form of bounds on transfer functions of the integrated system. This form allows the expression of model following, command tracking, and disturbance rejection requirements. The procedure is demonstrated on a STOVL aircraft design.

  16. Optomechanical design of TMT NFIRAOS Subsystems at INO

    Lamontagne, Frédéric; Desnoyers, Nichola; Grenier, Martin; Cottin, Pierre; Leclerc, Mélanie; Martin, Olivier; Buteau-Vaillancourt, Louis; Boucher, Marc-André; Nash, Reston; Lardière, Olivier; Andersen, David; Atwood, Jenny; Hill, Alexis; Byrnes, Peter W. G.; Herriot, Glen; Fitzsimmons, Joeleff; Véran, Jean-Pierre

    2017-08-01

    The adaptive optics system for the Thirty Meter Telescope (TMT) is the Narrow-Field InfraRed Adaptive Optics System (NFIRAOS). Recently, INO has been involved in the optomechanical design of several subsystems of NFIRAOS, including the Instrument Selection Mirror (ISM), the NFIRAOS Beamsplitters (NBS), and the NFIRAOS Source Simulator system (NSS) comprising the Focal Plane Mask (FPM), the Laser Guide Star (LGS) sources, and the Natural Guide Star (NGS) sources. This paper presents an overview of these subsystems and the optomechanical design approaches used to meet the optical performance requirements under environmental constraints.

  17. Software Testbed for Developing and Evaluating Integrated Autonomous Subsystems

    Ong, James; Remolina, Emilio; Prompt, Axel; Robinson, Peter; Sweet, Adam; Nishikawa, David

    2015-01-01

    To implement fault tolerant autonomy in future space systems, it will be necessary to integrate planning, adaptive control, and state estimation subsystems. However, integrating these subsystems is difficult, time-consuming, and error-prone. This paper describes Intelliface/ADAPT, a software testbed that helps researchers develop and test alternative strategies for integrating planning, execution, and diagnosis subsystems more quickly and easily. The testbed's architecture, graphical data displays, and implementations of the integrated subsystems support easy plug and play of alternate components to support research and development in fault-tolerant control of autonomous vehicles and operations support systems. Intelliface/ADAPT controls NASA's Advanced Diagnostics and Prognostics Testbed (ADAPT), which comprises batteries, electrical loads (fans, pumps, and lights), relays, circuit breakers, invertors, and sensors. During plan execution, an experimentor can inject faults into the ADAPT testbed by tripping circuit breakers, changing fan speed settings, and closing valves to restrict fluid flow. The diagnostic subsystem, based on NASA's Hybrid Diagnosis Engine (HyDE), detects and isolates these faults to determine the new state of the plant, ADAPT. Intelliface/ADAPT then updates its model of the ADAPT system's resources and determines whether the current plan can be executed using the reduced resources. If not, the planning subsystem generates a new plan that reschedules tasks, reconfigures ADAPT, and reassigns the use of ADAPT resources as needed to work around the fault. The resource model, planning domain model, and planning goals are expressed using NASA's Action Notation Modeling Language (ANML). Parts of the ANML model are generated automatically, and other parts are constructed by hand using the Planning Model Integrated Development Environment, a visual Eclipse-based IDE that accelerates ANML model development. Because native ANML planners are currently

  18. Novel Design Aspects of the Space Technology 5 Mechanical Subsystem

    Rossoni, Peter; McGill, William

    2003-01-01

    This paper describes several novel design elements of the Space Technology 5 (ST5) spacecraft mechanical subsystem. The spacecraft structure itself takes a significant step in integrating electronics into the primary structure. The deployment system restrains the spacecraft during launch and imparts a predetermined spin rate upon release from its secondary payload accommodations. The deployable instrument boom incorporates some traditional as well as new techniques for lightweight and stiffness. Analysis and test techniques used to validate these technologies are described. Numerous design choices were necessitated due to the compact spacecraft size and strict mechanical subsystem requirements.

  19. Definition of an arcjet propulsion sub-system

    Price, T.W.

    1989-01-01

    An engineering flight demonstration of a 100 kW3 Space Reactor Power System is planned for the mid to late 1990s. An arcjet based propulsion subsystem will be included on the flight demonstraction as a secondary experiment. Two studies, sponsored by the Kay Technologies Directorate of the SDI Organization and managed by the Jet Propulsion Laboratory are currently under way to define that propulsion subsystem. The principal tasks of those contracts and the plans for two later phases, an experimental verification of the concept and a flight qualification/delivery of a flight unit, are described. 9 refs

  20. System and methods of resource usage using an interoperable management framework

    Heileman, Gregory L.; Jamkhedkar, Pramod A.; Lamb, Christopher C.

    2017-10-31

    Generic rights expression language allowing interoperability across different computing environments including resource usage of different applications. A formal framework for usage management provides scaffolding upon which interoperable usage management systems can be built. Certain features of the framework are standardized, such as the operational semantics, including areas free of standards that necessitate choice and innovation to achieve a balance of flexibility and usability for interoperability in usage management systems.

  1. Seismic Safety Margins Research Program. Phase 1. Project V. Structural sub-system response: subsystem response review

    Fogelquist, J.; Kaul, M.K.; Koppe, R.; Tagart, S.W. Jr.; Thailer, H.; Uffer, R.

    1980-03-01

    This project is directed toward a portion of the Seismic Safety Margins Research Program which includes one link in the seismic methodology chain. The link addressed here is the structural subsystem dynamic response which consists of those components and systems whose behavior is often determined decoupled from the major structural response. Typically the mathematical model utilized for the major structural response will include only the mass effects of the subsystem and the main model is used to produce the support motion inputs for subsystem seismic qualification. The main questions addressed in this report have to do with the seismic response uncertainty of safety-related components or equipment whose seismic qualification is performed by (a) analysis, (b) tests, or (c) combinations of analysis and tests, and where the seismic input is assumed to have no uncertainty

  2. Cascade Distillation Subsystem Development: Progress Toward a Distillation Comparison Test

    Callahan, M. R.; Lubman, A.; Pickering, Karen D.

    2009-01-01

    Recovery of potable water from wastewater is essential for the success of long-duration manned missions to the Moon and Mars. Honeywell International and a team from NASA Johnson Space Center (JSC) are developing a wastewater processing subsystem that is based on centrifugal vacuum distillation. The wastewater processor, referred to as the Cascade Distillation Subsystem (CDS), utilizes an innovative and efficient multistage thermodynamic process to produce purified water. The rotary centrifugal design of the system also provides gas/liquid phase separation and liquid transport under microgravity conditions. A five-stage subsystem unit has been designed, built, delivered and integrated into the NASA JSC Advanced Water Recovery Systems Development Facility for performance testing. A major test objective of the project is to demonstrate the advancement of the CDS technology from the breadboard level to a subsystem level unit. An initial round of CDS performance testing was completed in fiscal year (FY) 2008. Based on FY08 testing, the system is now in development to support an Exploration Life Support (ELS) Project distillation comparison test expected to begin in early 2009. As part of the project objectives planned for FY09, the system will be reconfigured to support the ELS comparison test. The CDS will then be challenged with a series of human-gene-rated waste streams representative of those anticipated for a lunar outpost. This paper provides a description of the CDS technology, a status of the current project activities, and data on the system s performance to date.

  3. Mark 4A DSN receiver-exciter and transmitter subsystems

    Wick, M. R.

    1986-01-01

    The present configuration of the Mark 4A DSN Receiver-Exciter and Transmitter Subsystems is described. Functional requirements and key characteristics are given to show the differences in the capabilities required by the Networks Consolidation task for combined High Earth Orbiter and Deep Space Network tracking support.

  4. Effector-Triggered Self-Replication in Coupled Subsystems

    Komáromy, Dávid; Tezcan, Meniz; Schaeffer, Gaël; Marić, Ivana; Otto, Sijbren

    2017-01-01

    In living systems processes like genome duplication and cell division are carefully synchronized through subsystem coupling. If we are to create life de novo, similar control over essential processes such as self-replication need to be developed. Here we report that coupling two dynamic

  5. Computer Simulation of the Circulation Subsystem of a Library

    Shaw, W. M., Jr.

    1975-01-01

    When circulation data are used as input parameters for a computer simulation of a library's circulation subsystem, the results of the simulation provide information on book availability and delays. The model may be used to simulate alternative loan policies. (Author/LS)

  6. Double-Shell Tank (DST) Diluent and Flush Subsystem Specification

    GRAVES, C.E.

    2000-01-01

    The Double-Shell Tank (DST) Diluent and Flush Subsystem is intended to support Waste Feed Delivery. The DST Diluent and Flush Subsystem specification describes the relationship of this system with the DST System, describes the functions that must be performed by the system, and establishes the performance requirements to be applied to the design of the system. It also provides references for the requisite codes and standards. The DST Diluent and Flush Subsystem will treat the waste for a more favorable waste transfer. This will be accomplished by diluting the waste, dissolving the soluble portion of the waste, and flushing waste residuals from the transfer line. The Diluent and Flush Subsystem will consist of the following: The Diluent and Flush Station(s) where chemicals will be off-loaded, temporarily stored, mixed as necessary, heated, and metered to the delivery system; and A piping delivery system to deliver the chemicals to the appropriate valve or pump pit Associated support structures. This specification is intended to be the basis for new projects/installations. This specification is not intended to retroactively affect previously established project design criteria without specific direction by the program

  7. A shell-model calculation in terms of correlated subsystems

    Boisson, J.P.; Silvestre-Brac, B.

    1979-01-01

    A method for solving the shell-model equations in terms of a basis which includes correlated subsystems is presented. It is shown that the method allows drastic truncations of the basis to be made. The corresponding calculations are easy to perform and can be carried out rapidly

  8. Compliance with NRC subsystem requirements in the repository licensing process

    Minwalla, H.

    1994-01-01

    Section 121 of the Nuclear Waste Policy Act of 1982 requires the Nuclear Regulatory Commission (Commission) to issue technical requirements and criteria, for the use of a system of multiple barriers in the design of the repository, that are not inconsistent with any comparable standard promulgated by the Environmental Protection Agency (EPA). The Administrator of the EPA is required to promulgate generally applicable standards for protection of the general environment from offsite releases from radioactive material in repositories. The Commission's regulations pertaining to geologic repositories are provided in 10 CFR part 60. The Commission has provided in 10 CFR 60.112 the overall post-closure system performance objective which is used to demonstrate compliance with the EPA high-level waste (HLW) disposal standard. In addition, the Commission has provided, in 10 CFR 60.113, subsystem performance requirements for substantially complete containment, fractional release rate, and groundwater travel time; however, none of these subsystem performance requirements have a causal technical nexus with the EPA HLW disposal standard. This paper examines the issue of compliance with the conflicting dual regulatory role of subsystem performance requirements in the repository licensing process and recommends several approaches that would appropriately define the role of subsystem performance requirements in the repository licensing process

  9. Charactering lidar optical subsystem using four quadrants method

    Tian, Xiaomin; Liu, Dong; Xu, Jiwei; Wang, Zhenzhu; Wang, Bangxin; Wu, Decheng; Zhong, Zhiqing; Xie, Chenbo; Wang, Yingjian

    2018-02-01

    Lidar is a kind of active optical remote sensing instruments , can be applied to sound atmosphere with a high spatial and temporal resolution. Many parameter of atmosphere can be get by using different inverse algorithm with lidar backscatter signal. The basic setup of a lidar consist of a transmitter and a receiver. To make sure the quality of lidar signal data, the lidar must be calibrated before being used to measure the atmospheric variables. It is really significant to character and analyze lidar optical subsystem because a well equiped lidar optical subsystem contributes to high quality lidar signal data. we pay close attention to telecover test to character and analyze lidar optical subsystem.The telecover test is called four quadrants method consisting in dividing the telescope aperture in four quarants. when a lidar is well configured with lidar optical subsystem, the normalized signal from four qudrants will agree with each other on some level. Testing our WARL-II lidar by four quadrants method ,we find the signals of the four basically consistent with each other both in near range and in far range. But in detail, the signals in near range have some slight distinctions resulting from overlap function, some signals distinctions are induced by atmospheric instability.

  10. Interoperability after deployment: persistent challenges and regional strategies in Denmark.

    Kierkegaard, Patrick

    2015-04-01

    The European Union has identified Denmark as one of the countries who have the potential to provide leadership and inspiration for other countries in eHealth implementation and adoption. However, Denmark has historically struggled to facilitate data exchange between their public hospitals' electronic health records (EHRs). Furthermore, state-led projects failed to adequately address the challenges of interoperability after deployment. Changes in the organizational setup and division of responsibilities concerning the future of eHealth implementations in hospitals took place, which granted the Danish regions the full responsibility for all hospital systems, specifically the consolidation of EHRs to one system per region. The regions reduced the number of different EHRs to six systems by 2014. Additionally, the first version of the National Health Record was launched to provide health care practitioners with an overview of a patient's data stored in all EHRs across the regions and within the various health sectors. The governance of national eHealth implementation plays a crucial role in the development and diffusion of interoperable technologies. Changes in the organizational setup and redistribution of responsibilities between the Danish regions and the state play a pivotal role in producing viable and coherent solutions in a timely manner. Interoperability initiatives are best managed on a regional level or by the authorities responsible for the provision of local health care services. Cross-regional communication is essential during the initial phases of planning in order to set a common goal for countrywide harmonization, coherence and collaboration. © The Author 2015. Published by Oxford University Press in association with the International Society for Quality in Health Care; all rights reserved.

  11. UMTS network planning, optimization, and inter-operation with GSM

    Rahnema, Moe

    2008-01-01

    UMTS Network Planning, Optimization, and Inter-Operation with GSM is an accessible, one-stop reference to help engineers effectively reduce the time and costs involved in UMTS deployment and optimization. Rahnema includes detailed coverage from both a theoretical and practical perspective on the planning and optimization aspects of UMTS, and a number of other new techniques to help operators get the most out of their networks. Provides an end-to-end perspective, from network design to optimizationIncorporates the hands-on experiences of numerous researchersSingle

  12. Creating XML/PHP Interface for BAN Interoperability.

    Fragkos, Vasileios; Katzis, Konstantinos; Despotou, Georgios

    2017-01-01

    Recent advances in medical and electronic technologies have introduced the use of Body Area Networks as a part of e-health, for constant and accurate monitoring of patients and the transmission as well as processing of the data to develop a holistic Electronic Health Record. The rising global population, different BAN manufacturers and a variety of medical systems pose the issue of interoperability between BANs and systems as well as the proper way to propagate medical data in an organized and efficient manner. In this paper, we describe BANs and propose the use of certain web technologies to address this issue.

  13. Characterization of the power and efficiency of Stirling engine subsystems

    García, D.; González, M.A.; Prieto, J.I.; Herrero, S.; López, S.; Mesonero, I.; Villasante, C.

    2014-01-01

    Highlights: • We review experimental data from a V160 engine developed for cogeneration. • We also investigate the V161 solar engine. • The possible margin of improvement is evaluated for each subsystem. • The procedure is based on similarity models and thermodynamic models. • The procedure may be of general interest for other prototypes. - Abstract: The development of systems based on Stirling machines is limited by the lack of data about the performance of the various subsystems that are located between the input and output power sections. The measurement of some of the variables used to characterise these internal subsystems presents difficulties, particularly in the working gas circuit and the drive mechanism, which causes experimental reports to rarely be comprehensive enough for analysing the whole performance of the machine. In this article, we review experimental data from a V160 engine developed for cogeneration to evaluate the general validity; we also investigate one of the most successful prototypes used in dish-Stirling systems, the V161 engine, for which a seemingly small mechanical efficiency value has been recently predicted. The procedure described in this article allows the possible margin of improvement to be evaluated for each subsystem. The procedure is based on similarity models, which have been previously developed through experimental data from very different prototypes. Thermodynamic models for the gas circuit are also considered. Deduced characteristic curves show that both prototypes have an advanced degree of development as evidenced by relatively high efficiencies for each subsystem. The analyses are examples that demonstrate the qualities of dimensionless numbers in representing physical phenomena with maximum generality and physical meaning

  14. The Italian Cloud-based brokering Infrastructure to sustain Interoperability for Operative Hydrology

    Boldrini, E.; Pecora, S.; Bussettini, M.; Bordini, F.; Nativi, S.

    2015-12-01

    This work presents the informatics platform carried out to implement the National Hydrological Operative Information System of Italy. In particular, the presentation will focus on the governing aspects of the cloud infrastructure and brokering software that make possible to sustain the hydrology data flow between heterogeneous user clients and data providers.The Institute for Environmental Protection and Research, ISPRA (Istituto Superiore per la Protezione e la Ricerca Ambientale) in collaboration with the Regional Agency for Environmental Protection in the Emilia-Romagna region, ARPA-ER (Agenzia Regionale per la Prevenzione e l´Ambiente dell´Emilia-Romagna) and CNR-IIA (National Research Council of Italy) designed and developed an innovative platform for the discovery and access of hydrological data coming from 19 Italian administrative regions and 2 Italian autonomous provinces, in near real time. ISPRA has deployed and governs such a system. The presentation will introduce and discuss the technological barriers for interoperability as well as social and policy ones. The adopted solutions will be described outlining the sustainability challenges and benefits.

  15. Design and Implement AN Interoperable Internet of Things Application Based on AN Extended Ogc Sensorthings Api Standard

    Huang, C. Y.; Wu, C. H.

    2016-06-01

    The Internet of Things (IoT) is an infrastructure that interconnects uniquely-identifiable devices using the Internet. By interconnecting everyday appliances, various monitoring and physical mashup applications can be constructed to improve people's daily life. However, IoT devices created by different manufacturers follow different proprietary protocols and cannot communicate with each other. This heterogeneity issue causes different products to be locked in multiple closed ecosystems that we call IoT silos. In order to address this issue, a common industrial solution is the hub approach, which implements connectors to communicate with IoT devices following different protocols. However, with the growing number of proprietary protocols proposed by device manufacturers, IoT hubs need to support and maintain a lot of customized connectors. Hence, we believe the ultimate solution to address the heterogeneity issue is to follow open and interoperable standard. Among the existing IoT standards, the Open Geospatial Consortium (OGC) SensorThings API standard supports comprehensive conceptual model and query functionalities. The first version of SensorThings API mainly focuses on connecting to IoT devices and sharing sensor observations online, which is the sensing capability. Besides the sensing capability, IoT devices could also be controlled via the Internet, which is the tasking capability. While the tasking capability was not included in the first version of the SensorThings API standard, this research aims on defining the tasking capability profile and integrates with the SensorThings API standard, which we call the extended-SensorThings API in this paper. In general, this research proposes a lightweight JSON-based web service description, the "Tasking Capability Description", allowing device owners and manufacturers to describe different IoT device protocols. Through the extended- SensorThings API, users and applications can follow a coherent protocol to control Io

  16. Network Function Virtualization (NFV) based architecture to address connectivity, interoperability and manageability challenges in Internet of Things (IoT)

    Haseeb, Shariq; Hashim, Aisha Hassan A.; Khalifa, Othman O.; Faris Ismail, Ahmad

    2017-11-01

    IoT aims to interconnect sensors and actuators built into devices (also known as Things) in order for them to share data and control each other to improve existing processes for making people’s life better. IoT aims to connect between all physical devices like fridges, cars, utilities, buildings and cities so that they can take advantage of small pieces of information collected by each one of these devices and derive more complex decisions. However, these devices are heterogeneous in nature because of various vendor support, connectivity options and protocol suit. Heterogeneity of such devices makes it difficult for them to leverage on each other’s capabilities in the traditional IoT architecture. This paper highlights the effects of heterogeneity challenges on connectivity, interoperability, management in greater details. It also surveys some of the existing solutions adopted in the core network to solve the challenges of massive IoT deployments. Finally, the paper proposes a new architecture based on NFV to address the problems.

  17. Sociotechnical Challenges of Developing an Interoperable Personal Health Record

    Gaskin, G.L.; Longhurst, C.A.; Slayton, R.; Das, A.K.

    2011-01-01

    Objectives To analyze sociotechnical issues involved in the process of developing an interoperable commercial Personal Health Record (PHR) in a hospital setting, and to create guidelines for future PHR implementations. Methods This qualitative study utilized observational research and semi-structured interviews with 8 members of the hospital team, as gathered over a 28 week period of developing and adapting a vendor-based PHR at Lucile Packard Children’s Hospital at Stanford University. A grounded theory approach was utilized to code and analyze over 100 pages of typewritten field notes and interview transcripts. This grounded analysis allowed themes to surface during the data collection process which were subsequently explored in greater detail in the observations and interviews. Results Four major themes emerged: (1) Multidisciplinary teamwork helped team members identify crucial features of the PHR; (2) Divergent goals for the PHR existed even within the hospital team; (3) Differing organizational conceptions of the end-user between the hospital and software company differentially shaped expectations for the final product; (4) Difficulties with coordination and accountability between the hospital and software company caused major delays and expenses and strained the relationship between hospital and software vendor. Conclusions Though commercial interoperable PHRs have great potential to improve healthcare, the process of designing and developing such systems is an inherently sociotechnical process with many complex issues and barriers. This paper offers recommendations based on the lessons learned to guide future development of such PHRs. PMID:22003373

  18. Interoperability prototype between hospitals and general practitioners in Switzerland.

    Alves, Bruno; Müller, Henning; Schumacher, Michael; Godel, David; Abu Khaled, Omar

    2010-01-01

    Interoperability in data exchange has the potential to improve the care processes and decrease costs of the health care system. Many countries have related eHealth initiatives in preparation or already implemented. In this area, Switzerland has yet to catch up. Its health system is fragmented, because of the federated nature of cantons. It is thus more difficult to coordinate efforts between the existing healthcare actors. In the Medicoordination project a pragmatic approach was selected: integrating several partners in healthcare on a regional scale in French speaking Switzerland. In parallel with the Swiss eHealth strategy, currently being elaborated by the Swiss confederation, particularly medium-sized hospitals and general practitioners were targeted in Medicoordination to implement concrete scenarios of information exchange between hospitals and general practitioners with a high added value. In this paper we focus our attention on a prototype implementation of one chosen scenario: the discharge summary. Although simple in concept, exchanging release letters shows small, hidden difficulties due to the multi-partner nature of the project. The added value of such a prototype is potentially high and it is now important to show that interoperability can work in practice.

  19. Grid Interoperation with ARC middleware for the CMS experiment

    Edelmann, Erik; Groenager, Michael; Johansson, Daniel; Kleist, Josva; Field, Laurence; Qing, Di; Frey, Jaime; Happonen, Kalle; Klem, Jukka; Koivumaeki, Jesper; Linden, Tomas; Pirinen, Antti

    2010-01-01

    The Compact Muon Solenoid (CMS) is one of the general purpose experiments at the CERN Large Hadron Collider (LHC). CMS computing relies on different grid infrastructures to provide computational and storage resources. The major grid middleware stacks used for CMS computing are gLite, Open Science Grid (OSG) and ARC (Advanced Resource Connector). Helsinki Institute of Physics (HIP) hosts one of the Tier-2 centers for CMS computing. CMS Tier-2 centers operate software systems for data transfers (PhEDEx), Monte Carlo production (ProdAgent) and data analysis (CRAB). In order to provide the Tier-2 services for CMS, HIP uses tools and components from both ARC and gLite grid middleware stacks. Interoperation between grid systems is a challenging problem and HIP uses two different solutions to provide the needed services. The first solution is based on gLite-ARC grid level interoperability. This allows to use ARC resources in CMS without modifying the CMS application software. The second solution is based on developing specific ARC plugins in CMS software.

  20. Grid Interoperation with ARC middleware for the CMS experiment

    Edelmann, Erik; Groenager, Michael; Johansson, Daniel; Kleist, Josva [Nordic DataGrid Facility, Kastruplundgade 22, 1., DK-2770 Kastrup (Denmark); Field, Laurence; Qing, Di [CERN, CH-1211 Geneve 23 (Switzerland); Frey, Jaime [University of Wisconsin-Madison, 1210 W. Dayton St., Madison, WI (United States); Happonen, Kalle; Klem, Jukka; Koivumaeki, Jesper; Linden, Tomas; Pirinen, Antti, E-mail: Jukka.Klem@cern.c [Helsinki Institute of Physics, PO Box 64, FIN-00014 University of Helsinki (Finland)

    2010-04-01

    The Compact Muon Solenoid (CMS) is one of the general purpose experiments at the CERN Large Hadron Collider (LHC). CMS computing relies on different grid infrastructures to provide computational and storage resources. The major grid middleware stacks used for CMS computing are gLite, Open Science Grid (OSG) and ARC (Advanced Resource Connector). Helsinki Institute of Physics (HIP) hosts one of the Tier-2 centers for CMS computing. CMS Tier-2 centers operate software systems for data transfers (PhEDEx), Monte Carlo production (ProdAgent) and data analysis (CRAB). In order to provide the Tier-2 services for CMS, HIP uses tools and components from both ARC and gLite grid middleware stacks. Interoperation between grid systems is a challenging problem and HIP uses two different solutions to provide the needed services. The first solution is based on gLite-ARC grid level interoperability. This allows to use ARC resources in CMS without modifying the CMS application software. The second solution is based on developing specific ARC plugins in CMS software.

  1. Grid Interoperation with ARC Middleware for the CMS Experiment

    Edelmann, Erik; Frey, Jaime; Gronager, Michael; Happonen, Kalle; Johansson, Daniel; Kleist, Josva; Klem, Jukka; Koivumaki, Jesper; Linden, Tomas; Pirinen, Antti; Qing, Di

    2010-01-01

    The Compact Muon Solenoid (CMS) is one of the general purpose experiments at the CERN Large Hadron Collider (LHC). CMS computing relies on different grid infrastructures to provide computational and storage resources. The major grid middleware stacks used for CMS computing are gLite, Open Science Grid (OSG) and ARC (Advanced Resource Connector). Helsinki Institute of Physics (HIP) hosts one of the Tier-2 centers for CMS computing. CMS Tier-2 centers operate software systems for data transfers (PhEDEx), Monte Carlo production (ProdAgent) and data analysis (CRAB). In order to provide the Tier-2 services for CMS, HIP uses tools and components from both ARC and gLite grid middleware stacks. Interoperation between grid systems is a challenging problem and HIP uses two different solutions to provide the needed services. The first solution is based on gLite-ARC grid level interoperability. This allows to use ARC resources in CMS without modifying the CMS application software. The second solution is based on developi...

  2. Interoperability in planetary research for geospatial data analysis

    Hare, Trent M.; Rossi, Angelo P.; Frigeri, Alessandro; Marmo, Chiara

    2018-01-01

    For more than a decade there has been a push in the planetary science community to support interoperable methods for accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (e.g., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized geospatial image formats, geologic mapping conventions, U.S. Federal Geographic Data Committee (FGDC) cartographic and metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Map Tile Services (cached image tiles), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they can be just as valuable for planetary domain. Another initiative, called VESPA (Virtual European Solar and Planetary Access), will marry several of the above geoscience standards and astronomy-based standards as defined by International Virtual Observatory Alliance (IVOA). This work outlines the current state of interoperability initiatives in use or in the process of being researched within the planetary geospatial community.

  3. PyMOOSE: interoperable scripting in Python for MOOSE

    Subhasis Ray

    2008-12-01

    Full Text Available Python is emerging as a common scripting language for simulators. This opens up many possibilities for interoperability in the form of analysis, interfaces, and communications between simulators. We report the integration of Python scripting with the Multi-scale Object Oriented Simulation Environment (MOOSE. MOOSE is a general-purpose simulation system for compartmental neuronal models and for models of signaling pathways based on chemical kinetics. We show how the Python-scripting version of MOOSE, PyMOOSE, combines the power of a compiled simulator with the versatility and ease of use of Python. We illustrate this by using Python numerical libraries to analyze MOOSE output online, and by developing a GUI in Python/Qt for a MOOSE simulation. Finally, we build and run a composite neuronal/signaling model that uses both the NEURON and MOOSE numerical engines, and Python as a bridge between the two. Thus PyMOOSE has a high degree of interoperability with analysis routines, with graphical toolkits, and with other simulators.

  4. Processing biological literature with customizable Web services supporting interoperable formats.

    Rak, Rafal; Batista-Navarro, Riza Theresa; Carter, Jacob; Rowley, Andrew; Ananiadou, Sophia

    2014-01-01

    Web services have become a popular means of interconnecting solutions for processing a body of scientific literature. This has fuelled research on high-level data exchange formats suitable for a given domain and ensuring the interoperability of Web services. In this article, we focus on the biological domain and consider four interoperability formats, BioC, BioNLP, XMI and RDF, that represent domain-specific and generic representations and include well-established as well as emerging specifications. We use the formats in the context of customizable Web services created in our Web-based, text-mining workbench Argo that features an ever-growing library of elementary analytics and capabilities to build and deploy Web services straight from a convenient graphical user interface. We demonstrate a 2-fold customization of Web services: by building task-specific processing pipelines from a repository of available analytics, and by configuring services to accept and produce a combination of input and output data interchange formats. We provide qualitative evaluation of the formats as well as quantitative evaluation of automatic analytics. The latter was carried out as part of our participation in the fourth edition of the BioCreative challenge. Our analytics built into Web services for recognizing biochemical concepts in BioC collections achieved the highest combined scores out of 10 participating teams. Database URL: http://argo.nactem.ac.uk. © The Author(s) 2014. Published by Oxford University Press.

  5. Designing for Change: Interoperability in a scaling and adapting environment

    Yarmey, L.

    2015-12-01

    The Earth Science cyberinfrastructure landscape is constantly changing. Technologies advance and technical implementations are refined or replaced. Data types, volumes, packaging, and use cases evolve. Scientific requirements emerge and mature. Standards shift while systems scale and adapt. In this complex and dynamic environment, interoperability remains a critical component of successful cyberinfrastructure. Through the resource- and priority-driven iterations on systems, interfaces, and content, questions fundamental to stable and useful Earth Science cyberinfrastructure arise. For instance, how are sociotechnical changes planned, tracked, and communicated? How should operational stability balance against 'new and shiny'? How can ongoing maintenance and mitigation of technical debt be managed in an often short-term resource environment? The Arctic Data Explorer is a metadata brokering application developed to enable discovery of international, interdisciplinary Arctic data across distributed repositories. Completely dependent on interoperable third party systems, the Arctic Data Explorer publicly launched in 2013 with an original 3000+ data records from four Arctic repositories. Since then the search has scaled to 25,000+ data records from thirteen repositories at the time of writing. In the final months of original project funding, priorities shift to lean operations with a strategic eye on the future. Here we present lessons learned from four years of Arctic Data Explorer design, development, communication, and maintenance work along with remaining questions and potential directions.

  6. Language interoperability for high-performance parallel scientific components

    Elliot, N; Kohn, S; Smolinski, B

    1999-01-01

    With the increasing complexity and interdisciplinary nature of scientific applications, code reuse is becoming increasingly important in scientific computing. One method for facilitating code reuse is the use of components technologies, which have been used widely in industry. However, components have only recently worked their way into scientific computing. Language interoperability is an important underlying technology for these component architectures. In this paper, we present an approach to language interoperability for a high-performance parallel, component architecture being developed by the Common Component Architecture (CCA) group. Our approach is based on Interface Definition Language (IDL) techniques. We have developed a Scientific Interface Definition Language (SIDL), as well as bindings to C and Fortran. We have also developed a SIDL compiler and run-time library support for reference counting, reflection, object management, and exception handling (Babel). Results from using Babel to call a standard numerical solver library (written in C) from C and Fortran show that the cost of using Babel is minimal, where as the savings in development time and the benefits of object-oriented development support for C and Fortran far outweigh the costs

  7. Governance of Interoperability in Intergovernmental Services - Towards an Empirical Taxonomy

    Herbert Kubicek

    2008-12-01

    Full Text Available High quality and comfortable online delivery of governmental services often requires the seamless exchange of data between two or more government agencies. Smooth data exchange, in turn, requires interoperability of the databases and workflows in the agencies involved. Interoperability (IOP is a complex issue covering purely technical aspects such as transmission protocols and data exchange formats, but also content-related semantic aspects such as identifiers and the meaning of codes as well as organizational, contractual or legal issues. Starting from IOP frameworks which provide classifications of what has to be standardized, this paper, based on an ongoing research project, adopts a political and managerial view and tries to clarify the governance of achieving IOP, i.e. where and by whom IOPstandards are developed and established and how they are put into operation. By analyzing 32 cases of successful implementation of IOP in E-Government services within the European Union empirical indicators for different aspects of governance are proposed and applied to develop an empirical taxonomy of different types of IOP governance which can be used for future comparative research regarding success factors, barriers etc.

  8. Enabling Interoperable and Selective Data Sharing among Social Networking Sites

    Shin, Dongwan; Lopes, Rodrigo

    With the widespread use of social networking (SN) sites and even introduction of a social component in non-social oriented services, there is a growing concern over user privacy in general, how to handle and share user profiles across SN sites in particular. Although there have been several proprietary or open source-based approaches to unifying the creation of third party applications, the availability and retrieval of user profile information are still limited to the site where the third party application is run, mostly devoid of the support for data interoperability. In this paper we propose an approach to enabling interopearable and selective data sharing among SN sites. To support selective data sharing, we discuss an authenticated dictionary (ADT)-based credential which enables a user to share only a subset of her information certified by external SN sites with applications running on an SN site. For interoperable data sharing, we propose an extension to the OpenSocial API so that it can provide an open source-based framework for allowing the ADT-based credential to be used seamlessly among different SN sites.

  9. Interoperability Assets for Patient Summary Components: A Gap Analysis.

    Heitmann, Kai U; Cangioli, Giorgio; Melgara, Marcello; Chronaki, Catherine

    2018-01-01

    The International Patient Summary (IPS) standards aim to define the specifications for a minimal and non-exhaustive Patient Summary, which is specialty-agnostic and condition-independent, but still clinically relevant. Meanwhile, health systems are developing and implementing their own variation of a patient summary while, the eHealth Digital Services Infrastructure (eHDSI) initiative is deploying patient summary services across countries in the Europe. In the spirit of co-creation, flexible governance, and continuous alignment advocated by eStandards, the Trillum-II initiative promotes adoption of the patient summary by engaging standards organizations, and interoperability practitioners in a community of practice for digital health to share best practices, tools, data, specifications, and experiences. This paper compares operational aspects of patient summaries in 14 case studies in Europe, the United States, and across the world, focusing on how patient summary components are used in practice, to promote alignment and joint understanding that will improve quality of standards and lower costs of interoperability.

  10. Using software interoperability to achieve a virtual design environment

    Gregory, G. Groot; Koshel, R. John

    2005-09-01

    A variety of simulation tools, including optical design and analysis, have benefited by many years of evolution in software functionality and computing power, thus making the notion of virtual design environments a reality. To simulate the optical characteristics of a system, one needs to include optical performance, mechanical design and manufacturing aspects simultaneously. To date, no single software program offers a universal solution. One approach to achieve an integrated environment is to select tools that offer a high degree of interoperability. This allows the selection of the best tools for each aspect of the design working in concert to solve the problem. This paper discusses the issues of how to assemble a design environment and provides an example of a combination of tools for illumination design. We begin by offering a broad definition of interoperability from an optical analysis perspective. This definition includes aspects of file interchange formats, software communications protocols and customized applications. One example solution is proposed by combining SolidWorks1 for computer-aided design (CAD), TracePro2 for optical analysis and MATLAB3 as the mathematical engine for tolerance analysis. The resulting virtual tool will be applied to a lightpipe design task to illustrate how such a system can be used.

  11. BENEFITS OF LINKED DATA FOR INTEROPERABILITY DURING CRISIS MANAGEMENT

    R. Roller

    2015-08-01

    Full Text Available Floodings represent a permanent risk to the Netherlands in general and to her power supply in particular. Data sharing is essential within this crisis scenario as a power cut affects a great variety of interdependant sectors. Currently used data sharing systems have been shown to hamper interoperability between stakeholders since they lack flexibility and there is no consensus in term definitions and interpretations. The study presented in this paper addresses these challenges by proposing a new data sharing solution based on Linked Data, a method of interlinking data points in a structured way on the web. A conceptual model for two data sharing parties in a flood-caused power cut crisis management scenario was developed to which relevant data were linked. The analysis revealed that the presented data sharing solution burderns its user with extra costs in the short run, but saves resources in the long run by overcoming interoperability problems of the legacy systems. The more stakeholders adopt Linked Data the stronger its benefits for data sharing will become.

  12. Enabling interoperability in planetary sciences and heliophysics: The case for an information model

    Hughes, J. Steven; Crichton, Daniel J.; Raugh, Anne C.; Cecconi, Baptiste; Guinness, Edward A.; Isbell, Christopher E.; Mafi, Joseph N.; Gordon, Mitchell K.; Hardman, Sean H.; Joyner, Ronald S.

    2018-01-01

    The Planetary Data System has developed the PDS4 Information Model to enable interoperability across diverse science disciplines. The Information Model is based on an integration of International Organization for Standardization (ISO) level standards for trusted digital archives, information model development, and metadata registries. Where controlled vocabularies provides a basic level of interoperability by providing a common set of terms for communication between both machines and humans the Information Model improves interoperability by means of an ontology that provides semantic information or additional related context for the terms. The information model was defined by team of computer scientists and science experts from each of the diverse disciplines in the Planetary Science community, including Atmospheres, Geosciences, Cartography and Imaging Sciences, Navigational and Ancillary Information, Planetary Plasma Interactions, Ring-Moon Systems, and Small Bodies. The model was designed to be extensible beyond the Planetary Science community, for example there are overlaps between certain PDS disciplines and the Heliophysics and Astrophysics disciplines. "Interoperability" can apply to many aspects of both the developer and the end-user experience, for example agency-to-agency, semantic level, and application level interoperability. We define these types of interoperability and focus on semantic level interoperability, the type of interoperability most directly enabled by an information model.

  13. Evaluation of Interoperability Protocols in Repositories of Electronic Theses and Dissertations

    Hakimjavadi, Hesamedin; Masrek, Mohamad Noorman

    2013-01-01

    Purpose: The purpose of this study is to evaluate the status of eight interoperability protocols within repositories of electronic theses and dissertations (ETDs) as an introduction to further studies on feasibility of deploying these protocols in upcoming areas of interoperability. Design/methodology/approach: Three surveys of 266 ETD…

  14. Enabling interoperability-as-a-service for connected IoT infrastructures and Smart Objects

    Hovstø, Asbjørn; Guan, Yajuan; Quintero, Juan Carlos Vasquez

    2018-01-01

    Lack of interoperability is considered as the most important barrier to achieve the global integration of Internet-of-Things (IoT) ecosystems across borders of different disciplines, vendors and standards. Indeed, the current IoT landscape consists of a large set of non-interoperable infrastructu...

  15. Examining the Relationship between Electronic Health Record Interoperability and Quality Management

    Purcell, Bernice M.

    2013-01-01

    A lack of interoperability impairs data quality among health care providers' electronic health record (EHR) systems. The problem is whether the International Organization for Standardization (ISO) 9000 principles relate to the problem of interoperability in implementation of EHR systems. The purpose of the nonexperimental quantitative research…

  16. 76 FR 4102 - Smart Grid Interoperability Standards; Supplemental Notice of Technical Conference

    2011-01-24

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. RM11-2-000] Smart Grid Interoperability Standards; Supplemental Notice of Technical Conference January 13, 2011. On December 21, 2010, the Federal Energy Regulatory Commission announced that a Technical Conference on Smart Grid Interoperability Standards will be held on Monday...

  17. Watershed and Economic Data InterOperability (WEDO): Facilitating Discovery, Evaluation and Integration through the Sharing of Watershed Modeling Data

    Watershed and Economic Data InterOperability (WEDO) is a system of information technologies designed to publish watershed modeling studies for reuse. WEDO facilitates three aspects of interoperability: discovery, evaluation and integration of data. This increased level of interop...

  18. Sharing meanings: developing interoperable semantic technologies to enhance reproducibility in earth and environmental science research

    Schildhauer, M.

    2015-12-01

    Earth and environmental scientists are familiar with the entities, processes, and theories germane to their field of study, and comfortable collecting and analyzing data in their area of interest. Yet, while there appears to be consistency and agreement as to the scientific "terms" used to describe features in their data and analyses, aside from a few fundamental physical characteristics—such as mass or velocity-- there can be broad tolerances, if not considerable ambiguity, in how many earth science "terms" map to the underlying "concepts" that they actually represent. This ambiguity in meanings, or "semantics", creates major problems for scientific reproducibility. It greatly impedes the ability to replicate results—by making it difficult to determine the specifics of the intended meanings of terms such as deforestation or carbon flux -- as to scope, composition, magnitude, etc. In addition, semantic ambiguity complicates assemblage of comparable data for reproducing results, due to ambiguous or idiosyncratic labels for measurements, such as percent cover of forest, where the term "forest" is undefined; or where a reported output of "total carbon-emissions" might just include CO2 emissions, but not methane emissions. In this talk, we describe how the NSF-funded DataONE repository for earth and environmental science data (http://dataone.org), is using W3C-standard languages (RDF/OWL) to build an ontology for clarifying concepts embodied in heterogeneous data and model outputs. With an initial focus on carbon cycling concepts using terrestrial biospheric model outputs and LTER productivity data, we describe how we are achieving interoperability with "semantic vocabularies" (or ontologies) from aligned earth and life science domains, including OBO-foundry ontologies such as ENVO and BCO; the ISO/OGC O&M; and the NSF Earthcube GeoLink project. Our talk will also discuss best practices that may be helpful for other groups interested in constructing their own

  19. The Influence of Information Systems Interoperability on Economic Activity in Poland

    Ganczar Małgorzata

    2017-12-01

    Full Text Available In the text, I discuss the abilities and challenges of information systems interoperability. The anticipated and expected result of interoperability is to improve the provision of public utility services to citizens and companies by means of facilitating the provision of public utility services on the basis of a “single window” principle and reducing the costs incurred by public administrations, companies, and citizens, resulting from the efficiency of the provision of public utility services. In the article, the conceptual framework of interoperability is elaborated upon. Moreover, information systems and public registers for entrepreneurs in Poland exemplify whether the interoperability may be applied and, if so, whether interoperability fulfils its targets to the extent of e-Government services for entrepreneurs.

  20. D-ATM, a working example of health care interoperability: From dirt path to gravel road.

    DeClaris, John-William

    2009-01-01

    For many years, there have been calls for interoperability within health care systems. The technology currently exists and is being used in business areas like banking and commerce, to name a few. Yet the question remains, why has interoperability not been achieved in health care? This paper examines issues encountered and success achieved with interoperability during the development of the Digital Access To Medication (D-ATM) project, sponsored by the Substance Abuse and Mental Health Services Administration (SAMHSA). D-ATM is the first government funded interoperable patient management system. The goal of this paper is to provide lessons learned and propose one possible road map for health care interoperability within private industry and how government can help.

  1. National Ingition Facility subsystem design requirements optical mounts SSDR 1.4.4

    Richardson, M.

    1996-01-01

    This SSDR establishes the performance, design, development and test requirements for NIF Beam Transport Optomechanical Subsystems. optomechanical Subsystems includes the mounts for the beam transport mirrors, LMl - LM8, the polarizer mount, and the spatial filter lens mounts

  2. Double Shell Tank (DST) Transfer Pump Subsystem Specification

    GRAVES, C.E.

    2001-01-01

    This specification establishes the performance requirements and provides the references to the requisite codes and standards to be applied during the design of the Double-Shell Tank (DST) Transfer Pump Subsystem that supports the first phase of waste feed delivery (WFD). The DST Transfer Pump Subsystem consists of a pump for supernatant and/or slurry transfer for the DSTs that will be retrieved during the Phase 1 WFD operations. This system is used to transfer low-activity waste (LAW) and high-level waste (HLW) to designated DST staging tanks. It also will deliver blended LAW and HLW feed from these staging tanks to the River Protection Project (RPP) Waste Treatment Plant where it will be processed into an immobilized waste form. This specification is intended to be the basis for new projects/installations (W-521, etc.). This specification is not intended to retroactively affect previously established project design criteria without specific direction by the program

  3. Recovering Intrinsic Fragmental Vibrations Using the Generalized Subsystem Vibrational Analysis.

    Tao, Yunwen; Tian, Chuan; Verma, Niraj; Zou, Wenli; Wang, Chao; Cremer, Dieter; Kraka, Elfi

    2018-05-08

    Normal vibrational modes are generally delocalized over the molecular system, which makes it difficult to assign certain vibrations to specific fragments or functional groups. We introduce a new approach, the Generalized Subsystem Vibrational Analysis (GSVA), to extract the intrinsic fragmental vibrations of any fragment/subsystem from the whole system via the evaluation of the corresponding effective Hessian matrix. The retention of the curvature information with regard to the potential energy surface for the effective Hessian matrix endows our approach with a concrete physical basis and enables the normal vibrational modes of different molecular systems to be legitimately comparable. Furthermore, the intrinsic fragmental vibrations act as a new link between the Konkoli-Cremer local vibrational modes and the normal vibrational modes.

  4. Measurement system as a subsystem of the quality management system

    Ľubica Floreková

    2006-12-01

    Full Text Available Each measurement system and a control principle must be based on certain facts about the system behaviour (what, operation (how and structure (why. Each system is distributed into subsystems that provide an input for the next subsystem. For each system, start is important the begin, that means system characteristics, collecting of data, its hierarchy and the processes distribution.A measurement system (based on the chapter 8 of the standard ISO 9001:2000 Quality management system, requirements defines the measurement, analysis and improvement for each organization in order to present the products conformity, the quality management system conformity guarantee and for the continuously permanent improvement of effectivity, efficiency and economy of quality management system.

  5. Subsystem software for TSTA [Tritium Systems Test Assembly

    Mann, L.W.; Claborn, G.W.; Nielson, C.W.

    1987-01-01

    The Subsystem Control Software at the Tritium System Test Assembly (TSTA) must control sophisticated chemical processes through the physical operation of valves, motor controllers, gas sampling devices, thermocouples, pressure transducers, and similar devices. Such control software has to be capable of passing stringent quality assurance (QA) criteria to provide for the safe handling of significant amounts of tritium on a routine basis. Since many of the chemical processes and physical components are experimental, the control software has to be flexible enough to allow for trial/error learning curve, but still protect the environment and personnel from exposure to unsafe levels of radiation. The software at TSTA is implemented in several levels as described in a preceding paper in these proceedings. This paper depends on information given in the preceding paper for understanding. The top level is the Subsystem Control level

  6. 76 FR 66040 - NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Draft...

    2011-10-25

    ...-01] NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Draft... draft version of the NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0... Roadmap for Smart Grid Interoperability Standards, Release 2.0 (Release 2.0) (Draft) for public review and...

  7. LightNVM: The Linux Open-Channel SSD Subsystem

    Bjørling, Matias; Gonzalez, Javier; Bonnet, Philippe

    2017-01-01

    resource utilization. We propose that SSD management trade-offs should be handled through Open-Channel SSDs, a new class of SSDs, that give hosts control over their internals. We present our experience building LightNVM, the Linux Open-Channel SSD subsystem. We introduce a new Physical Page Ad- dress I...... to limit read latency variability and that it can be customized to achieve predictable I/O latencies....

  8. CAMAC subsystem and user context utilities in ngdp framework

    Isupov, A.Yu.

    2010-01-01

    The ngdp framework advanced topics are described. Namely, we consider work with CAMAC hardware, 'selfflow' nodes for the data acquisition systems with the As-Soon-As-Possible policy, ng m m(4) as an alternative to ng s ocket(4), the control subsystem, user context utilities, events representation for the ROOT package, test and debug nodes, possible advancements for netgraph(4), etc. It is shown that the ngdp is suitable for building lightweight DAQ systems to handle CAMAC

  9. Subsystem for control of isotope production with linear electron accelerator

    Karasyov, S P; Uvarov, V L

    2001-01-01

    In this report the high-current LINAC subsystem for diagnostic and monitoring the basic technological parameters of isotope production (energy flux of Bremsstrahlung photons and absorbed doze in the target,target activity, temperature and consumption of water cooling the converter and target) is described.T he parallel printer port (LPT) of the personal computer is proposed to use as an interface with the measurement channels.

  10. Photovoltaic subsystem optimization and design tradeoff study. Final report

    Stolte, W.J.

    1982-03-01

    Tradeoffs and subsystem choices are examined in photovoltaic array subfield design, power-conditioning sizing and selection, roof- and ground-mounted structure installation, energy loss, operating voltage, power conditioning cost, and subfield size. Line- and self-commutated power conditioning options are analyzed to determine the most cost-effective technology in the megawatt power range. Methods for reducing field installation of flat panels and roof mounting of intermediate load centers are discussed, including the cost of retrofit installations.

  11. Attitude Control Subsystem for the Advanced Communications Technology Satellite

    Hewston, Alan W.; Mitchell, Kent A.; Sawicki, Jerzy T.

    1996-01-01

    This paper provides an overview of the on-orbit operation of the Attitude Control Subsystem (ACS) for the Advanced Communications Technology Satellite (ACTS). The three ACTS control axes are defined, including the means for sensing attitude and determining the pointing errors. The desired pointing requirements for various modes of control as well as the disturbance torques that oppose the control are identified. Finally, the hardware actuators and control loops utilized to reduce the attitude error are described.

  12. Radioisotope thermoelectric generator transportation system subsystem 143 software development plan

    King, D.A.

    1994-01-01

    This plan describes the activities to be performed and the controls to be applied to the process of specifying, developing, and qualifying the data acquisition software for the Radioisotope Thermoelectric Generator (RTG) Transportation System Subsystem 143 Instrumentation and Data Acquisition System (IDAS). This plan will serve as a software quality assurance plan, a verification and validation (V and V) plan, and a configuration management plan

  13. FireSignal application Node for subsystem control

    Duarte, A.S.; Santos, B.; Pereira, T.; Carvalho, B.B.; Fernandes, H.; Neto, A.; Janky, F.; Cahyna, P.; Pisacka, J.; Hron, M.

    2010-01-01

    Modern fusion experiments require the presence of several subsystems, responsible for the different parameters involved in the operation of the machine. With the migration from the pre-programmed to the real-time control paradigm, their integration in Control, Data Acquisition, and Communication (CODAC) systems became an important issue, as this implies not only the connection to a main central coordination system, but also communications with related diagnostics and actuators. A subsystem for the control and operation of the vacuum, gas injection and baking was developed and installed in the COMPASS tokamak. These tasks are performed by dsPIC microcontrollers that receive commands from a hub computer and send information regarding the status of the operation. Communications are done in the serial protocol RS-232 through fibre optics. Java software, with an intuitive graphical user interface, for controlling and monitoring of the subsystem was developed and installed in a hub computer. In order to allow operators to perform these tasks remotely besides locally, this was integrated in the FireSignal system. Taking advantage of FireSignal features, it was possible to provide the users with, not only the same functionalities of the local application but also a similar user interface. An independent FireSignal Java Node bridges the central server and the control application. This design makes possible to easily reuse the Node for other subsystems or integrate the vacuum slow control in the other CODAC systems. The complete system, with local and remote control, has been installed successfully on COMPASS and has been in operation since April this year.

  14. Frozen density embedding with non-integer subsystems' particle numbers.

    Fabiano, Eduardo; Laricchia, Savio; Della Sala, Fabio

    2014-03-21

    We extend the frozen density embedding theory to non-integer subsystems' particles numbers. Different features of this formulation are discussed, with special concern for approximate embedding calculations. In particular, we highlight the relation between the non-integer particle-number partition scheme and the resulting embedding errors. Finally, we provide a discussion of the implications of the present theory for the derivative discontinuity issue and the calculation of chemical reactivity descriptors.

  15. Stability of subsystem solutions in agent-based models

    Perc, Matjaž

    2018-01-01

    The fact that relatively simple entities, such as particles or neurons, or even ants or bees or humans, give rise to fascinatingly complex behaviour when interacting in large numbers is the hallmark of complex systems science. Agent-based models are frequently employed for modelling and obtaining a predictive understanding of complex systems. Since the sheer number of equations that describe the behaviour of an entire agent-based model often makes it impossible to solve such models exactly, Monte Carlo simulation methods must be used for the analysis. However, unlike pairwise interactions among particles that typically govern solid-state physics systems, interactions among agents that describe systems in biology, sociology or the humanities often involve group interactions, and they also involve a larger number of possible states even for the most simplified description of reality. This begets the question: when can we be certain that an observed simulation outcome of an agent-based model is actually stable and valid in the large system-size limit? The latter is key for the correct determination of phase transitions between different stable solutions, and for the understanding of the underlying microscopic processes that led to these phase transitions. We show that a satisfactory answer can only be obtained by means of a complete stability analysis of subsystem solutions. A subsystem solution can be formed by any subset of all possible agent states. The winner between two subsystem solutions can be determined by the average moving direction of the invasion front that separates them, yet it is crucial that the competing subsystem solutions are characterised by a proper composition and spatiotemporal structure before the competition starts. We use the spatial public goods game with diverse tolerance as an example, but the approach has relevance for a wide variety of agent-based models.

  16. Subsystem for control of isotope production with linear electron accelerator

    Karasyov, S.P.; Pomatsalyuk, R.I.; Uvarov, V.L.

    2001-01-01

    In this report the high-current LINAC subsystem for diagnostic and monitoring the basic technological parameters of isotope production (energy flux of Bremsstrahlung photons and absorbed doze in the target,target activity, temperature and consumption of water cooling the converter and target) is described.T he parallel printer port (LPT) of the personal computer is proposed to use as an interface with the measurement channels

  17. Functional Analysis for Double Shell Tank (DST) Subsystems

    SMITH, D.F.

    2000-01-01

    This functional analysis identifies the hierarchy and describes the subsystem functions that support the Double-Shell Tank (DST) System described in HNF-SD-WM-TRD-007, System Specification for the Double-Shell Tank System. Because of the uncertainty associated with the need for upgrades of the existing catch tanks supporting the Waste Feed Delivery (WFD) mission, catch tank functions are not addressed in this document. The functions identified herein are applicable to the Phase 1 WFD mission only

  18. FireSignal Application Node for Subsystem Control

    Duarte, A.; Santos, B.; Pereira, T.; Carvalho, B.; Fernandes, H. [Instituto de Plasmas e Fusao Nuclear - Instituto Superior Tecnico, Lisbon (Portugal); Cahyna, P.; Pisacka, J.; Hron, M. [Institute of Plasma Physics AS CR, Association EURATOM/IPP.CR, Prague (Czech Republic)

    2009-07-01

    Modern fusion experiments require the presence of several sub-systems, responsible for the different parameters involved in the operation of the machine. With the migration from the pre-programmed to the real-time control paradigm, their integration in Control, Data Acquisition, and Communication (CODAC) systems became an important issue, as this implies not only the connection to a main central coordination system, but also communications with related diagnostics and actuators. A sub-system for the control and operation of the vacuum, gas injection and baking was developed and installed in the COMPASS tokamak. These tasks are performed by 'dsPIC' micro-controllers that receive commands from a computer and send information regarding the status of the operation. Communications are done in the serial protocol RS-232 through fibre optics at speeds up to 1 Mbaud. A Java software, with an intuitive graphical user interface, for controlling and monitoring the sub-system was developed and installed in a hub computer. In order to allow operators to perform these tasks remotely besides locally, this was integrated in the FireSignal system. Taking advantage of FireSignal features, it was possible to provide the users with, not only the same functionalities of the local application but also a similar user interface. An independent FireSignal Java node bridges the central server and the control application. This design makes possible to easily reuse the node for other subsystems or integrate the vacuum slow control in the other CODAC systems. This document is composed of an abstract and a poster. (authors)

  19. Design of nanophotonic circuits for autonomous subsystem quantum error correction

    Kerckhoff, J; Pavlichin, D S; Chalabi, H; Mabuchi, H, E-mail: jkerc@stanford.edu [Edward L Ginzton Laboratory, Stanford University, Stanford, CA 94305 (United States)

    2011-05-15

    We reapply our approach to designing nanophotonic quantum memories in order to formulate an optical network that autonomously protects a single logical qubit against arbitrary single-qubit errors. Emulating the nine-qubit Bacon-Shor subsystem code, the network replaces the traditionally discrete syndrome measurement and correction steps by continuous, time-independent optical interactions and coherent feedback of unitarily processed optical fields.

  20. SING-dialoque subsystem for graphical representation of one-dimensional array contents

    Karlov, A.A.; Kirilov, A.S.

    1979-01-01

    General principles of organization and main features of dialogue subsystem for graphical representation of one-dimensional array contents are considered. The subsystem is developed for remote display station of the JINR BESM-6 computer. Some examples of using the subsystem for drawing curves and histograms are given. The subsystem is developed according to modern dialogue systems requirements. It is ''open'' for extension and could be installed into other computers [ru

  1. Static Feed Water Electrolysis Subsystem Testing and Component Development

    Koszenski, E. P.; Schubert, F. H.; Burke, K. A.

    1983-01-01

    A program was carried out to develop and test advanced electrochemical cells/modules and critical electromechanical components for a static feed (alkaline electrolyte) water electrolysis oxygen generation subsystem. The accomplishments were refurbishment of a previously developed subsystem and successful demonstration for a total of 2980 hours of normal operation; achievement of sustained one-person level oxygen generation performance with state-of-the-art cell voltages averaging 1.61 V at 191 ASF for an operating temperature of 128F (equivalent to 1.51V when normalized to 180F); endurance testing and demonstration of reliable performance of the three-fluid pressure controller for 8650 hours; design and development of a fluid control assembly for this subsystem and demonstration of its performance; development and demonstration at the single cell and module levels of a unitized core composite cell that provides expanded differential pressure tolerance capability; fabrication and evaluation of a feed water electrolyte elimination five-cell module; and successful demonstration of an electrolysis module pressurization technique that can be used in place of nitrogen gas during the standby mode of operation to maintain system pressure and differential pressures.

  2. Quantitative risk analysis of a space shuttle subsystem

    Frank, M.V.

    1989-01-01

    This paper reports that in an attempt to investigate methods for risk management other than qualitative analysis techniques, NASA has funded pilot study quantitative risk analyses for space shuttle subsystems. The authors performed one such study of two shuttle subsystems with McDonnell Douglas Astronautics Company. The subsystems were the auxiliary power units (APU) on the orbiter, and the hydraulic power units on the solid rocket booster. The technology and results of the APU study are presented in this paper. Drawing from a rich in-flight database as well as from a wealth of tests and analyses, the study quantitatively assessed the risk of APU-initiated scenarios on the shuttle during all phases of a flight mission. Damage states of interest were loss of crew/vehicle, aborted mission, and launch scrub. A quantitative risk analysis approach to deciding on important items for risk management was contrasted with the current NASA failure mode and effects analysis/critical item list approach

  3. Double Shell Tank (DST) Transfer Pump Subsystem Specification

    LESHIKAR, G.A.

    2000-01-01

    This specification establishes the performance requirements and provides references to the requisite codes and standards to be applied to the Double-Shell Tank (DST) Transfer Pump Subsystem which supports the first phase of Waste Feed Delivery (WFD). This specification establishes the performance requirements and provides the references to the requisite codes and standards to be applied during the design of the DST Transfer Pump Subsystem that supports the first phase of (WFD). The DST Transfer Pump Subsystem consists of a pump for supernatant and or slurry transfer for the DSTs that will be retrieved during the Phase 1 WFD operations. This system is used to transfer low-activity waste (LAW) and high-level waste (HLW) to designated DST staging tanks. It also will deliver blended LAW and HLW feed from these staging tanks to the River Protection Project (RPP) Privatization Contractor facility where it will be processed into an immobilized waste form. This specification is intended to be the basis for new projects/installations (W-521, etc.). This specification is not intended to retroactively affect previously established project design criteria without specific direction by the program

  4. Test Protocols for Advanced Inverter Interoperability Functions - Appendices

    Johnson, Jay Dean [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gonzalez, Sigifredo [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ralph, Mark E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ellis, Abraham [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Broderick, Robert Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2013-11-01

    Distributed energy resources (DER) such as photovoltaic (PV) systems, when deployed in a large scale, are capable of influencing significantly the operation of power systems. Looking to the future, stakeholders are working on standards to make it possible to manage the potentially complex interactions between DER and the power system. In 2009, the Electric Power Research Institute (EPRI), Sandia National Laboratories (SNL) with the U.S. Department of Energy (DOE), and the Solar Electric Power Association (SEPA) initiated a large industry collaborative to identify and standardize definitions for a set of DER grid support functions. While the initial effort concentrated on grid-tied PV inverters and energy storage systems, the concepts have applicability to all DER. A partial product of this on-going effort is a reference definitions document (IEC TR 61850-90-7, Object models for power converters in distributed energy resources (DER) systems) that has become a basis for expansion of related International Electrotechnical Commission (IEC) standards, and is supported by US National Institute of Standards and Technology (NIST) Smart Grid Interoperability Panel (SGIP). Some industry-led organizations advancing communications protocols have also embraced this work. As standards continue to evolve, it is necessary to develop test protocols to independently verify that the inverters are properly executing the advanced functions. Interoperability is assured by establishing common definitions for the functions and a method to test compliance with operational requirements. This document describes test protocols developed by SNL to evaluate the electrical performance and operational capabilities of PV inverters and energy storage, as described in IEC TR 61850-90-7. While many of these functions are not now required by existing grid codes or may not be widely available commercially, the industry is rapidly moving in that direction. Interoperability issues are already apparent as

  5. Test Protocols for Advanced Inverter Interoperability Functions – Main Document

    Johnson, Jay Dean [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gonzalez, Sigifredo [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ralph, Mark E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ellis, Abraham [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Broderick, Robert Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2013-11-01

    Distributed energy resources (DER) such as photovoltaic (PV) systems, when deployed in a large scale, are capable of influencing significantly the operation of power systems. Looking to the future, stakeholders are working on standards to make it possible to manage the potentially complex interactions between DER and the power system. In 2009, the Electric Power Research Institute (EPRI), Sandia National Laboratories (SNL) with the U.S. Department of Energy (DOE), and the Solar Electric Power Association (SEPA) initiated a large industry collaborative to identify and standardize definitions for a set of DER grid support functions. While the initial effort concentrated on grid-tied PV inverters and energy storage systems, the concepts have applicability to all DER. A partial product of this on-going effort is a reference definitions document (IEC TR 61850-90-7, Object models for power converters in distributed energy resources (DER) systems) that has become a basis for expansion of related International Electrotechnical Commission (IEC) standards, and is supported by US National Institute of Standards and Technology (NIST) Smart Grid Interoperability Panel (SGIP). Some industry-led organizations advancing communications protocols have also embraced this work. As standards continue to evolve, it is necessary to develop test protocols to independently verify that the inverters are properly executing the advanced functions. Interoperability is assured by establishing common definitions for the functions and a method to test compliance with operational requirements. This document describes test protocols developed by SNL to evaluate the electrical performance and operational capabilities of PV inverters and energy storage, as described in IEC TR 61850-90-7. While many of these functions are not currently required by existing grid codes or may not be widely available commercially, the industry is rapidly moving in that direction. Interoperability issues are already

  6. Predicting Speech Intelligibility with a Multiple Speech Subsystems Approach in Children with Cerebral Palsy

    Lee, Jimin; Hustad, Katherine C.; Weismer, Gary

    2014-01-01

    Purpose: Speech acoustic characteristics of children with cerebral palsy (CP) were examined with a multiple speech subsystems approach; speech intelligibility was evaluated using a prediction model in which acoustic measures were selected to represent three speech subsystems. Method: Nine acoustic variables reflecting different subsystems, and…

  7. Fiscal 1974 research report. General research on hydrogen energy subsystems; 1974 nendo suiso riyo subsystem sogoteki kento hokokusho

    NONE

    1975-03-01

    Based on the contract research 'General research on hydrogen energy subsystems and their peripheral technologies' with Agency of Industrial Science and Technology, each of 7 organizations including Denki Kagaku Kyokai (Electrochemical Association) promoted the research on hydrogen energy subsystem, combustion, fuel cell, car engine, aircraft engine, gas turbine and chemical energy, respectively. This report summarizes the research result on the former of 2 committees on hydrogen energy and peripheral technologies promoted by Denki Kagaku Kyokai. The first part describes the merit, demerit, domestic and overseas R and D states, technical problems, and future research issue for every use form of hydrogen. This part also outlines the short-, medium- and long-term prospects for use of hydrogen and oxygen energy, and describes the whole future research issue. The second part summarizes the content of each committee report. Although on details the original reports of each committee should be lead, this report is useful for obtaining the outline of utilization of hydrogen energy. (NEDO)

  8. On DESTINY Science Instrument Electrical and Electronics Subsystem Framework

    Kizhner, Semion; Benford, Dominic J.; Lauer, Tod R.

    2009-01-01

    Future space missions are going to require large focal planes with many sensing arrays and hundreds of millions of pixels all read out at high data rates'' . This will place unique demands on the electrical and electronics (EE) subsystem design and it will be critically important to have high technology readiness level (TRL) EE concepts ready to support such missions. One such omission is the Joint Dark Energy Mission (JDEM) charged with making precise measurements of the expansion rate of the universe to reveal vital clues about the nature of dark energy - a hypothetical form of energy that permeates all of space and tends to increase the rate of the expansion. One of three JDEM concept studies - the Dark Energy Space Telescope (DESTINY) was conducted in 2008 at the NASA's Goddard Space Flight Center (GSFC) in Greenbelt, Maryland. This paper presents the EE subsystem framework, which evolved from the DESTINY science instrument study. It describes the main challenges and implementation concepts related to the design of an EE subsystem featuring multiple focal planes populated with dozens of large arrays and millions of pixels. The focal planes are passively cooled to cryogenic temperatures (below 140 K). The sensor mosaic is controlled by a large number of Readout Integrated Circuits and Application Specific Integrated Circuits - the ROICs/ASICs in near proximity to their sensor focal planes. The ASICs, in turn, are serviced by a set of "warm" EE subsystem boxes performing Field Programmable Gate Array (FPGA) based digital signal processing (DSP) computations of complex algorithms, such as sampling-up-the-ramp algorithm (SUTR), over large volumes of fast data streams. The SUTR boxes are supported by the Instrument Control/Command and Data Handling box (ICDH Primary and Backup boxes) for lossless data compression, command and low volume telemetry handling, power conversion and for communications with the spacecraft. The paper outlines how the JDEM DESTINY concept

  9. Focus for 3D city models should be on interoperability

    Bodum, Lars; Kjems, Erik; Jaegly, Marie Michele Helena

    2006-01-01

    that would make it useful for other purposes than visualisation. Time has come to try to change this trend and to convince the municipalities that interoperability and semantics are important issues for the future. It is important for them to see that 3D modelling, mapping and geographic information...... developments in Geographical Exploration Systems. Centralized and proprietary Geographical Exploration Systems only give us their own perspective on the world. On the contrary, GRIFINOR is decentralized and available for everyone to use, empowering people to promote their own world vision....... are subjects on the same agenda towards an integrated solution for an object-oriented mapping of multidimensional geographic objects in the urban environment. Many relevant subjects could be discussed regarding these matters, but in this paper we will narrow the discussion down to the ideas behind...

  10. Web services for distributed and interoperable hydro-information systems

    Horak, J.; Orlik, A.; Stromsky, J.

    2008-03-01

    Web services support the integration and interoperability of Web-based applications and enable machine-to-machine interaction. The concepts of web services and open distributed architecture were applied to the development of T-DSS, the prototype customised for web based hydro-information systems. T-DSS provides mapping services, database related services and access to remote components, with special emphasis placed on the output flexibility (e.g. multilingualism), where SOAP web services are mainly used for communication. The remote components are represented above all by remote data and mapping services (e.g. meteorological predictions), modelling and analytical systems (currently HEC-HMS, MODFLOW and additional utilities), which support decision making in water management.

  11. Operational Plan Ontology Model for Interconnection and Interoperability

    Long, F.; Sun, Y. K.; Shi, H. Q.

    2017-03-01

    Aiming at the assistant decision-making system’s bottleneck of processing the operational plan data and information, this paper starts from the analysis of the problem of traditional expression and the technical advantage of ontology, and then it defines the elements of the operational plan ontology model and determines the basis of construction. Later, it builds up a semi-knowledge-level operational plan ontology model. Finally, it probes into the operational plan expression based on the operational plan ontology model and the usage of the application software. Thus, this paper has the theoretical significance and application value in the improvement of interconnection and interoperability of the operational plan among assistant decision-making systems.

  12. Special Topic Interoperability and EHR: Combining openEHR, SNOMED, IHE, and Continua as approaches to interoperability on national eHealth.

    Beštek, Mate; Stanimirović, Dalibor

    2017-08-09

    The main aims of the paper comprise the characterization and examination of the potential approaches regarding interoperability. This includes openEHR, SNOMED, IHE, and Continua as combined interoperability approaches, possibilities for their incorporation into the eHealth environment, and identification of the main success factors in the field, which are necessary for achieving required interoperability, and consequently, for the successful implementation of eHealth projects in general. The paper represents an in-depth analysis regarding the potential application of openEHR, SNOMED, IHE and Continua approaches in the development and implementation process of eHealth in Slovenia. The research method used is both exploratory and deductive in nature. The methodological framework is grounded on information retrieval with a special focus on research and charting of existing experience in the field, and sources, both electronic and written, which include interoperability concepts and related implementation issues. The paper will try to answer the following inquiries that are complementing each other: 1. Scrutiny of the potential approaches, which could alleviate the pertinent interoperability issues in the Slovenian eHealth context. 2. Analyzing the possibilities (requirements) for their inclusion in the construction process for individual eHealth solutions. 3. Identification and charting the main success factors in the interoperability field that critically influence development and implementation of eHealth projects in an efficient manner. Provided insights and identified success factors could serve as a constituent of the strategic starting points for continuous integration of interoperability principles into the healthcare domain. Moreover, the general implementation of the identified success factors could facilitate better penetration of ICT into the healthcare environment and enable the eHealth-based transformation of the health system especially in the countries

  13. European Interoperability Assets Register and Quality Framework Implementation.

    Moreno-Conde, Alberto; Thienpont, Geert; Lamote, Inge; Coorevits, Pascal; Parra, Carlos; Kalra, Dipak

    2016-01-01

    Interoperability assets is the term applied to refer to any resource that can support the design, implementation and successful adoption of eHealth services that can exchange data meaningfully. Some examples may include functional requirements, specifications, standards, clinical models and term lists, guidance on how standards may be used concurrently, implementation guides, educational resources, and other resources. Unfortunately, these are largely accessible in ad hoc ways and result in scattered fragments of a solution space that urgently need to be brought together. At present, it is well known that new initiatives and projects will reinvent assets of which they were unaware, while those assets which were potentially of great value are forgotten, not maintained and eventually fall into disuse. This research has defined a quality in use model and assessed the suitability of this quality framework based on the feedback and opinion of a representative sample of potential end users. This quality framework covers the following domains of asset development and adoption: (i) Development process, (ii) Maturity level, (iii) Trustworthiness, (iv) Support & skills, (v) Sustainability, (vi) Semantic interoperability, (vii) Cost & effort of adoption (viii) Maintenance. When participants were requested to evaluate how the overall quality in use framework, 70% would recommend using the register to their colleagues, 70% felt that it could provide relevant benefits for discovering new assets, and 50% responded that it would support their decision making about the recommended asset to adopt or implement in their organisation. Several European projects have expressed interest in using the register, which will now be sustained and promoted by the the European Institute for Innovation through Health Data.

  14. Political, policy and social barriers to health system interoperability: emerging opportunities of Web 2.0 and 3.0.

    Juzwishin, Donald W M

    2009-01-01

    Achieving effective health informatics interoperability in a fragmented and uncoordinated health system is by definition not possible. Interoperability requires the simultaneous integration of health care processes and information across different types and levels of care (systems thinking). The fundamental argument of this paper is that information system interoperability will remain an unfulfilled hope until health reforms effectively address the governance (accountability), structural and process barriers to interoperability of health care delivery. The ascendency of Web 2.0 and 3.0, although still unproven, signals the opportunity to accelerate patients' access to health information and their health record. Policy suggestions for simultaneously advancing health system delivery and information system interoperability are posited.

  15. Interoperability of Information Systems Managed and Used by the Local Health Departments.

    Shah, Gulzar H; Leider, Jonathon P; Luo, Huabin; Kaur, Ravneet

    2016-01-01

    In the post-Affordable Care Act era marked by interorganizational collaborations and availability of large amounts of electronic data from other community partners, it is imperative to assess the interoperability of information systems used by the local health departments (LHDs). To describe the level of interoperability of LHD information systems and identify factors associated with lack of interoperability. This mixed-methods research uses data from the 2015 Informatics Capacity and Needs Assessment Survey, with a target population of all LHDs in the United States. A representative sample of 650 LHDs was drawn using a stratified random sampling design. A total of 324 completed responses were received (50% response rate). Qualitative data were used from a key informant interview study of LHD informatics staff from across the United States. Qualitative data were independently coded by 2 researchers and analyzed thematically. Survey data were cleaned, bivariate comparisons were conducted, and a multivariable logistic regression was run to characterize factors associated with interoperability. For 30% of LHDs, no systems were interoperable, and 38% of LHD respondents indicated some of the systems were interoperable. Significant determinants of interoperability included LHDs having leadership support (adjusted odds ratio [AOR] = 3.54), control of information technology budget allocation (AOR = 2.48), control of data systems (AOR = 2.31), having a strategic plan for information systems (AOR = 1.92), and existence of business process analysis and redesign (AOR = 1.49). Interoperability of all systems may be an informatics goal, but only a small proportion of LHDs reported having interoperable systems, pointing to a substantial need among LHDs nationwide.

  16. National Ignition Facility, subsystem design requirements beam control and laser diagnostics SSDR 1.7

    Bliss, E.

    1996-01-01

    This Subsystem Design Requirement document is a development specification that establishes the performance, design, development, and test requirements for the Alignment subsystem (WBS 1.7.1), Beam Diagnostics (WBS 1.7.2), and the Wavefront Control subsystem (WBS 1.7. 3) of the NIF Laser System (WBS 1.3). These three subsystems are collectively referred to as the Beam Control ampersand Laser Diagnostics Subsystem. The NIF is a multi-pass, 192-beam, high-power, neodymium-glass laser that meets requirements set forth in the NIF SDR 002 (Laser System). 3 figs., 3 tabs

  17. Autonomous navigation - The ARMMS concept. [Autonomous Redundancy and Maintenance Management Subsystem

    Wood, L. J.; Jones, J. B.; Mease, K. D.; Kwok, J. H.; Goltz, G. L.; Kechichian, J. A.

    1984-01-01

    A conceptual design is outlined for the navigation subsystem of the Autonomous Redundancy and Maintenance Management Subsystem (ARMMS). The principal function of this navigation subsystem is to maintain the spacecraft over a specified equatorial longitude to within + or - 3 deg. In addition, the navigation subsystem must detect and correct internal faults. It comprises elements for a navigation executive and for orbit determination, trajectory, maneuver planning, and maneuver command. Each of these elements is described. The navigation subsystem is to be used in the DSCS III spacecraft.

  18. Systems and methods for an integrated electrical sub-system powered by wind energy

    Liu, Yan [Ballston Lake, NY; Garces, Luis Jose [Niskayuna, NY

    2008-06-24

    Various embodiments relate to systems and methods related to an integrated electrically-powered sub-system and wind power system including a wind power source, an electrically-powered sub-system coupled to and at least partially powered by the wind power source, the electrically-powered sub-system being coupled to the wind power source through power converters, and a supervisory controller coupled to the wind power source and the electrically-powered sub-system to monitor and manage the integrated electrically-powered sub-system and wind power system.

  19. Development of the CsI Calorimeter Subsystem for AMEGO

    Grove, J. Eric; Woolf, Richard; Johnson, W. Neil; Phlips, Bernard

    2018-01-01

    We report on the development of the thallium-doped cesium iodide (CsI:Tl) calorimeter subsystem for the All-Sky Medium-Energy Gamma-ray Observatory (AMEGO). The CsI calorimeter is one of the three main subsystems that comprise the AMEGO instrument suite; the others include the double-sided silicon strip detector (DSSD) tracker/converter and a cadmium zinc telluride (CZT) calorimeter. Similar to the LAT instrument on Fermi, the hodoscopic calorimeter consists of orthogonally layered CsI bars. Unlike the LAT, which uses PIN photodiodes, the scintillation light readout from each end of the CsI bar is done with recently developed large-area silicon photomultiplier (SiPM) arrays. We currently have an APRA program to develop the calorimeter technology for a larger, future space-based gamma-ray observatory. Under this program, we are building and testing a prototype calorimeter consisting of 24 CsI bars (16.7 mm x 16.7 mm x 100 mm) arranged in 4 layers with 6 bars per layer. The ends of each bar are read out with a 2 x 2 array of 6 mm x 6 mm SensL J series SiPMs. Signal readout and processing is done with the IDEAS SIPHRA (IDE3380) ASIC. Performance testing of this prototype will be done with laboratory sources, a beam test, and a balloon flight in conjunction with the other subsystems led by NASA GSFC. Additionally, we will test 16.7 mm x 16.7 mm x 450 mm CsI bars with SiPM readout to understand the performance of longer bars in advance of the developing the full instrument.Acknowledgement: This work was sponsored by the Chief of Naval Research (CNR) and NASA-APRA (NNH15ZDA001N-APRA).

  20. RF communications subsystem for the Radiation Belt Storm Probes mission

    Srinivasan, Dipak K.; Artis, David; Baker, Ben; Stilwell, Robert; Wallis, Robert

    2009-12-01

    The NASA Radiation Belt Storm Probes (RBSP) mission, currently in Phase B, is a two-spacecraft, Earth-orbiting mission, which will launch in 2012. The spacecraft's S-band radio frequency (RF) telecommunications subsystem has three primary functions: provide spacecraft command capability, provide spacecraft telemetry and science data return, and provide accurate Doppler data for navigation. The primary communications link to the ground is via the Johns Hopkins University Applied Physics Laboratory's (JHU/APL) 18 m dish, with secondary links to the NASA 13 m Ground Network and the Tracking and Data Relay Spacecraft System (TDRSS) in single-access mode. The on-board RF subsystem features the APL-built coherent transceiver and in-house builds of a solid-state power amplifier and conical bifilar helix broad-beam antennas. The coherent transceiver provides coherency digitally, and controls the downlink data rate and encoding within its field-programmable gate array (FPGA). The transceiver also provides a critical command decoder (CCD) function, which is used to protect against box-level upsets in the C&DH subsystem. Because RBSP is a spin-stabilized mission, the antennas must be symmetric about the spin axis. Two broad-beam antennas point along both ends of the spin axis, providing communication coverage from boresight to 70°. An RF splitter excites both antennas; therefore, the mission is designed such that no communications are required close to 90° from the spin axis due to the interferometer effect from the two antennas. To maximize the total downlink volume from the spacecraft, the CCSDS File Delivery Protocol (CFDP) has been baselined for the RBSP mission. During real-time ground contacts with the APL ground station, downlinked files are checked for errors. Handshaking between flight and ground CFDP software results in requests to retransmit only the file fragments lost due to dropouts. This allows minimization of RF link margins, thereby maximizing data rate and

  1. Electric and hybrid vehicle environmental control subsystem study

    Heitner, K. L.

    1980-01-01

    An environmental control subsystem (ECS) in electric and hybrid vehicles is studied. A combination of a combustion heater and gasoline engine (Otto cycle) driven vapor compression air conditioner is selected. The combustion heater, the small gasoline engine, and the vapor compression air conditioner are commercially available. These technologies have good cost and performance characteristics. The cost for this ECS is relatively close to the cost of current ECS's. Its effect on the vehicle's propulsion battery is minimal and the ECS size and weight do not have significant impact on the vehicle's range.

  2. Electric and hybrid vehicles environmental control subsystem study

    1981-01-01

    An environmental control subsystem (ECS) in the passenger compartment of electric and hybrid vehicles is studied. Various methods of obtaining the desired temperature control for the battery pack is also studied. The functional requirements of ECS equipment is defined. Following categorization by methodology, technology availability and risk, all viable ECS concepts are evaluated. Each is assessed independently for benefits versus risk, as well as for its feasibility to short, intermediate and long term product development. Selection of the preferred concept is made against these requirements, as well as the study's major goal of providing safe, highly efficient and thermally confortable ECS equipment.

  3. The New York Public Library Automated Book Catalog Subsystem

    S. Michael Malinconico

    1973-03-01

    Full Text Available A comprehensive automated bibliographic control system has been developed by the New York Public Library. This system is unique in its use of an automated authority system and highly sophisticated machine filing algorithms. The primary aim was the rigorous control of established forms and their cross-reference structure. The original impetus for creation of the system, and its most highly visible product, is a photocomposed book catalog. The book catalog subsystem supplies automatic punctuation of condensed entries and contains the ability to pmduce cumulation/ supplement book catalogs in installments without loss of control of the crossreferencing structure.

  4. Recent developments for the Large Binocular Telescope Guiding Control Subsystem

    Golota, T.; De La Peña, M. D.; Biddick, C.; Lesser, M.; Leibold, T.; Miller, D.; Meeks, R.; Hahn, T.; Storm, J.; Sargent, T.; Summers, D.; Hill, J.; Kraus, J.; Hooper, S.; Fisher, D.

    2014-07-01

    The Large Binocular Telescope (LBT) has eight Acquisition, Guiding, and wavefront Sensing Units (AGw units). They provide guiding and wavefront sensing capability at eight different locations at both direct and bent Gregorian focal stations. Recent additions of focal stations for PEPSI and MODS instruments doubled the number of focal stations in use including respective motion, camera controller server computers, and software infrastructure communicating with Guiding Control Subsystem (GCS). This paper describes the improvements made to the LBT GCS and explains how these changes have led to better maintainability and contributed to increased reliability. This paper also discusses the current GCS status and reviews potential upgrades to further improve its performance.

  5. Architecture of the software for LAMOST fiber positioning subsystem

    Peng, Xiaobo; Xing, Xiaozheng; Hu, Hongzhuan; Zhai, Chao; Li, Weimin

    2004-09-01

    The architecture of the software which controls the LAMOST fiber positioning sub-system is described. The software is composed of two parts as follows: a main control program in a computer and a unit controller program in a MCS51 single chip microcomputer ROM. And the function of the software includes: Client/Server model establishment, observation planning, collision handling, data transmission, pulse generation, CCD control, image capture and processing, and data analysis etc. Particular attention is paid to the ways in which different parts of the software can communicate. Also software techniques for multi threads, SOCKET programming, Microsoft Windows message response, and serial communications are discussed.

  6. Analysis of subsystems in wavelength-division-multiplexing networks

    Liu, Fenghai

    2001-01-01

    Wavelength division multiplexing (WDM) technology together with optical amplification has created a new era for optical communication. Transmission capacity is greatly increased by adding more and more wavelength channels into a single fiber, as well as by increasing the line rate of each channel...... in semiconductor optical amplifiers (SOAs), and dispersion managed fiber sections. New subsystems are also proposed in the thesis: a modular 2×2 multiwavelength cross-connect using wavelength switching blocks, a wavelength converter based on cross phase modulation in a semiconductor modulator, a wavelength...

  7. Information measuring subsystem oil pumping station “Parabel”

    Nyashina Galina S.

    2014-01-01

    Full Text Available Information-measurement subsystem oil pumping station (OPS “Parabel”, located on the site of the main pipeline “Alexandrov-Anzhero” (OJSC “AK” Transneft "”. Developed on the basis of a modern microprocessor equipment, automation, as well as high-speed digital data channels. The simple solution to meet the requirements set out in the guidance document "Automation and remote control of trunk pipelines. «General provisions» (RD-35.240.0000-KTN-207-08.

  8. Reactor Subsystem Simulation for Nuclear Hybrid Energy Systems

    Shannon Bragg-Sitton; J. Michael Doster; Alan Rominger

    2012-09-01

    Preliminary system models have been developed by Idaho National Laboratory researchers and are currently being enhanced to assess integrated system performance given multiple sources (e.g., nuclear + wind) and multiple applications (i.e., electricity + process heat). Initial efforts to integrate a Fortran-based simulation of a small modular reactor (SMR) with the balance of plant model have been completed in FY12. This initial effort takes advantage of an existing SMR model developed at North Carolina State University to provide initial integrated system simulation for a relatively low cost. The SMR subsystem simulation details are discussed in this report.

  9. Data Management Applications for the Service Preparation Subsystem

    Luong, Ivy P.; Chang, George W.; Bui, Tung; Allen, Christopher; Malhotra, Shantanu; Chen, Fannie C.; Bui, Bach X.; Gutheinz, Sandy C.; Kim, Rachel Y.; Zendejas, Silvino C.; hide

    2009-01-01

    These software applications provide intuitive User Interfaces (UIs) with a consistent look and feel for interaction with, and control of, the Service Preparation Subsystem (SPS). The elements of the UIs described here are the File Manager, Mission Manager, and Log Monitor applications. All UIs provide access to add/delete/update data entities in a complex database schema without requiring technical expertise on the part of the end users. These applications allow for safe, validated, catalogued input of data. Also, the software has been designed in multiple, coherent layers to promote ease of code maintenance and reuse in addition to reducing testing and accelerating maturity.

  10. The heterogeneous response method in slab geometry

    Villarino, E.A.; Stamm'ler, R.J.J.

    1984-01-01

    The heterogeneous response method (HRM) has been developed to calculate the multigroup flux in a heterogeneous system, e.g. a fuel assembly, without having to resort to dubious homogenization recipes. Here, the method is described in slab geometry in a manner that facilitates its computerization. By dividing the system into subsystems or nodes, say pin cells, two levels of calculation are created, which define a set of local problems and a global problem, respectively. In the local problem, collision probabilities are used to obtain for a node in vacuum, its response fluxes caused by sources and in-currents. They preserve the heterogeneous character of the node. In the global problem, the nodes are coupled by cosine currents. A suitable transformation reduces the number of two unknown currents per interface to one unknown per node, its total transmitted in-current. The global equation system thus becomes a set of three-point relations, which can be solved efficiently. In cases typical of fuel-assembly situations, the HRM produces fluxes that compare very well with the direct solution of the entire system by collision probabilities, though at a fraction of the computer cost. Extension of the method to 2- and 3-D systems is discussed. (author)

  11. An application of ETICS Co-Scheduling Mechanism to Interoperability and Compliance Validation of Grid Services

    Ronchieri, Elisabetta; Diez-andino Sancho, Guillermo; DI Meglio, Alberto; Marzolla, Moreno

    2008-01-01

    Grid software projects require infrastructures in order to evaluate interoperability with other projects and compliance with predefined standards. Interoperability and compliance are quality attributes that are expected from all distributed projects. ETICS is designed to automate the investigation of this kind of problems. It integrates well-established procedures, tools and resources in a coherent framework and adaptes them to the special needs of these projects. Interoperability and compliance to standards are important quality attributes of software developed for Grid environments where many different parts of an interconnected system have to interact. Compliance to standard is one of the major factors in making sure that interoperating parts of a distributed system can actually interconnect and exchange information. Taking the case of the Grid environment (Foster and Kesselman, 2003), most of the projects that are developing software have not reached the maturity level of other communities yet and have di...

  12. Rich services in interoperable Learning Designs: can the circle be squared?

    Griffiths, David

    2009-01-01

    Griffiths, D. (2009). Rich services in interoperable Learning Designs: Can the circle be squared?. Presented at Opening Up Learning Design, European LAMS and Learning Design Conference 2009. July, 6-9, 2009, Milton Keynes, United Kingdom.

  13. Mediation, Alignment, and Information Services for Semantic interoperability (MAISSI): A Trade Study

    Barlos, Fotis; Hunter, Dan; Krikeles, Basil; McDonough, James

    2007-01-01

    .... Semantic Interoperability (SI) encompasses a broad range of technologies such as data mediation and schema matching, ontology alignment, and context representation that attempt to enable systems to understand each others semantics...

  14. Interoperability and future internet for next generation enterprises - editorial and state of the art

    van Sinderen, Marten J.; Johnson, Pontus; Doumeingts, Guy

    2013-01-01

    Today’s global markets drive enterprises towards closer collaboration with customers, suppliers and partners. Interoperability problems constitute fundamental barriers to such collaboration. A characteristic of modern economic life is the requirement on continuous and rapid change and innovation.

  15. Public Key Infrastructure (PKI) Interoperability: A Security Services Approach to Support Transfer of Trust

    Hansen, Anthony

    1999-01-01

    .... This thesis defines interoperability as the capacity to support trust through retention of security services across PKI domains at a defined level of assurance and examines the elements of PKI...

  16. Analysis of Jordan's Proposed Emergency Communication Interoperability Plan (JECIP) for Disaster Response

    Alzaghal, Mohamad H

    2008-01-01

    ... country. It is essential to build a robust and interoperable Information and Communication Technology (ICT) infrastructure before the disaster, which will facilitate patch/restore/reconstruct it when and after the disaster hits...

  17. Interoperability requirements for a South African joint command and control test facility

    Le Roux, WH

    2008-06-01

    Full Text Available approach is followed to provide all the necessary services, mechanisms and functionalities. Since simulations and simulators form part of such a facility, interoperability standards are very important, as well as the underlying data model. The high...

  18. 78 FR 50075 - Statewide Communication Interoperability Plan Template and Annual Progress Report

    2013-08-16

    ... Collection Request should be forwarded to DHS/NPPD/CS&C/OEC, 245 Murray Lane SW., Mail Stop 0640, Arlington... will assist states in their strategic planning for interoperable and emergency communications while...

  19. Revealing electronic open quantum systems with subsystem TDDFT

    Krishtal, Alisa; Pavanello, Michele

    2016-03-01

    Open quantum systems (OQSs) are perhaps the most realistic systems one can approach through simulations. In recent years, describing OQSs with Density Functional Theory (DFT) has been a prominent avenue of research with most approaches based on a density matrix partitioning in conjunction with an ad-hoc description of system-bath interactions. We propose a different theoretical approach to OQSs based on partitioning of the electron density. Employing the machinery of subsystem DFT (and its time-dependent extension), we provide a novel way of isolating and analyzing the various terms contributing to the coupling between the system and the surrounding bath. To illustrate the theory, we provide numerical simulations on a toy system (a molecular dimer) and on a condensed phase system (solvated excimer). The simulations show that non-Markovian dynamics in the electronic system-bath interactions are important in chemical applications. For instance, we show that the superexchange mechanism of transport in donor-bridge-acceptor systems is a non-Markovian interaction between the donor-acceptor (OQS) with the bridge (bath) which is fully characterized by real-time subsystem time-dependent DFT.

  20. Lunar Advanced Volatile Analysis Subsystem: Pressure Transducer Trade Study

    Kang, Edward Shinuk

    2017-01-01

    In Situ Resource Utilization (ISRU) is a key factor in paving the way for the future of human space exploration. The ability to harvest resources on foreign astronomical objects to produce consumables and propellant offers potential reduction in mission cost and risk. Through previous missions, the existence of water ice at the poles of the moon has been identified, however the feasibility of water extraction for resources remains unanswered. The Resource Prospector (RP) mission is currently in development to provide ground truth, and will enable us to characterize the distribution of water at one of the lunar poles. Regolith & Environment Science and Oxygen & Lunar Volatile Extraction (RESOLVE) is the primary payload on RP that will be used in conjunction with a rover. RESOLVE contains multiple instruments for systematically identifying the presence of water. The main process involves the use of two systems within RESOLVE: the Oxygen Volatile Extraction Node (OVEN) and Lunar Advanced Volatile Analysis (LAVA). Within the LAVA subsystem, there are multiple calculations that depend on accurate pressure readings. One of the most important instances where pressure transducers (PT) are used is for calculating the number of moles in a gas transfer from the OVEN subsystem. As a critical component of the main process, a mixture of custom and commercial off the shelf (COTS) PTs are currently being tested in the expected operating environment to eventually down select an option for integrated testing in the LAVA engineering test unit (ETU).

  1. Principles of control for decoherence-free subsystems.

    Cappellaro, P; Hodges, J S; Havel, T F; Cory, D G

    2006-07-28

    Decoherence-free subsystems (DFSs) are a powerful means of protecting quantum information against noise with known symmetry properties. Although Hamiltonians that can implement a universal set of logic gates on DFS encoded qubits without ever leaving the protected subsystem theoretically exist, the natural Hamiltonians that are available in specific implementations do not necessarily have this property. Here we describe some of the principles that can be used in such cases to operate on encoded qubits without losing the protection offered by the DFSs. In particular, we show how dynamical decoupling can be used to control decoherence during the unavoidable excursions outside of the DFS. By means of cumulant expansions, we show how the fidelity of quantum gates implemented by this method on a simple two physical qubit DFS depends on the correlation time of the noise responsible for decoherence. We further show by means of numerical simulations how our previously introduced "strongly modulating pulses" for NMR quantum information processing can permit high-fidelity operations on multiple DFS encoded qubits in practice, provided that the rate at which the system can be modulated is fast compared to the correlation time of the noise. The principles thereby illustrated are expected to be broadly applicable to many implementations of quantum information processors based on DFS encoded qubits.

  2. Revealing electronic open quantum systems with subsystem TDDFT.

    Krishtal, Alisa; Pavanello, Michele

    2016-03-28

    Open quantum systems (OQSs) are perhaps the most realistic systems one can approach through simulations. In recent years, describing OQSs with Density Functional Theory (DFT) has been a prominent avenue of research with most approaches based on a density matrix partitioning in conjunction with an ad-hoc description of system-bath interactions. We propose a different theoretical approach to OQSs based on partitioning of the electron density. Employing the machinery of subsystem DFT (and its time-dependent extension), we provide a novel way of isolating and analyzing the various terms contributing to the coupling between the system and the surrounding bath. To illustrate the theory, we provide numerical simulations on a toy system (a molecular dimer) and on a condensed phase system (solvated excimer). The simulations show that non-Markovian dynamics in the electronic system-bath interactions are important in chemical applications. For instance, we show that the superexchange mechanism of transport in donor-bridge-acceptor systems is a non-Markovian interaction between the donor-acceptor (OQS) with the bridge (bath) which is fully characterized by real-time subsystem time-dependent DFT.

  3. Interoperability challenges for the Sustainable Management of seagrass meadows (Invited)

    Nativi, S.; Pastres, R.; Bigagli, L.; Venier, C.; Zucchetta, M.; Santoro, M.

    2013-12-01

    Seagrass meadows (marine angiosperm plants) occupy less than 0.2% of the global ocean surface, annually store about 10-18% of the so-called 'Blue Carbon', i.e. the Carbon stored in coastal vegetated areas. Recent literature estimates that the flux to the long-term carbon sink in seagrasses represents 10-20% of seagrasses global average production. Such figures can be translated into economic benefits, taking into account that a ton of carbon dioxide in Europe is paid at around 15 € in the carbon market. This means that the organic carbon retained in seagrass sediments in the Mediterranean is worth 138 - 1128 billion €, which represents 6-23 € per square meter. This is 9-35 times more than one square meter of tropical forest soil (0.66 € per square meter), or 5-17 times when considering both the above and the belowground compartments in tropical forests. According the most conservative estimations, about 10% of the Mediterranean meadows have been lost during the last century. In the framework of the GEOSS (Global Earth Observation System of Systems) initiative, the MEDINA project (funded by the European Commission and coordinated by the University of Ca'Foscari in Venice) prepared a showcase as part of the GEOSS Architecture Interoperability Pilot -phase 6 (AIP-6). This showcase aims at providing a tool for the sustainable management of seagrass meadows along the Mediterranean coastline. The application is based on an interoperability framework providing a set of brokerage services to easily ingest and run a Habitat Suitability model (a model predicting the probability a given site to provide a suitable habitat for the development of seagrass meadow and the average coverage expected). The presentation discusses such a framework explaining how the input data is discovered, accessed and processed to ingest the model (developed in the MEDINA project). Furthermore, the brokerage framework provides the necessary services to run the model and visualize results

  4. The National Flood Interoperability Experiment: Bridging Resesarch and Operations

    Salas, F. R.

    2015-12-01

    The National Weather Service's new National Water Center, located on the University of Alabama campus in Tuscaloosa, will become the nation's hub for comprehensive water resources forecasting. In conjunction with its federal partners the US Geological Survey, Army Corps of Engineers and Federal Emergency Management Agency, the National Weather Service will operationally support both short term flood prediction and long term seasonal forecasting of water resource conditions. By summer 2016, the National Water Center will begin evaluating four streamflow data products at the scale of the NHDPlus river reaches (approximately 2.67 million). In preparation for the release of these products, from September 2014 to August 2015, the National Weather Service partnered with the Consortium of Universities for the Advancement of Hydrologic Science, Inc. to support the National Flood Interoperability Experiment which included a seven week in-residence Summer Institute in Tuscaloosa for university students interested in learning about operational hydrology and flood forecasting. As part of the experiment, 15 hour forecasts from the operational High Resolution Rapid Refresh atmospheric model were used to drive a three kilometer Noah-MP land surface model loosely coupled to a RAPID river routing model operating on the NHDPlus dataset. This workflow was run every three hours during the Summer Institute and the results were made available to those engaged to pursue a range of research topics focused on flood forecasting (e.g. reservoir operations, ensemble forecasting, probabilistic flood inundation mapping, rainfall product evaluation etc.) Although the National Flood Interoperability Experiment was finite in length, it provided a platform through which the academic community could engage federal agencies and vice versa to narrow the gap between research and operations and demonstrate how state of the art research infrastructure, models, services, datasets etc. could be utilized

  5. Author identities an interoperability problem solved by a collaborative solution

    Fleischer, D.; Czerniak, A.; Schirnick, C.

    2012-12-01

    The identity of authors and data providers is crucial for personalized interoperability. The marketplace of available identifiers is packed and the right choice is getting more and more complicated. Even though there are more then 15 different systems available there are still some under development and proposed to come up by the end of 2012 ('PubMed Central Author ID' and ORCID). Data Management on a scale beyond the size of a single research institute but on the scale of a scientific site including a university with student education program needs to tackle this problem and so did the Kiel Data Management an Infrastructure. The main problem with the identities of researchers is the quite high frequency changes in positions during a scientist life. The required system needed to be a system that already contained the potential of preregistered people with their scientific publications from other countries, institutions and organizations. Scanning the author ID marketplace brought up, that there us a high risk of additional workload to the researcher itself or the administration due to the fact that individuals need to register an ID for themselves or the chosen register is not yet big enough to simply find the right entry. On the other hand libraries deal with authors and their publications now for centuries and they have high quality catalogs with person identities already available. Millions of records internationally mapped are available by collaboration with libraries and can be used in exactly the same scope. The international collaboration between libraries (VIAF) provides a mapping between libraries from the US, CA, UK, FR, GER and many more. The international library author identification system made it possible to actually reach at the first matching a success of 60% of all scientists. The additional advantage is that librarians can finalize the Identity system in a kind of background process. The Kiel Data Management Infrastructure initiated a web service

  6. An E-government Interoperability Platform Supporting Personal Data Protection Regulations

    González, Laura; Echevarría, Andrés; Morales, Dahiana; Ruggia, Raúl

    2016-01-01

    Public agencies are increasingly required to collaborate with each other in order to provide high-quality e-government services. This collaboration is usually based on the service-oriented approach and supported by interoperability platforms. Such platforms are specialized middleware-based infrastructures enabling the provision, discovery and invocation of interoperable software services. In turn, given that personal data handled by governments are often very sensitive, most governments have ...

  7. A Pragmatic Approach to Sustainable Interoperability for the Web 2.0 World

    Wright, D. J.; Sankaran, S.

    2015-12-01

    In the geosciences, interoperability is a fundamental requirement. Members of various standards organizations such as the OGC and ISO-TC 211 have done yeomen services to promote a standards-centric approach to manage the interoperability challenges that organizations face today. The specific challenges that organizations face when adopting interoperability patterns are very many. One approach, that of mandating the use of specific standards has been reasonably successful. But scientific communities, as with all others, ultimately want their solutions to be widely accepted and used. And to this end there is a crying need to explore all possible interoperability patterns without restricting the choices to mandated standards. Standards are created by a slow and deliberative process that sometimes takes a long time to come to fruition and therefore sometime feel to fall short of user expectations. It seems therefore that organizations are left with a series of perceived orthogonal requirements when they want to pursue interoperability. They want a robust but agile solution, a mature approach that also needs to satisfy latest technology trends and so on. Sustainable interoperability patterns need to be forward looking and should choose the patterns and paradigms of the Web 2.0 generation. To this end, the key is to choose platform technologies that embrace multiple interoperability mechanisms that are built on fundamental "open" principles and which align with popular mainstream patterns. We seek to explore data-, metadata- and web service-related interoperability patterns through the prism of building solutions that encourage strong implementer and end-user engagement, improved usability and scalability considerations, and appealing developer frameworks that can grow the audience. The path to tread is not new, and the geocommunity only needs to observe and align its end goals with current Web 2.0 patterns to realize all the benefits that today we all take for granted

  8. Collaborative ocean resource interoperability - multi-use of ocean data on the semantic web

    Tao, Feng; Campbell, Jon; Pagnani, Maureen; Griffiths, Gwyn

    2009-01-01

    Earth Observations (EO) collect various characteristics of the objective environment using sensors which often have different measuring, spatial and temporal coverage. Making individual observational data interoperable becomes equally important when viewed in the context of its expensive and time-consuming EO operations. Interoperability will improve reusability of existing observations in both the broader context, and with other observations. As a demonstration of the potential offered by se...

  9. Implementing Interoperability in the Seafood Industry: Learning from Experiences in Other Sectors.

    Bhatt, Tejas; Gooch, Martin; Dent, Benjamin; Sylvia, Gilbert

    2017-08-01

    Interoperability of communication and information technologies within and between businesses operating along supply chains is being pursued and implemented in numerous industries worldwide to increase the efficiency and effectiveness of operations. The desire for greater interoperability is also driven by the need to reduce business risk through more informed management decisions. Interoperability is achieved by the development of a technology architecture that guides the design and implementation of communication systems existing within individual businesses and between businesses comprising the supply chain. Technology architectures are developed through a purposeful dialogue about why the architecture is required, the benefits and opportunities that the architecture offers the industry, and how the architecture will translate into practical results. An assessment of how the finance, travel, and health industries and a sector of the food industry-fresh produce-have implemented interoperability was conducted to identify lessons learned that can aid the development of interoperability in the seafood industry. The findings include identification of the need for strong, effective governance during the establishment and operation of an interoperability initiative to ensure the existence of common protocols and standards. The resulting insights were distilled into a series of principles for enabling syntactic and semantic interoperability in any industry, which we summarize in this article. Categorized as "structural," "operational," and "integrative," the principles describe requirements and solutions that are pivotal to enabling businesses to create and capture value from full chain interoperability. The principles are also fundamental to allowing governments and advocacy groups to use traceability for public good. © 2017 Institute of Food Technologists®.

  10. The development of the intrinsic functional connectivity of default network subsystems from age 3 to 5.

    Xiao, Yaqiong; Zhai, Hongchang; Friederici, Angela D; Jia, Fucang

    2016-03-01

    In recent years, research on human functional brain imaging using resting-state fMRI techniques has been increasingly prevalent. The term "default mode" was proposed to describe a baseline or default state of the brain during rest. Recent studies suggested that the default mode network (DMN) is comprised of two functionally distinct subsystems: a dorsal-medial prefrontal cortex (DMPFC) subsystem involved in self-oriented cognition (i.e., theory of mind) and a medial temporal lobe (MTL) subsystem engaged in memory and scene construction; both subsystems interact with the anterior medial prefrontal cortex (aMPFC) and posterior cingulate (PCC) as the core regions of DMN. The present study explored the development of DMN core regions and these two subsystems in both hemispheres from 3- to 5-year-old children. The analysis of the intrinsic activity showed strong developmental changes in both subsystems, and significant changes were specifically found in MTL subsystem, but not in DMPFC subsystem, implying distinct developmental trajectories for DMN subsystems. We found stronger interactions between the DMPFC and MTL subsystems in 5-year-olds, particularly in the left subsystems that support the development of environmental adaptation and relatively complex mental activities. These results also indicate that there is stronger right hemispheric lateralization at age 3, which then changes as bilateral development gradually increases through to age 5, suggesting in turn the hemispheric dominance in DMN subsystems changing with age. The present results provide primary evidence for the development of DMN subsystems in early life, which might be closely related to the development of social cognition in childhood.

  11. People-Technology-Ecosystem Integration: A Framework to Ensure Regional Interoperability for Safety, Sustainability, and Resilience of Interdependent Energy, Water, and Seafood Sources in the (Persian) Gulf.

    Meshkati, Najmedin; Tabibzadeh, Maryam; Farshid, Ali; Rahimi, Mansour; Alhanaee, Ghena

    2016-02-01

    The aim of this study is to identify the interdependencies of human and organizational subsystems of multiple complex, safety-sensitive technological systems and their interoperability in the context of sustainability and resilience of an ecosystem. Recent technological disasters with severe environmental impact are attributed to human factors and safety culture causes. One of the most populous and environmentally sensitive regions in the world, the (Persian) Gulf, is on the confluence of an exponentially growing number of two industries--nuclear power and seawater desalination plants--that is changing its land- and seascape. Building upon Rasmussen's model, a macrosystem integrative framework, based on the broader context of human factors, is developed, which can be considered in this context as a "meta-ergonomics" paradigm, for the analysis of interactions, design of interoperability, and integration of decisions of major actors whose actions can affect safety and sustainability of the focused industries during routine and nonroutine (emergency) operations. Based on the emerging realities in the Gulf region, it is concluded that without such systematic approach toward addressing the interdependencies of water and energy sources, sustainability will be only a short-lived dream and prosperity will be a disappearing mirage for millions of people in the region. This multilayered framework for the integration of people, technology, and ecosystem--which has been applied to the (Persian) Gulf--offers a viable and vital approach to the design and operation of large-scale complex systems wherever the nexus of water, energy, and food sources are concerned, such as the Black Sea. © 2016, Human Factors and Ergonomics Society.

  12. [Lessons learned in the implementation of interoperable National Health Information Systems: a systematic review].

    Ovies-Bernal, Diana Paola; Agudelo-Londoño, Sandra M

    2014-01-01

    Identify shared criteria used throughout the world in the implementation of interoperable National Health Information Systems (NHIS) and provide validated scientific information on the dimensions affecting interoperability. This systematic review sought to identify primary articles on the implementation of interoperable NHIS published in scientific journals in English, Portuguese, or Spanish between 1990 and 2011 through a search of eight databases of electronic journals in the health sciences and informatics: MEDLINE (PubMed), Proquest, Ovid, EBSCO, MD Consult, Virtual Health Library, Metapress, and SciELO. The full texts of the articles were reviewed, and those that focused on technical computer aspects or on normative issues were excluded, as well as those that did not meet the quality criteria for systematic reviews of interventions. Of 291 studies found and reviewed, only five met the inclusion criteria. These articles reported on the process of implementing an interoperable NHIS in Brazil, China, the United States, Turkey, and the Semiautonomous Region of Zanzíbar, respectively. Five common basic criteria affecting implementation of the NHIS were identified: standards in place to govern the process, availability of trained human talent, financial and structural constraints, definition of standards, and assurance that the information is secure. Four dimensions affecting interoperability were defined: technical, semantic, legal, and organizational. The criteria identified have to be adapted to the actual situation in each country and a proactive approach should be used to ensure that implementation of the interoperable NHIS is strategic, simple, and reliable.

  13. Reflections on the role of open source in health information system interoperability.

    Sfakianakis, S; Chronaki, C E; Chiarugi, F; Conforti, F; Katehakis, D G

    2007-01-01

    This paper reflects on the role of open source in health information system interoperability. Open source is a driving force in computer science research and the development of information systems. It facilitates the sharing of information and ideas, enables evolutionary development and open collaborative testing of code, and broadens the adoption of interoperability standards. In health care, information systems have been developed largely ad hoc following proprietary specifications and customized design. However, the wide deployment of integrated services such as Electronic Health Records (EHRs) over regional health information networks (RHINs) relies on interoperability of the underlying information systems and medical devices. This reflection is built on the experiences of the PICNIC project that developed shared software infrastructure components in open source for RHINs and the OpenECG network that offers open source components to lower the implementation cost of interoperability standards such as SCP-ECG, in electrocardiography. Open source components implementing standards and a community providing feedback from real-world use are key enablers of health care information system interoperability. Investing in open source is investing in interoperability and a vital aspect of a long term strategy towards comprehensive health services and clinical research.

  14. On the formal definition of the systems' interoperability capability: an anthropomorphic approach

    Zdravković, Milan; Luis-Ferreira, Fernando; Jardim-Goncalves, Ricardo; Trajanović, Miroslav

    2017-03-01

    The extended view of enterprise information systems in the Internet of Things (IoT) introduces additional complexity to the interoperability problems. In response to this, the problem of systems' interoperability is revisited by taking into the account the different aspects of philosophy, psychology, linguistics and artificial intelligence, namely by analysing the potential analogies between the processes of human and system communication. Then, the capability to interoperate as a property of the system, is defined as a complex ability to seamlessly sense and perceive a stimulus from its environment (assumingly, a message from any other system), make an informed decision about this perception and consequently, articulate a meaningful and useful action or response, based on this decision. Although this capability is defined on the basis of the existing interoperability theories, the proposed approach to its definition excludes the assumption on the awareness of co-existence of two interoperating systems. Thus, it establishes the links between the research of interoperability of systems and intelligent software agents, as one of the systems' digital identities.

  15. A federated semantic metadata registry framework for enabling interoperability across clinical research and care domains.

    Sinaci, A Anil; Laleci Erturkmen, Gokce B

    2013-10-01

    In order to enable secondary use of Electronic Health Records (EHRs) by bridging the interoperability gap between clinical care and research domains, in this paper, a unified methodology and the supporting framework is introduced which brings together the power of metadata registries (MDR) and semantic web technologies. We introduce a federated semantic metadata registry framework by extending the ISO/IEC 11179 standard, and enable integration of data element registries through Linked Open Data (LOD) principles where each Common Data Element (CDE) can be uniquely referenced, queried and processed to enable the syntactic and semantic interoperability. Each CDE and their components are maintained as LOD resources enabling semantic links with other CDEs, terminology systems and with implementation dependent content models; hence facilitating semantic search, much effective reuse and semantic interoperability across different application domains. There are several important efforts addressing the semantic interoperability in healthcare domain such as IHE DEX profile proposal, CDISC SHARE and CDISC2RDF. Our architecture complements these by providing a framework to interlink existing data element registries and repositories for multiplying their potential for semantic interoperability to a greater extent. Open source implementation of the federated semantic MDR framework presented in this paper is the core of the semantic interoperability layer of the SALUS project which enables the execution of the post marketing safety analysis studies on top of existing EHR systems. Copyright © 2013 Elsevier Inc. All rights reserved.

  16. An approach to define semantics for BPM systems interoperability

    Rico, Mariela; Caliusco, María Laura; Chiotti, Omar; Rosa Galli, María

    2015-04-01

    This article proposes defining semantics for Business Process Management systems interoperability through the ontology of Electronic Business Documents (EBD) used to interchange the information required to perform cross-organizational processes. The semantic model generated allows aligning enterprise's business processes to support cross-organizational processes by matching the business ontology of each business partner with the EBD ontology. The result is a flexible software architecture that allows dynamically defining cross-organizational business processes by reusing the EBD ontology. For developing the semantic model, a method is presented, which is based on a strategy for discovering entity features whose interpretation depends on the context, and representing them for enriching the ontology. The proposed method complements ontology learning techniques that can not infer semantic features not represented in data sources. In order to improve the representation of these entity features, the method proposes using widely accepted ontologies, for representing time entities and relations, physical quantities, measurement units, official country names, and currencies and funds, among others. When the ontologies reuse is not possible, the method proposes identifying whether that feature is simple or complex, and defines a strategy to be followed. An empirical validation of the approach has been performed through a case study.

  17. Adaptation of interoperability standards for cross domain usage

    Essendorfer, B.; Kerth, Christian; Zaschke, Christian

    2017-05-01

    As globalization affects most aspects of modern life, challenges of quick and flexible data sharing apply to many different domains. To protect a nation's security for example, one has to look well beyond borders and understand economical, ecological, cultural as well as historical influences. Most of the time information is produced and stored digitally and one of the biggest challenges is to receive relevant readable information applicable to a specific problem out of a large data stock at the right time. These challenges to enable data sharing across national, organizational and systems borders are known to other domains (e.g., ecology or medicine) as well. Solutions like specific standards have been worked on for the specific problems. The question is: what can the different domains learn from each other and do we have solutions when we need to interlink the information produced in these domains? A known problem is to make civil security data available to the military domain and vice versa in collaborative operations. But what happens if an environmental crisis leads to the need to quickly cooperate with civil or military security in order to save lives? How can we achieve interoperability in such complex scenarios? The paper introduces an approach to adapt standards from one domain to another and lines out problems that have to be overcome and limitations that may apply.

  18. An open, interoperable, and scalable prehospital information technology network architecture.

    Landman, Adam B; Rokos, Ivan C; Burns, Kevin; Van Gelder, Carin M; Fisher, Roger M; Dunford, James V; Cone, David C; Bogucki, Sandy

    2011-01-01

    Some of the most intractable challenges in prehospital medicine include response time optimization, inefficiencies at the emergency medical services (EMS)-emergency department (ED) interface, and the ability to correlate field interventions with patient outcomes. Information technology (IT) can address these and other concerns by ensuring that system and patient information is received when and where it is needed, is fully integrated with prior and subsequent patient information, and is securely archived. Some EMS agencies have begun adopting information technologies, such as wireless transmission of 12-lead electrocardiograms, but few agencies have developed a comprehensive plan for management of their prehospital information and integration with other electronic medical records. This perspective article highlights the challenges and limitations of integrating IT elements without a strategic plan, and proposes an open, interoperable, and scalable prehospital information technology (PHIT) architecture. The two core components of this PHIT architecture are 1) routers with broadband network connectivity to share data between ambulance devices and EMS system information services and 2) an electronic patient care report to organize and archive all electronic prehospital data. To successfully implement this comprehensive PHIT architecture, data and technology requirements must be based on best available evidence, and the system must adhere to health data standards as well as privacy and security regulations. Recent federal legislation prioritizing health information technology may position federal agencies to help design and fund PHIT architectures.

  19. Health level seven interoperability strategy: big data, incrementally structured.

    Dolin, R H; Rogers, B; Jaffe, C

    2015-01-01

    Describe how the HL7 Clinical Document Architecture (CDA), a foundational standard in US Meaningful Use, contributes to a "big data, incrementally structured" interoperability strategy, whereby data structured incrementally gets large amounts of data flowing faster. We present cases showing how this approach is leveraged for big data analysis. To support the assertion that semi-structured narrative in CDA format can be a useful adjunct in an overall big data analytic approach, we present two case studies. The first assesses an organization's ability to generate clinical quality reports using coded data alone vs. coded data supplemented by CDA narrative. The second leverages CDA to construct a network model for referral management, from which additional observations can be gleaned. The first case shows that coded data supplemented by CDA narrative resulted in significant variances in calculated performance scores. In the second case, we found that the constructed network model enables the identification of differences in patient characteristics among different referral work flows. The CDA approach goes after data indirectly, by focusing first on the flow of narrative, which is then incrementally structured. A quantitative assessment of whether this approach will lead to a greater flow of data and ultimately a greater flow of structured data vs. other approaches is planned as a future exercise. Along with growing adoption of CDA, we are now seeing the big data community explore the standard, particularly given its potential to supply analytic en- gines with volumes of data previously not possible.

  20. An Interoperable System toward Cardiac Risk Stratification from ECG Monitoring

    Cristina Soguero-Ruiz

    2018-03-01

    Full Text Available Many indices have been proposed for cardiovascular risk stratification from electrocardiogram signal processing, still with limited use in clinical practice. We created a system integrating the clinical definition of cardiac risk subdomains from ECGs and the use of diverse signal processing techniques. Three subdomains were defined from the joint analysis of the technical and clinical viewpoints. One subdomain was devoted to demographic and clinical data. The other two subdomains were intended to obtain widely defined risk indices from ECG monitoring: a simple-domain (heart rate turbulence (HRT, and a complex-domain (heart rate variability (HRV. Data provided by the three subdomains allowed for the generation of alerts with different intensity and nature, as well as for the grouping and scrutinization of patients according to the established processing and risk-thresholding criteria. The implemented system was tested by connecting data from real-world in-hospital electronic health records and ECG monitoring by considering standards for syntactic (HL7 messages and semantic interoperability (archetypes based on CEN/ISO EN13606 and SNOMED-CT. The system was able to provide risk indices and to generate alerts in the health records to support decision-making. Overall, the system allows for the agile interaction of research and clinical practice in the Holter-ECG-based cardiac risk domain.

  1. An Interoperable System toward Cardiac Risk Stratification from ECG Monitoring

    Mora-Jiménez, Inmaculada; Ramos-López, Javier; Quintanilla Fernández, Teresa; García-García, Antonio; Díez-Mazuela, Daniel; García-Alberola, Arcadi

    2018-01-01

    Many indices have been proposed for cardiovascular risk stratification from electrocardiogram signal processing, still with limited use in clinical practice. We created a system integrating the clinical definition of cardiac risk subdomains from ECGs and the use of diverse signal processing techniques. Three subdomains were defined from the joint analysis of the technical and clinical viewpoints. One subdomain was devoted to demographic and clinical data. The other two subdomains were intended to obtain widely defined risk indices from ECG monitoring: a simple-domain (heart rate turbulence (HRT)), and a complex-domain (heart rate variability (HRV)). Data provided by the three subdomains allowed for the generation of alerts with different intensity and nature, as well as for the grouping and scrutinization of patients according to the established processing and risk-thresholding criteria. The implemented system was tested by connecting data from real-world in-hospital electronic health records and ECG monitoring by considering standards for syntactic (HL7 messages) and semantic interoperability (archetypes based on CEN/ISO EN13606 and SNOMED-CT). The system was able to provide risk indices and to generate alerts in the health records to support decision-making. Overall, the system allows for the agile interaction of research and clinical practice in the Holter-ECG-based cardiac risk domain. PMID:29494497

  2. Interoperability In The New Planetary Science Archive (PSA)

    Rios, C.; Barbarisi, I.; Docasal, R.; Macfarlane, A. J.; Gonzalez, J.; Arviset, C.; Grotheer, E.; Besse, S.; Martinez, S.; Heather, D.; De Marchi, G.; Lim, T.; Fraga, D.; Barthelemy, M.

    2015-12-01

    As the world becomes increasingly interconnected, there is a greater need to provide interoperability with software and applications that are commonly being used globally. For this purpose, the development of the new Planetary Science Archive (PSA), by the European Space Astronomy Centre (ESAC) Science Data Centre (ESDC), is focused on building a modern science archive that takes into account internationally recognised standards in order to provide access to the archive through tools from third parties, for example by the NASA Planetary Data System (PDS), the VESPA project from the Virtual Observatory of Paris as well as other international institutions. The protocols and standards currently being supported by the new Planetary Science Archive at this time are the Planetary Data Access Protocol (PDAP), the EuroPlanet-Table Access Protocol (EPN-TAP) and Open Geospatial Consortium (OGC) standards. The architecture of the PSA consists of a Geoserver (an open-source map server), the goal of which is to support use cases such as the distribution of search results, sharing and processing data through a OGC Web Feature Service (WFS) and a Web Map Service (WMS). This server also allows the retrieval of requested information in several standard output formats like Keyhole Markup Language (KML), Geography Markup Language (GML), shapefile, JavaScript Object Notation (JSON) and Comma Separated Values (CSV), among others. The provision of these various output formats enables end-users to be able to transfer retrieved data into popular applications such as Google Mars and NASA World Wind.

  3. A Working Framework for Enabling International Science Data System Interoperability

    Hughes, J. Steven; Hardman, Sean; Crichton, Daniel J.; Martinez, Santa; Law, Emily; Gordon, Mitchell K.

    2016-07-01

    For diverse scientific disciplines to interoperate they must be able to exchange information based on a shared understanding. To capture this shared understanding, we have developed a knowledge representation framework that leverages ISO level reference models for metadata registries and digital archives. This framework provides multi-level governance, evolves independent of the implementation technologies, and promotes agile development, namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. The knowledge representation is captured in an ontology through a process of knowledge acquisition. Discipline experts in the role of stewards at the common, discipline, and project levels work to design and populate the ontology model. The result is a formal and consistent knowledge base that provides requirements for data representation, integrity, provenance, context, identification, and relationship. The contents of the knowledge base are translated and written to files in suitable formats to configure system software and services, provide user documentation, validate input, and support data analytics. This presentation will provide an overview of the framework, present a use case that has been adopted by an entire science discipline at the international level, and share some important lessons learned.

  4. Digital Motion Imagery, Interoperability Challenges for Space Operations

    Grubbs, Rodney

    2012-01-01

    With advances in available bandwidth from spacecraft and between terrestrial control centers, digital motion imagery and video is becoming more practical as a data gathering tool for science and engineering, as well as for sharing missions with the public. The digital motion imagery and video industry has done a good job of creating standards for compression, distribution, and physical interfaces. Compressed data streams can easily be transmitted or distributed over radio frequency, internet protocol, and other data networks. All of these standards, however, can make sharing video between spacecraft and terrestrial control centers a frustrating and complicated task when different standards and protocols are used by different agencies. This paper will explore the challenges presented by the abundance of motion imagery and video standards, interfaces and protocols with suggestions for common formats that could simplify interoperability between spacecraft and ground support systems. Real-world examples from the International Space Station will be examined. The paper will also discuss recent trends in the development of new video compression algorithms, as well likely expanded use of Delay (or Disruption) Tolerant Networking nodes.

  5. Advances in a Distributed Approach for Ocean Model Data Interoperability

    Richard P. Signell

    2014-03-01

    Full Text Available An infrastructure for earth science data is emerging across the globe based on common data models and web services. As we evolve from custom file formats and web sites to standards-based web services and tools, data is becoming easier to distribute, find and retrieve, leaving more time for science. We describe recent advances that make it easier for ocean model providers to share their data, and for users to search, access, analyze and visualize ocean data using MATLAB® and Python®. These include a technique for modelers to create aggregated, Climate and Forecast (CF metadata convention datasets from collections of non-standard Network Common Data Form (NetCDF output files, the capability to remotely access data from CF-1.6-compliant NetCDF files using the Open Geospatial Consortium (OGC Sensor Observation Service (SOS, a metadata standard for unstructured grid model output (UGRID, and tools that utilize both CF and UGRID standards to allow interoperable data search, browse and access. We use examples from the U.S. Integrated Ocean Observing System (IOOS® Coastal and Ocean Modeling Testbed, a project in which modelers using both structured and unstructured grid model output needed to share their results, to compare their results with other models, and to compare models with observed data. The same techniques used here for ocean modeling output can be applied to atmospheric and climate model output, remote sensing data, digital terrain and bathymetric data.

  6. Advances in a distributed approach for ocean model data interoperability

    Signell, Richard P.; Snowden, Derrick P.

    2014-01-01

    An infrastructure for earth science data is emerging across the globe based on common data models and web services. As we evolve from custom file formats and web sites to standards-based web services and tools, data is becoming easier to distribute, find and retrieve, leaving more time for science. We describe recent advances that make it easier for ocean model providers to share their data, and for users to search, access, analyze and visualize ocean data using MATLAB® and Python®. These include a technique for modelers to create aggregated, Climate and Forecast (CF) metadata convention datasets from collections of non-standard Network Common Data Form (NetCDF) output files, the capability to remotely access data from CF-1.6-compliant NetCDF files using the Open Geospatial Consortium (OGC) Sensor Observation Service (SOS), a metadata standard for unstructured grid model output (UGRID), and tools that utilize both CF and UGRID standards to allow interoperable data search, browse and access. We use examples from the U.S. Integrated Ocean Observing System (IOOS®) Coastal and Ocean Modeling Testbed, a project in which modelers using both structured and unstructured grid model output needed to share their results, to compare their results with other models, and to compare models with observed data. The same techniques used here for ocean modeling output can be applied to atmospheric and climate model output, remote sensing data, digital terrain and bathymetric data.

  7. Evaluating Sustainability Models for Interoperability through Brokering Software

    Pearlman, Jay; Benedict, Karl; Best, Mairi; Fyfe, Sue; Jacobs, Cliff; Michener, William; Nativi, Stefano; Powers, Lindsay; Turner, Andrew

    2016-04-01

    Sustainability of software and research support systems is an element of innovation that is not often discussed. Yet, sustainment is essential if we expect research communities to make the time investment to learn and adopt new technologies. As the Research Data Alliance (RDA) is developing new approaches to interoperability, the question of uptake and sustainability is important. Brokering software sustainability is one of the areas that is being addressed in RDA. The Business Models Team of the Research Data Alliance Brokering Governance Working Group examined several support models proposed to promote the long-term sustainability of brokering middleware. The business model analysis includes examination of funding source, implementation frameworks and challenges, and policy and legal considerations. Results of this comprehensive analysis highlight advantages and disadvantages of the various models with respect to the specific requirements for brokering services. We offer recommendations based on the outcomes of this analysis that suggest that hybrid funding models present the most likely avenue to long term sustainability.

  8. Standardized headings as a foundation for semantic interoperability in EHR

    Halilovic Amra

    2016-01-01

    Full Text Available The new Swedish Patient Act, which allows patients to choose health care in county councils other than their own, creates the need to be able to share health-related information contained in electronic health records [EHRs across county councils. This demands interoperability in terms of structured and standardized data. Headings in EHR could also be a part of structured and standardized data. The aim was to study to what extent terminology is shared and standardized across county councils in Sweden. Headings from three county councils were analyzed to see to what extent they were shared and to what extent they corresponded to concepts in SNOMED CT and the National Board of Health and Welfare’s term dictionary [NBHW’s TD. In total 41% of the headings were shared across two or three county councils. A third of the shared headings corresponded to concepts in SNOMED CT. Further, an eighth of the shared headings corresponded to concepts in NBHW’s TD. The results showed that the extent of shared and standardized terminology in terms of headings across the studied three county councils were negligible.

  9. Making Interoperability Easier with the NASA Metadata Management Tool

    Shum, D.; Reese, M.; Pilone, D.; Mitchell, A. E.

    2016-12-01

    ISO 19115 has enabled interoperability amongst tools, yet many users find it hard to build ISO metadata for their collections because it can be large and overly flexible for their needs. The Metadata Management Tool (MMT), part of NASA's Earth Observing System Data and Information System (EOSDIS), offers users a modern, easy to use browser based tool to develop ISO compliant metadata. Through a simplified UI experience, metadata curators can create and edit collections without any understanding of the complex ISO-19115 format, while still generating compliant metadata. The MMT is also able to assess the completeness of collection level metadata by evaluating it against a variety of metadata standards. The tool provides users with clear guidance as to how to change their metadata in order to improve their quality and compliance. It is based on NASA's Unified Metadata Model for Collections (UMM-C) which is a simpler metadata model which can be cleanly mapped to ISO 19115. This allows metadata authors and curators to meet ISO compliance requirements faster and more accurately. The MMT and UMM-C have been developed in an agile fashion, with recurring end user tests and reviews to continually refine the tool, the model and the ISO mappings. This process is allowing for continual improvement and evolution to meet the community's needs.

  10. Providing trust and interoperability to federate distributed biobanks.

    Lablans, Martin; Bartholomäus, Sebastian; Uckert, Frank

    2011-01-01

    Biomedical research requires large numbers of well annotated, quality-assessed samples which often cannot be provided by a single biobank. Connecting biobanks, researchers and service providers raises numerous challenges including trust among partners and towards the infrastructure as well as interoperability problems. Therefore we develop a holistic, open-source and easy-to-use IT infrastructure. Our federated approach allows partners to reflect their organizational structures and protect their data sovereignty. The search service and the contact arrangement processes increase data sovereignty without stigmatizing for rejecting a specific cooperation. The infrastructure supports daily processes with an integrated basic sample manager and user-definable electronic case report forms. Interfaces for existing IT systems avoid re-entering of data. Moreover, resource virtualization is supported to make underutilized resources of some partners accessible to those with insufficient equipment for mutual benefit. The functionality of the resulting infrastructure is outlined in a use-case to demonstrate collaboration within a translational research network. Compared to other existing or upcoming infrastructures, our approach has ultimately the same goals, but relies on gentle incentives rather than top-down imposed progress.

  11. Measuring interoperable EHR adoption and maturity: a Canadian example.

    Gheorghiu, Bobby; Hagens, Simon

    2016-01-25

    An interoperable electronic health record is a secure consolidated record of an individual's health history and care, designed to facilitate authorized information sharing across the care continuum.  Each Canadian province and territory has implemented such a system and for all, measuring adoption is essential to understanding progress and optimizing use in order to realize intended benefits. About 250,000 health professionals-approximately half of Canada's anticipated potential physician, nurse, pharmacist, and administrative users-indicated that they electronically access data, such as those found in provincial/territorial lab or drug information systems, in 2015.  Trends suggest further growth as maturity of use increases. There is strong interest in health information exchange through the iEHR in Canada, and continued growth in adoption is expected. Central to managing the evolution of digital health is access to robust data about who is using solutions, how they are used, where and when.  Stakeholders such as government, program leads, and health system administrators must critically assess progress and achievement of benefits, to inform future strategic and operational decisions.

  12. MPEG-4 IPMP Extension for Interoperable Protection of Multimedia Content

    Zeng Wenjun

    2004-01-01

    Full Text Available To ensure secure content delivery, the Motion Picture Experts Group (MPEG has dedicated significant effort to the digital rights management (DRM issues. MPEG is now moving from defining only hooks to proprietary systems (e.g., in MPEG-2, MPEG-4 Version 1 to specifying a more encompassing standard in intellectual property management and protection (IPMP. MPEG feels that this is necessary in order to achieve MPEG's most important goal: interoperability. The design of the IPMP Extension framework also considers the complexity of the MPEG-4 standard and the diversity of its applications. This architecture leaves the details of the design of IPMP tools in the hands of applications developers, while ensuring the maximum flexibility and security. This paper first briefly describes the background of the development of the MPEG-4 IPMP Extension. It then presents an overview of the MPEG-4 IPMP Extension, including its architecture, the flexible protection signaling, and the secure messaging framework for the communication between the terminal and the tools. Two sample usage scenarios are also provided to illustrate how an MPEG-4 IPMP Extension compliant system works.

  13. A Proposed Information Architecture for Telehealth System Interoperability

    Craft, R.L.; Funkhouser, D.R.; Gallagher, L.K.; Garica, R.J.; Parks, R.C.; Warren, S.

    1999-04-20

    We propose an object-oriented information architecture for telemedicine systems that promotes secure `plug-and-play' interaction between system components through standardized interfaces, communication protocols, messaging formats, and data definitions. In this architecture, each component functions as a black box, and components plug together in a ''lego-like'' fashion to achieve the desired device or system functionality. Introduction Telemedicine systems today rely increasingly on distributed, collaborative information technology during the care delivery process. While these leading-edge systems are bellwethers for highly advanced telemedicine, most are custom-designed and do not interoperate with other commercial offerings. Users are limited to a set of functionality that a single vendor provides and must often pay high prices to obtain this functionality, since vendors in this marketplace must deliver en- tire systems in order to compete. Besides increasing corporate research and development costs, this inhibits the ability of the user to make intelligent purchasing decisions regarding best-of-breed technologies. This paper proposes a reference architecture for plug-and-play telemedicine systems that addresses these issues.

  14. A Proposed Information Architecture for Telehealth System Interoperability

    Warren, S.; Craft, R.L.; Parks, R.C.; Gallagher, L.K.; Garcia, R.J.; Funkhouser, D.R.

    1999-04-07

    Telemedicine technology is rapidly evolving. Whereas early telemedicine consultations relied primarily on video conferencing, consultations today may utilize video conferencing, medical peripherals, store-and-forward capabilities, electronic patient record management software, and/or a host of other emerging technologies. These remote care systems rely increasingly on distributed, collaborative information technology during the care delivery process, in its many forms. While these leading-edge systems are bellwethers for highly advanced telemedicine, the remote care market today is still immature. Most telemedicine systems are custom-designed and do not interoperate with other commercial offerings. Users are limited to a set of functionality that a single vendor provides and must often pay high prices to obtain this functionality, since vendors in this marketplace must deliver entire systems in order to compete. Besides increasing corporate research and development costs, this inhibits the ability of the user to make intelligent purchasing decisions regarding best-of-breed technologies. We propose a secure, object-oriented information architecture for telemedicine systems that promotes plug-and-play interaction between system components through standardized interfaces, communication protocols, messaging formats, and data definitions. In this architecture, each component functions as a black box, and components plug together in a lego-like fashion to achieve the desired device or system functionality. The architecture will support various ongoing standards work in the medical device arena.

  15. Technical Interoperability for Machine Connectivity on the Shop Floor

    Magnus Åkerman

    2018-06-01

    Full Text Available This paper presents a generic technical solution that can increase Industry 4.0 maturity by collecting data from sensors and control systems on the shop floor. Within the research project “5G-Enabled Manufacturing”, an LTE (Long-Term Evolution network with 5G technologies was deployed on the shop floor to enable fast and scalable connectivity. This network was used to connect a grinding machine to a remote private cloud where data was stored and streamed to a data analytics center. This enabled visibility and transparency of the production data, which is the basis for Industry 4.0 and smart manufacturing. The solution is described with a focus on high-level communication technologies above wireless communication standards. These technologies are discussed regarding technical interoperability, focusing on the system layout, communication standards, and open systems. From the discussion, it can be derived that generic solutions such as this are possible, but manufacturing end-users must expand and further internalize knowledge of future information and communication technologies to reduce their dependency on equipment and technology providers.

  16. Internet use during childhood and the ecological techno-subsystem

    Genevieve Marie Johnson

    2008-12-01

    Full Text Available Research findings suggest both positive and negative developmental consequences of Internet use during childhood (e.g., playing video games have been associated with enhanced visual skills as well as increased aggression. Several studies have concluded that environmental factors mediate the developmental impact of childhood online behaviour. From an ecological perspective, we propose the techno-subsystem, a dimension of the microsystem (i.e., immediate environments. The techno-subsystem includes child interaction with both living (e.g., peers and nonliving (e.g., hardware elements of communication, information, and recreation technologies in direct environments. By emphasizing the role of technology in child development, the ecological techno-subsystem encourages holistic exploration of the developmental consequences of Internet use (and future technological advances during childhood. L’usage d’Internet chez les enfants et le sous-système Techno écologique Résumé : Les résultats de recherche semblent indiquer que l’usage d’Internet chez les enfants aurait des conséquences développementales qui soit à la fois positives et négatives (ex. : l’usage des jeux vidéo auraient été associés à un accroissement des habileté visuelles ainsi qu’à un accroissement de l’agressivité. Plusieurs études ont aussi conclue que l’impact du comportement des enfants quand il sont en ligne sur leur développement serait affecté par des facteurs environnementaux. Dans une perspective écologique, nous proposons le sous-système Techno, une dimension du microsystème (ex :. les environnements immédiats. Le sous-système Techno comprend l’interaction de l’enfant avec des éléments vivants (e. : les paires et non vivants (ex; les ordinateurs de communication, d’information et de technologie de jeux dans des environnements directes.

  17. A development and integration analysis of commercial and in-house control subsystems

    Moore, D.M.; Dalesio, L.R.

    1998-01-01

    The acquisition and integration of commercial automation and control subsystems in physics research is becoming more common. It is presumed these systems present lower risk and less cost. This paper studies four subsystems used in the Accelerator Production of Tritium (APT) Low Energy Demonstration Accelerator (LEDA) at the Los Alamos National Laboratory (LANL). The radio frequency quadrupole (RFQ) resonance-control cooling subsystem (RCCS), the high-power RF subsystem and the RFQ vacuum subsystem were outsourced; the low-level RF (LLRF) subsystem was developed in-house. Based on the authors experience a careful evaluation of the costs and risks in acquisition, implementation, integration, and maintenance associated with these approaches is given

  18. Designing RF control subsystems using the VXIbus standard

    Stepp, J.D.; Vong, F.C.; Bridges, J.F.

    1993-01-01

    Various components are being designed to control the RF system of the 7-GeV Advanced Photon Source (APS). The associated control electronics (phase shifters, amplitude modulators, phase detectors, automatic tuning control, and local feedback control) are designed as modular cards with multiple channels for ease of replacement as well as for compact design. Various specifications of the VXIbus are listed and the method used to simplify the design of the control subsystem is shown. A commercial VXI interface board was used to speed the design cycle. Required manpower and actual task times are included. A discussion of the computer architecture and software development of the device drivers which allowed computer control from a VME processor located in a remote crate operating under the Experimental Physics and Industrial Controls Software (EPICS) program is also presented

  19. Lacie phase 1 Classification and Mensuration Subsystem (CAMS) rework experiment

    Chhikara, R. S.; Hsu, E. M.; Liszcz, C. J.

    1976-01-01

    An experiment was designed to test the ability of the Classification and Mensuration Subsystem rework operations to improve wheat proportion estimates for segments that had been processed previously. Sites selected for the experiment included three in Kansas and three in Texas, with the remaining five distributed in Montana and North and South Dakota. The acquisition dates were selected to be representative of imagery available in actual operations. No more than one acquisition per biophase were used, and biophases were determined by actual crop calendars. All sites were worked by each of four Analyst-Interpreter/Data Processing Analyst Teams who reviewed the initial processing of each segment and accepted or reworked it for an estimate of the proportion of small grains in the segment. Classification results, acquisitions and classification errors and performance results between CAMS regular and ITS rework are tabulated.

  20. The precision segmented reflectors: Moderate mission figure control subsystem

    Sevaston, G.; Redding, D.; Lau, K.; Breckenridge, W.; Levine, B.; Nerheim, N.; Sirlin, S.; Kadogawa, H.

    1991-01-01

    A system concept for a space based segmented reflector telescope figure control subsystem is described. The concept employs a two phase architecture in which figure initialization and figure maintenance are independent functions. Figure initialization is accomplished by image sharpening using natural reference targets. Figure maintenance is performed by monitoring the relative positions and alignments of the telescope components using an optical truss. Actuation is achieved using precision positioners. Computer simulation results of figure initialization by pairwise segment coalignment/cophasing and simulated annealing are presented along with figure maintenance results using a wavefront error regulation algorithm. Both functions are shown to perform at acceptable levels for the class of submillimeter telescopes that are serving as the focus of this technology development effort. Component breadboard work as well as plans for a system testbed are discussed.