WorldWideScience

Sample records for based interoperability solution

  1. Flexible solution for interoperable cloud healthcare systems.

    Science.gov (United States)

    Vida, Mihaela Marcella; Lupşe, Oana Sorina; Stoicu-Tivadar, Lăcrămioara; Bernad, Elena

    2012-01-01

    It is extremely important for the healthcare domain to have a standardized communication because will improve the quality of information and in the end the resulting benefits will improve the quality of patients' life. The standards proposed to be used are: HL7 CDA and CCD. For a better access to the medical data a solution based on cloud computing (CC) is investigated. CC is a technology that supports flexibility, seamless care, and reduced costs of the medical act. To ensure interoperability between healthcare information systems a solution creating a Web Custom Control is presented. The control shows the database tables and fields used to configure the two standards. This control will facilitate the work of the medical staff and hospital administrators, because they can configure the local system easily and prepare it for communication with other systems. The resulted information will have a higher quality and will provide knowledge that will support better patient management and diagnosis.

  2. Interoperable and standard e-Health solution over Bluetooth.

    Science.gov (United States)

    Martinez, I; Del Valle, P; Munoz, P; Trigo, J D; Escayola, J; Martínez-Espronceda, M; Muñoz, A; Serrano, L; Garcia, J

    2010-01-01

    The new paradigm of e-Health demands open sensors and middleware components that permit transparent integration and end-to-end interoperability of new personal health devices. The use of standards seems to be the internationally adopted way to solve these problems. This paper presents the implementation of an end-to-end standards-based e-Health solution. This includes ISO/IEEE11073 standard for the interoperability of the medical devices in the patient environment and EN13606 standard for the interoperable exchange of the Electronic Healthcare Record. The design strictly fulfills all the technical features of the most recent versions of both standards. The implemented prototype has been tested in a laboratory environment to demonstrate its feasibility for its further transfer to the healthcare system.

  3. An Ontological Solution to Support Interoperability in the Textile Industry

    Science.gov (United States)

    Duque, Arantxa; Campos, Cristina; Jiménez-Ruiz, Ernesto; Chalmeta, Ricardo

    Significant developments in information and communication technologies and challenging market conditions have forced enterprises to adapt their way of doing business. In this context, providing mechanisms to guarantee interoperability among heterogeneous organisations has become a critical issue. Even though prolific research has already been conducted in the area of enterprise interoperability, we have found that enterprises still struggle to introduce fully interoperable solutions, especially, in terms of the development and application of ontologies. Thus, the aim of this paper is to introduce basic ontology concepts in a simple manner and to explain the advantages of the use of ontologies to improve interoperability. We will also present a case study showing the implementation of an application ontology for an enterprise in the textile/clothing sector.

  4. On MDA - SOA based Intercloud Interoperability framework

    Directory of Open Access Journals (Sweden)

    Tahereh Nodehi

    2013-01-01

    Full Text Available Cloud computing has been one of the latest technologies which assures reliable delivery of on - demand computing services over the Internet. Cloud service providers have established geographically distributed data centers and computing resources, which are available online as service. The clouds operated by different service providers working together in collaboration can open up lots more spaces for innovative scenarios with huge amount of resources provisioning on demand. However, current cloud systems do not support intercloud interoperability. This paper is thus motivated to address Intercloud Interoperabilityby analyzing different methodologies that have been applied to resolve various scenarios of interoperability. Model Driven Architecture (MDA and Service Oriented Architecture (SOA method have been used to address interoperability in various scenarios, which also opens up spaces to address intercloud interoperability by making use of these well accepted methodologies. The focus of this document is to show Intercloud Interoperability can be supported through a Model Driven approach and Service Oriented systems. Moreover, the current state of the art in Intercloud, concept and benefits of MDA and SOA are discussed in the paper. At the same time this paper also proposes a generic architecture for MDA - SOA based framework, which can be useful for developing applications which will require intercloud interoperability. The paper justi fies the usability of the framework by a use - case scenario for dynamic workload migration among heterogeneous clouds.

  5. INTEROPERABLE FRAMEWORK SOLUTION TO ICU HEALTH CARE MONITORING

    Directory of Open Access Journals (Sweden)

    Shola Usha Rani

    2015-03-01

    Full Text Available An interoperable telehealth system provides an independent healthcare solution for better management of health and wellness. It allows people to manage their heart disease and diabetes etc. by sending their health parameters like blood pressure, heart rate, glucose levels, temperature, weight, respiration from remote place to health professional, and get real-time feedback on their condition. Here different medical devices are connected to the patient for monitoring. Each kind of device is manufactured by different vendors. And each device information and communication requires different installation and network design. It causes design complexities and network overheads when moving patients for diagnosis examinations. This problem will be solved by interoperability among devices. The ISO/IEEE 11073 is an international standard which produces interoperable hospital information system solution to medical devices. One such type of integrated environment that requires the integration of medical devices is ICU (Intensive Care Unit. This paper presents the issues for ICU monitoring system and framework solution for it.

  6. Interoperability

    DEFF Research Database (Denmark)

    Savin, Andrej

    be limited. Fourth, Data protection “by design” would be distinguished from data protection “by default”. Fifth, new fundamental rights would be introduced and the old ones clarified. Sixth, new rules on controllers’ and processors’ duties, on supervisory authorities and on sanctions would be introduced....... Finally, the Commission would obtain significant new powers to adopt delegated acts. This appendix explores the impact that the proposed Regulation might have on interoperability of user-­‐generated services.4 Since the proposed Regulation is an instrument of high complexity, only those provisions...... of direct relevance for the project and Work Package 5 will be analysed here....

  7. Linked data for transaction based enterprise interoperability

    NARCIS (Netherlands)

    Folmer, E.J.A.; Krukkert, D.

    2015-01-01

    Interoperability is of major importance in B2B environments. Starting with EDI in the ‘80s, currently interoperability relies heavily on XMLbased standards. Although having great impact, still issues remain to be solved for improving B2B interoperability. These issues include lack of dynamics, cost

  8. Data interoperability software solution for emergency reaction in the Europe Union

    Science.gov (United States)

    Casado, R.; Rubiera, E.; Sacristan, M.; Schütte, F.; Peters, R.

    2015-07-01

    Emergency management becomes more challenging in international crisis episodes because of cultural, semantic and linguistic differences between all stakeholders, especially first responders. Misunderstandings between first responders makes decision making slower and more difficult. However, spread and development of networks and IT-based emergency management systems (EMSs) have improved emergency responses, which have become more coordinated. Despite improvements made in recent years, EMSs have not still solved problems related to cultural, semantic and linguistic differences which are the real cause of slower decision making. In addition, from a technical perspective, the consolidation of current EMSs and the different formats used to exchange information offers another problem to be solved in any solution proposed for information interoperability between heterogeneous EMSs in different contexts. To overcome these problems, we present a software solution based on semantic and mediation technologies. EMERGency ELements (EMERGEL) (Fundacion CTIC and AntwortING Ingenieurbüro PartG, 2013), a common and modular ontology shared by all the stakeholders, has been defined. It offers the best solution to gather all stakeholders' knowledge in a unique and flexible data model, taking into account different countries' cultural and linguistic issues. To deal with the diversity of data protocols and formats, we have designed a service-oriented architecture for data interoperability (named DISASTER: Data Interoperability Solution At STakeholders Emergency Reaction) providing a flexible extensible solution to solve the mediation issues. Web services have been adopted as specific technology to implement this paradigm that has the most significant academic and industrial visibility and attraction. Contributions of this work have been validated through the design and development of a cross-border realistic prototype scenario, actively involving both emergency managers and emergency

  9. IHE based interoperability - benefits and challenges.

    Science.gov (United States)

    Wozak, Florian; Ammenwerth, Elske; Hörbst, Alexander; Sögner, Peter; Mair, Richard; Schabetsberger, Thomas

    2008-01-01

    Optimized workflows and communication between institutions involved in a patient's treatment process can lead to improved quality and efficiency in the healthcare sector. Electronic Health Records (EHRs) provide a patient-centered access to clinical data across institutional boundaries supporting the above mentioned aspects. Interoperability is regarded as vital success factor. However a clear definition of interoperability does not exist. The aim of this work is to define and to assess interoperability criteria as required for EHRs. The definition and assessment of interoperability criteria is supported by the analysis of existing literature and personal experience as well as by discussions with several domain experts. Criteria for interoperability addresses the following aspects: Interfaces, Semantics, Legal and organizational aspects and Security. The Integrating the Healthcare Enterprises initiative (IHE) profiles make a major contribution to these aspects, but they also arise new problems. Flexibility for adoption to different organizational/regional or other specific conditions is missing. Regional or national initiatives should get a possibility to realize their specific needs within the boundaries of IHE profiles. Security so far is an optional element which is one of IHE greatest omissions. An integrated security approach seems to be preferable. Irrespective of the so far practical significance of the IHE profiles it appears to be of great importance, that the profiles are constantly checked against practical experiences and are continuously adapted.

  10. The GEOSS solution for enabling data interoperability and integrative research.

    Science.gov (United States)

    Nativi, Stefano; Mazzetti, Paolo; Craglia, Max; Pirrone, Nicola

    2014-03-01

    Global sustainability research requires an integrative research effort underpinned by digital infrastructures (systems) able to harness data and heterogeneous information across disciplines. Digital data and information sharing across systems and applications is achieved by implementing interoperability: a property of a product or system to work with other products or systems, present or future. There are at least three main interoperability challenges a digital infrastructure must address: technological, semantic, and organizational. In recent years, important international programs and initiatives are focusing on such an ambitious objective. This manuscript presents and combines the studies and the experiences carried out by three relevant projects, focusing on the heavy metal domain: Global Mercury Observation System, Global Earth Observation System of Systems (GEOSS), and INSPIRE. This research work recognized a valuable interoperability service bus (i.e., a set of standards models, interfaces, and good practices) proposed to characterize the integrative research cyber-infrastructure of the heavy metal research community. In the paper, the GEOSS common infrastructure is discussed implementing a multidisciplinary and participatory research infrastructure, introducing a possible roadmap for the heavy metal pollution research community to join GEOSS as a new Group on Earth Observation community of practice and develop a research infrastructure for carrying out integrative research in its specific domain.

  11. Interoperability for electronic ID

    OpenAIRE

    Zygadlo, Zuzanna

    2009-01-01

    Electronic Business, including eBanking, eCommerce and eGovernmental services, is today based on a large variety of security solutions, comprising electronic IDs provided by a broad community of Public Key Infrastructure (PKI) vendors. Significant differences in implementations of those solutions introduce a problem of lack of interoperability in electronic business, which have not yet been resolved by standardization and interoperability initiatives based on existing PKI trust models. It i...

  12. DIMP: an interoperable solution for software integration and product data exchange

    Science.gov (United States)

    Wang, Xi Vincent; Xu, Xun William

    2012-08-01

    Today, globalisation has become one of the main trends of manufacturing business that has led to a world-wide decentralisation of resources amongst not only individual departments within one company but also business partners. However, despite the development and improvement in the last few decades, difficulties in information exchange and sharing still exist in heterogeneous applications environments. This article is divided into two parts. In the first part, related research work and integrating solutions are reviewed and discussed. The second part introduces a collaborative environment called distributed interoperable manufacturing platform, which is based on a module-based, service-oriented architecture (SOA). In the platform, the STEP-NC data model is used to facilitate data-exchange among heterogeneous CAD/CAM/CNC systems.

  13. Author identities an interoperability problem solved by a collaborative solution

    Science.gov (United States)

    Fleischer, D.; Czerniak, A.; Schirnick, C.

    2012-12-01

    The identity of authors and data providers is crucial for personalized interoperability. The marketplace of available identifiers is packed and the right choice is getting more and more complicated. Even though there are more then 15 different systems available there are still some under development and proposed to come up by the end of 2012 ('PubMed Central Author ID' and ORCID). Data Management on a scale beyond the size of a single research institute but on the scale of a scientific site including a university with student education program needs to tackle this problem and so did the Kiel Data Management an Infrastructure. The main problem with the identities of researchers is the quite high frequency changes in positions during a scientist life. The required system needed to be a system that already contained the potential of preregistered people with their scientific publications from other countries, institutions and organizations. Scanning the author ID marketplace brought up, that there us a high risk of additional workload to the researcher itself or the administration due to the fact that individuals need to register an ID for themselves or the chosen register is not yet big enough to simply find the right entry. On the other hand libraries deal with authors and their publications now for centuries and they have high quality catalogs with person identities already available. Millions of records internationally mapped are available by collaboration with libraries and can be used in exactly the same scope. The international collaboration between libraries (VIAF) provides a mapping between libraries from the US, CA, UK, FR, GER and many more. The international library author identification system made it possible to actually reach at the first matching a success of 60% of all scientists. The additional advantage is that librarians can finalize the Identity system in a kind of background process. The Kiel Data Management Infrastructure initiated a web service

  14. IoT interoperability : a hub-based approach

    OpenAIRE

    Blackstock, Michael; Lea, Rodger

    2014-01-01

    Interoperability in the Internet of Things is critical for emerging services and applications. In this paper we advocate the use of IoT ‘hubs’ to aggregate things using web protocols, and suggest a staged approach to interoperability. In the context of a UK government funded project involving 8 IoT projects to address cross-domain IoT interoperability, we introduce the HyperCat IoT catalogue specification. We then describe the tools and techniques we developed to adapt an existing data portal...

  15. eHealth integration and interoperability issues: towards a solution through enterprise architecture.

    Science.gov (United States)

    Adenuga, Olugbenga A; Kekwaletswe, Ray M; Coleman, Alfred

    2015-01-01

    Investments in healthcare information and communication technology (ICT) and health information systems (HIS) continue to increase. This is creating immense pressure on healthcare ICT and HIS to deliver and show significance in such investments in technology. It is discovered in this study that integration and interoperability contribute largely to this failure in ICT and HIS investment in healthcare, thus resulting in the need towards healthcare architecture for eHealth. This study proposes an eHealth architectural model that accommodates requirement based on healthcare need, system, implementer, and hardware requirements. The model is adaptable and examines the developer's and user's views that systems hold high hopes for their potential to change traditional organizational design, intelligence, and decision-making.

  16. Towards an enterprise interoperability framework

    CSIR Research Space (South Africa)

    Kotzé, P

    2010-06-01

    Full Text Available This paper presents relevant interoperability approaches and solutions applied to global/international networked (collaborative) enterprises or organisations and conceptualise an enhanced enterprise interoperability framework. The paper covers...

  17. Policy-Based Negotiation Engine for Cross-Domain Interoperability

    Science.gov (United States)

    Vatan, Farrokh; Chow, Edward T.

    2012-01-01

    A successful policy negotiation scheme for Policy-Based Management (PBM) has been implemented. Policy negotiation is the process of determining the "best" communication policy that all of the parties involved can agree on. Specifically, the problem is how to reconcile the various (and possibly conflicting) communication protocols used by different divisions. The solution must use protocols available to all parties involved, and should attempt to do so in the best way possible. Which protocols are commonly available, and what the definition of "best" is will be dependent on the parties involved and their individual communications priorities.

  18. A Proposed Engineering Process and Prototype Toolset for Developing C2-to-Simulation Interoperability Solutions

    NARCIS (Netherlands)

    Gautreau, B.; Khimeche, L.; Reus, N.M. de; Heffner, K.; Mevassvik, O.M.

    2014-01-01

    The Coalition Battle Management Language (C-BML) is an open standard being developed for the exchange of digitized military information among command and control (C2), simulation and autonomous systems by the Simulation Interoperability Standards Organization (SISO). As the first phase of the C-BML

  19. The role of architecture and ontology for interoperability.

    Science.gov (United States)

    Blobel, Bernd; González, Carolina; Oemig, Frank; Lopéz, Diego; Nykänen, Pirkko; Ruotsalainen, Pekka

    2010-01-01

    Turning from organization-centric to process-controlled or even to personalized approaches, advanced healthcare settings have to meet special interoperability challenges. eHealth and pHealth solutions must assure interoperability between actors cooperating to achieve common business objectives. Hereby, the interoperability chain also includes individually tailored technical systems, but also sensors and actuators. For enabling corresponding pervasive computing and even autonomic computing, individualized systems have to be based on an architecture framework covering many domains, scientifically managed by specialized disciplines using their specific ontologies in a formalized way. Therefore, interoperability has to advance from a communication protocol to an architecture-centric approach mastering ontology coordination challenges.

  20. Profiling Fast Healthcare Interoperability Resources (FHIR) of Family Health History based on the Clinical Element Models

    OpenAIRE

    Lee, Jaehoon; Hulse, Nathan C.; Wood, Grant M.; Oniki, Thomas A.; Huff, Stanley M.

    2017-01-01

    In this study we developed a Fast Healthcare Interoperability Resources (FHIR) profile to support exchanging a full pedigree based family health history (FHH) information across multiple systems and applications used by clinicians, patients, and researchers. We used previously developed clinical element models (CEMs) that are capable of representing the FHH information, and derived essential data elements including attributes, constraints, and value sets. We analyzed gaps between the FHH CEM ...

  1. MDA-based interoperability establishment using language independent information models

    OpenAIRE

    Agostinho C.; Cerny J.; Jardim-Goncalves R.

    2012-01-01

    Part 2: Full Papers; International audience; Nowadays, more and more enterprises realize that one important step to success in their business is to create new and innovative products. Many times the solution to do that is to abandon the idea of an enterprise as an “isolated island”, and get collaboration with others: worldwide non-hierarchical networks are characterized by collaboration and non-centralized decision making. This paper proposes a conceptual model common to the entire business n...

  2. Standards-based Content Resources: A Prerequisite for Content Integration and Content Interoperability

    Directory of Open Access Journals (Sweden)

    Christian Galinski

    2010-05-01

    Full Text Available Objective: to show how standards-based approaches for content standardization, content management, content related services and tools as well as the respective certification systems not only guarantee reliable content integration and content interoperability, but also are of particular benefit to people with special needs in eAccessibility/eInclusion. Method: document MoU/MG/05 N0221 ''Semantic Interoperability and the need for a coherent policy for a framework of distributed, possibly federated repositories for all kinds of content items on a world-wide scale''2, which was adopted in 2005, was a first step towards the formulation of global interoperability requirements for structured content. These requirements -based on advanced terminological principles- were taken up in EU-projects such as IN-SAFETY (INfrastructure and SAFETY and OASIS (Open architecture for Accessible Services Integration and Standardization. Results: Content integration and content interoperability are key concepts in connection with the emergence of state-of-the-art distributed and federated databases/repositories of structured content. Given the fact that linguistic content items are increasingly combined with or embedded in non-linguistic content items (and vice versa, a systemic and generic approach to data modelling and content management has become the order of the day. Fulfilling the requirements of capability for multilinguality and multimodality, based on open standards makes software and database design fit for eAccessibility/eInclusion from the outset. It also makes structured content capable for global content integration and content interoperability, because it enhances its potential for being re-used and re-purposed in totally different eApplications. Such content as well as the methods, tools and services applied can be subject to new kinds of certification schemes which also should be based on standards. Conclusions: Content must be totally reliable in some

  3. CCR+: Metadata Based Extended Personal Health Record Data Model Interoperable with the ASTM CCR Standard.

    Science.gov (United States)

    Park, Yu Rang; Yoon, Young Jo; Jang, Tae Hun; Seo, Hwa Jeong; Kim, Ju Han

    2014-01-01

    Extension of the standard model while retaining compliance with it is a challenging issue because there is currently no method for semantically or syntactically verifying an extended data model. A metadata-based extended model, named CCR+, was designed and implemented to achieve interoperability between standard and extended models. Furthermore, a multilayered validation method was devised to validate the standard and extended models. The American Society for Testing and Materials (ASTM) Community Care Record (CCR) standard was selected to evaluate the CCR+ model; two CCR and one CCR+ XML files were evaluated. In total, 188 metadata were extracted from the ASTM CCR standard; these metadata are semantically interconnected and registered in the metadata registry. An extended-data-model-specific validation file was generated from these metadata. This file can be used in a smartphone application (Health Avatar CCR+) as a part of a multilayered validation. The new CCR+ model was successfully evaluated via a patient-centric exchange scenario involving multiple hospitals, with the results supporting both syntactic and semantic interoperability between the standard CCR and extended, CCR+, model. A feasible method for delivering an extended model that complies with the standard model is presented herein. There is a great need to extend static standard models such as the ASTM CCR in various domains: the methods presented here represent an important reference for achieving interoperability between standard and extended models.

  4. Interoperability of Geographic Information: A Communication Process –Based Prototype

    Directory of Open Access Journals (Sweden)

    Jean Brodeur

    2005-04-01

    Full Text Available Since 1990, municipal, state/provincial, and federal governments have developed numerous geographic databases over the years to fulfill organizations' specific needs. As such, same real world topographic phenomena have been abstracted differently, for instance vegetation (surface, trees (surface, wooded area (line, wooded area (point and line, milieu boisé (surface, zone boisée (unknown geometry. Today, information about these geographic phenomena is accessible on the Internet from Web infrastructures specially developed to simplify their access. Early in the nineties, the development of interoperability of geographic information has been undertaken to solve syntactic, structural, and semantic heterogeneities as well as spatial and temporal heterogeneities to facilitate sharing and integration of such data. Recently, we have proposed a new conceptual framework for interoperability of geographic information based on the human communication process, cognitive science, and ontology, and introduced geosemantic proximity, a reasoning methodology to qualify dynamically the semantic similarity between geographic abstractions. This framework could be of interest to other disciplines. This paper presents the details of our framework for interoperability of geographic information as well as a prototype.

  5. Model of the naval base logistic interoperability within the multinational operations

    Directory of Open Access Journals (Sweden)

    Bohdan Pac

    2011-12-01

    Full Text Available The paper concerns the model of the naval base logistics interoperability within the multinational operations conducted at sea by NATO or EU nations. The model includes the set of logistic requirements that NATO and EU expect from the contributing nations within the area of the logistic support provided to the forces operating out of the home bases. Model may reflect the scheme configuration, the set of requirements and its mathematical description for the naval base supporting multinational forces within maritime operations.

  6. Design and Implement AN Interoperable Internet of Things Application Based on AN Extended Ogc Sensorthings Api Standard

    Science.gov (United States)

    Huang, C. Y.; Wu, C. H.

    2016-06-01

    The Internet of Things (IoT) is an infrastructure that interconnects uniquely-identifiable devices using the Internet. By interconnecting everyday appliances, various monitoring and physical mashup applications can be constructed to improve people's daily life. However, IoT devices created by different manufacturers follow different proprietary protocols and cannot communicate with each other. This heterogeneity issue causes different products to be locked in multiple closed ecosystems that we call IoT silos. In order to address this issue, a common industrial solution is the hub approach, which implements connectors to communicate with IoT devices following different protocols. However, with the growing number of proprietary protocols proposed by device manufacturers, IoT hubs need to support and maintain a lot of customized connectors. Hence, we believe the ultimate solution to address the heterogeneity issue is to follow open and interoperable standard. Among the existing IoT standards, the Open Geospatial Consortium (OGC) SensorThings API standard supports comprehensive conceptual model and query functionalities. The first version of SensorThings API mainly focuses on connecting to IoT devices and sharing sensor observations online, which is the sensing capability. Besides the sensing capability, IoT devices could also be controlled via the Internet, which is the tasking capability. While the tasking capability was not included in the first version of the SensorThings API standard, this research aims on defining the tasking capability profile and integrates with the SensorThings API standard, which we call the extended-SensorThings API in this paper. In general, this research proposes a lightweight JSON-based web service description, the "Tasking Capability Description", allowing device owners and manufacturers to describe different IoT device protocols. Through the extended- SensorThings API, users and applications can follow a coherent protocol to control Io

  7. Toward an Interoperability Architecture

    National Research Council Canada - National Science Library

    Buddenberg, Rex

    2001-01-01

    .... The continued burgeoning of the Internet constitutes an existence proof. But a common networking base is insufficient to reach a goal of cross-system interoperability - the large information system...

  8. Web services-based text-mining demonstrates broad impacts for interoperability and process simplification.

    Science.gov (United States)

    Wiegers, Thomas C; Davis, Allan Peter; Mattingly, Carolyn J

    2014-01-01

    The Critical Assessment of Information Extraction systems in Biology (BioCreAtIvE) challenge evaluation tasks collectively represent a community-wide effort to evaluate a variety of text-mining and information extraction systems applied to the biological domain. The BioCreative IV Workshop included five independent subject areas, including Track 3, which focused on named-entity recognition (NER) for the Comparative Toxicogenomics Database (CTD; http://ctdbase.org). Previously, CTD had organized document ranking and NER-related tasks for the BioCreative Workshop 2012; a key finding of that effort was that interoperability and integration complexity were major impediments to the direct application of the systems to CTD's text-mining pipeline. This underscored a prevailing problem with software integration efforts. Major interoperability-related issues included lack of process modularity, operating system incompatibility, tool configuration complexity and lack of standardization of high-level inter-process communications. One approach to potentially mitigate interoperability and general integration issues is the use of Web services to abstract implementation details; rather than integrating NER tools directly, HTTP-based calls from CTD's asynchronous, batch-oriented text-mining pipeline could be made to remote NER Web services for recognition of specific biological terms using BioC (an emerging family of XML formats) for inter-process communications. To test this concept, participating groups developed Representational State Transfer /BioC-compliant Web services tailored to CTD's NER requirements. Participants were provided with a comprehensive set of training materials. CTD evaluated results obtained from the remote Web service-based URLs against a test data set of 510 manually curated scientific articles. Twelve groups participated in the challenge. Recall, precision, balanced F-scores and response times were calculated. Top balanced F-scores for gene, chemical and

  9. The impact of SOA for achieving healthcare interoperability. An empirical investigation based on a hypothetical adoption.

    Science.gov (United States)

    Daskalakis, S; Mantas, J

    2009-01-01

    The evaluation of a service-oriented prototype implementation for healthcare interoperability. A prototype framework was developed, aiming to exploit the use of service-oriented architecture (SOA) concepts for achieving healthcare interoperability and to move towards a virtual patient record (VPR) paradigm. The prototype implementation was evaluated for its hypothetical adoption. The evaluation strategy was based on the initial proposition of the DeLone and McLean model of information systems (IS) success [1], as modeled by Iivari [2]. A set of SOA and VPR characteristics were empirically encapsulated within the dimensions of IS success model, combined with measures from previous research works. The data gathered was analyzed using partial least squares (PLS). The results highlighted that system quality is a partial predictor of system use but not of user satisfaction. On the contrary, information quality proved to be a significant predictor of user satisfaction and partially a strong significant predictor of system use. Moreover, system use did not prove to be a significant predictor of individual impact whereas the bi-directional relation between use and user satisfaction did not confirm. Additionally, user satisfaction was found to be a strong significant predictor of individual impact. Finally, individual impact proved to be a strong significant predictor of organizational impact. The empirical study attempted to obtain hypothetical, but still useful beliefs and perceptions regarding the SOA prototype implementation. The deduced observations can form the basis for further investigation regarding the adaptability of SOA implementations with VPR characteristics in the healthcare domain.

  10. Research on key technologies for data-interoperability-based metadata, data compression and encryption, and their application

    Science.gov (United States)

    Yu, Xu; Shao, Quanqin; Zhu, Yunhai; Deng, Yuejin; Yang, Haijun

    2006-10-01

    With the development of informationization and the separation between data management departments and application departments, spatial data sharing becomes one of the most important objectives for the spatial information infrastructure construction, and spatial metadata management system, data transmission security and data compression are the key technologies to realize spatial data sharing. This paper discusses the key technologies for metadata based on data interoperability, deeply researches the data compression algorithms such as adaptive Huffman algorithm, LZ77 and LZ78 algorithm, studies to apply digital signature technique to encrypt spatial data, which can not only identify the transmitter of spatial data, but also find timely whether the spatial data are sophisticated during the course of network transmission, and based on the analysis of symmetric encryption algorithms including 3DES,AES and asymmetric encryption algorithm - RAS, combining with HASH algorithm, presents a improved mix encryption method for spatial data. Digital signature technology and digital watermarking technology are also discussed. Then, a new solution of spatial data network distribution is put forward, which adopts three-layer architecture. Based on the framework, we give a spatial data network distribution system, which is efficient and safe, and also prove the feasibility and validity of the proposed solution.

  11. AN INTEROPERABLE ARCHITECTURE FOR AIR POLLUTION EARLY WARNING SYSTEM BASED ON SENSOR WEB

    Directory of Open Access Journals (Sweden)

    F. Samadzadegan

    2013-09-01

    Full Text Available Environmental monitoring systems deal with time-sensitive issues which require quick responses in emergency situations. Handling the sensor observations in near real-time and obtaining valuable information is challenging issues in these systems from a technical and scientific point of view. The ever-increasing population growth in urban areas has caused certain problems in developing countries, which has direct or indirect impact on human life. One of applicable solution for controlling and managing air quality by considering real time and update air quality information gathered by spatially distributed sensors in mega cities, using sensor web technology for developing monitoring and early warning systems. Urban air quality monitoring systems using functionalities of geospatial information system as a platform for analysing, processing, and visualization of data in combination with Sensor Web for supporting decision support systems in disaster management and emergency situations. This system uses Sensor Web Enablement (SWE framework of the Open Geospatial Consortium (OGC, which offers a standard framework that allows the integration of sensors and sensor data into spatial data infrastructures. SWE framework introduces standards for services to access sensor data and discover events from sensor data streams as well as definition set of standards for the description of sensors and the encoding of measurements. The presented system provides capabilities to collect, transfer, share, process air quality sensor data and disseminate air quality status in real-time. It is possible to overcome interoperability challenges by using standard framework. In a routine scenario, air quality data measured by in-situ sensors are communicated to central station where data is analysed and processed. The extracted air quality status is processed for discovering emergency situations, and if necessary air quality reports are sent to the authorities. This research

  12. An Interoperable Architecture for Air Pollution Early Warning System Based on Sensor Web

    Science.gov (United States)

    Samadzadegan, F.; Zahmatkesh, H.; Saber, M.; Ghazi khanlou, H. J.

    2013-09-01

    Environmental monitoring systems deal with time-sensitive issues which require quick responses in emergency situations. Handling the sensor observations in near real-time and obtaining valuable information is challenging issues in these systems from a technical and scientific point of view. The ever-increasing population growth in urban areas has caused certain problems in developing countries, which has direct or indirect impact on human life. One of applicable solution for controlling and managing air quality by considering real time and update air quality information gathered by spatially distributed sensors in mega cities, using sensor web technology for developing monitoring and early warning systems. Urban air quality monitoring systems using functionalities of geospatial information system as a platform for analysing, processing, and visualization of data in combination with Sensor Web for supporting decision support systems in disaster management and emergency situations. This system uses Sensor Web Enablement (SWE) framework of the Open Geospatial Consortium (OGC), which offers a standard framework that allows the integration of sensors and sensor data into spatial data infrastructures. SWE framework introduces standards for services to access sensor data and discover events from sensor data streams as well as definition set of standards for the description of sensors and the encoding of measurements. The presented system provides capabilities to collect, transfer, share, process air quality sensor data and disseminate air quality status in real-time. It is possible to overcome interoperability challenges by using standard framework. In a routine scenario, air quality data measured by in-situ sensors are communicated to central station where data is analysed and processed. The extracted air quality status is processed for discovering emergency situations, and if necessary air quality reports are sent to the authorities. This research proposed an

  13. Profiling Fast Healthcare Interoperability Resources (FHIR) of Family Health History based on the Clinical Element Models.

    Science.gov (United States)

    Lee, Jaehoon; Hulse, Nathan C; Wood, Grant M; Oniki, Thomas A; Huff, Stanley M

    2016-01-01

    In this study we developed a Fast Healthcare Interoperability Resources (FHIR) profile to support exchanging a full pedigree based family health history (FHH) information across multiple systems and applications used by clinicians, patients, and researchers. We used previously developed clinical element models (CEMs) that are capable of representing the FHH information, and derived essential data elements including attributes, constraints, and value sets. We analyzed gaps between the FHH CEM elements and existing FHIR resources. Based on the analysis, we developed a profile that consists of 1) FHIR resources for essential FHH data elements, 2) extensions for additional elements that were not covered by the resources, and 3) a structured definition to integrate patient and family member information in a FHIR message. We implemented the profile using an open-source based FHIR framework and validated it using patient-entered FHH data that was captured through a locally developed FHH tool.

  14. Interoperable Solution for Test Execution in Various I&T Environments

    Science.gov (United States)

    Lee, Young H.; Bareh, Magdy S.

    2006-01-01

    When there is spacecraft collaboration between several industry partners, there is an inherent difference in integration and test (I&T) methodologies, which creates a challenge for verifying flight systems during the development phase. To converge the differing I&T methodologies, considerations were required for multiple project areas such as Flight System Testbed (FST), Assembly, Test, and Launch Operations (ATLO), and Spacecraft Simulator environments. This paper details the challenges and approaches of the JPL's effort in engineering a solution to testing the flight system with the Mission Operations Ground System while maintaining the comparability with testing methods of the industry partners.

  15. Lemnos Interoperable Security Program

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, John [Tennessee Valley Authority, Knoxville, TN (United States); Halbgewachs, Ron [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Chavez, Adrian [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Smith, Rhett [Schweitzer Engineering Laboratories, Chattanooga, TN (United States); Teumim, David [Teumim Technical, Allentown, PA (United States)

    2012-01-31

    The manner in which the control systems are being designed and operated in the energy sector is undergoing some of the most significant changes in history due to the evolution of technology and the increasing number of interconnections to other system. With these changes however come two significant challenges that the energy sector must face; 1) Cyber security is more important than ever before, and 2) Cyber security is more complicated than ever before. A key requirement in helping utilities and vendors alike in meeting these challenges is interoperability. While interoperability has been present in much of the discussions relating to technology utilized within the energy sector and especially the Smart Grid, it has been absent in the context of cyber security. The Lemnos project addresses these challenges by focusing on the interoperability of devices utilized within utility control systems which support critical cyber security functions. In theory, interoperability is possible with many of the cyber security solutions available to utilities today. The reality is that the effort required to achieve cyber security interoperability is often a barrier for utilities. For example, consider IPSec, a widely-used Internet Protocol to define Virtual Private Networks, or tunnels , to communicate securely through untrusted public and private networks. The IPSec protocol suite has a significant number of configuration options and encryption parameters to choose from, which must be agreed upon and adopted by both parties establishing the tunnel. The exercise in getting software or devices from different vendors to interoperate is labor intensive and requires a significant amount of security expertise by the end user. Scale this effort to a significant number of devices operating over a large geographical area and the challenge becomes so overwhelming that it often leads utilities to pursue solutions from a single vendor. These single vendor solutions may inadvertently lock

  16. A Proposed Engineering Process and Prototype Toolset for Developing C2-to-Simulation Interoperability Solutions

    NARCIS (Netherlands)

    Johnsen, F.T.; Bloebaum, T.H.; Meiler, P.P.; Owens, I.; Barz, C.; Jansen, N.

    2013-01-01

    Service Oriented Architecture (SOA) can enable agile C2 functionality. The flexibility and loose coupling offered by the SOA paradigm means that both NATO and many of the NATO nations are basing their future information infrastructures on this paradigm. Web services, the most common and mature

  17. Nato Multinational Brigade Interoperability: Issues, Mitigating Solutions and is it Time for a Nato Multinational Brigade Doctrine?

    Directory of Open Access Journals (Sweden)

    Schiller Mark

    2016-06-01

    Full Text Available Multinational Brigade Operations involving NATO and its European Partners are the norm in the post-Cold War Era. Commonplace today are Multinational Brigades, composed of staffs and subordinate units representing almost every NATO Country and Partner, participating in training exercises or actual operations in both the European and Southwest Asian Theatres. Leadership challenges are prevalent for the Multinational Brigade Commander and his staff, especially those challenges they face in achieving an effective level of brigade interoperability in order to conduct successful operations in NATO’s present and future operating environments. The purpose of this paper is twofold: to examine the major interoperability obstacles a multinational brigade commander and his staff are likely to encounter during the planning and execution of brigade operations; and, to recommend actions and measures a multinational brigade commander and his staff can implement to facilitate interoperability in a multinational brigade operating environment. Several key interoperability topics considered integral to effective multinational brigade operations will be examined and analysed to include understanding partner unit capabilities and limitations facilitated by an integration plan, appropriate command and support relationships, compatible communications, synchronized intelligence and information collection, establishing effective liaison, and fratricide prevention. The paper conclusion will urge for a NATO land brigade doctrine considering doctrine’s critical importance to effective brigade command and control interoperability and the expected missions a land brigade will encounter in future NATO operating environments as part of the NATO Very High Readiness Joint Task Force (VJTF.

  18. COTS-based OO-component approach for software inter-operability and reuse (software systems engineering methodology)

    Science.gov (United States)

    Yin, J.; Oyaki, A.; Hwang, C.; Hung, C.

    2000-01-01

    The purpose of this research and study paper is to provide a summary description and results of rapid development accomplishments at NASA/JPL in the area of advanced distributed computing technology using a Commercial-Off--The-Shelf (COTS)-based object oriented component approach to open inter-operable software development and software reuse.

  19. Open Source Interoperability: It's More than Technology

    Directory of Open Access Journals (Sweden)

    Dominic Sartorio

    2008-01-01

    Full Text Available The Open Solutions Alliance is a consortium of leading commercial open source vendors, integrators and end users dedicated to the growth of open source based solutions in the enterprise. We believe Linux and other infrastructure software, such as Apache, has become mainstream, and packaged solutions represent the next great growth opportunity. However some unique challenges can temper that opportunity. These challenges include getting the word out about the maturity and enterprise-readiness of those solutions, ensuring interoperability both with each other and with other proprietary and legacy solutions, and ensuring healthy collaboration between vendors and their respective customer and developer communities.

  20. Grid interoperability: the interoperations cookbook

    Energy Technology Data Exchange (ETDEWEB)

    Field, L; Schulz, M [CERN (Switzerland)], E-mail: Laurence.Field@cern.ch, E-mail: Markus.Schulz@cern.ch

    2008-07-01

    Over recent years a number of grid projects have emerged which have built grid infrastructures that are now the computing backbones for various user communities. A significant number of these communities are limited to one grid infrastructure due to the different middleware and procedures used in each grid. Grid interoperation is trying to bridge these differences and enable virtual organizations to access resources independent of the grid project affiliation. This paper gives an overview of grid interoperation and describes the current methods used to bridge the differences between grids. Actual use cases encountered during the last three years are discussed and the most important interfaces required for interoperability are highlighted. A summary of the standardisation efforts in these areas is given and we argue for moving more aggressively towards standards.

  1. Grid interoperability: the interoperations cookbook

    International Nuclear Information System (INIS)

    Field, L; Schulz, M

    2008-01-01

    Over recent years a number of grid projects have emerged which have built grid infrastructures that are now the computing backbones for various user communities. A significant number of these communities are limited to one grid infrastructure due to the different middleware and procedures used in each grid. Grid interoperation is trying to bridge these differences and enable virtual organizations to access resources independent of the grid project affiliation. This paper gives an overview of grid interoperation and describes the current methods used to bridge the differences between grids. Actual use cases encountered during the last three years are discussed and the most important interfaces required for interoperability are highlighted. A summary of the standardisation efforts in these areas is given and we argue for moving more aggressively towards standards

  2. The Italian Cloud-based brokering Infrastructure to sustain Interoperability for Operative Hydrology

    Science.gov (United States)

    Boldrini, E.; Pecora, S.; Bussettini, M.; Bordini, F.; Nativi, S.

    2015-12-01

    This work presents the informatics platform carried out to implement the National Hydrological Operative Information System of Italy. In particular, the presentation will focus on the governing aspects of the cloud infrastructure and brokering software that make possible to sustain the hydrology data flow between heterogeneous user clients and data providers.The Institute for Environmental Protection and Research, ISPRA (Istituto Superiore per la Protezione e la Ricerca Ambientale) in collaboration with the Regional Agency for Environmental Protection in the Emilia-Romagna region, ARPA-ER (Agenzia Regionale per la Prevenzione e l´Ambiente dell´Emilia-Romagna) and CNR-IIA (National Research Council of Italy) designed and developed an innovative platform for the discovery and access of hydrological data coming from 19 Italian administrative regions and 2 Italian autonomous provinces, in near real time. ISPRA has deployed and governs such a system. The presentation will introduce and discuss the technological barriers for interoperability as well as social and policy ones. The adopted solutions will be described outlining the sustainability challenges and benefits.

  3. Buildings Interoperability Landscape

    Energy Technology Data Exchange (ETDEWEB)

    Hardin, Dave [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Stephan, Eric G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wang, Weimin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Corbin, Charles D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Widergren, Steven E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-12-31

    Through its Building Technologies Office (BTO), the United States Department of Energy’s Office of Energy Efficiency and Renewable Energy (DOE-EERE) is sponsoring an effort to advance interoperability for the integration of intelligent buildings equipment and automation systems, understanding the importance of integration frameworks and product ecosystems to this cause. This is important to BTO’s mission to enhance energy efficiency and save energy for economic and environmental purposes. For connected buildings ecosystems of products and services from various manufacturers to flourish, the ICT aspects of the equipment need to integrate and operate simply and reliably. Within the concepts of interoperability lie the specification, development, and certification of equipment with standards-based interfaces that connect and work. Beyond this, a healthy community of stakeholders that contribute to and use interoperability work products must be developed. On May 1, 2014, the DOE convened a technical meeting to take stock of the current state of interoperability of connected equipment and systems in buildings. Several insights from that meeting helped facilitate a draft description of the landscape of interoperability for connected buildings, which focuses mainly on small and medium commercial buildings. This document revises the February 2015 landscape document to address reviewer comments, incorporate important insights from the Buildings Interoperability Vision technical meeting, and capture thoughts from that meeting about the topics to be addressed in a buildings interoperability vision. In particular, greater attention is paid to the state of information modeling in buildings and the great potential for near-term benefits in this area from progress and community alignment.

  4. Interoperability and Security Support for Heterogeneous COTS/GOTS/Legacy Component-Based Architecture

    National Research Council Canada - National Science Library

    Tran, Tam

    2000-01-01

    There is a need for Commercial-off-the-shelf (COTS), Government-off-the-shelf (GOTS) and legacy components to interoperate in a secure distributed computing environment in order to facilitate the development of evolving applications...

  5. Network Function Virtualization (NFV) based architecture to address connectivity, interoperability and manageability challenges in Internet of Things (IoT)

    Science.gov (United States)

    Haseeb, Shariq; Hashim, Aisha Hassan A.; Khalifa, Othman O.; Faris Ismail, Ahmad

    2017-11-01

    IoT aims to interconnect sensors and actuators built into devices (also known as Things) in order for them to share data and control each other to improve existing processes for making people’s life better. IoT aims to connect between all physical devices like fridges, cars, utilities, buildings and cities so that they can take advantage of small pieces of information collected by each one of these devices and derive more complex decisions. However, these devices are heterogeneous in nature because of various vendor support, connectivity options and protocol suit. Heterogeneity of such devices makes it difficult for them to leverage on each other’s capabilities in the traditional IoT architecture. This paper highlights the effects of heterogeneity challenges on connectivity, interoperability, management in greater details. It also surveys some of the existing solutions adopted in the core network to solve the challenges of massive IoT deployments. Finally, the paper proposes a new architecture based on NFV to address the problems.

  6. Semantically Interoperable XML Data.

    Science.gov (United States)

    Vergara-Niedermayr, Cristobal; Wang, Fusheng; Pan, Tony; Kurc, Tahsin; Saltz, Joel

    2013-09-01

    XML is ubiquitously used as an information exchange platform for web-based applications in healthcare, life sciences, and many other domains. Proliferating XML data are now managed through latest native XML database technologies. XML data sources conforming to common XML schemas could be shared and integrated with syntactic interoperability. Semantic interoperability can be achieved through semantic annotations of data models using common data elements linked to concepts from ontologies. In this paper, we present a framework and software system to support the development of semantic interoperable XML based data sources that can be shared through a Grid infrastructure. We also present our work on supporting semantic validated XML data through semantic annotations for XML Schema, semantic validation and semantic authoring of XML data. We demonstrate the use of the system for a biomedical database of medical image annotations and markups.

  7. Semantically Interoperable XML Data

    Science.gov (United States)

    Vergara-Niedermayr, Cristobal; Wang, Fusheng; Pan, Tony; Kurc, Tahsin; Saltz, Joel

    2013-01-01

    XML is ubiquitously used as an information exchange platform for web-based applications in healthcare, life sciences, and many other domains. Proliferating XML data are now managed through latest native XML database technologies. XML data sources conforming to common XML schemas could be shared and integrated with syntactic interoperability. Semantic interoperability can be achieved through semantic annotations of data models using common data elements linked to concepts from ontologies. In this paper, we present a framework and software system to support the development of semantic interoperable XML based data sources that can be shared through a Grid infrastructure. We also present our work on supporting semantic validated XML data through semantic annotations for XML Schema, semantic validation and semantic authoring of XML data. We demonstrate the use of the system for a biomedical database of medical image annotations and markups. PMID:25298789

  8. Towards sustainability: An interoperability outline for a Regional ARC based infrastructure in the WLCG and EGEE infrastructures

    International Nuclear Information System (INIS)

    Field, L; Gronager, M; Johansson, D; Kleist, J

    2010-01-01

    Interoperability of grid infrastructures is becoming increasingly important in the emergence of large scale grid infrastructures based on national and regional initiatives. To achieve interoperability of grid infrastructures adaptions and bridging of many different systems and services needs to be tackled. A grid infrastructure offers services for authentication, authorization, accounting, monitoring, operation besides from the services for handling and data and computations. This paper presents an outline of the work done to integrate the Nordic Tier-1 and 2s, which for the compute part is based on the ARC middleware, into the WLCG grid infrastructure co-operated by the EGEE project. Especially, a throughout description of integration of the compute services is presented.

  9. Enhancing Data Interoperability with Web Services

    Science.gov (United States)

    Shrestha, S. R.; Zimble, D. A.; Wang, W.; Herring, D.; Halpert, M.

    2014-12-01

    In an effort to improve data access and interoperability of climate and weather data, the National Oceanic and Atmospheric Administration's (NOAA) Climate.gov and Climate Prediction Center (CPC) are exploring various platform solutions to enhance a user's ability to locate, preview, and acquire the data. The Climate.gov and CPC data team faces multiple challenges including the various kinds of data and formats, inconsistency of metadata records, variety of data service implementations, very large volumes of data and geographically distributed locations. We have created the Data Access and Interoperability project to design a web-based platform, where interoperability between systems can be leveraged to allow greater data discovery, access, visualization and delivery. In the interoperable data platform, systems can integrate with each other to support the synthesis of climate and weather data. Interoperability is the ability for users to discover the available climate and weather data, preview and interact with the data, and acquire the data in common digital formats through a simple web-based interface. The goal of the interoperable data platform is to leverage existing web services, implement the established standards and integrate with existing solutions across the earth sciences domain instead of creating new technologies. Towards this effort to improve the interoperability of the platform, we are collaborating with ESRI Inc. to provide climate and weather data via web services. In this presentation, we will discuss and demonstrate how to use ArcGIS to author RESTful based scientific web services using open standards. These web services are able to encapsulate the logic required to handle and describe scientific data through a variety of service types including, image, map, feature, geoprocessing, and their respective service methods. Combining these types of services and leveraging well-documented APIs, including the ArcGIS JavaScript API, we can afford to

  10. Achieving control and interoperability through unified model-based systems and software engineering

    Science.gov (United States)

    Rasmussen, Robert; Ingham, Michel; Dvorak, Daniel

    2005-01-01

    Control and interoperation of complex systems is one of the most difficult challenges facing NASA's Exploration Systems Mission Directorate. An integrated but diverse array of vehicles, habitats, and supporting facilities, evolving over the long course of the enterprise, must perform ever more complex tasks while moving steadily away from the sphere of ground support and intervention.

  11. RuleML-Based Learning Object Interoperability on the Semantic Web

    Science.gov (United States)

    Biletskiy, Yevgen; Boley, Harold; Ranganathan, Girish R.

    2008-01-01

    Purpose: The present paper aims to describe an approach for building the Semantic Web rules for interoperation between heterogeneous learning objects, namely course outlines from different universities, and one of the rule uses: identifying (in)compatibilities between course descriptions. Design/methodology/approach: As proof of concept, a rule…

  12. Model-based prototyping of an interoperability protocol for mobile ad-hoc networks

    NARCIS (Netherlands)

    Kristensen, L.M.; Westergaard, M.; Norgaard, P.C.; Romijn, J.; Smith, G.; Pol, van de J.

    2005-01-01

    We present an industrial project conducted at Ericsson Danmark A/S, Telebit where formal methods in the form of Coloured Petri Nets (CP-nets or CPNs) have been used for the specification of an interoperability protocol for routing packets between fixed core networks and mobile ad-hoc networks. The

  13. Achieving Interoperability Through Base Registries for Governmental Services and Document Management

    Science.gov (United States)

    Charalabidis, Yannis; Lampathaki, Fenareti; Askounis, Dimitris

    As digital infrastructures increase their presence worldwide, following the efforts of governments to provide citizens and businesses with high-quality one-stop services, there is a growing need for the systematic management of those newly defined and constantly transforming processes and electronic documents. E-government Interoperability Frameworks usually cater to the technical standards of e-government systems interconnection, but do not address service composition and use by citizens, businesses, or other administrations.

  14. Architectural Blueprint for Plate Boundary Observatories based on interoperable Data Management Platforms

    Science.gov (United States)

    Kerschke, D. I.; Häner, R.; Schurr, B.; Oncken, O.; Wächter, J.

    2014-12-01

    Interoperable data management platforms play an increasing role in the advancement of knowledge and technology in many scientific disciplines. Through high quality services they support the establishment of efficient and innovative research environments. Well-designed research environments can facilitate the sustainable utilization, exchange, and re-use of scientific data and functionality by using standardized community models. Together with innovative 3D/4D visualization, these concepts provide added value in improving scientific knowledge-gain, even across the boundaries of disciplines. A project benefiting from the added value is the Integrated Plate boundary Observatory in Chile (IPOC). IPOC is a European-South American network to study earthquakes and deformation at the Chilean continental margin and to monitor the plate boundary system for capturing an anticipated great earthquake in a seismic gap. In contrast to conventional observatories that monitor individual signals only, IPOC captures a large range of different processes through various observation methods (e.g., seismographs, GPS, magneto-telluric sensors, creep-meter, accelerometer, InSAR). For IPOC a conceptual design has been devised that comprises an architectural blueprint for a data management platform based on common and standardized data models, protocols, and encodings as well as on an exclusive use of Free and Open Source Software (FOSS) including visualization components. Following the principles of event-driven service-oriented architectures, the design enables novel processes by sharing and re-using functionality and information on the basis of innovative data mining and data fusion technologies. This platform can help to improve the understanding of the physical processes underlying plate deformations as well as the natural hazards induced by them. Through the use of standards, this blueprint can not only be facilitated for other plate observing systems (e.g., the European Plate

  15. Towards Interoperable IoT Deployments inSmart Cities - How project VITAL enables smart, secure and cost- effective cities

    OpenAIRE

    Schiele , Gregor; Soldatos , John; Mitton , Nathalie

    2014-01-01

    International audience; IoT-based deployments in smart cities raise several challenges, especially in terms of interoperability. In this paper, we illustrate semantic interoperability solutions for IoT systems. Based on these solutions, we describe how the FP7 VITAL project aims to bridge numerous silo IoT deployments in smart cities through repurposing and reusing sensors and data streams across multiple applications without carelessly compromising citizens’ security and privacy. This approa...

  16. Future Interoperability of Camp Protection Systems (FICAPS)

    Science.gov (United States)

    Caron, Sylvie; Gündisch, Rainer; Marchand, Alain; Stahl, Karl-Hermann

    2013-05-01

    The FICAPS Project has been established as a Project of the European Defence Agency based on an initiative of Germany and France. Goal of this Project was to derive Guidelines, which by a proper implementation in future developments improve Camp Protection Systems (CPS) by enabling and improving interoperability between Camp Protection Systems and its Equipments of different Nations involved in multinational missions. These Guidelines shall allow for: • Real-time information exchange between equipments and systems of different suppliers and nations (even via SatCom), • Quick and easy replacement of equipments (even of different Nations) at run-time in the field by means of plug and play capability, thus lowering the operational and logistic costs and making the system highly available, • Enhancement of system capabilities (open and modular systems) by adding new equipment with new capabilities (just plug-in, automatic adjustment of the HMI Human Machine Interface) without costly and time consuming validation and test on system level (validation and test can be done on Equipment level), Four scenarios have been identified to summarize the interoperability requirements from an operational viewpoint. To prove the definitions given in the Guideline Document, a French and a German Demonstration System, based on existing national assets, were realized. Demonstrations, showing the capabilities given by the defined interoperability requirements with respect to the operational scenarios, were performed. Demonstrations included remote control of a CPS by another CPS, remote sensor control (Electro-Optic/InfraRed EO/IR) and remote effector control. This capability can be applied to extend the protection area or to protect distant infrastructural assets Demonstrations have been performed. The required interoperability functionality was shown successfully. Even if the focus of the FICAPS project was on camp protection, the solution found is also appropriate for other

  17. Using GDAL to Convert NetCDF 4 CF 1.6 to GeoTIFF: Interoperability Problems and Solutions for Data Providers and Distributors

    Science.gov (United States)

    Haran, T. M.; Brodzik, M. J.; Nordgren, B.; Estilow, T.; Scott, D. J.

    2015-12-01

    An increasing number of new Earth science datasets are being producedby data providers in self-describing, machine-independent file formatsincluding Hierarchical Data Format version 5 (HDF5) and NetworkCommon Data Form version 4 (netCDF-4). Furthermore data providers maybe producing netCDF-4 files that follow the conventions for Climateand Forecast metadata version 1.6 (CF 1.6) which, for datasets mappedto a projected raster grid covering all or a portion of the earth,includes the Coordinate Reference System (CRS) used to define howlatitude and longitude are mapped to grid coordinates, i.e. columnsand rows, and vice versa. One problem that users may encounter is thattheir preferred visualization and analysis tool may not yet includesupport for one of these newer formats. Moreover, data distributorssuch as NASA's NSIDC DAAC may not yet include support for on-the-flyconversion of data files for all data sets produced in a new format toa preferred older distributed format.There do exist open source solutions to this dilemma in the form ofsoftware packages that can translate files in one of the new formatsto one of the preferred formats. However these software packagesrequire that the file to be translated conform to the specificationsof its respective format. Although an online CF-Convention compliancechecker is available from cfconventions.org, a recent NSIDC userservices incident described here in detail involved an NSIDC-supporteddata set that passed the (then current) CF Checker Version 2.0.6, butwas in fact lacking two variables necessary for conformance. Thisproblem was not detected until GDAL, a software package which reliedon the missing variables, was employed by a user in an attempt totranslate the data into a different file format, namely GeoTIFF.This incident indicates that testing a candidate data product with oneor more software products written to accept the advertised conventionsis proposed as a practice which improves interoperability

  18. Data Distribution Service-Based Interoperability Framework for Smart Grid Testbed Infrastructure

    Directory of Open Access Journals (Sweden)

    Tarek A. Youssef

    2016-03-01

    Full Text Available This paper presents the design and implementation of a communication and control infrastructure for smart grid operation. The proposed infrastructure enhances the reliability of the measurements and control network. The advantages of utilizing the data-centric over message-centric communication approach are discussed in the context of smart grid applications. The data distribution service (DDS is used to implement a data-centric common data bus for the smart grid. This common data bus improves the communication reliability, enabling distributed control and smart load management. These enhancements are achieved by avoiding a single point of failure while enabling peer-to-peer communication and an automatic discovery feature for dynamic participating nodes. The infrastructure and ideas presented in this paper were implemented and tested on the smart grid testbed. A toolbox and application programing interface for the testbed infrastructure are developed in order to facilitate interoperability and remote access to the testbed. This interface allows control, monitoring, and performing of experiments remotely. Furthermore, it could be used to integrate multidisciplinary testbeds to study complex cyber-physical systems (CPS.

  19. Smart Grid Interoperability Maturity Model

    Energy Technology Data Exchange (ETDEWEB)

    Widergren, Steven E.; Levinson, Alex; Mater, J.; Drummond, R.

    2010-04-28

    The integration of automation associated with electricity resources (including transmission and distribution automation and demand-side resources operated by end-users) is key to supporting greater efficiencies and incorporating variable renewable resources and electric vehicles into the power system. The integration problems faced by this community are analogous to those faced in the health industry, emergency services, and other complex communities with many stakeholders. To highlight this issue and encourage communication and the development of a smart grid interoperability community, the GridWise Architecture Council (GWAC) created an Interoperability Context-Setting Framework. This "conceptual model" has been helpful to explain the importance of organizational alignment in addition to technical and informational interface specifications for "smart grid" devices and systems. As a next step to building a community sensitive to interoperability, the GWAC is investigating an interoperability maturity model (IMM) based on work done by others to address similar circumstances. The objective is to create a tool or set of tools that encourages a culture of interoperability in this emerging community. The tools would measure status and progress, analyze gaps, and prioritize efforts to improve the situation.

  20. Innovation in OGC: The Interoperability Program

    Directory of Open Access Journals (Sweden)

    George Percivall

    2015-10-01

    Full Text Available The OGC Interoperability Program is a source of innovation in the development of open standards. The approach to innovation is based on hands-on; collaborative engineering leading to more mature standards and implementations. The process of the Interoperability Program engages a community of sponsors and participants based on an economic model that benefits all involved. Each initiative begins with an innovative approach to identify interoperability needs followed by agile software development to advance the state of technology to the benefit of society. Over eighty initiatives have been conducted in the Interoperability Program since the breakthrough Web Mapping Testbed began the program in 1999. OGC standards that were initiated in Interoperability Program are the basis of two thirds of the certified compliant products.

  1. Interoperability and Security Support for Heterogeneous COTS/GOTS/Legacy Component-Based Architecture

    National Research Council Canada - National Science Library

    Tran, Tam

    2000-01-01

    .... This thesis researches existing open standards solutions to the distributed component integration problem and proposes an application framework that supports application wrappers and a uniform...

  2. Data Modeling Challenges of Advanced Interoperability.

    Science.gov (United States)

    Blobel, Bernd; Oemig, Frank; Ruotsalainen, Pekka

    2018-01-01

    Progressive health paradigms, involving many different disciplines and combining multiple policy domains, requires advanced interoperability solutions. This results in special challenges for modeling health systems. The paper discusses classification systems for data models and enterprise business architectures and compares them with the ISO Reference Architecture. On that basis, existing definitions, specifications and standards of data models for interoperability are evaluated and their limitations are discussed. Amendments to correctly use those models and to better meet the aforementioned challenges are offered.

  3. Interoperability Strategic Vision

    Energy Technology Data Exchange (ETDEWEB)

    Widergren, Steven E.; Knight, Mark R.; Melton, Ronald B.; Narang, David; Martin, Maurice; Nordman, Bruce; Khandekar, Aditya; Hardy, Keith S.

    2018-02-28

    The Interoperability Strategic Vision whitepaper aims to promote a common understanding of the meaning and characteristics of interoperability and to provide a strategy to advance the state of interoperability as applied to integration challenges facing grid modernization. This includes addressing the quality of integrating devices and systems and the discipline to improve the process of successfully integrating these components as business models and information technology improve over time. The strategic vision for interoperability described in this document applies throughout the electric energy generation, delivery, and end-use supply chain. Its scope includes interactive technologies and business processes from bulk energy levels to lower voltage level equipment and the millions of appliances that are becoming equipped with processing power and communication interfaces. A transformational aspect of a vision for interoperability in the future electric system is the coordinated operation of intelligent devices and systems at the edges of grid infrastructure. This challenge offers an example for addressing interoperability concerns throughout the electric system.

  4. Flexible Language Interoperability

    DEFF Research Database (Denmark)

    Ekman, Torbjörn; Mechlenborg, Peter; Schultz, Ulrik Pagh

    2007-01-01

    Virtual machines raise the abstraction level of the execution environment at the cost of restricting the set of supported languages. Moreover, the ability of a language implementation to integrate with other languages hosted on the same virtual machine typically constrains the features...... of the language. In this paper, we present a highly flexible yet efficient approach to hosting multiple programming languages on an object-oriented virtual machine. Our approach is based on extending the interface of each class with language-specific wrapper methods, offering each language a tailored view...... of a given class. This approach can be deployed both on a statically typed virtual machine, such as the JVM, and on a dynamic virtual machine, such as a Smalltalk virtual machine. We have implemented our approach to language interoperability on top of a prototype virtual machine for embedded systems based...

  5. Component-Based Modelling for Scalable Smart City Systems Interoperability: A Case Study on Integrating Energy Demand Response Systems.

    Science.gov (United States)

    Palomar, Esther; Chen, Xiaohong; Liu, Zhiming; Maharjan, Sabita; Bowen, Jonathan

    2016-10-28

    Smart city systems embrace major challenges associated with climate change, energy efficiency, mobility and future services by embedding the virtual space into a complex cyber-physical system. Those systems are constantly evolving and scaling up, involving a wide range of integration among users, devices, utilities, public services and also policies. Modelling such complex dynamic systems' architectures has always been essential for the development and application of techniques/tools to support design and deployment of integration of new components, as well as for the analysis, verification, simulation and testing to ensure trustworthiness. This article reports on the definition and implementation of a scalable component-based architecture that supports a cooperative energy demand response (DR) system coordinating energy usage between neighbouring households. The proposed architecture, called refinement of Cyber-Physical Component Systems (rCPCS), which extends the refinement calculus for component and object system (rCOS) modelling method, is implemented using Eclipse Extensible Coordination Tools (ECT), i.e., Reo coordination language. With rCPCS implementation in Reo, we specify the communication, synchronisation and co-operation amongst the heterogeneous components of the system assuring, by design scalability and the interoperability, correctness of component cooperation.

  6. Supporting interoperability of collaborative networks through engineering of a service-based Mediation Information System (MISE 2.0)

    Science.gov (United States)

    Benaben, Frederick; Mu, Wenxin; Boissel-Dallier, Nicolas; Barthe-Delanoe, Anne-Marie; Zribi, Sarah; Pingaud, Herve

    2015-08-01

    The Mediation Information System Engineering project is currently finishing its second iteration (MISE 2.0). The main objective of this scientific project is to provide any emerging collaborative situation with methods and tools to deploy a Mediation Information System (MIS). MISE 2.0 aims at defining and designing a service-based platform, dedicated to initiating and supporting the interoperability of collaborative situations among potential partners. This MISE 2.0 platform implements a model-driven engineering approach to the design of a service-oriented MIS dedicated to supporting the collaborative situation. This approach is structured in three layers, each providing their own key innovative points: (i) the gathering of individual and collaborative knowledge to provide appropriate collaborative business behaviour (key point: knowledge management, including semantics, exploitation and capitalisation), (ii) deployment of a mediation information system able to computerise the previously deduced collaborative processes (key point: the automatic generation of collaborative workflows, including connection with existing devices or services) (iii) the management of the agility of the obtained collaborative network of organisations (key point: supervision of collaborative situations and relevant exploitation of the gathered data). MISE covers business issues (through BPM), technical issues (through an SOA) and agility issues of collaborative situations (through EDA).

  7. Agent Based Knowledge Management Solution using Ontology, Semantic Web Services and GIS

    Directory of Open Access Journals (Sweden)

    Andreea DIOSTEANU

    2009-01-01

    Full Text Available The purpose of our research is to develop an agent based knowledge management application framework using a specific type of ontology that is able to facilitate semantic web service search and automatic composition. This solution can later on be used to develop complex solutions for location based services, supply chain management, etc. This application for modeling knowledge highlights the importance of agent interaction that leads to efficient enterprise interoperability. Furthermore, it proposes an "agent communication language" ontology that extends the OWL Lite standard approach and makes it more flexible in retrieving proper data for identifying the agents that can best communicate and negotiate.

  8. Towards Interoperable Preservation Repositories: TIPR

    Directory of Open Access Journals (Sweden)

    Priscilla Caplan

    2010-07-01

    Full Text Available Towards Interoperable Preservation Repositories (TIPR is a project funded by the Institute of Museum and Library Services to create and test a Repository eXchange Package (RXP. The package will make it possible to transfer complex digital objects between dissimilar preservation repositories.  For reasons of redundancy, succession planning and software migration, repositories must be able to exchange copies of archival information packages with each other. Every different repository application, however, describes and structures its archival packages differently. Therefore each system produces dissemination packages that are rarely understandable or usable as submission packages by other repositories. The RXP is an answer to that mismatch. Other solutions for transferring packages between repositories focus either on transfers between repositories of the same type, such as DSpace-to-DSpace transfers, or on processes that rely on central translation services.  Rather than build translators between many dissimilar repository types, the TIPR project has defined a standards-based package of metadata files that can act as an intermediary information package, the RXP, a lingua franca all repositories can read and write.

  9. Intercloud Architecture Framework for Interoperability and Integration

    NARCIS (Netherlands)

    Demchenko, Y.; Ngo, C.; Makkes, M.X.; Strijkers, R.J.

    2013-01-01

    This report presents on-going research to develop the Intercloud Architecture Framework (ICAF) that addresses interoperability and integration issues in multi-provider multi-domain heterogeneous Cloud based infrastructure services and applications provisioning, including integration and

  10. WellnessRules: A Web 3.0 Case Study in RuleML-Based Prolog-N3 Profile Interoperation

    Science.gov (United States)

    Boley, Harold; Osmun, Taylor Michael; Craig, Benjamin Larry

    An interoperation study, WellnessRules, is described, where rules about wellness opportunities are created by participants in rule languages such as Prolog and N3, and translated within a wellness community using RuleML/XML. The wellness rules are centered around participants, as profiles, encoding knowledge about their activities conditional on the season, the time-of-day, the weather, etc. This distributed knowledge base extends FOAF profiles with a vocabulary and rules about wellness group networking. The communication between participants is organized through Rule Responder, permitting wellness-profile translation and distributed querying across engines. WellnessRules interoperates between rules and queries in the relational (Datalog) paradigm of the pure-Prolog subset of POSL and in the frame (F-logic) paradigm of N3. An evaluation of Rule Responder instantiated for WellnessRules revealed acceptable Web response times.

  11. Design challenges and gaps in standards in developing an interoperable zero footprint DI thin client for use in image-enabled electronic health record solutions

    Science.gov (United States)

    Agrawal, Arun; Koff, David; Bak, Peter; Bender, Duane; Castelli, Jane

    2015-03-01

    The deployment of regional and national Electronic Health Record solutions has been a focus of many countries throughout the past decade. A major challenge for these deployments has been support for ubiquitous image viewing. More specifically, these deployments require an imaging solution that can work over the Internet, leverage any point of service device: desktop, tablet, phone; and access imaging data from any source seamlessly. Whereas standards exist to enable ubiquitous image viewing, few if any solutions exist that leverage these standards and meet the challenge. Rather, most of the currently available web based DI viewing solutions are either proprietary solutions or require special plugins. We developed a true zero foot print browser based DI viewing solution based on the Web Access DICOM Objects (WADO) and Cross-enterprise Document Sharing for Imaging (XDS-I.b) standards to a) demonstrate that a truly ubiquitous image viewer can be deployed; b) identify the gaps in the current standards and the design challenges for developing such a solution. The objective was to develop a viewer, which works on all modern browsers on both desktop and mobile devices. The implementation allows basic viewing functionalities of scroll, zoom, pan and window leveling (limited). The major gaps identified in the current DICOM WADO standards are a lack of ability to allow any kind of 3D reconstruction or MPR views. Other design challenges explored include considerations related to optimization of the solution for response time and low memory foot print.

  12. Development of an electronic claim system based on an integrated electronic health record platform to guarantee interoperability.

    Science.gov (United States)

    Kim, Hwa Sun; Cho, Hune; Lee, In Keun

    2011-06-01

    We design and develop an electronic claim system based on an integrated electronic health record (EHR) platform. This system is designed to be used for ambulatory care by office-based physicians in the United States. This is achieved by integrating various medical standard technologies for interoperability between heterogeneous information systems. The developed system serves as a simple clinical data repository, it automatically fills out the Centers for Medicare and Medicaid Services (CMS)-1500 form based on information regarding the patients and physicians' clinical activities. It supports electronic insurance claims by creating reimbursement charges. It also contains an HL7 interface engine to exchange clinical messages between heterogeneous devices. The system partially prevents physician malpractice by suggesting proper treatments according to patient diagnoses and supports physicians by easily preparing documents for reimbursement and submitting claim documents to insurance organizations electronically, without additional effort by the user. To show the usability of the developed system, we performed an experiment that compares the time spent filling out the CMS-1500 form directly and time required create electronic claim data using the developed system. From the experimental results, we conclude that the system could save considerable time for physicians in making claim documents. The developed system might be particularly useful for those who need a reimbursement-specialized EHR system, even though the proposed system does not completely satisfy all criteria requested by the CMS and Office of the National Coordinator for Health Information Technology (ONC). This is because the criteria are not sufficient but necessary condition for the implementation of EHR systems. The system will be upgraded continuously to implement the criteria and to offer more stable and transparent transmission of electronic claim data.

  13. Interoperability of medical device information and the clinical applications: an HL7 RMIM based on the ISO/IEEE 11073 DIM.

    Science.gov (United States)

    Yuksel, Mustafa; Dogac, Asuman

    2011-07-01

    Medical devices are essential to the practice of modern healthcare services. Their benefits will increase if clinical software applications can seamlessly acquire the medical device data. The need to represent medical device observations in a format that can be consumable by clinical applications has already been recognized by the industry. Yet, the solutions proposed involve bilateral mappings from the ISO/IEEE 11073 Domain Information Model (DIM) to specific message or document standards. Considering that there are many different types of clinical applications such as the electronic health record and the personal health record systems, the clinical workflows, and the clinical decision support systems each conforming to different standard interfaces, detailing a mapping mechanism for every one of them introduces significant work and, thus, limits the potential health benefits of medical devices. In this paper, to facilitate the interoperability of clinical applications and the medical device data, we use the ISO/IEEE 11073 DIM to derive an HL7 v3 Refined Message Information Model (RMIM) of the medical device domain from the HL7 v3 Reference Information Mode (RIM). This makes it possible to trace the medical device data back to a standard common denominator, that is, HL7 v3 RIM from which all the other medical domains under HL7 v3 are derived. Hence, once the medical device data are obtained in the RMIM format, it can easily be transformed into HL7-based standard interfaces through XML transformations because these interfaces all have their building blocks from the same RIM. To demonstrate this, we provide the mappings from the developed RMIM to some of the widely used HL7 v3-based standard interfaces.

  14. Special Topic Interoperability and EHR: Combining openEHR, SNOMED, IHE, and Continua as approaches to interoperability on national eHealth.

    Science.gov (United States)

    Beštek, Mate; Stanimirović, Dalibor

    2017-08-09

    The main aims of the paper comprise the characterization and examination of the potential approaches regarding interoperability. This includes openEHR, SNOMED, IHE, and Continua as combined interoperability approaches, possibilities for their incorporation into the eHealth environment, and identification of the main success factors in the field, which are necessary for achieving required interoperability, and consequently, for the successful implementation of eHealth projects in general. The paper represents an in-depth analysis regarding the potential application of openEHR, SNOMED, IHE and Continua approaches in the development and implementation process of eHealth in Slovenia. The research method used is both exploratory and deductive in nature. The methodological framework is grounded on information retrieval with a special focus on research and charting of existing experience in the field, and sources, both electronic and written, which include interoperability concepts and related implementation issues. The paper will try to answer the following inquiries that are complementing each other: 1. Scrutiny of the potential approaches, which could alleviate the pertinent interoperability issues in the Slovenian eHealth context. 2. Analyzing the possibilities (requirements) for their inclusion in the construction process for individual eHealth solutions. 3. Identification and charting the main success factors in the interoperability field that critically influence development and implementation of eHealth projects in an efficient manner. Provided insights and identified success factors could serve as a constituent of the strategic starting points for continuous integration of interoperability principles into the healthcare domain. Moreover, the general implementation of the identified success factors could facilitate better penetration of ICT into the healthcare environment and enable the eHealth-based transformation of the health system especially in the countries

  15. Maturity Model for Advancing Smart Grid Interoperability

    Energy Technology Data Exchange (ETDEWEB)

    Knight, Mark; Widergren, Steven E.; Mater, J.; Montgomery, Austin

    2013-10-28

    Abstract—Interoperability is about the properties of devices and systems to connect and work properly. Advancing interoperability eases integration and maintenance of the resulting interconnection. This leads to faster integration, lower labor and component costs, predictability of projects and the resulting performance, and evolutionary paths for upgrade. When specifications are shared and standardized, competition and novel solutions can bring new value streams to the community of stakeholders involved. Advancing interoperability involves reaching agreement for how things join at their interfaces. The quality of the agreements and the alignment of parties involved in the agreement present challenges that are best met with process improvement techniques. The GridWise® Architecture Council (GWAC) sponsored by the United States Department of Energy is supporting an effort to use concepts from capability maturity models used in the software industry to advance interoperability of smart grid technology. An interoperability maturity model has been drafted and experience is being gained through trials on various types of projects and community efforts. This paper describes the value and objectives of maturity models, the nature of the interoperability maturity model and how it compares with other maturity models, and experiences gained with its use.

  16. Sensor Interoperability and Fusion in Fingerprint Verification: A Case Study using Minutiae-and Ridge-Based Matchers

    NARCIS (Netherlands)

    Alonso-Fernandez, F.; Veldhuis, Raymond N.J.; Bazen, A.M.; Fierrez-Aguilar, J.; Ortega-Garcia, J.

    2006-01-01

    Information fusion in fingerprint recognition has been studied in several papers. However, only a few papers have been focused on sensor interoperability and sensor fusion. In this paper, these two topics are studied using a multisensor database acquired with three different fingerprint sensors.

  17. An Interoperable Electronic Medical Record-Based Platform for Personalized Predictive Analytics

    Science.gov (United States)

    Abedtash, Hamed

    2017-01-01

    Precision medicine refers to the delivering of customized treatment to patients based on their individual characteristics, and aims to reduce adverse events, improve diagnostic methods, and enhance the efficacy of therapies. Among efforts to achieve the goals of precision medicine, researchers have used observational data for developing predictive…

  18. Telefacturing Based Distributed Manufacturing Environment for Optimal Manufacturing Service by Enhancing the Interoperability in the Hubs

    Directory of Open Access Journals (Sweden)

    V. K. Manupati

    2017-01-01

    Full Text Available Recent happenings are surrounding the manufacturing sector leading to intense progress towards the development of effective distributed collaborative manufacturing environments. This evolving collaborative manufacturing not only focuses on digitalisation of this environment but also necessitates service-dependent manufacturing system that offers an uninterrupted approach to a number of diverse, complicated, dynamic manufacturing operations management systems at a common work place (hub. This research presents a novel telefacturing based distributed manufacturing environment for recommending the manufacturing services based on the user preferences. The first step in this direction is to deploy the most advanced tools and techniques, that is, Ontology-based Protégé 5.0 software for transforming the huge stored knowledge/information into XML schema of Ontology Language (OWL documents and Integration of Process Planning and Scheduling (IPPS for multijobs in a collaborative manufacturing system. Thereafter, we also investigate the possibilities of allocation of skilled workers to the best feasible operations sequence. In this context, a mathematical model is formulated for the considered objectives, that is, minimization of makespan and total training cost of the workers. With an evolutionary algorithm and developed heuristic algorithm, the performance of the proposed manufacturing system has been improved. Finally, to manifest the capability of the proposed approach, an illustrative example from the real-time manufacturing industry is validated for optimal service recommendation.

  19. Semantic Interoperable Electronic Patient Records: The Unfolding of Consensus based Archetypes.

    Science.gov (United States)

    Pedersen, Rune; Wynn, Rolf; Ellingsen, Gunnar

    2015-01-01

    This paper is a status report from a large-scale openEHR-based EPR project from the North Norway Regional Health Authority encouraged by the unfolding of a national repository for openEHR archetypes. Clinicians need to engage in, and be responsible for the production of archetypes. The consensus processes have so far been challenged by a low number of active clinicians, a lack of critical specialties to reach consensus, and a cumbersome review process (3 or 4 review rounds) for each archetype. The goal is to have several clinicians from each specialty as a backup if one is hampered to participate. Archetypes and their importance for structured data and sharing of information has to become more visible for the clinicians through more sharpened information practice.

  20. Interoperability of CAD Standards and Robotics in CIME

    DEFF Research Database (Denmark)

    Sørensen, Torben

    The research presented in this dissertation concerns the identification of problems and provision of solutions for increasing the degree of interoperability between CAD, CACSD (Computer Aided Control Systems Design) and CAR (Computer Aided Robotics) in Computer Integrated Manufacturing and Engine......The research presented in this dissertation concerns the identification of problems and provision of solutions for increasing the degree of interoperability between CAD, CACSD (Computer Aided Control Systems Design) and CAR (Computer Aided Robotics) in Computer Integrated Manufacturing......· The development of a STEP based interface for general control system data and functions, especially related to robot motion control for interoperability of CAD, CACSD, and CAR systems for the extension of the inter-system communication capabilities beyond the stage achieved up to now.This interface development...... comprehends the following work:· The definition of the concepts of 'information' and 'information model', and the selection of a proper information modeling methodology within the STEP methodologies.· The elaboration of a general function model of a generic robot motion controller in IDEF0 for interface...

  1. Achieving mask order processing automation, interoperability and standardization based on P10

    Science.gov (United States)

    Rodriguez, B.; Filies, O.; Sadran, D.; Tissier, Michel; Albin, D.; Stavroulakis, S.; Voyiatzis, E.

    2007-02-01

    Last year the MUSCLE (Masks through User's Supply Chain: Leadership by Excellence) project was presented. Here is the project advancement. A key process in mask supply chain management is the exchange of technical information for ordering masks. This process is large, complex, company specific and error prone, and leads to longer cycle times and higher costs due to missing or wrong inputs. Its automation and standardization could produce significant benefits. We need to agree on the standard for mandatory and optional parameters, and also a common way to describe parameters when ordering. A system was created to improve the performance in terms of Key Performance Indicators (KPIs) such as cycle time and cost of production. This tool allows us to evaluate and measure the effect of factors, as well as the effect of implementing the improvements of the complete project. Next, a benchmark study and a gap analysis were performed. These studies show the feasibility of standardization, as there is a large overlap in requirements. We see that the SEMI P10 standard needs enhancements. A format supporting the standard is required, and XML offers the ability to describe P10 in a flexible way. Beyond using XML for P10, the semantics of the mask order should also be addressed. A system design and requirements for a reference implementation for a P10 based management system are presented, covering a mechanism for the evolution and for version management and a design for P10 editing and data validation.

  2. Interoperable computerized smart card based system for health insurance and health services applied in cardiology.

    Science.gov (United States)

    Cocei, Horia-Delatebea; Stefan, Livia; Dobre, Ioana; Croitoriu, Mihai; Sinescu, Crina; Ovricenco, Eduard

    2002-01-01

    In 1999 Romania started its health care reform by promulgating the Health Insurance Law. A functional and efficient health care system needs procedures for monitoring and evaluation of the medical services, communication between different service providers and entities involved in the system, integration and availability of the information. The final goal is a good response to the needs and demands of the patients and of the real life. For this project we took into account, on one hand, the immediate need for computerized systems for the health care providers and, on the other hand, the large number of trials and experiments with health smart cards across Europe. Our project will implement a management system based on electronic patient records to be used in all cardiology clinics and will experiment the health smart cards, will promote and demonstrate the capabilities of the smart card technology. We focused our attention towards a specific and also critical category of patients, those with heart diseases, and also towards a critical sector of the health care system--the emergency care. The patient card was tested on a number of 150 patients at a cardiology clinic in Bucharest. This was the first trial of a health smart card in Romania.

  3. Device interoperability and authentication for telemedical appliance based on the ISO/IEEE 11073 Personal Health Device (PHD) Standards.

    Science.gov (United States)

    Caranguian, Luther Paul R; Pancho-Festin, Susan; Sison, Luis G

    2012-01-01

    In this study, we focused on the interoperability and authentication of medical devices in the context of telemedical systems. A recent standard called the ISO/IEEE 11073 Personal Health Device (X73-PHD) Standards addresses the device interoperability problem by defining common protocols for agent (medical device) and manager (appliance) interface. The X73-PHD standard however has not addressed security and authentication of medical devices which is important in establishing integrity of a telemedical system. We have designed and implemented a security policy within the X73-PHD standards. The policy will enable device authentication using Asymmetric-Key Cryptography and the RSA algorithm as the digital signature scheme. We used two approaches for performing the digital signatures: direct software implementation and use of embedded security modules (ESM). The two approaches were evaluated and compared in terms of execution time and memory requirement. For the standard 2048-bit RSA, ESM calculates digital signatures only 12% of the total time for the direct implementation. Moreover, analysis shows that ESM offers more security advantage such as secure storage of keys compared to using direct implementation. Interoperability with other systems was verified by testing the system with LNI Healthlink, a manager software that implements the X73-PHD standard. Lastly, security analysis was done and the system's response to common attacks on authentication systems was analyzed and several measures were implemented to protect the system against them.

  4. Ocean Data Interoperability Platform (ODIP): developing a common framework for global marine data management

    Science.gov (United States)

    Glaves, H. M.

    2015-12-01

    In recent years marine research has become increasingly multidisciplinary in its approach with a corresponding rise in the demand for large quantities of high quality interoperable data as a result. This requirement for easily discoverable and readily available marine data is currently being addressed by a number of regional initiatives with projects such as SeaDataNet in Europe, Rolling Deck to Repository (R2R) in the USA and the Integrated Marine Observing System (IMOS) in Australia, having implemented local infrastructures to facilitate the exchange of standardised marine datasets. However, each of these systems has been developed to address local requirements and created in isolation from those in other regions.Multidisciplinary marine research on a global scale necessitates a common framework for marine data management which is based on existing data systems. The Ocean Data Interoperability Platform project is seeking to address this requirement by bringing together selected regional marine e-infrastructures for the purposes of developing interoperability across them. By identifying the areas of commonality and incompatibility between these data infrastructures, and leveraging the development activities and expertise of these individual systems, three prototype interoperability solutions are being created which demonstrate the effective sharing of marine data and associated metadata across the participating regional data infrastructures as well as with other target international systems such as GEO, COPERNICUS etc.These interoperability solutions combined with agreed best practice and approved standards, form the basis of a common global approach to marine data management which can be adopted by the wider marine research community. To encourage implementation of these interoperability solutions by other regional marine data infrastructures an impact assessment is being conducted to determine both the technical and financial implications of deploying them

  5. Towards technical interoperability in telemedicine.

    Energy Technology Data Exchange (ETDEWEB)

    Craft, Richard Layne, II

    2004-05-01

    For telemedicine to realize the vision of anywhere, anytime access to care, the question of how to create a fully interoperable technical infrastructure must be addressed. After briefly discussing how 'technical interoperability' compares with other types of interoperability being addressed in the telemedicine community today, this paper describes reasons for pursuing technical interoperability, presents a proposed framework for realizing technical interoperability, identifies key issues that will need to be addressed if technical interoperability is to be achieved, and suggests a course of action that the telemedicine community might follow to accomplish this goal.

  6. Procrustes-based geometric morphometrics on MRI images: An example of inter-operator bias in 3D landmarks and its impact on big datasets.

    Science.gov (United States)

    Daboul, Amro; Ivanovska, Tatyana; Bülow, Robin; Biffar, Reiner; Cardini, Andrea

    2018-01-01

    Using 3D anatomical landmarks from adult human head MRIs, we assessed the magnitude of inter-operator differences in Procrustes-based geometric morphometric analyses. An in depth analysis of both absolute and relative error was performed in a subsample of individuals with replicated digitization by three different operators. The effect of inter-operator differences was also explored in a large sample of more than 900 individuals. Although absolute error was not unusual for MRI measurements, including bone landmarks, shape was particularly affected by differences among operators, with up to more than 30% of sample variation accounted for by this type of error. The magnitude of the bias was such that it dominated the main pattern of bone and total (all landmarks included) shape variation, largely surpassing the effect of sex differences between hundreds of men and women. In contrast, however, we found higher reproducibility in soft-tissue nasal landmarks, despite relatively larger errors in estimates of nasal size. Our study exemplifies the assessment of measurement error using geometric morphometrics on landmarks from MRIs and stresses the importance of relating it to total sample variance within the specific methodological framework being used. In summary, precise landmarks may not necessarily imply negligible errors, especially in shape data; indeed, size and shape may be differentially impacted by measurement error and different types of landmarks may have relatively larger or smaller errors. Importantly, and consistently with other recent studies using geometric morphometrics on digital images (which, however, were not specific to MRI data), this study showed that inter-operator biases can be a major source of error in the analysis of large samples, as those that are becoming increasingly common in the 'era of big data'.

  7. FLTSATCOM interoperability applications

    Science.gov (United States)

    Woolford, Lynn

    A mobile Fleet Satellite Communications (FLTSATCOM) system called the Mobile Operational Control Center (MOCC) was developed which has demonstrated the ability to be interoperable with many of the current FLTSATCOM command and control channels. This low-cost system is secure in all its communications, is lightweight, and provides a gateway for other communications formats. The major elements of this system are made up of a personal computer, a protocol microprocessor, and off-the-shelf mobile communication components. It is concluded that with both FLTSATCOM channel protocol and data format interoperability, the MOCC has the ability provide vital information in or near real time, which significantly improves mission effectiveness.

  8. A Theory of Interoperability Failures

    National Research Council Canada - National Science Library

    McBeth, Michael S

    2003-01-01

    This paper develops a theory of interoperability failures. Interoperability in this paper refers to the exchange of information and the use of information, once exchanged, between two or more systems...

  9. Secure and interoperable communication infrastructures for PPDR organisations

    Science.gov (United States)

    Müller, Wilmuth; Marques, Hugo; Pereira, Luis; Rodriguez, Jonathan; Brouwer, Frank; Bouwers, Bert; Politis, Ilias; Lykourgiotis, Asimakis; Ladas, Alexandros; Adigun, Olayinka; Jelenc, David

    2016-05-01

    The growing number of events affecting public safety and security (PS&S) on a regional scale with potential to grow up to large scale cross border disasters puts an increased pressure on agencies and organisation responsible for PS&S. In order to respond timely and in an adequate manner to such events, Public Protection and Disaster Relief (PPDR) organisations need to cooperate, align their procedures and activities, share the needed information and be interoperable. Existing PPDR/PMR technologies such as TETRA, TETRAPOL or P25, do not currently provide broadband capability nor is expected such technologies to be upgraded in the future. This presents a major limitation in supporting new services and information flows. Furthermore, there is no known standard that addresses interoperability of these technologies. In this contribution the design of a next generation communication infrastructure for PPDR organisations which fulfills the requirements of secure and seamless end-to-end communication and interoperable information exchange within the deployed communication networks is presented. Based on Enterprise Architecture of PPDR organisations, a next generation PPDR network that is backward compatible with legacy communication technologies is designed and implemented, capable of providing security, privacy, seamless mobility, QoS and reliability support for mission-critical Private Mobile Radio (PMR) voice and broadband data services. The designed solution provides a robust, reliable, and secure mobile broadband communications system for a wide variety of PMR applications and services on PPDR broadband networks, including the ability of inter-system, interagency and cross-border operations with emphasis on interoperability between users in PMR and LTE.

  10. Interoperable End-to-End Remote Patient Monitoring Platform Based on IEEE 11073 PHD and ZigBee Health Care Profile.

    Science.gov (United States)

    Clarke, Malcolm; de Folter, Joost; Verma, Vivek; Gokalp, Hulya

    2018-05-01

    This paper describes the implementation of an end-to-end remote monitoring platform based on the IEEE 11073 standards for personal health devices (PHD). It provides an overview of the concepts and approaches and describes how the standard has been optimized for small devices with limited resources of processor, memory, and power that use short-range wireless technology. It explains aspects of IEEE 11073, including the domain information model, state model, and nomenclature, and how these support its plug-and-play architecture. It shows how these aspects underpin a much larger ecosystem of interoperable devices and systems that include IHE PCD-01, HL7, and BlueTooth LE medical devices, and the relationship to the Continua Guidelines, advocating the adoption of data standards and nomenclature to support semantic interoperability between health and ambient assisted living in future platforms. The paper further describes the adaptions that have been made in order to implement the standard on the ZigBee Health Care Profile and the experiences of implementing an end-to-end platform that has been deployed to frail elderly patients with chronic disease(s) and patients with diabetes.

  11. Recent ARC developments: Through modularity to interoperability

    International Nuclear Information System (INIS)

    Smirnova, O; Cameron, D; Ellert, M; Groenager, M; Johansson, D; Kleist, J; Dobe, P; Joenemo, J; Konya, B; Fraagaat, T; Konstantinov, A; Nilsen, J K; Saada, F Ould; Qiang, W; Read, A; Kocan, M; Marton, I; Nagy, Zs; Moeller, S; Mohn, B

    2010-01-01

    The Advanced Resource Connector (ARC) middleware introduced by NorduGrid is one of the basic Grid solutions used by scientists worldwide. While being well-proven in daily use by a wide variety of scientific applications at large-scale infrastructures like the Nordic DataGrid Facility (NDGF) and smaller scale projects, production ARC of today is still largely based on conventional Grid technologies and custom interfaces introduced a decade ago. In order to guarantee sustainability, true cross-system portability and standards-compliance based interoperability, the ARC community undertakes a massive effort of implementing modular Web Service (WS) approach into the middleware. With support from the EU KnowARC project, new components were introduced and the existing key ARC services got extended with WS technology based standard-compliant interfaces following a service-oriented architecture. Such components include the hosting environment framework, the resource-coupled execution service, the re-engineered client library, the self-healing storage solution and the peer-to-peer information system, to name a few. Gradual introduction of these new services and client tools into the production middleware releases is carried out together with NDGF and thus ensures a smooth transition to the next generation Grid middleware. Standard interfaces and modularity of the new component design are essential for ARC contributions to the planned Universal Middleware Distribution of the European Grid Initiative.

  12. Recent ARC developments: Through modularity to interoperability

    Energy Technology Data Exchange (ETDEWEB)

    Smirnova, O; Cameron, D; Ellert, M; Groenager, M; Johansson, D; Kleist, J [NDGF, Kastruplundsgade 22, DK-2770 Kastrup (Denmark); Dobe, P; Joenemo, J; Konya, B [Lund University, Experimental High Energy Physics, Institute of Physics, Box 118, SE-22100 Lund (Sweden); Fraagaat, T; Konstantinov, A; Nilsen, J K; Saada, F Ould; Qiang, W; Read, A [University of Oslo, Department of Physics, P. O. Box 1048, Blindern, N-0316 Oslo (Norway); Kocan, M [Pavol Jozef Safarik University, Faculty of Science, Jesenna 5, SK-04000 Kosice (Slovakia); Marton, I; Nagy, Zs [NIIF/HUNGARNET, Victor Hugo 18-22, H-1132 Budapest (Hungary); Moeller, S [University of Luebeck, Inst. Of Neuro- and Bioinformatics, Ratzeburger Allee 160, D-23538 Luebeck (Germany); Mohn, B, E-mail: oxana.smirnova@hep.lu.s [Uppsala University, Department of Physics and Astronomy, Div. of Nuclear and Particle Physics, Box 535, SE-75121 Uppsala (Sweden)

    2010-04-01

    The Advanced Resource Connector (ARC) middleware introduced by NorduGrid is one of the basic Grid solutions used by scientists worldwide. While being well-proven in daily use by a wide variety of scientific applications at large-scale infrastructures like the Nordic DataGrid Facility (NDGF) and smaller scale projects, production ARC of today is still largely based on conventional Grid technologies and custom interfaces introduced a decade ago. In order to guarantee sustainability, true cross-system portability and standards-compliance based interoperability, the ARC community undertakes a massive effort of implementing modular Web Service (WS) approach into the middleware. With support from the EU KnowARC project, new components were introduced and the existing key ARC services got extended with WS technology based standard-compliant interfaces following a service-oriented architecture. Such components include the hosting environment framework, the resource-coupled execution service, the re-engineered client library, the self-healing storage solution and the peer-to-peer information system, to name a few. Gradual introduction of these new services and client tools into the production middleware releases is carried out together with NDGF and thus ensures a smooth transition to the next generation Grid middleware. Standard interfaces and modularity of the new component design are essential for ARC contributions to the planned Universal Middleware Distribution of the European Grid Initiative.

  13. Grid Interoperation with ARC middleware for the CMS experiment

    International Nuclear Information System (INIS)

    Edelmann, Erik; Groenager, Michael; Johansson, Daniel; Kleist, Josva; Field, Laurence; Qing, Di; Frey, Jaime; Happonen, Kalle; Klem, Jukka; Koivumaeki, Jesper; Linden, Tomas; Pirinen, Antti

    2010-01-01

    The Compact Muon Solenoid (CMS) is one of the general purpose experiments at the CERN Large Hadron Collider (LHC). CMS computing relies on different grid infrastructures to provide computational and storage resources. The major grid middleware stacks used for CMS computing are gLite, Open Science Grid (OSG) and ARC (Advanced Resource Connector). Helsinki Institute of Physics (HIP) hosts one of the Tier-2 centers for CMS computing. CMS Tier-2 centers operate software systems for data transfers (PhEDEx), Monte Carlo production (ProdAgent) and data analysis (CRAB). In order to provide the Tier-2 services for CMS, HIP uses tools and components from both ARC and gLite grid middleware stacks. Interoperation between grid systems is a challenging problem and HIP uses two different solutions to provide the needed services. The first solution is based on gLite-ARC grid level interoperability. This allows to use ARC resources in CMS without modifying the CMS application software. The second solution is based on developing specific ARC plugins in CMS software.

  14. Grid Interoperation with ARC middleware for the CMS experiment

    Energy Technology Data Exchange (ETDEWEB)

    Edelmann, Erik; Groenager, Michael; Johansson, Daniel; Kleist, Josva [Nordic DataGrid Facility, Kastruplundgade 22, 1., DK-2770 Kastrup (Denmark); Field, Laurence; Qing, Di [CERN, CH-1211 Geneve 23 (Switzerland); Frey, Jaime [University of Wisconsin-Madison, 1210 W. Dayton St., Madison, WI (United States); Happonen, Kalle; Klem, Jukka; Koivumaeki, Jesper; Linden, Tomas; Pirinen, Antti, E-mail: Jukka.Klem@cern.c [Helsinki Institute of Physics, PO Box 64, FIN-00014 University of Helsinki (Finland)

    2010-04-01

    The Compact Muon Solenoid (CMS) is one of the general purpose experiments at the CERN Large Hadron Collider (LHC). CMS computing relies on different grid infrastructures to provide computational and storage resources. The major grid middleware stacks used for CMS computing are gLite, Open Science Grid (OSG) and ARC (Advanced Resource Connector). Helsinki Institute of Physics (HIP) hosts one of the Tier-2 centers for CMS computing. CMS Tier-2 centers operate software systems for data transfers (PhEDEx), Monte Carlo production (ProdAgent) and data analysis (CRAB). In order to provide the Tier-2 services for CMS, HIP uses tools and components from both ARC and gLite grid middleware stacks. Interoperation between grid systems is a challenging problem and HIP uses two different solutions to provide the needed services. The first solution is based on gLite-ARC grid level interoperability. This allows to use ARC resources in CMS without modifying the CMS application software. The second solution is based on developing specific ARC plugins in CMS software.

  15. Grid Interoperation with ARC Middleware for the CMS Experiment

    CERN Document Server

    Edelmann, Erik; Frey, Jaime; Gronager, Michael; Happonen, Kalle; Johansson, Daniel; Kleist, Josva; Klem, Jukka; Koivumaki, Jesper; Linden, Tomas; Pirinen, Antti; Qing, Di

    2010-01-01

    The Compact Muon Solenoid (CMS) is one of the general purpose experiments at the CERN Large Hadron Collider (LHC). CMS computing relies on different grid infrastructures to provide computational and storage resources. The major grid middleware stacks used for CMS computing are gLite, Open Science Grid (OSG) and ARC (Advanced Resource Connector). Helsinki Institute of Physics (HIP) hosts one of the Tier-2 centers for CMS computing. CMS Tier-2 centers operate software systems for data transfers (PhEDEx), Monte Carlo production (ProdAgent) and data analysis (CRAB). In order to provide the Tier-2 services for CMS, HIP uses tools and components from both ARC and gLite grid middleware stacks. Interoperation between grid systems is a challenging problem and HIP uses two different solutions to provide the needed services. The first solution is based on gLite-ARC grid level interoperability. This allows to use ARC resources in CMS without modifying the CMS application software. The second solution is based on developi...

  16. Defining Inter-Cloud Architecture for Interoperability and Integration

    NARCIS (Netherlands)

    Demchenko, Y.; Ngo, C.; Makkes, M.X.; Strijkers, R.J.; Laat, C. de

    2012-01-01

    This paper presents on-going research to develop the Inter-Cloud Architecture that should address problems in multi-provider multi-domain heterogeneous Cloud based applications integration and interoperability, including integration and interoperability with legacy infrastructure services. Cloud

  17. Defining inter-cloud architecture for interoperability and integration

    NARCIS (Netherlands)

    Demchenko, Y.; Ngo, C.; Makkes, M.X.; Strijkers, R.; de Laat, C.; Zimmermann, W.; Lee, Y.W.; Demchenko, Y.

    2012-01-01

    This paper presents an on-going research to develop the Inter-Cloud Architecture, which addresses the architectural problems in multi-provider multi-domain heterogeneous cloud based applications integration and interoperability, including integration and interoperability with legacy infrastructure

  18. Interoperability does matter

    Directory of Open Access Journals (Sweden)

    Manfred Goepel

    2006-04-01

    Full Text Available In companies, the historically developed IT systems are mostly application islands. They always produce good results if the system's requirements and surroundings are not changed and as long as a system interface is not needed. With the ever increas-ing dynamic and globalization of the market, however, these IT islands are certain to collapse. Interoperability (IO is the bid of the hour, assuming the integration of users, data, applications and processes. In the following, important IO enablers such as ETL, EAI, and SOA will be examined on the basis of practica-bility. It will be shown that especially SOA produces a surge of interoperability that could rightly be referred to as IT evolution.

  19. An Interoperable Cartographic Database

    OpenAIRE

    Slobodanka Ključanin; Zdravko Galić

    2007-01-01

    The concept of producing a prototype of interoperable cartographic database is explored in this paper, including the possibilities of integration of different geospatial data into the database management system and their visualization on the Internet. The implementation includes vectorization of the concept of a single map page, creation of the cartographic database in an object-relation database, spatial analysis, definition and visualization of the database content in the form of a map on t...

  20. Scientific Digital Libraries, Interoperability, and Ontologies

    Science.gov (United States)

    Hughes, J. Steven; Crichton, Daniel J.; Mattmann, Chris A.

    2009-01-01

    Scientific digital libraries serve complex and evolving research communities. Justifications for the development of scientific digital libraries include the desire to preserve science data and the promises of information interconnectedness, correlative science, and system interoperability. Shared ontologies are fundamental to fulfilling these promises. We present a tool framework, some informal principles, and several case studies where shared ontologies are used to guide the implementation of scientific digital libraries. The tool framework, based on an ontology modeling tool, was configured to develop, manage, and keep shared ontologies relevant within changing domains and to promote the interoperability, interconnectedness, and correlation desired by scientists.

  1. The DFG Viewer for Interoperability in Germany

    Directory of Open Access Journals (Sweden)

    Ralf Goebel

    2010-02-01

    Full Text Available This article deals with the DFG Viewer for Interoperability, a free and open source web-based viewer for digitised books, and assesses its relevance for interoperability in Germany. First the specific situation in Germany is described, including the important role of the Deutsche Forschungsgemeinschaft (German Research Foundation. The article then moves on to the overall concept of the viewer and its technical background. It introduces the data formats and standards used, it briefly illustrates how the viewer works and includes a few examples.

  2. Architectures for the Development of the National Interoperability Framework in Romania

    Directory of Open Access Journals (Sweden)

    Codrin-Florentin NISIOIU

    2015-10-01

    Full Text Available The authors of Digital Agenda consider that Europe do not take fully advantage of interoperability. They believe that we need effective interoperability between IT products and services to build a truly Digital Society. The Digital Agenda can only be effective if all the elements and applications are interoperable and based on open standards and platforms. In this context, I propose in this article a specific architecture for developing Romanian National Interoperability framework.

  3. Intercloud architecture for interoperability and integration

    NARCIS (Netherlands)

    Demchenko, Y.; Makkes, M.X.; Strijkers, R.; de Laat, C.

    2012-01-01

    This paper presents on-going research to develop the Intercloud Architecture Framework (ICAF) that addresses problems in multi-provider multi-domain heterogeneous cloud based infrastructure services and applications integration and interoperability. The paper refers to existing standards in Cloud

  4. Intercloud Architecture for interoperability and integration

    NARCIS (Netherlands)

    Demchenko, Y.; Makkes, M.X.; Strijkers, R.J.; Laat, C. de

    2012-01-01

    This paper presents on-going research to develop the Intercloud Architecture Framework (ICAF) that addresses problems in multi-provider multi-domain heterogeneous cloud based infrastructure services and applications integration and interoperability. The paper refers to existing standards in Cloud

  5. BIM Interoperability Limitations: Australian and Malaysian Rail Projects

    Directory of Open Access Journals (Sweden)

    Kenley Russell

    2016-01-01

    Full Text Available Building information modelling (BIM is defined as a process involving the generation and management of digital representation of physical and functional characteristics of a facility. The purpose of interoperability in integrated or “open” BIM is to facilitate the information exchange between different digital systems, models and tools. There has been effort towards data interoperability with development of open source standards and object-oriented models, such as industry foundation classes (IFC for vertical infrastructure. However, the lack of open data standards for the information exchange for horizontal infrastructure limits the adoption and effectiveness of integrated BIM. The paper outlines two interoperability issues for construction of rail infrastructure. The issues are presented in two case study reports, one from Australia and one from Malaysia. The each case study includes: a description of the project, the application of BIM in the project, a discussion of the promised BIM interoperability solution plus the identification of the unresolved lack of interoperability for horizontal infrastructure project management. The Moreton Bay Rail project in Australia introduces general software interoperability issues. The Light Rail Extension project in Kuala Lumpur outlines an example of the integration problems related to two different location data structures. The paper highlights how the continuing lack of data interoperability limits utilisation of integrated BIM for horizontal infrastructure rail projects.

  6. An Interoperable Cartographic Database

    Directory of Open Access Journals (Sweden)

    Slobodanka Ključanin

    2007-05-01

    Full Text Available The concept of producing a prototype of interoperable cartographic database is explored in this paper, including the possibilities of integration of different geospatial data into the database management system and their visualization on the Internet. The implementation includes vectorization of the concept of a single map page, creation of the cartographic database in an object-relation database, spatial analysis, definition and visualization of the database content in the form of a map on the Internet. 

  7. Inter-operability

    International Nuclear Information System (INIS)

    Plaziat, J.F.; Moulin, P.; Van Beurden, R.; Ballet, E.

    2005-01-01

    Building an internal gas market implies establishing harmonized rules for cross border trading between operators. To that effect, the European association EASEE-gas is carrying out standards and procedures, commonly called 'inter-operability'. Set up in 2002, the Association brings together all segments of the gas industry: producers, transporters, distributors, traders and shippers, suppliers, consumers and service providers. This workshop presents the latest status on issues such as barriers to gas trade in Europe, rules and procedures under preparation by EASEE-gas, and the implementation schedule of these rules by operators. This article gathers 5 presentations about this topic given at the gas conference

  8. BENEFITS OF LINKED DATA FOR INTEROPERABILITY DURING CRISIS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    R. Roller

    2015-08-01

    Full Text Available Floodings represent a permanent risk to the Netherlands in general and to her power supply in particular. Data sharing is essential within this crisis scenario as a power cut affects a great variety of interdependant sectors. Currently used data sharing systems have been shown to hamper interoperability between stakeholders since they lack flexibility and there is no consensus in term definitions and interpretations. The study presented in this paper addresses these challenges by proposing a new data sharing solution based on Linked Data, a method of interlinking data points in a structured way on the web. A conceptual model for two data sharing parties in a flood-caused power cut crisis management scenario was developed to which relevant data were linked. The analysis revealed that the presented data sharing solution burderns its user with extra costs in the short run, but saves resources in the long run by overcoming interoperability problems of the legacy systems. The more stakeholders adopt Linked Data the stronger its benefits for data sharing will become.

  9. Unmanned Ground Vehicle (UGV) Interoperability Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The UGV Interoperability Lab provides the capability to verify vendor conformance against government-defined interoperability profiles (IOPs). This capability allows...

  10. Proposed DoD Guidelines for Implementation of a Web-Based Joint IETM Architecture (JIA) to Assure the Interoperability of DoD IETMs

    National Research Council Canada - National Science Library

    Jorgensen, Eric L

    1999-01-01

    This Paper presents preliminary guidelines intended to serve as input to a planned DoD Handbook for the Acquisition and Deployment of DoD IETMs with the specific purpose of assuring interoperability...

  11. Cloud portability and interoperability issues and current trends

    CERN Document Server

    Di Martino, Beniamino; Esposito, Antonio

    2015-01-01

    This book offers readers a quick, comprehensive and up-to-date overview of the most important methodologies, technologies, APIs and standards related to the portability and interoperability of cloud applications and services, illustrated by a number of use cases representing a variety of interoperability and portability scenarios. The lack of portability and interoperability between cloud platforms at different service levels is the main issue affecting cloud-based services today. The brokering, negotiation, management, monitoring and reconfiguration of cloud resources are challenging tasks

  12. An E-government Interoperability Platform Supporting Personal Data Protection Regulations

    Directory of Open Access Journals (Sweden)

    Laura González

    2016-08-01

    Full Text Available Public agencies are increasingly required to collaborate with each other in order to provide high-quality e-government services. This collaboration is usually based on the service-oriented approach and supported by interoperability platforms. Such platforms are specialized middleware-based infrastructures enabling the provision, discovery and invocation of interoperable software services. In turn, given that personal data handled by governments are often very sensitive, most governments have developed some sort of legislation focusing on data protection. This paper proposes solutions for monitoring and enforcing data protection laws within an E-government Interoperability Platform. In particular, the proposal addresses requirements posed by the Uruguayan Data Protection Law and the Uruguayan E-government Platform, although it can also be applied in similar scenarios. The solutions are based on well-known integration mechanisms (e.g. Enterprise Service Bus as well as recognized security standards (e.g. eXtensible Access Control Markup Language and were completely prototyped leveraging the SwitchYard ESB product.

  13. Visual Development Environment for Semantically Interoperable Smart Cities Applications

    OpenAIRE

    Roukounaki , Aikaterini; Soldatos , John; Petrolo , Riccardo; Loscri , Valeria; Mitton , Nathalie; Serrano , Martin

    2015-01-01

    International audience; This paper presents an IoT architecture for the semantic interoperability of diverse IoT systems and applications in smart cities. The architecture virtualizes diverse IoT systems and ensures their modelling and representation according to common standards-based IoT ontologies. Furthermore, based on this architecture, the paper introduces a first-of-a-kind visual development environment which eases the development of semantically interoperable applications in smart cit...

  14. Processing biological literature with customizable Web services supporting interoperable formats.

    Science.gov (United States)

    Rak, Rafal; Batista-Navarro, Riza Theresa; Carter, Jacob; Rowley, Andrew; Ananiadou, Sophia

    2014-01-01

    Web services have become a popular means of interconnecting solutions for processing a body of scientific literature. This has fuelled research on high-level data exchange formats suitable for a given domain and ensuring the interoperability of Web services. In this article, we focus on the biological domain and consider four interoperability formats, BioC, BioNLP, XMI and RDF, that represent domain-specific and generic representations and include well-established as well as emerging specifications. We use the formats in the context of customizable Web services created in our Web-based, text-mining workbench Argo that features an ever-growing library of elementary analytics and capabilities to build and deploy Web services straight from a convenient graphical user interface. We demonstrate a 2-fold customization of Web services: by building task-specific processing pipelines from a repository of available analytics, and by configuring services to accept and produce a combination of input and output data interchange formats. We provide qualitative evaluation of the formats as well as quantitative evaluation of automatic analytics. The latter was carried out as part of our participation in the fourth edition of the BioCreative challenge. Our analytics built into Web services for recognizing biochemical concepts in BioC collections achieved the highest combined scores out of 10 participating teams. Database URL: http://argo.nactem.ac.uk. © The Author(s) 2014. Published by Oxford University Press.

  15. Augmenting interoperability across repositories architectural ideas

    CERN Multimedia

    CERN. Geneva

    2005-01-01

    The aDORe digital repository architecture designed and implemented by the Los Alamos Research Library is fully standards-based and highly modular, with the various components of the architecture interacting in a protocol-driven manner. Although aDORe was designed for use in the context of the Los Alamos Library, its modular and standards-based design has led to interesting insights regarding possible new levels of interoperability in a federation of heterogeneous repositories. The presentation will discuss these insights, and will illustrate that attractive federations of repositories can be built by introducing rather basic interoperability requirements. The presentation will also show that, once these requirements are met, a powerful service framework that overlays the federation can emerge.

  16. Evaluation of Enterprise Architecture Interoperability

    National Research Council Canada - National Science Library

    Jamison, Theresa A; Niska, Brice T; Layman, Phillip A; Whitney, Steven P

    2005-01-01

    ...), which describes these architectures. The purpose of this project, suggested by Air Force Space Command, was to examine the value of existing analytical tools in making an interoperability assessment of individual enterprises, as well...

  17. River Basin Standards Interoperability Pilot

    Science.gov (United States)

    Pesquer, Lluís; Masó, Joan; Stasch, Christoph

    2016-04-01

    There is a lot of water information and tools in Europe to be applied in the river basin management but fragmentation and a lack of coordination between countries still exists. The European Commission and the member states have financed several research and innovation projects in support of the Water Framework Directive. Only a few of them are using the recently emerging hydrological standards, such as the OGC WaterML 2.0. WaterInnEU is a Horizon 2020 project focused on creating a marketplace to enhance the exploitation of EU funded ICT models, tools, protocols and policy briefs related to water and to establish suitable conditions for new market opportunities based on these offerings. One of WaterInnEU's main goals is to assess the level of standardization and interoperability of these outcomes as a mechanism to integrate ICT-based tools, incorporate open data platforms and generate a palette of interchangeable components that are able to use the water data emerging from the recently proposed open data sharing processes and data models stimulated by initiatives such as the INSPIRE directive. As part of the standardization and interoperability activities in the project, the authors are designing an experiment (RIBASE, the present work) to demonstrate how current ICT-based tools and water data can work in combination with geospatial web services in the Scheldt river basin. The main structure of this experiment, that is the core of the present work, is composed by the following steps: - Extraction of information from river gauges data in OGC WaterML 2.0 format using SOS services (preferably compliant to the OGC SOS 2.0 Hydrology Profile Best Practice). - Model floods using a WPS 2.0, WaterML 2.0 data and weather forecast models as input. - Evaluation of the applicability of Sensor Notification Services in water emergencies. - Open distribution of the input and output data as OGC web services WaterML, / WCS / WFS and with visualization utilities: WMS. The architecture

  18. 1991 and networked interoperability

    Directory of Open Access Journals (Sweden)

    Birte Christensen-Dalsgaard

    2017-05-01

    Full Text Available 1991, the year of the first call in the Libraries Programme, was a very different time; the network infrastructure was being built, computers were becoming more powerful, and the information society was being formed based on different technological solutions. Standards (SR, Z39.50, HTML, MP3 and protocols (TCP/IP, OSI as to how bits should be transported and interpreted, how programs on computers should communicate and how one could find the relevant information (WAIS, Gopher, WWW were being developed, tested and deployed. Questions were asked that pushed boundaries, experiments were conducted, which delivered new possibilities, and progress on standardization was made. Much happened in the 1990s in a short time span.

  19. Solving the interoperability challenge of a distributed complex patient guidance system: a data integrator based on HL7's Virtual Medical Record standard.

    Science.gov (United States)

    Marcos, Carlos; González-Ferrer, Arturo; Peleg, Mor; Cavero, Carlos

    2015-05-01

    We show how the HL7 Virtual Medical Record (vMR) standard can be used to design and implement a data integrator (DI) component that collects patient information from heterogeneous sources and stores it into a personal health record, from which it can then retrieve data. Our working hypothesis is that the HL7 vMR standard in its release 1 version can properly capture the semantics needed to drive evidence-based clinical decision support systems. To achieve seamless communication between the personal health record and heterogeneous data consumers, we used a three-pronged approach. First, the choice of the HL7 vMR as a message model for all components accompanied by the use of medical vocabularies eases their semantic interoperability. Second, the DI follows a service-oriented approach to provide access to system components. Third, an XML database provides the data layer.Results The DI supports requirements of a guideline-based clinical decision support system implemented in two clinical domains and settings, ensuring reliable and secure access, high performance, and simplicity of integration, while complying with standards for the storage and processing of patient information needed for decision support and analytics. This was tested within the framework of a multinational project (www.mobiguide-project.eu) aimed at developing a ubiquitous patient guidance system (PGS). The vMR model with its extension mechanism is demonstrated to be effective for data integration and communication within a distributed PGS implemented for two clinical domains across different healthcare settings in two nations. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. Evaluating the Organizational Interoperability Maturity Level in ICT Research Center

    Directory of Open Access Journals (Sweden)

    Manijeh Haghighinasab

    2011-03-01

    Full Text Available Interoperability refers to the ability to provide services and to accept services from other systems or devices. Collaborative enterprises face additional challenges to interoperate seamlessly within a networked organization. The major task here is to assess the maturity level of interoperating organizations. For this purpose the maturity models for enterprise were reviewed based on vendors’ reliability and advantages versus disadvantages. Interoperability maturity model was deduced from ATHENA project as European Integrated Project in 2005, this model named as EIMM was examined in Iran information and Communication Institute as a leading Telecommunication organization. 115 questionnaires were distributed between staff of 4 departments: Information Technology, Communication Technology, Security and Strategic studies regarding six areas of concern: Enterprise Modeling, Business Strategy Process, Organization and Competences, Products and Services, Systems and Technology, Legal Environment, Security and Trust at five maturity levels: Performed, Modeled , Integrated, Interoperable and Optimizing maturity. The findings showed different levels of maturity in this Institute. To achieve Interoperability level, appropriate practices are proposed for promotion to the higher levels.

  1. Interoperability and HealthGRID.

    Science.gov (United States)

    Bescos, C; Schmitt, D; Kass, J; García-Barbero, M; Kantchev, P

    2005-01-01

    GRID technology, with initiatives like the GGF, will have the potential to allow both competition and interoperability not only among applications and toolkits, but also among implementations of key services. The pyramid of eHealth interoperability should be achieved from standards in communication and data security, storage and processing, to the policy initiatives, including organizational protocols, financing procedures, and legal framework. The open challenges for GRID use in clinical fields illustrate the potential of the combination of grid technologies with medical routine into a wider interoperable framework. The Telemedicine Alliance is a consortium (ESA, WHO and ITU), initiated in 2002, in building a vision for the provision of eHealth to European citizens by 2010. After a survey with more that 50 interviews of experts, interoperability was identified as the main showstopper to eHealth implementation. There are already several groups and organizations contributing to standardization. TM-Alliance is supporting the "e-Health Standardization Coordination Group" (eHSCG). It is now, in the design and development phase of GRID technology in Health, the right moment to act with the aim of achieving an interoperable and open framework. The Health area should benefit from the initiatives started at the GGF in terms of global architecture and services definitions, as well as from the security and other web services applications developed under the Internet umbrella. There is a risk that existing important results of the standardization efforts in this area are not taken up simply because they are not always known.

  2. OR.NET: a service-oriented architecture for safe and dynamic medical device interoperability.

    Science.gov (United States)

    Kasparick, Martin; Schmitz, Malte; Andersen, Björn; Rockstroh, Max; Franke, Stefan; Schlichting, Stefan; Golatowski, Frank; Timmermann, Dirk

    2018-02-23

    Modern surgical departments are characterized by a high degree of automation supporting complex procedures. It recently became apparent that integrated operating rooms can improve the quality of care, simplify clinical workflows, and mitigate equipment-related incidents and human errors. Particularly using computer assistance based on data from integrated surgical devices is a promising opportunity. However, the lack of manufacturer-independent interoperability often prevents the deployment of collaborative assistive systems. The German flagship project OR.NET has therefore developed, implemented, validated, and standardized concepts for open medical device interoperability. This paper describes the universal OR.NET interoperability concept enabling a safe and dynamic manufacturer-independent interconnection of point-of-care (PoC) medical devices in the operating room and the whole clinic. It is based on a protocol specifically addressing the requirements of device-to-device communication, yet also provides solutions for connecting the clinical information technology (IT) infrastructure. We present the concept of a service-oriented medical device architecture (SOMDA) as well as an introduction to the technical specification implementing the SOMDA paradigm, currently being standardized within the IEEE 11073 service-oriented device connectivity (SDC) series. In addition, the Session concept is introduced as a key enabler for safe device interconnection in highly dynamic ensembles of networked medical devices; and finally, some security aspects of a SOMDA are discussed.

  3. A Smart Home Center Platform Solution Based on Smart Mirror

    Directory of Open Access Journals (Sweden)

    Deng Xibo

    2017-01-01

    Full Text Available With the popularization of the concept of smart home, people have raised requirements on the experience of smart living. A smart home platform center solution is put forward in order to solve the intelligent interoperability and information integration of smart home, which enable people to have a more intelligent and convenient life experience. This platform center is achieved through the Smart Mirror. The Smart Mirror refers to a smart furniture, on the basis of the traditional concept of mirror, combining Raspberry Pi, the application of one-way mirror imaging principle, the touch-enabled design, voice and video interaction. Smart Mirror can provide a series of intelligent experience for the residents, such as controlling all the intelligent furniture through Smart Mirror; accessing and displaying the weather, time, news and other life information; monitoring the home environment; remote interconnection operation.

  4. Dynamic Business Networks: A Headache for Sustainable Systems Interoperability

    Science.gov (United States)

    Agostinho, Carlos; Jardim-Goncalves, Ricardo

    Collaborative networked environments emerged with the spread of the internet, contributing to overcome past communication barriers, and identifying interoperability as an essential property. When achieved seamlessly, efficiency is increased in the entire product life cycle. Nowadays, most organizations try to attain interoperability by establishing peer-to-peer mappings with the different partners, or in optimized networks, by using international standard models as the core for information exchange. In current industrial practice, mappings are only defined once, and the morphisms that represent them, are hardcoded in the enterprise systems. This solution has been effective for static environments, where enterprise and product models are valid for decades. However, with an increasingly complex and dynamic global market, models change frequently to answer new customer requirements. This paper draws concepts from the complex systems science and proposes a framework for sustainable systems interoperability in dynamic networks, enabling different organizations to evolve at their own rate.

  5. Special topic interoperability and EHR: Combining openEHR, SNOMED, IHE, and continua as approaches to interoperability on national ehealth

    DEFF Research Database (Denmark)

    Bestek, M.; Stanimirovi, D.

    2017-01-01

    into the eHealth environment, and identification of the main success factors in the field, which are necessary for achieving required interoperability, and consequently, for the successful implementation of eHealth projects in general. Methods: The paper represents an in-depth analysis regarding...... the potential application of openEHR, SNOMED, IHE and Continua approaches in the development and implementation process of eHealth in Slovenia. The research method used is both exploratory and deductive in nature. The methodological framework is grounded on information retrieval with a special focus on research...... could alleviate the pertinent interoperability issues in the Slovenian eHealth context. 2. Analyzing the possibilities (requirements) for their inclusion in the construction process for individual eHealth solutions. 3. Identification and charting the main success factors in the interoperability field...

  6. Interoperability challenges in river discharge modelling: A cross domain application scenario

    Science.gov (United States)

    Santoro, Mattia; Andres, Volker; Jirka, Simon; Koike, Toshio; Looser, Ulrich; Nativi, Stefano; Pappenberger, Florian; Schlummer, Manuela; Strauch, Adrian; Utech, Michael; Zsoter, Ervin

    2018-06-01

    River discharge is a critical water cycle variable, as it integrates all the processes (e.g. runoff and evapotranspiration) occurring within a river basin and provides a hydrological output variable that can be readily measured. Its prediction is of invaluable help for many water-related tasks including water resources assessment and management, flood protection, and disaster mitigation. Observations of river discharge are important to calibrate and validate hydrological or coupled land, atmosphere and ocean models. This requires using datasets from different scientific domains (Water, Weather, etc.). Typically, such datasets are provided using different technological solutions. This complicates the integration of new hydrological data sources into application systems. Therefore, a considerable effort is often spent on data access issues instead of the actual scientific question. This paper describes the work performed to address multidisciplinary interoperability challenges related to river discharge modeling and validation. This includes definition and standardization of domain specific interoperability standards for hydrological data sharing and their support in global frameworks such as the Global Earth Observation System of Systems (GEOSS). The research was developed in the context of the EU FP7-funded project GEOWOW (GEOSS Interoperability for Weather, Ocean and Water), which implemented a "River Discharge" application scenario. This scenario demonstrates the combination of river discharge observations data from the Global Runoff Data Centre (GRDC) database and model outputs produced by the European Centre for Medium-Range Weather Forecasts (ECMWF) predicting river discharge based on weather forecast information in the context of the GEOSS.

  7. Ocean Data Interoperability Platform (ODIP): developing a common framework for marine data management on a global scale

    Science.gov (United States)

    Glaves, Helen; Schaap, Dick

    2016-04-01

    The increasingly ocean basin level approach to marine research has led to a corresponding rise in the demand for large quantities of high quality interoperable data. This requirement for easily discoverable and readily available marine data is currently being addressed by initiatives such as SeaDataNet in Europe, Rolling Deck to Repository (R2R) in the USA and the Australian Ocean Data Network (AODN) with each having implemented an e-infrastructure to facilitate the discovery and re-use of standardised multidisciplinary marine datasets available from a network of distributed repositories, data centres etc. within their own region. However, these regional data systems have been developed in response to the specific requirements of their users and in line with the priorities of the funding agency. They have also been created independently of the marine data infrastructures in other regions often using different standards, data formats, technologies etc. that make integration of marine data from these regional systems for the purposes of basin level research difficult. Marine research at the ocean basin level requires a common global framework for marine data management which is based on existing regional marine data systems but provides an integrated solution for delivering interoperable marine data to the user. The Ocean Data Interoperability Platform (ODIP/ODIP II) project brings together those responsible for the management of the selected marine data systems and other relevant technical experts with the objective of developing interoperability across the regional e-infrastructures. The commonalities and incompatibilities between the individual data infrastructures are identified and then used as the foundation for the specification of prototype interoperability solutions which demonstrate the feasibility of sharing marine data across the regional systems and also with relevant larger global data services such as GEO, COPERNICUS, IODE, POGO etc. The potential

  8. Model and Interoperability using Meta Data Annotations

    Science.gov (United States)

    David, O.

    2011-12-01

    Software frameworks and architectures are in need for meta data to efficiently support model integration. Modelers have to know the context of a model, often stepping into modeling semantics and auxiliary information usually not provided in a concise structure and universal format, consumable by a range of (modeling) tools. XML often seems the obvious solution for capturing meta data, but its wide adoption to facilitate model interoperability is limited by XML schema fragmentation, complexity, and verbosity outside of a data-automation process. Ontologies seem to overcome those shortcomings, however the practical significance of their use remains to be demonstrated. OMS version 3 took a different approach for meta data representation. The fundamental building block of a modular model in OMS is a software component representing a single physical process, calibration method, or data access approach. Here, programing language features known as Annotations or Attributes were adopted. Within other (non-modeling) frameworks it has been observed that annotations lead to cleaner and leaner application code. Framework-supported model integration, traditionally accomplished using Application Programming Interfaces (API) calls is now achieved using descriptive code annotations. Fully annotated components for various hydrological and Ag-system models now provide information directly for (i) model assembly and building, (ii) data flow analysis for implicit multi-threading or visualization, (iii) automated and comprehensive model documentation of component dependencies, physical data properties, (iv) automated model and component testing, calibration, and optimization, and (v) automated audit-traceability to account for all model resources leading to a particular simulation result. Such a non-invasive methodology leads to models and modeling components with only minimal dependencies on the modeling framework but a strong reference to its originating code. Since models and

  9. The interoperability force in the ERP field

    Science.gov (United States)

    Boza, Andrés; Cuenca, Llanos; Poler, Raúl; Michaelides, Zenon

    2015-04-01

    Enterprise resource planning (ERP) systems participate in interoperability projects and this participation sometimes leads to new proposals for the ERP field. The aim of this paper is to identify the role that interoperability plays in the evolution of ERP systems. To go about this, ERP systems have been first identified within interoperability frameworks. Second, the initiatives in the ERP field driven by interoperability requirements have been identified from two perspectives: technological and business. The ERP field is evolving from classical ERP as information system integrators to a new generation of fully interoperable ERP. Interoperability is changing the way of running business, and ERP systems are changing to adapt to the current stream of interoperability.

  10. Food product tracing technology capabilities and interoperability.

    Science.gov (United States)

    Bhatt, Tejas; Zhang, Jianrong Janet

    2013-12-01

    identify the supply-chain partner who provided the information prior to sharing this information with product tracing technology providers. The 9 traceability solution providers who agreed to participate in this project have their systems deployed in a wide range of sectors within the food industry including, but not limited to, livestock, dairy, produce, fruits, seafood, meat, and pork; as well as in pharmaceutical, automotive, retail, and other industries. Some have also been implemented across the globe including Canada, China, USA, Norway, and the EU, among others. This broad commercial use ensures that the findings of this work are applicable to a broad spectrum of the food system. Six of the 9 participants successfully completed the data entry phase of this test. To verify successful data entry for these 6, a demo or screenshots of the data set from each system's user interface was requested. Only 4 of the 6 were able to provide us with this evidence for verification. Of the 6 that completed data entry and moved on to the scenarios phase of the test, 5 were able to provide us with the responses to the scenarios. Time metrics were useful for evaluating the scalability and usability of each technology. Scalability was derived from the time it took to enter the nonstandardized data set into the system (ranges from 7 to 11 d). Usability was derived from the time it took to query the scenarios and provide the results (from a few hours to a week). The time was measured in days it took for the participants to respond after we supplied them all the information they would need to successfully execute each test/scenario. Two of the technology solution providers successfully implemented and participated in a proof-of-concept interoperable framework during Year 2 of this study. While not required, they also demonstrated this interoperability capability on the FSMA-mandated food product tracing pilots for the U.S. FDA. This has significant real-world impact since the

  11. European Interoperability Assets Register and Quality Framework Implementation.

    Science.gov (United States)

    Moreno-Conde, Alberto; Thienpont, Geert; Lamote, Inge; Coorevits, Pascal; Parra, Carlos; Kalra, Dipak

    2016-01-01

    Interoperability assets is the term applied to refer to any resource that can support the design, implementation and successful adoption of eHealth services that can exchange data meaningfully. Some examples may include functional requirements, specifications, standards, clinical models and term lists, guidance on how standards may be used concurrently, implementation guides, educational resources, and other resources. Unfortunately, these are largely accessible in ad hoc ways and result in scattered fragments of a solution space that urgently need to be brought together. At present, it is well known that new initiatives and projects will reinvent assets of which they were unaware, while those assets which were potentially of great value are forgotten, not maintained and eventually fall into disuse. This research has defined a quality in use model and assessed the suitability of this quality framework based on the feedback and opinion of a representative sample of potential end users. This quality framework covers the following domains of asset development and adoption: (i) Development process, (ii) Maturity level, (iii) Trustworthiness, (iv) Support & skills, (v) Sustainability, (vi) Semantic interoperability, (vii) Cost & effort of adoption (viii) Maintenance. When participants were requested to evaluate how the overall quality in use framework, 70% would recommend using the register to their colleagues, 70% felt that it could provide relevant benefits for discovering new assets, and 50% responded that it would support their decision making about the recommended asset to adopt or implement in their organisation. Several European projects have expressed interest in using the register, which will now be sustained and promoted by the the European Institute for Innovation through Health Data.

  12. Semantic Interoperability in Heterogeneous IoT Infrastructure for Healthcare

    Directory of Open Access Journals (Sweden)

    Sohail Jabbar

    2017-01-01

    Full Text Available Interoperability remains a significant burden to the developers of Internet of Things’ Systems. This is due to the fact that the IoT devices are highly heterogeneous in terms of underlying communication protocols, data formats, and technologies. Secondly due to lack of worldwide acceptable standards, interoperability tools remain limited. In this paper, we proposed an IoT based Semantic Interoperability Model (IoT-SIM to provide Semantic Interoperability among heterogeneous IoT devices in healthcare domain. Physicians communicate their patients with heterogeneous IoT devices to monitor their current health status. Information between physician and patient is semantically annotated and communicated in a meaningful way. A lightweight model for semantic annotation of data using heterogeneous devices in IoT is proposed to provide annotations for data. Resource Description Framework (RDF is a semantic web framework that is used to relate things using triples to make it semantically meaningful. RDF annotated patients’ data has made it semantically interoperable. SPARQL query is used to extract records from RDF graph. For simulation of system, we used Tableau, Gruff-6.2.0, and Mysql tools.

  13. Metadata behind the Interoperability of Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Miguel Angel Manso Callejo

    2009-05-01

    Full Text Available Wireless Sensor Networks (WSNs produce changes of status that are frequent, dynamic and unpredictable, and cannot be represented using a linear cause-effect approach. Consequently, a new approach is needed to handle these changes in order to support dynamic interoperability. Our approach is to introduce the notion of context as an explicit representation of changes of a WSN status inferred from metadata elements, which in turn, leads towards a decision-making process about how to maintain dynamic interoperability. This paper describes the developed context model to represent and reason over different WSN status based on four types of contexts, which have been identified as sensing, node, network and organisational contexts. The reasoning has been addressed by developing contextualising and bridges rules. As a result, we were able to demonstrate how contextualising rules have been used to reason on changes of WSN status as a first step towards maintaining dynamic interoperability.

  14. Plugfest 2009: Global Interoperability in Telerobotics and Telemedicine.

    Science.gov (United States)

    King, H Hawkeye; Hannaford, Blake; Kwok, Ka-Wai; Yang, Guang-Zhong; Griffiths, Paul; Okamura, Allison; Farkhatdinov, Ildar; Ryu, Jee-Hwan; Sankaranarayanan, Ganesh; Arikatla, Venkata; Tadano, Kotaro; Kawashima, Kenji; Peer, Angelika; Schauß, Thomas; Buss, Martin; Miller, Levi; Glozman, Daniel; Rosen, Jacob; Low, Thomas

    2010-05-07

    Despite the great diversity of teleoperator designs and applications, their underlying control systems have many similarities. These similarities can be exploited to enable inter-operability between heterogeneous systems. We have developed a network data specification, the Interoperable Telerobotics Protocol, that can be used for Internet based control of a wide range of teleoperators. In this work we test interoperable telerobotics on the global Internet, focusing on the telesurgery application domain. Fourteen globally dispersed telerobotic master and slave systems were connected in thirty trials in one twenty four hour period. Users performed common manipulation tasks to demonstrate effective master-slave operation. With twenty eight (93%) successful, unique connections the results show a high potential for standardizing telerobotic operation. Furthermore, new paradigms for telesurgical operation and training are presented, including a networked surgery trainer and upper-limb exoskeleton control of micro-manipulators.

  15. Evolutionary Based Solutions for Green Computing

    CERN Document Server

    Kołodziej, Joanna; Li, Juan; Zomaya, Albert

    2013-01-01

    Today’s highly parameterized large-scale distributed computing systems may be composed  of a large number of various components (computers, databases, etc) and must provide a wide range of services. The users of such systems, located at different (geographical or managerial) network cluster may have a limited access to the system’s services and resources, and different, often conflicting, expectations and requirements. Moreover, the information and data processed in such dynamic environments may be incomplete, imprecise, fragmentary, and overloading. All of the above mentioned issues require some intelligent scalable methodologies for the management of the whole complex structure, which unfortunately may increase the energy consumption of such systems.   This book in its eight chapters, addresses the fundamental issues related to the energy usage and the optimal low-cost system design in high performance ``green computing’’ systems. The recent evolutionary and general metaheuristic-based solutions ...

  16. Interoperable eHealth Platform for Personalized Smart Services

    DEFF Research Database (Denmark)

    Mihaylov, Mihail Rumenov; Mihovska, Albena Dimitrova; Kyriazakos, Sofoklis

    2015-01-01

    personalized context-aware applications to serve the user's needs. This paper proposes the use of advised sensing, context-aware and cloud-based lifestyle reasoning to design an innovative eHealth platform that supports highly personalized smart services to primary users. The architecture of the platform has...... been designed in accordance with the interoperability requirements and standards as proposed by ITU-T and Continua Alliance. In particular, we define the interface dependencies and functional requirements needed, to allow eCare and eHealth vendors to manufacture interoperable sensors, ambient and home...

  17. Interoperable Archetypes With a Three Folded Terminology Governance.

    Science.gov (United States)

    Pederson, Rune; Ellingsen, Gunnar

    2015-01-01

    The use of openEHR archetypes increases the interoperability of clinical terminology, and in doing so improves upon the availability of clinical terminology for both primary and secondary purposes. Where clinical terminology is employed in the EPR system, research reports conflicting a results for the use of structuring and standardization as measurements of success. In order to elucidate this concept, this paper focuses on the effort to establish a national repository for openEHR based archetypes in Norway where clinical terminology could be included with benefit for interoperability three folded.

  18. Key pillars of data interoperability in Earth Sciences - INSPIRE and beyond

    Science.gov (United States)

    Tomas, Robert; Lutz, Michael

    2013-04-01

    encoding. However, since the conceptual models are independent of concrete encodings, it is also possible to derive other encodings (e.g. based on RDF). Registers provide unique and persistent identifiers for a number of different types of information items (e.g. terms from a controlled vocabulary or units of measure) and allow their consistent management and versioning. By using these identifiers in data, references to specific information items can be made unique and unambiguous. It is important that these interoperability solutions are not developed in isolation - for Europe only. This has been identified from the beginning, and therefore, international standards have been taken into account and been widely referred to in INSPIRE. This mutual cooperation with international standardisation activities needs to be maintained or even extended. For example, where INSPIRE has gone beyond existing standards, the INSPIRE interoperability solutions should be introduced to the international standardisation initiatives. However, in some cases, it is difficult to choose the appropriate international organization or standardisation body (e.g. where there are several organizations overlapping in scope) or to achieve international agreements that accept European specifics. Furthermore, the development of the INSPIRE specifications (to be legally adopted in 2013) is only a beginning of the effort to make environmental data interoperable. Their actual implementation by data providers across Europe, as well as the rapid development in the earth sciences (e.g. from new simulation models, scientific advances, etc.) and ICT technology will lead to requests for changes. It is therefore crucial to ensure the long-term sustainable maintenance and further development of the proposed infrastructure. This task cannot be achieved by the INSPIRE coordination team of the European Commission alone. It is therefore crucial to closely involve relevant (where possible, umbrella) organisations in the

  19. OGC and Grid Interoperability in enviroGRIDS Project

    Science.gov (United States)

    Gorgan, Dorian; Rodila, Denisa; Bacu, Victor; Giuliani, Gregory; Ray, Nicolas

    2010-05-01

    EnviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is a 4-years FP7 Project aiming to address the subjects of ecologically unsustainable development and inadequate resource management. The project develops a Spatial Data Infrastructure of the Black Sea Catchment region. The geospatial technologies offer very specialized functionality for Earth Science oriented applications as well as the Grid oriented technology that is able to support distributed and parallel processing. One challenge of the enviroGRIDS project is the interoperability between geospatial and Grid infrastructures by providing the basic and the extended features of the both technologies. The geospatial interoperability technology has been promoted as a way of dealing with large volumes of geospatial data in distributed environments through the development of interoperable Web service specifications proposed by the Open Geospatial Consortium (OGC), with applications spread across multiple fields but especially in Earth observation research. Due to the huge volumes of data available in the geospatial domain and the additional introduced issues (data management, secure data transfer, data distribution and data computation), the need for an infrastructure capable to manage all those problems becomes an important aspect. The Grid promotes and facilitates the secure interoperations of geospatial heterogeneous distributed data within a distributed environment, the creation and management of large distributed computational jobs and assures a security level for communication and transfer of messages based on certificates. This presentation analysis and discusses the most significant use cases for enabling the OGC Web services interoperability with the Grid environment and focuses on the description and implementation of the most promising one. In these use cases we give a special attention to issues such as: the relations between computational grid and

  20. Sustainability of Open-Source Software Organizations as Underpinning for Sustainable Interoperability on Large Scales

    Science.gov (United States)

    Fulker, D. W.; Gallagher, J. H. R.

    2015-12-01

    OPeNDAP's Hyrax data server is an open-source framework fostering interoperability via easily-deployed Web services. Compatible with solutions listed in the (PA001) session description—federation, rigid standards and brokering/mediation—the framework can support tight or loose coupling, even with dependence on community-contributed software. Hyrax is a Web-services framework with a middleware-like design and a handler-style architecture that together reduce the interoperability challenge (for N datatypes and M user contexts) to an O(N+M) problem, similar to brokering. Combined with an open-source ethos, this reduction makes Hyrax a community tool for gaining interoperability. E.g., in its response to the Big Earth Data Initiative (BEDI), NASA references OPeNDAP-based interoperability. Assuming its suitability, the question becomes: how sustainable is OPeNDAP, a small not-for-profit that produces open-source software, i.e., has no software-sales? In other words, if geoscience interoperability depends on OPeNDAP and similar organizations, are those entities in turn sustainable? Jim Collins (in Good to Great) highlights three questions that successful companies can answer (paraphrased here): What is your passion? Where is your world-class excellence? What drives your economic engine? We attempt to shed light on OPeNDAP sustainability by examining these. Passion: OPeNDAP has a focused passion for improving the effectiveness of scientific data sharing and use, as deeply-cooperative community endeavors. Excellence: OPeNDAP has few peers in remote, scientific data access. Skills include computer science with experience in data science, (operational, secure) Web services, and software design (for servers and clients, where the latter vary from Web pages to standalone apps and end-user programs). Economic Engine: OPeNDAP is an engineering services organization more than a product company, despite software being key to OPeNDAP's reputation. In essence, provision of

  1. Enabling interoperability in planetary sciences and heliophysics: The case for an information model

    Science.gov (United States)

    Hughes, J. Steven; Crichton, Daniel J.; Raugh, Anne C.; Cecconi, Baptiste; Guinness, Edward A.; Isbell, Christopher E.; Mafi, Joseph N.; Gordon, Mitchell K.; Hardman, Sean H.; Joyner, Ronald S.

    2018-01-01

    The Planetary Data System has developed the PDS4 Information Model to enable interoperability across diverse science disciplines. The Information Model is based on an integration of International Organization for Standardization (ISO) level standards for trusted digital archives, information model development, and metadata registries. Where controlled vocabularies provides a basic level of interoperability by providing a common set of terms for communication between both machines and humans the Information Model improves interoperability by means of an ontology that provides semantic information or additional related context for the terms. The information model was defined by team of computer scientists and science experts from each of the diverse disciplines in the Planetary Science community, including Atmospheres, Geosciences, Cartography and Imaging Sciences, Navigational and Ancillary Information, Planetary Plasma Interactions, Ring-Moon Systems, and Small Bodies. The model was designed to be extensible beyond the Planetary Science community, for example there are overlaps between certain PDS disciplines and the Heliophysics and Astrophysics disciplines. "Interoperability" can apply to many aspects of both the developer and the end-user experience, for example agency-to-agency, semantic level, and application level interoperability. We define these types of interoperability and focus on semantic level interoperability, the type of interoperability most directly enabled by an information model.

  2. Standards to open and interoperable digital libraries

    Directory of Open Access Journals (Sweden)

    Luís Fernando Sayão

    2007-12-01

    Full Text Available Interoperability is one of the main issues in creating a networked system of digital libraries. However, the interoperability as the way to accomplish data exchange and service collaboration requires adoption of a set of open standards covering all digital repository processes. The aim of this document is to revise the most important standards, protocols and the best pratices that form the framework to an open and fully interoperable digital library.

  3. The challenge of networked enterprises for cloud computing interoperability

    OpenAIRE

    Mezgár, István; Rauschecker, Ursula

    2014-01-01

    Manufacturing enterprises have to organize themselves into effective system architectures forming different types of Networked Enterprises (NE) to match fast changing market demands. Cloud Computing (CC) is an important up to date computing concept for NE, as it offers significant financial and technical advantages beside high-level collaboration possibilities. As cloud computing is a new concept the solutions for handling interoperability, portability, security, privacy and standardization c...

  4. Semantic Interoperability Almost Without Using The Same Vocabulary: Is It Possible?

    Science.gov (United States)

    Krisnadhi, A. A.

    2016-12-01

    Semantic interoperability, which is a key requirement in realizing cross-repository data integration, is often understood as using the same ontology or vocabulary. Consequently, within a particular domain, one can easily assume that there has to be one unifying domain ontology covering as many vocabulary terms in the domain as possible in order to realize any form of data integration across multiple data sources. Furthermore, the desire to provide very precise definition of those many terms led to the development of huge, foundational and domain ontologies that are comprehensive, but too complicated, restrictive, monolithic, and difficult to use and reuse, which cause common data providers to avoid using them. This problem is especially true in a domain as diverse as geosciences as it is virtually impossible to reach an agreement to the semantics of many terms (e.g., there are hundreds of definitions of forest used throughout the world). To overcome this challenge, modular ontology architecture has emerged in recent years, fueled among others, by advances in the ontology design pattern research. Each ontology pattern models only one key notion. It can act as a small module of a larger ontology. Such a module is developed in such a way that it is largely independent of how other notions in the same domain are modeled. This leads to an increased reusability. Furthermore, an ontology formed out of such modules would have an improved understandability over large, monolithic ontologies. Semantic interoperability in the aforementioned architecture is not achieved by enforcing the use of the same vocabulary, but rather, promoting alignment to the same ontology patterns. In this work, we elaborate how this architecture realizes the above idea. In particular, we describe how multiple data sources with differing perspectives and vocabularies can interoperate through this architecture. Building the solution upon semantic technologies such as Linked Data and the Web Ontology

  5. A Guide to Understanding Emerging Interoperability Technologies

    National Research Council Canada - National Science Library

    Bollinger, Terry

    2000-01-01

    .... Over time, individual interoperability problems tend to disappear as the resources involved literally become part of one system through integration and standardization, but the overall problem...

  6. Biodiversity information platforms: From standards to interoperability

    Directory of Open Access Journals (Sweden)

    Walter Berendsohn

    2011-11-01

    Full Text Available One of the most serious bottlenecks in the scientific workflows of biodiversity sciences is the need to integrate data from different sources, software applications, and services for analysis, visualisation and publication. For more than a quarter of a century the TDWG Biodiversity Information Standards organisation has a central role in defining and promoting data standards and protocols supporting interoperability between disparate and locally distributed systems. Although often not sufficiently recognized, TDWG standards are the foundation of many popular Biodiversity Informatics applications and infrastructures ranging from small desktop software solutions to large scale international data networks. However, individual scientists and groups of collaborating scientist have difficulties in fully exploiting the potential of standards that are often notoriously complex, lack non-technical documentations, and use different representations and underlying technologies. In the last few years, a series of initiatives such as Scratchpads, the EDIT Platform for Cybertaxonomy, and biowikifarm have started to implement and set up virtual work platforms for biodiversity sciences which shield their users from the complexity of the underlying standards. Apart from being practical work-horses for numerous working processes related to biodiversity sciences, they can be seen as information brokers mediating information between multiple data standards and protocols. The ViBRANT project will further strengthen the flexibility and power of virtual biodiversity working platforms by building software interfaces between them, thus facilitating essential information flows needed for comprehensive data exchange, data indexing, web-publication, and versioning. This work will make an important contribution to the shaping of an international, interoperable, and user-oriented biodiversity information infrastructure.

  7. A Pragmatic Approach to Sustainable Interoperability for the Web 2.0 World

    Science.gov (United States)

    Wright, D. J.; Sankaran, S.

    2015-12-01

    In the geosciences, interoperability is a fundamental requirement. Members of various standards organizations such as the OGC and ISO-TC 211 have done yeomen services to promote a standards-centric approach to manage the interoperability challenges that organizations face today. The specific challenges that organizations face when adopting interoperability patterns are very many. One approach, that of mandating the use of specific standards has been reasonably successful. But scientific communities, as with all others, ultimately want their solutions to be widely accepted and used. And to this end there is a crying need to explore all possible interoperability patterns without restricting the choices to mandated standards. Standards are created by a slow and deliberative process that sometimes takes a long time to come to fruition and therefore sometime feel to fall short of user expectations. It seems therefore that organizations are left with a series of perceived orthogonal requirements when they want to pursue interoperability. They want a robust but agile solution, a mature approach that also needs to satisfy latest technology trends and so on. Sustainable interoperability patterns need to be forward looking and should choose the patterns and paradigms of the Web 2.0 generation. To this end, the key is to choose platform technologies that embrace multiple interoperability mechanisms that are built on fundamental "open" principles and which align with popular mainstream patterns. We seek to explore data-, metadata- and web service-related interoperability patterns through the prism of building solutions that encourage strong implementer and end-user engagement, improved usability and scalability considerations, and appealing developer frameworks that can grow the audience. The path to tread is not new, and the geocommunity only needs to observe and align its end goals with current Web 2.0 patterns to realize all the benefits that today we all take for granted

  8. Interoperability between phenotype and anatomy ontologies.

    Science.gov (United States)

    Hoehndorf, Robert; Oellrich, Anika; Rebholz-Schuhmann, Dietrich

    2010-12-15

    Phenotypic information is important for the analysis of the molecular mechanisms underlying disease. A formal ontological representation of phenotypic information can help to identify, interpret and infer phenotypic traits based on experimental findings. The methods that are currently used to represent data and information about phenotypes fail to make the semantics of the phenotypic trait explicit and do not interoperate with ontologies of anatomy and other domains. Therefore, valuable resources for the analysis of phenotype studies remain unconnected and inaccessible to automated analysis and reasoning. We provide a framework to formalize phenotypic descriptions and make their semantics explicit. Based on this formalization, we provide the means to integrate phenotypic descriptions with ontologies of other domains, in particular anatomy and physiology. We demonstrate how our framework leads to the capability to represent disease phenotypes, perform powerful queries that were not possible before and infer additional knowledge. http://bioonto.de/pmwiki.php/Main/PheneOntology.

  9. Interoperability of Web Archives and Digital Libraries

    DEFF Research Database (Denmark)

    Kalb, Hendrik; Lazaridou, Paraskevi; Pinsent, Edward

    2013-01-01

    The interoperability of web archives and digital libraries is crucial to avoid silos of preserved data and content. While various researches focus on specfic facets of the challenge to interoperate, there is a lack of empirical work about the overall situation of actual challenges. We conduct...

  10. Model for Trans-sector Digital Interoperability

    NARCIS (Netherlands)

    Madureira, António; den Hartog, Frank; Goncalves da Silva, Eduardo; Baken, Nico; Zhao, L.; Macaulay, L.

    2009-01-01

    Interoperability refers to the ability of two or more systems or components to exchange information and to use the information that has been exchanged. The importance of interoperability has grown together with the adoption of Digital Information Networks (DINs). DINs refer to information networks

  11. Model for Trans-sector Digital Interoperability

    NARCIS (Netherlands)

    Madureira, A.; Den Hartog, F.; Silva, E.; Baken, N.

    2010-01-01

    Interoperability refers to the ability of two or more systems or components to exchange information and to use the information that has been exchanged. The importance of interoperability has grown together with the adoption of Digital Information Networks (DINs). DINs refer to information networks

  12. Model for Trans-sector Digital Interoperability

    NARCIS (Netherlands)

    Popplewell, Keith; Madureira, António; Harding, Jenny; den Hartog, Frank; Goncalves da Silva, Eduardo; Poler, Raul; Chalmeta, Ricardo; Baken, Nico

    Interoperability refers to the ability of two or more systems or components to exchange information and to use the information that has been exchanged. The importance of interoperability has grown together with the adoption of Digital Information Networks (DINs). DINs refer to information networks

  13. Interoperability of Electronic Health Records: A Physician-Driven Redesign.

    Science.gov (United States)

    Miller, Holly; Johns, Lucy

    2018-01-01

    PURPOSE: Electronic health records (EHRs), now used by hundreds of thousands of providers and encouraged by federal policy, have the potential to improve quality and decrease costs in health care. But interoperability, although technically feasible among different EHR systems, is the weak link in the chain of logic. Interoperability is inhibited by poor understanding, by suboptimal implementation, and at times by a disinclination to dilute market share or patient base on the part of vendors or providers, respectively. The intent of this project has been to develop a series of practicable recommendations that, if followed by EHR vendors and users, can promote and enhance interoperability, helping EHRs reach their potential. METHODOLOGY: A group of 11 physicians, one nurse, and one health policy consultant, practicing from California to Massachusetts, has developed a document titled "Feature and Function Recommendations To Optimize Clinician Usability of Direct Interoperability To Enhance Patient Care" that offers recommendations from the clinician point of view. This report introduces some of these recommendations and suggests their implications for policy and the "virtualization" of EHRs. CONCLUSION: Widespread adoption of even a few of these recommendations by designers and vendors would enable a major advance toward the "Triple Aim" of improving the patient experience, improving the health of populations, and reducing per capita costs.

  14. The Role of Markup for Enabling Interoperability in Health Informatics

    Directory of Open Access Journals (Sweden)

    Steve eMckeever

    2015-05-01

    Full Text Available Interoperability is the faculty of making information systems work together. In this paper we will distinguish a number of different forms that interoperability can take and show how they are realised on a variety of physiological and health care use cases. The last fifteen years has seen the rise of very cheap digital storage both on and off cite. With the advent of the 'Internet of Things' people's expectations are for greater interconnectivity and seamless interoperability. The potential impact these technologies have on healthcare are dramatic: from improved diagnoses through immediate access to a patient's electronic health record, to 'in silico' modeling of organs and early stage drug trials, to predictive medicine based on top-down modeling of disease progression and treatment. We will begin by looking at the underlying technology, classify the various kinds of interoperability that exist in the field, and discuss how they are realised. We conclude with a discussion on future possibilities that big data and further standardizations will enable.

  15. A development framework for semantically interoperable health information systems.

    Science.gov (United States)

    Lopez, Diego M; Blobel, Bernd G M E

    2009-02-01

    Semantic interoperability is a basic challenge to be met for new generations of distributed, communicating and co-operating health information systems (HIS) enabling shared care and e-Health. Analysis, design, implementation and maintenance of such systems and intrinsic architectures have to follow a unified development methodology. The Generic Component Model (GCM) is used as a framework for modeling any system to evaluate and harmonize state of the art architecture development approaches and standards for health information systems as well as to derive a coherent architecture development framework for sustainable, semantically interoperable HIS and their components. The proposed methodology is based on the Rational Unified Process (RUP), taking advantage of its flexibility to be configured for integrating other architectural approaches such as Service-Oriented Architecture (SOA), Model-Driven Architecture (MDA), ISO 10746, and HL7 Development Framework (HDF). Existing architectural approaches have been analyzed, compared and finally harmonized towards an architecture development framework for advanced health information systems. Starting with the requirements for semantic interoperability derived from paradigm changes for health information systems, and supported in formal software process engineering methods, an appropriate development framework for semantically interoperable HIS has been provided. The usability of the framework has been exemplified in a public health scenario.

  16. The next generation of interoperability agents in healthcare.

    Science.gov (United States)

    Cardoso, Luciana; Marins, Fernando; Portela, Filipe; Santos, Manuel; Abelha, António; Machado, José

    2014-05-16

    Interoperability in health information systems is increasingly a requirement rather than an option. Standards and technologies, such as multi-agent systems, have proven to be powerful tools in interoperability issues. In the last few years, the authors have worked on developing the Agency for Integration, Diffusion and Archive of Medical Information (AIDA), which is an intelligent, agent-based platform to ensure interoperability in healthcare units. It is increasingly important to ensure the high availability and reliability of systems. The functions provided by the systems that treat interoperability cannot fail. This paper shows the importance of monitoring and controlling intelligent agents as a tool to anticipate problems in health information systems. The interaction between humans and agents through an interface that allows the user to create new agents easily and to monitor their activities in real time is also an important feature, as health systems evolve by adopting more features and solving new problems. A module was installed in Centro Hospitalar do Porto, increasing the functionality and the overall usability of AIDA.

  17. The role of markup for enabling interoperability in health informatics.

    Science.gov (United States)

    McKeever, Steve; Johnson, David

    2015-01-01

    Interoperability is the faculty of making information systems work together. In this paper we will distinguish a number of different forms that interoperability can take and show how they are realized on a variety of physiological and health care use cases. The last 15 years has seen the rise of very cheap digital storage both on and off site. With the advent of the Internet of Things people's expectations are for greater interconnectivity and seamless interoperability. The potential impact these technologies have on healthcare are dramatic: from improved diagnoses through immediate access to a patient's electronic health record, to in silico modeling of organs and early stage drug trials, to predictive medicine based on top-down modeling of disease progression and treatment. We will begin by looking at the underlying technology, classify the various kinds of interoperability that exist in the field, and discuss how they are realized. We conclude with a discussion on future possibilities that big data and further standardizations will enable.

  18. The Next Generation of Interoperability Agents in Healthcare

    Directory of Open Access Journals (Sweden)

    Luciana Cardoso

    2014-05-01

    Full Text Available Interoperability in health information systems is increasingly a requirement rather than an option. Standards and technologies, such as multi-agent systems, have proven to be powerful tools in interoperability issues. In the last few years, the authors have worked on developing the Agency for Integration, Diffusion and Archive of Medical Information (AIDA, which is an intelligent, agent-based platform to ensure interoperability in healthcare units. It is increasingly important to ensure the high availability and reliability of systems. The functions provided by the systems that treat interoperability cannot fail. This paper shows the importance of monitoring and controlling intelligent agents as a tool to anticipate problems in health information systems. The interaction between humans and agents through an interface that allows the user to create new agents easily and to monitor their activities in real time is also an important feature, as health systems evolve by adopting more features and solving new problems. A module was installed in Centro Hospitalar do Porto, increasing the functionality and the overall usability of AIDA.

  19. Interoperable mesh and geometry tools for advanced petascale simulations

    International Nuclear Information System (INIS)

    Diachin, L; Bauer, A; Fix, B; Kraftcheck, J; Jansen, K; Luo, X; Miller, M; Ollivier-Gooch, C; Shephard, M S; Tautges, T; Trease, H

    2007-01-01

    SciDAC applications have a demonstrated need for advanced software tools to manage the complexities associated with sophisticated geometry, mesh, and field manipulation tasks, particularly as computer architectures move toward the petascale. The Center for Interoperable Technologies for Advanced Petascale Simulations (ITAPS) will deliver interoperable and interchangeable mesh, geometry, and field manipulation services that are of direct use to SciDAC applications. The premise of our technology development goal is to provide such services as libraries that can be used with minimal intrusion into application codes. To develop these technologies, we focus on defining a common data model and data-structure neutral interfaces that unify a number of different services such as mesh generation and improvement, front tracking, adaptive mesh refinement, shape optimization, and solution transfer operations. We highlight the use of several ITAPS services in SciDAC applications

  20. An Optimized, Data Distribution Service-Based Solution for Reliable Data Exchange Among Autonomous Underwater Vehicles.

    Science.gov (United States)

    Rodríguez-Molina, Jesús; Bilbao, Sonia; Martínez, Belén; Frasheri, Mirgita; Cürüklü, Baran

    2017-08-05

    Major challenges are presented when managing a large number of heterogeneous vehicles that have to communicate underwater in order to complete a global mission in a cooperative manner. In this kind of application domain, sending data through the environment presents issues that surpass the ones found in other overwater, distributed, cyber-physical systems (i.e., low bandwidth, unreliable transport medium, data representation and hardware high heterogeneity). This manuscript presents a Publish/Subscribe-based semantic middleware solution for unreliable scenarios and vehicle interoperability across cooperative and heterogeneous autonomous vehicles. The middleware relies on different iterations of the Data Distribution Service (DDS) software standard and their combined work between autonomous maritime vehicles and a control entity. It also uses several components with different functionalities deemed as mandatory for a semantic middleware architecture oriented to maritime operations (device and service registration, context awareness, access to the application layer) where other technologies are also interweaved with middleware (wireless communications, acoustic networks). Implementation details and test results, both in a laboratory and a deployment scenario, have been provided as a way to assess the quality of the system and its satisfactory performance.

  1. An Optimized, Data Distribution Service-Based Solution for Reliable Data Exchange Among Autonomous Underwater Vehicles

    Directory of Open Access Journals (Sweden)

    Jesús Rodríguez-Molina

    2017-08-01

    Full Text Available Major challenges are presented when managing a large number of heterogeneous vehicles that have to communicate underwater in order to complete a global mission in a cooperative manner. In this kind of application domain, sending data through the environment presents issues that surpass the ones found in other overwater, distributed, cyber-physical systems (i.e., low bandwidth, unreliable transport medium, data representation and hardware high heterogeneity. This manuscript presents a Publish/Subscribe-based semantic middleware solution for unreliable scenarios and vehicle interoperability across cooperative and heterogeneous autonomous vehicles. The middleware relies on different iterations of the Data Distribution Service (DDS software standard and their combined work between autonomous maritime vehicles and a control entity. It also uses several components with different functionalities deemed as mandatory for a semantic middleware architecture oriented to maritime operations (device and service registration, context awareness, access to the application layer where other technologies are also interweaved with middleware (wireless communications, acoustic networks. Implementation details and test results, both in a laboratory and a deployment scenario, have been provided as a way to assess the quality of the system and its satisfactory performance.

  2. Interoperability of Standards for Robotics in CIME

    DEFF Research Database (Denmark)

    Kroszynski, Uri; Sørensen, Torben; Ludwig, Arnold

    1997-01-01

    Esprit Project 6457 "Interoperability of Standards for Robotics in CIME (InterRob)" belongs to the Subprogramme "Integration in Manufacturing" of Esprit, the European Specific Programme for Research and Development in Information Technology supported by the European Commision.The first main goal...... of InterRob was to close the information chain between product design, simulation, programming, and robot control by developing standardized interfaces and their software implementation for standards STEP (International Standard for the Exchange of Product model data, ISO 10303) and IRL (Industrial Robot...... Language, DIN 66312). This is a continuation of the previous Esprit projects CAD*I and NIRO, which developed substantial basics of STEP.The InterRob approach is based on standardized models for product geometry, kinematics, robotics, dynamics and control, hence on a coherent neutral information model...

  3. An Integrated Framework to Achieve Interoperability in Person-Centric Health Management

    Directory of Open Access Journals (Sweden)

    Fabio Vergari

    2011-01-01

    Full Text Available The need for high-quality out-of-hospital healthcare is a known socioeconomic problem. Exploiting ICT's evolution, ad-hoc telemedicine solutions have been proposed in the past. Integrating such ad-hoc solutions in order to cost-effectively support the entire healthcare cycle is still a research challenge. In order to handle the heterogeneity of relevant information and to overcome the fragmentation of out-of-hospital instrumentation in person-centric healthcare systems, a shared and open source interoperability component can be adopted, which is ontology driven and based on the semantic web data model. The feasibility and the advantages of the proposed approach are demonstrated by presenting the use case of real-time monitoring of patients' health and their environmental context.

  4. Basic semantic architecture of interoperability for the intelligent distribution in the CFE electrical system; Arquitectura base de interoperabilidad semantica para el sistema electrico de distribucion inteligente en la CFE

    Energy Technology Data Exchange (ETDEWEB)

    Espinosa Reza, Alfredo; Garcia Mendoza, Raul; Borja Diaz, Jesus Fidel; Sierra Rodriguez, Benjamin [Instituto de Investigaciones Electricas, Cuernavaca, Morelos (Mexico)

    2010-07-01

    The physical and logical architecture of the interoperability platform defined for the distribution management systems (DMS), of the Distribution Subdivision of Comision Federal de Electricidad (CFE) in Mexico is presented. The adopted architecture includes the definition of a technological platform to manage the exchange of information between systems and applications, sustained in the Model of Common Information (CIM), established in norms IEC61968 and IEC 61970. The architecture based on SSOA (Semantic Services Oriented Architecture), on EIB (Enterprise Integration Bus) and on GID (Generic Interface Definition) is presented, as well as the sequence to obtain the interoperability of systems related to the Distribution Management of the of electrical energy in Mexico. Of equal way it is described the process to establish a Semantic Model of the Electrical System of Distribution (SED) and the creation of instances CIM/XML, oriented to the interoperability of the information systems in the DMS scope, by means of the interchange of messages conformed and validated according to the structure obtained and agreed to the rules established by Model CIM. In this way, the messages and the information interchanged among systems, assure the compatibility and correct interpretation in an independent way to the developer, mark or manufacturer of the system source and destiny. The primary target is to establish the infrastructure semantic base of interoperability, cradle in standards that sustain the strategic definition of an Electrical System of Intelligent Distribution (SEDI) in Mexico. [Spanish] Se presenta la arquitectura fisica y logica de la plataforma de interoperabilidad definida para los sistemas de gestion de la distribucion (DMS por sus siglas en ingles), de la Subdireccion de Distribucion de la Comision Federal de Electricidad (CFE) en Mexico. La arquitectura adoptada incluye la definicion de una plataforma tecnologica para gestionar el intercambio de informacion

  5. A generic architecture for an adaptive, interoperable and intelligent type 2 diabetes mellitus care system.

    Science.gov (United States)

    Uribe, Gustavo A; Blobel, Bernd; López, Diego M; Schulz, Stefan

    2015-01-01

    Chronic diseases such as Type 2 Diabetes Mellitus (T2DM) constitute a big burden to the global health economy. T2DM Care Management requires a multi-disciplinary and multi-organizational approach. Because of different languages and terminologies, education, experiences, skills, etc., such an approach establishes a special interoperability challenge. The solution is a flexible, scalable, business-controlled, adaptive, knowledge-based, intelligent system following a systems-oriented, architecture-centric, ontology-based and policy-driven approach. The architecture of real systems is described, using the basics and principles of the Generic Component Model (GCM). For representing the functional aspects of a system the Business Process Modeling Notation (BPMN) is used. The system architecture obtained is presented using a GCM graphical notation, class diagrams and BPMN diagrams. The architecture-centric approach considers the compositional nature of the real world system and its functionalities, guarantees coherence, and provides right inferences. The level of generality provided in this paper facilitates use case specific adaptations of the system. By that way, intelligent, adaptive and interoperable T2DM care systems can be derived from the presented model as presented in another publication.

  6. An E-government Interoperability Platform Supporting Personal Data Protection Regulations

    OpenAIRE

    González, Laura; Echevarría, Andrés; Morales, Dahiana; Ruggia, Raúl

    2016-01-01

    Public agencies are increasingly required to collaborate with each other in order to provide high-quality e-government services. This collaboration is usually based on the service-oriented approach and supported by interoperability platforms. Such platforms are specialized middleware-based infrastructures enabling the provision, discovery and invocation of interoperable software services. In turn, given that personal data handled by governments are often very sensitive, most governments have ...

  7. Improving conditions for reuse of design solutions - by means of a context based solution library

    DEFF Research Database (Denmark)

    Mortensen, Niels Henrik; Grothe-Møller, Thorkild; Andreasen, Mogens Myrup

    1997-01-01

    Among the most important reasoning mechanisms in design is reasoning by analogy. One precondition for being able to reason about the properties and functionalitues of a product or subsystem is that the context of the solution is known. This paper presents a computer based solution library where...

  8. Impact of coalition interoperability on PKI

    Science.gov (United States)

    Krall, Edward J.

    2003-07-01

    This paper examines methods for providing PKI interoperability among units of a coalition of armed forces drawn from different nations. The area in question is tactical identity management, for the purposes of confidentiality, integrity and non-repudiation in such a dynamic coalition. The interoperating applications under consideration range from email and other forms of store-and-forward messaging to TLS and IPSEC-protected real-time communications. Six interoperability architectures are examined with advantages and disadvantages of each described in the paper.

  9. Implementing Interoperability in the Seafood Industry: Learning from Experiences in Other Sectors.

    Science.gov (United States)

    Bhatt, Tejas; Gooch, Martin; Dent, Benjamin; Sylvia, Gilbert

    2017-08-01

    Interoperability of communication and information technologies within and between businesses operating along supply chains is being pursued and implemented in numerous industries worldwide to increase the efficiency and effectiveness of operations. The desire for greater interoperability is also driven by the need to reduce business risk through more informed management decisions. Interoperability is achieved by the development of a technology architecture that guides the design and implementation of communication systems existing within individual businesses and between businesses comprising the supply chain. Technology architectures are developed through a purposeful dialogue about why the architecture is required, the benefits and opportunities that the architecture offers the industry, and how the architecture will translate into practical results. An assessment of how the finance, travel, and health industries and a sector of the food industry-fresh produce-have implemented interoperability was conducted to identify lessons learned that can aid the development of interoperability in the seafood industry. The findings include identification of the need for strong, effective governance during the establishment and operation of an interoperability initiative to ensure the existence of common protocols and standards. The resulting insights were distilled into a series of principles for enabling syntactic and semantic interoperability in any industry, which we summarize in this article. Categorized as "structural," "operational," and "integrative," the principles describe requirements and solutions that are pivotal to enabling businesses to create and capture value from full chain interoperability. The principles are also fundamental to allowing governments and advocacy groups to use traceability for public good. © 2017 Institute of Food Technologists®.

  10. A buffer overflow detection based on inequalities solution

    International Nuclear Information System (INIS)

    Xu Guoai; Zhang Miao; Yang Yixian

    2007-01-01

    A new buffer overflow detection model based on Inequalities Solution was designed, which is based on analyzing disadvantage of the old buffer overflow detection technique and successfully converting buffer overflow detection to Inequalities Solution. The new model can conquer the disadvantage of the old technique and improve efficiency of buffer overflow detection. (authors)

  11. Balancing of Heterogeneity and Interoperability in E-Business Networks: The Role of Standards and Protocols

    OpenAIRE

    Frank-Dieter Dorloff; Ejub Kajan

    2012-01-01

    To reach this interoperability visibility and common understanding must be ensured on all levels of the interoperability pyramid. This includes common agreements about the visions, political and legal restrictions, clear descriptions about the collaboration scenarios, included business processes and-rules, the type and roles of the Documents, a common understandable vocabulary, etc. To do this in an effective and automatable manner, ICT based concepts, frameworks and models have to be defined...

  12. Risk Management Considerations for Interoperable Acquisition

    National Research Council Canada - National Science Library

    Meyers, B. C

    2006-01-01

    .... The state of risk management practice -- the specification of standards and the methodologies to implement them -- is addressed and examined with respect to the needs of system-of-systems interoperability...

  13. Interoperability for Entreprise Systems and Applications '12

    CERN Document Server

    Doumeingts, Guy; Katzy, Bernhard; Chalmeta, Ricardo

    2012-01-01

    Within a scenario of globalised markets, where the capacity to efficiently cooperate with other firms starts to become essential in order to remain in the market in an economically, socially and environmentally cost-effective manner, it can be seen how the most innovative enterprises are beginning to redesign their business model to become interoperable. This goal of interoperability is essential, not only from the perspective of the individual enterprise but also in the new business structures that are now emerging, such as supply chains, virtual enterprises, interconnected organisations or extended enterprises, as well as in mergers and acquisitions. Composed of over 40 papers, Enterprise Interoperability V ranges from academic research through case studies to industrial and administrative experience of interoperability. The international nature of the authorship contnues to broaden. Many of the papers have examples and illustrations calculated to deepen understanding and generate new ideas. The I-ESA'12 Co...

  14. Interoperability for Enterprise Systems and Applications

    CERN Document Server

    Jardim-Gonçalves, Ricardo; Popplewell, Keith; Mendonça, João

    2016-01-01

    A concise reference to the state of the art in systems interoperability, Enterprise Interoperability VII will be of great value to engineers and computer scientists working in manufacturing and other process industries and to software engineers and electronic and manufacturing engineers working in the academic environment. Furthermore, it shows how knowledge of the meaning within information and the use to which it will be put have to be held in common between enterprises for consistent and efficient inter-enterprise networks. Over 30 papers, ranging from academic research through case studies to industrial and administrative experience of interoperability show how, in a scenario of globalised markets, where the capacity to cooperate with other organizations efficiently is essential in order to remain economically, socially and environmentally cost-effective, the most innovative digitized and networked enterprises ensure that their systems and applications are able to interoperate across heterogeneous collabo...

  15. Epimenides: Interoperability Reasoning for Digital Preservation

    NARCIS (Netherlands)

    Kargakis, Yannis; Tzitzikas, Yannis; van Horik, M.P.M.

    2014-01-01

    This paper presents Epimenides, a system that implements a novel interoperability dependency reasoning approach for assisting digital preservation activities. A distinctive feature is that it can model also converters and emulators, and the adopted modelling approach enables the automatic reasoning

  16. Requirements for Interoperability in Healthcare Information Systems

    Directory of Open Access Journals (Sweden)

    Rita Noumeir

    2012-01-01

    Full Text Available Interoperability is a requirement for the successful deployment of Electronic Health Records (EHR. EHR improves the quality of healthcare by enabling access to all relevant information at the diagnostic decision moment, regardless of location. It is a system that results from the cooperation of several heterogeneous distributed subsystems that need to successfully exchange information relative to a specific healthcare process. This paper analyzes interoperability impediments in healthcare by first defining them and providing concrete healthcare examples, followed by discussion of how specifications can be defined and how verification can be conducted to eliminate those impediments and ensure interoperability in healthcare. This paper also analyzes how Integrating the Healthcare Enterprise (IHE has been successful in enabling interoperability, and identifies some neglected aspects that need attention.

  17. Investigation of Automated Terminal Interoperability Test

    OpenAIRE

    Brammer, Niklas

    2008-01-01

    In order to develop and secure the functionality of its cellular communications systems, Ericsson deals with numerous R&D and I&V activities. One important aspect is interoperability with mobile terminals from different vendors on the world market. Therefore Ericsson co-operates with mobile platform and user equipment manufacturers. These companies visit the interoperability developmental testing (IoDT) laboratories in Linköping to test their developmental products and prototypes in o...

  18. Emergent Semantics Interoperability in Large-Scale Decentralized Information Systems

    CERN Document Server

    Cudré-Mauroux, Philippe

    2008-01-01

    Peer-to-peer systems are evolving with new information-system architectures, leading to the idea that the principles of decentralization and self-organization will offer new approaches in informatics, especially for systems that scale with the number of users or for which central authorities do not prevail. This book describes a new way of building global agreements (semantic interoperability) based only on decentralized, self-organizing interactions.

  19. Smart hospitality—Interconnectivity and interoperability towards an ecosystem

    OpenAIRE

    Buhalis, Dimitrios; Leung, Rosanna

    2018-01-01

    The Internet and cloud computing changed the way business operate. Standardised web-based applications simplify data interchange which allow internal applications and business partners systems to become interconnected and interoperable. This study conceptualises the smart and agile hospitality enterprises of the future, and proposes a smart hospitality ecosystem that adds value to all stakeholders. Internal data from applications among all stakeholders, consolidated with external environment ...

  20. Improving Patient Safety with X-Ray and Anesthesia Machine Ventilator Synchronization: A Medical Device Interoperability Case Study

    Science.gov (United States)

    Arney, David; Goldman, Julian M.; Whitehead, Susan F.; Lee, Insup

    When a x-ray image is needed during surgery, clinicians may stop the anesthesia machine ventilator while the exposure is made. If the ventilator is not restarted promptly, the patient may experience severe complications. This paper explores the interconnection of a ventilator and simulated x-ray into a prototype plug-and-play medical device system. This work assists ongoing interoperability framework development standards efforts to develop functional and non-functional requirements and illustrates the potential patient safety benefits of interoperable medical device systems by implementing a solution to a clinical use case requiring interoperability.

  1. Grid interoperability: joining grid information systems

    International Nuclear Information System (INIS)

    Flechl, M; Field, L

    2008-01-01

    A grid is defined as being 'coordinated resource sharing and problem solving in dynamic, multi-institutional virtual organizations'. Over recent years a number of grid projects, many of which have a strong regional presence, have emerged to help coordinate institutions and enable grids. Today, we face a situation where a number of grid projects exist, most of which are using slightly different middleware. Grid interoperation is trying to bridge these differences and enable Virtual Organizations to access resources at the institutions independent of their grid project affiliation. Grid interoperation is usually a bilateral activity between two grid infrastructures. Recently within the Open Grid Forum, the Grid Interoperability Now (GIN) Community Group is trying to build upon these bilateral activities. The GIN group is a focal point where all the infrastructures can come together to share ideas and experiences on grid interoperation. It is hoped that each bilateral activity will bring us one step closer to the overall goal of a uniform grid landscape. A fundamental aspect of a grid is the information system, which is used to find available grid services. As different grids use different information systems, interoperation between these systems is crucial for grid interoperability. This paper describes the work carried out to overcome these differences between a number of grid projects and the experiences gained. It focuses on the different techniques used and highlights the important areas for future standardization

  2. ICD-11 (JLMMS) and SCT Inter-Operation.

    Science.gov (United States)

    Mamou, Marzouk; Rector, Alan; Schulz, Stefan; Campbell, James; Solbrig, Harold; Rodrigues, Jean-Marie

    2016-01-01

    The goal of this work is to contribute to a smooth and semantically sound inter-operability between the ICD-11 (International Classification of Diseases-11th revision Joint Linearization for Mortality, Morbidity and Statistics) and SNOMED CT (SCT). To guarantee such inter-operation between a classification, characterized by a single hierarchy of mutually exclusive and exhaustive classes, as is the JLMMS successor of ICD-10 on the one hand, and the multi-hierarchical, ontology-based clinical terminology SCT on the other hand, we use ontology axioms that logically express generalizable truths. This is expressed by the compositional grammar of SCT, together with queries on axiomsof SCT. We test the feasibility of the method on the circulatory chapter of ICD-11 JLMMS and present limitations and results.

  3. Interoperable Data Sharing for Diverse Scientific Disciplines

    Science.gov (United States)

    Hughes, John S.; Crichton, Daniel; Martinez, Santa; Law, Emily; Hardman, Sean

    2016-04-01

    For diverse scientific disciplines to interoperate they must be able to exchange information based on a shared understanding. To capture this shared understanding, we have developed a knowledge representation framework using ontologies and ISO level archive and metadata registry reference models. This framework provides multi-level governance, evolves independent of implementation technologies, and promotes agile development, namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. The knowledge representation framework is populated through knowledge acquisition from discipline experts. It is also extended to meet specific discipline requirements. The result is a formalized and rigorous knowledge base that addresses data representation, integrity, provenance, context, quantity, and their relationships within the community. The contents of the knowledge base is translated and written to files in appropriate formats to configure system software and services, provide user documentation, validate ingested data, and support data analytics. This presentation will provide an overview of the framework, present the Planetary Data System's PDS4 as a use case that has been adopted by the international planetary science community, describe how the framework is being applied to other disciplines, and share some important lessons learned.

  4. On the Feasibility of Interoperable Schemes in Hand Biometrics

    Directory of Open Access Journals (Sweden)

    Miguel A. Ferrer

    2012-02-01

    Full Text Available Personal recognition through hand-based biometrics has attracted the interest of many researchers in the last twenty years. A significant number of proposals based on different procedures and acquisition devices have been published in the literature. However, comparisons between devices and their interoperability have not been thoroughly studied. This paper tries to fill this gap by proposing procedures to improve the interoperability among different hand biometric schemes. The experiments were conducted on a database made up of 8,320 hand images acquired from six different hand biometric schemes, including a flat scanner, webcams at different wavelengths, high quality cameras, and contactless devices. Acquisitions on both sides of the hand were included. Our experiment includes four feature extraction methods which determine the best performance among the different scenarios for two of the most popular hand biometrics: hand shape and palm print. We propose smoothing techniques at the image and feature levels to reduce interdevice variability. Results suggest that comparative hand shape offers better performance in terms of interoperability than palm prints, but palm prints can be more effective when using similar sensors.

  5. On the feasibility of interoperable schemes in hand biometrics.

    Science.gov (United States)

    Morales, Aythami; González, Ester; Ferrer, Miguel A

    2012-01-01

    Personal recognition through hand-based biometrics has attracted the interest of many researchers in the last twenty years. A significant number of proposals based on different procedures and acquisition devices have been published in the literature. However, comparisons between devices and their interoperability have not been thoroughly studied. This paper tries to fill this gap by proposing procedures to improve the interoperability among different hand biometric schemes. The experiments were conducted on a database made up of 8,320 hand images acquired from six different hand biometric schemes, including a flat scanner, webcams at different wavelengths, high quality cameras, and contactless devices. Acquisitions on both sides of the hand were included. Our experiment includes four feature extraction methods which determine the best performance among the different scenarios for two of the most popular hand biometrics: hand shape and palm print. We propose smoothing techniques at the image and feature levels to reduce interdevice variability. Results suggest that comparative hand shape offers better performance in terms of interoperability than palm prints, but palm prints can be more effective when using similar sensors.

  6. On the Feasibility of Interoperable Schemes in Hand Biometrics

    Science.gov (United States)

    Morales, Aythami; González, Ester; Ferrer, Miguel A.

    2012-01-01

    Personal recognition through hand-based biometrics has attracted the interest of many researchers in the last twenty years. A significant number of proposals based on different procedures and acquisition devices have been published in the literature. However, comparisons between devices and their interoperability have not been thoroughly studied. This paper tries to fill this gap by proposing procedures to improve the interoperability among different hand biometric schemes. The experiments were conducted on a database made up of 8,320 hand images acquired from six different hand biometric schemes, including a flat scanner, webcams at different wavelengths, high quality cameras, and contactless devices. Acquisitions on both sides of the hand were included. Our experiment includes four feature extraction methods which determine the best performance among the different scenarios for two of the most popular hand biometrics: hand shape and palm print. We propose smoothing techniques at the image and feature levels to reduce interdevice variability. Results suggest that comparative hand shape offers better performance in terms of interoperability than palm prints, but palm prints can be more effective when using similar sensors. PMID:22438714

  7. ISAIA: Interoperable Systems for Archival Information Access

    Science.gov (United States)

    Hanisch, Robert J.

    2002-01-01

    The ISAIA project was originally proposed in 1999 as a successor to the informal AstroBrowse project. AstroBrowse, which provided a data location service for astronomical archives and catalogs, was a first step toward data system integration and interoperability. The goals of ISAIA were ambitious: '...To develop an interdisciplinary data location and integration service for space science. Building upon existing data services and communications protocols, this service will allow users to transparently query hundreds or thousands of WWW-based resources (catalogs, data, computational resources, bibliographic references, etc.) from a single interface. The service will collect responses from various resources and integrate them in a seamless fashion for display and manipulation by the user.' Funding was approved only for a one-year pilot study, a decision that in retrospect was wise given the rapid changes in information technology in the past few years and the emergence of the Virtual Observatory initiatives in the US and worldwide. Indeed, the ISAIA pilot study was influential in shaping the science goals, system design, metadata standards, and technology choices for the virtual observatory. The ISAIA pilot project also helped to cement working relationships among the NASA data centers, US ground-based observatories, and international data centers. The ISAIA project was formed as a collaborative effort between thirteen institutions that provided data to astronomers, space physicists, and planetary scientists. Among the fruits we ultimately hoped would come from this project would be a central site on the Web that any space scientist could use to efficiently locate existing data relevant to a particular scientific question. Furthermore, we hoped that the needed technology would be general enough to allow smaller, more-focused community within space science could use the same technologies and standards to provide more specialized services. A major challenge to searching

  8. Semantic modeling and interoperability in product and process engineering a technology for engineering informatics

    CERN Document Server

    2013-01-01

    In the past decade, feature-based design and manufacturing has gained some momentum in various engineering domains to represent and reuse semantic patterns with effective applicability. However, the actual scope of feature application is still very limited. Semantic Modeling and Interoperability in Product and Process Engineering provides a systematic solution for the challenging engineering informatics field aiming at the enhancement of sustainable knowledge representation, implementation and reuse in an open and yet practically manageable scale.   This semantic modeling technology supports uniform, multi-facet and multi-level collaborative system engineering with heterogeneous computer-aided tools, such as CADCAM, CAE, and ERP.  This presented unified feature model can be applied to product and process representation, development, implementation and management. Practical case studies and test samples are provided to illustrate applications which can be implemented by the readers in real-world scenarios. �...

  9. Solution-Processed Smart Window Platforms Based on Plasmonic Electrochromics

    KAUST Repository

    Abbas, Sara

    2018-01-01

    blocking. Despite this edge, this technology can benefit from important developments, including low-cost solution-based manufacturing on flexible substrates while maintaining durability and coloration efficiency, demonstration of independent control

  10. A variational solution of transport equation based on spherical geometry

    International Nuclear Information System (INIS)

    Liu Hui; Zhang Ben'ai

    2002-01-01

    A variational method with differential forms gives better precision for numerical solution of transport critical problem based on spherical geometry, and its computation seems simple than other approximate methods

  11. Market based solutions for power pricing

    International Nuclear Information System (INIS)

    Wangensteen, Ivar

    2002-06-01

    The report examines how the price for effect reserves, spot market power and regulated power is formed provided ideal market conditions rule. Primarily the price determining factors in a market for power reserves are examined and how the connection between this market and the energy market (the spot market) is. In a free market there would be a balance between what the actors may obtain by operating in the open market for power reserves/regulated power on the one hand and the market for spot power on the other. Primarily we suppose that the desired amount of power reserve is known. Secondly the problem constellation is extended to comprise the size of the effect reserves i.e. the optimising of the requirement to the power reserves. The optimal amount of power reserves is obtained when there is a balance between the cost and the benefit. This optimal balance is achieved when expected macro economical loss due to outfacing balances against the cost of maintaining larger reserves. By using a simple model it is demonstrated that a system operator regulates the maximal price in the regulated market and this equals the rationing price. The actors will offer sufficient reserves even if the reserve price is zero (provided risk neutrality). If the maximal price for regulated power is lower the price of effect reserves will rise. Based on the same simple model calculations are made for how short and long term market balance will be for increasing demands

  12. Enterprise interoperability VI : Interoperability for Agility, Resilience and Plasticity of Collaboration

    CERN Document Server

    Bénaben, Frédérick; Poler, Raúl; Bourrières, Jean-Paul

    2014-01-01

    A concise reference to the state of the art in systems interoperability, Enterprise Interoperability VI will be of great value to engineers and computer scientists working in manufacturing and other process industries and to software engineers and electronic and manufacturing engineers working in the academic environment. Over 40 papers, ranging from academic research through case studies to industrial and administrative experience of interoperability show how, in a scenario of globalised markets, where the capacity to cooperate with other firms efficiently starts to become essential in order to remain in the market in an economically, socially and environmentally cost-effective manner, the most innovative enterprises are beginning to redesign their business model to become interoperable. This goal of interoperability is essential, not only from the perspective of the individual enterprise but also in the new business structures that are now emerging, such as supply chains, virtual enterprises, interconnected...

  13. Interoperability of Heliophysics Virtual Observatories

    Science.gov (United States)

    Thieman, J.; Roberts, A.; King, T.; King, J.; Harvey, C.

    2008-01-01

    If you'd like to find interrelated heliophysics (also known as space and solar physics) data for a research project that spans, for example, magnetic field data and charged particle data from multiple satellites located near a given place and at approximately the same time, how easy is this to do? There are probably hundreds of data sets scattered in archives around the world that might be relevant. Is there an optimal way to search these archives and find what you want? There are a number of virtual observatories (VOs) now in existence that maintain knowledge of the data available in subdisciplines of heliophysics. The data may be widely scattered among various data centers, but the VOs have knowledge of what is available and how to get to it. The problem is that research projects might require data from a number of subdisciplines. Is there a way to search multiple VOs at once and obtain what is needed quickly? To do this requires a common way of describing the data such that a search using a common term will find all data that relate to the common term. This common language is contained within a data model developed for all of heliophysics and known as the SPASE (Space Physics Archive Search and Extract) Data Model. NASA has funded the main part of the development of SPASE but other groups have put resources into it as well. How well is this working? We will review the use of SPASE and how well the goal of locating and retrieving data within the heliophysics community is being achieved. Can the VOs truly be made interoperable despite being developed by so many diverse groups?

  14. Interoperable Cloud Networking for intelligent power supply; Interoperables Cloud Networking fuer intelligente Energieversorgung

    Energy Technology Data Exchange (ETDEWEB)

    Hardin, Dave [Invensys Operations Management, Foxboro, MA (United States)

    2010-09-15

    Intelligent power supply by a so-called Smart Grid will make it possible to control consumption by market-based pricing and signals for load reduction. This necessitates that both the energy rates and the energy information are distributed reliably and in real time to automation systems in domestic and other buildings and in industrial plants over a wide geographic range and across the most varied grid infrastructures. Effective communication at this level of complexity necessitates computer and grid resources that are normally only available in the computer centers of big industries. The cloud computing technology, which is described here in some detail, has all features to provide reliability, interoperability and efficiency for large-scale smart grid applications, at lower cost than traditional computer centers. (orig.)

  15. Forcing Interoperability: An Intentionally Fractured Approach

    Science.gov (United States)

    Gallaher, D. W.; Brodzik, M.; Scambos, T.; Stroeve, J.

    2008-12-01

    The NSIDC is attempting to rebuild a significant portion of its public-facing cyberinfrastructure to better meet the needs expressed by the cryospheric community. The project initially addresses a specific science need - understanding Greenland's contribution to global sea level rise through comparison and analysis of variables such as temperature, albedo, melt, ice velocity and surface elevation. This project will ultimately be expanded to cover most of NSIDC's cryospheric data. Like many organizations, we need to provide users with data discovery interfaces, collaboration tools and mapping services. Complicating this effort is the need to reduce the volume of raw data delivered to the user. Data growth, especially with time-series data, will overwhelm our software, processors and network like never before. We need to provide the users the ability to perform first level analysis directly on our site. In order to accomplish this, the users should be free to modify the behavior of these tools as well as incorporate their own tools and analysis to meet their needs. Rather than building one monolithic project to build this system, we have chosen to build three semi-independent systems. One team is building a data discovery and web based distribution system, the second is building an advanced analysis and workflow system and the third is building a customized web mapping service. These systems will use the same underlying data structures and services but will employ different technologies and teams to build their objectives, schedules and user interfaces. Obviously, we are adding complexity and risk to the overall project however this may be the best method to achieve interoperability because the development teams will be required to build off each others work. The teams will be forced to design with other users in mind as opposed to building interoperability as an afterthought, which a tendency in monolithic systems. All three teams will take advantage of preexisting

  16. On the formal definition of the systems' interoperability capability: an anthropomorphic approach

    Science.gov (United States)

    Zdravković, Milan; Luis-Ferreira, Fernando; Jardim-Goncalves, Ricardo; Trajanović, Miroslav

    2017-03-01

    The extended view of enterprise information systems in the Internet of Things (IoT) introduces additional complexity to the interoperability problems. In response to this, the problem of systems' interoperability is revisited by taking into the account the different aspects of philosophy, psychology, linguistics and artificial intelligence, namely by analysing the potential analogies between the processes of human and system communication. Then, the capability to interoperate as a property of the system, is defined as a complex ability to seamlessly sense and perceive a stimulus from its environment (assumingly, a message from any other system), make an informed decision about this perception and consequently, articulate a meaningful and useful action or response, based on this decision. Although this capability is defined on the basis of the existing interoperability theories, the proposed approach to its definition excludes the assumption on the awareness of co-existence of two interoperating systems. Thus, it establishes the links between the research of interoperability of systems and intelligent software agents, as one of the systems' digital identities.

  17. Rollout Strategy to Implement Interoperable Traceability in the Seafood Industry.

    Science.gov (United States)

    Gooch, Martin; Dent, Benjamin; Sylvia, Gilbert; Cusack, Christopher

    2017-08-01

    Verifying the accuracy and rigor of data exchanged within and between businesses for the purposes of traceability rests on the existence of effective and efficient interoperable information systems that meet users' needs. Interoperability, particularly given the complexities intrinsic to the seafood industry, requires that the systems used by businesses operating along the supply chain share a common technology architecture that is robust, resilient, and evolves as industry needs change. Technology architectures are developed through engaging industry stakeholders in understanding why an architecture is required, the benefits provided to the industry and individual businesses and supply chains, and how the architecture will translate into practical results. This article begins by reiterating the benefits that the global seafood industry can capture by implementing interoperable chain-length traceability and the reason for basing the architecture on a peer-to-peer networked database concept versus more traditional centralized or linear approaches. A summary of capabilities that already exist within the seafood industry that the proposed architecture uses is discussed; and a strategy for implementing the architecture is presented. The 6-step strategy is presented in the form of a critical path. © 2017 Institute of Food Technologists®.

  18. A step-by-step methodology for enterprise interoperability projects

    Science.gov (United States)

    Chalmeta, Ricardo; Pazos, Verónica

    2015-05-01

    Enterprise interoperability is one of the key factors for enhancing enterprise competitiveness. Achieving enterprise interoperability is an extremely complex process which involves different technological, human and organisational elements. In this paper we present a framework to help enterprise interoperability. The framework has been developed taking into account the three domains of interoperability: Enterprise Modelling, Architecture and Platform and Ontologies. The main novelty of the framework in comparison to existing ones is that it includes a step-by-step methodology that explains how to carry out an enterprise interoperability project taking into account different interoperability views, like business, process, human resources, technology, knowledge and semantics.

  19. Photochemical properties of Ysub(t) base in aqueous solution

    International Nuclear Information System (INIS)

    Paszyc, S.; Rafalska, M.

    1979-01-01

    Photoreactivity of Ysub(t) base (I) has been studied in aqueous solution (pH-6) saturated with oxygen. Two photoproducts (II,III), resulting from irradiation at lambda = 253.7 nm and lambda >= 290 nm were isolated and their structures determined. The quantum yield for Ysub(t) base disappearance (rho dis) is 0.002 (lambda = 313 nm). It was shown that dye- sensitised photo-oxidation of Ysub(t) base in aqueous solution occurs according to a Type I mechanism as well as with participation of singlet state oxygen. Quantum yields, fluorescence decay times and phosphorescence of Ysub(t) base have also been determined. (author)

  20. An IMS-Based Middleware Solution for Energy-Efficient and Cost-Effective Mobile Multimedia Services

    Science.gov (United States)

    Bellavista, Paolo; Corradi, Antonio; Foschini, Luca

    Mobile multimedia services have recently become of extreme industrial relevance due to the advances in both wireless client devices and multimedia communications. That has motivated important standardization efforts, such as the IP Multimedia Subsystem (IMS) to support session control, mobility, and interoperability in all-IP next generation networks. Notwithstanding the central role of IMS in novel mobile multimedia, the potential of IMS-based service composition for the development of new classes of ready-to-use, energy-efficient, and cost-effective services is still widely unexplored. The paper proposes an original solution for the dynamic and standard-compliant redirection of incoming voice calls towards WiFi-equipped smart phones. The primary design guideline is to reduce energy consumption and service costs for the final user by automatically switching from the 3G to the WiFi infrastructure whenever possible. The proposal is fully compliant with the IMS standard and exploits the recently released IMS presence service to update device location and current communication opportunities. The reported experimental results point out that our solution, in a simple way and with full compliance with state-of-the-art industrially-accepted standards, can significantly increase battery lifetime without negative effects on call initiation delay.

  1. Redox flow batteries based on supporting solutions containing chloride

    Science.gov (United States)

    Li, Liyu; Kim, Soowhan; Yang, Zhenguo; Wang, Wei; Zhang, Jianlu; Chen, Baowei; Nie, Zimin; Xia, Guanguang

    2014-01-14

    Redox flow battery systems having a supporting solution that contains Cl.sup.- ions can exhibit improved performance and characteristics. Furthermore, a supporting solution having mixed SO.sub.4.sup.2- and Cl.sup.- ions can provide increased energy density and improved stability and solubility of one or more of the ionic species in the catholyte and/or anolyte. According to one example, a vanadium-based redox flow battery system is characterized by an anolyte having V.sup.2+ and V.sup.3+ in a supporting solution and a catholyte having V.sup.4+ and V.sup.5+ in a supporting solution. The supporting solution can contain Cl.sup.- ions or a mixture of SO.sub.4.sup.2- and Cl.sup.- ions.

  2. Redox flow batteries based on supporting solutions containing chloride

    Energy Technology Data Exchange (ETDEWEB)

    Li, Liyu; Kim, Soowhan; Yang, Zhenguo; Wang, Wei; Nie, Zimin; Chen, Baowei; Zhang, Jianlu; Xia, Guanguang

    2017-11-14

    Redox flow battery systems having a supporting solution that contains Cl.sup.- ions can exhibit improved performance and characteristics. Furthermore, a supporting solution having mixed SO.sub.4.sup.2- and Cl.sup.- ions can provide increased energy density and improved stability and solubility of one or more of the ionic species in the catholyte and/or anolyte. According to one example, a vanadium-based redox flow battery system is characterized by an anolyte having V.sup.2+ and V.sup.3+ in a supporting solution and a catholyte having V.sup.4+ and V.sup.5+ in a supporting solution. The supporting solution can contain Cl.sup.- ions or a mixture of SO.sub.4.sup.2- and Cl.sup.- ions.

  3. Using software interoperability to achieve a virtual design environment

    Science.gov (United States)

    Gregory, G. Groot; Koshel, R. John

    2005-09-01

    A variety of simulation tools, including optical design and analysis, have benefited by many years of evolution in software functionality and computing power, thus making the notion of virtual design environments a reality. To simulate the optical characteristics of a system, one needs to include optical performance, mechanical design and manufacturing aspects simultaneously. To date, no single software program offers a universal solution. One approach to achieve an integrated environment is to select tools that offer a high degree of interoperability. This allows the selection of the best tools for each aspect of the design working in concert to solve the problem. This paper discusses the issues of how to assemble a design environment and provides an example of a combination of tools for illumination design. We begin by offering a broad definition of interoperability from an optical analysis perspective. This definition includes aspects of file interchange formats, software communications protocols and customized applications. One example solution is proposed by combining SolidWorks1 for computer-aided design (CAD), TracePro2 for optical analysis and MATLAB3 as the mathematical engine for tolerance analysis. The resulting virtual tool will be applied to a lightpipe design task to illustrate how such a system can be used.

  4. Cross border semantic interoperability for clinical research: the EHR4CR semantic resources and services

    Science.gov (United States)

    Daniel, Christel; Ouagne, David; Sadou, Eric; Forsberg, Kerstin; Gilchrist, Mark Mc; Zapletal, Eric; Paris, Nicolas; Hussain, Sajjad; Jaulent, Marie-Christine; MD, Dipka Kalra

    2016-01-01

    With the development of platforms enabling the use of routinely collected clinical data in the context of international clinical research, scalable solutions for cross border semantic interoperability need to be developed. Within the context of the IMI EHR4CR project, we first defined the requirements and evaluation criteria of the EHR4CR semantic interoperability platform and then developed the semantic resources and supportive services and tooling to assist hospital sites in standardizing their data for allowing the execution of the project use cases. The experience gained from the evaluation of the EHR4CR platform accessing to semantically equivalent data elements across 11 European participating EHR systems from 5 countries demonstrated how far the mediation model and mapping efforts met the expected requirements of the project. Developers of semantic interoperability platforms are beginning to address a core set of requirements in order to reach the goal of developing cross border semantic integration of data. PMID:27570649

  5. Latest developments for the IAGOS database: Interoperability and metadata

    Science.gov (United States)

    Boulanger, Damien; Gautron, Benoit; Thouret, Valérie; Schultz, Martin; van Velthoven, Peter; Broetz, Bjoern; Rauthe-Schöch, Armin; Brissebrat, Guillaume

    2014-05-01

    In-service Aircraft for a Global Observing System (IAGOS, http://www.iagos.org) aims at the provision of long-term, frequent, regular, accurate, and spatially resolved in situ observations of the atmospheric composition. IAGOS observation systems are deployed on a fleet of commercial aircraft. The IAGOS database is an essential part of the global atmospheric monitoring network. Data access is handled by open access policy based on the submission of research requests which are reviewed by the PIs. Users can access the data through the following web sites: http://www.iagos.fr or http://www.pole-ether.fr as the IAGOS database is part of the French atmospheric chemistry data centre ETHER (CNES and CNRS). The database is in continuous development and improvement. In the framework of the IGAS project (IAGOS for GMES/COPERNICUS Atmospheric Service), major achievements will be reached, such as metadata and format standardisation in order to interoperate with international portals and other databases, QA/QC procedures and traceability, CARIBIC (Civil Aircraft for the Regular Investigation of the Atmosphere Based on an Instrument Container) data integration within the central database, and the real-time data transmission. IGAS work package 2 aims at providing the IAGOS data to users in a standardized format including the necessary metadata and information on data processing, data quality and uncertainties. We are currently redefining and standardizing the IAGOS metadata for interoperable use within GMES/Copernicus. The metadata are compliant with the ISO 19115, INSPIRE and NetCDF-CF conventions. IAGOS data will be provided to users in NetCDF or NASA Ames format. We also are implementing interoperability between all the involved IAGOS data services, including the central IAGOS database, the former MOZAIC and CARIBIC databases, Aircraft Research DLR database and the Jülich WCS web application JOIN (Jülich OWS Interface) which combines model outputs with in situ data for

  6. Modeling Interoperable Information Systems with 3LGM² and IHE.

    Science.gov (United States)

    Stäubert, S; Schaaf, M; Jahn, F; Brandner, R; Winter, A

    2015-01-01

    Strategic planning of information systems (IS) in healthcare requires descriptions of the current and the future IS state. Enterprise architecture planning (EAP) tools like the 3LGM² tool help to build up and to analyze IS models. A model of the planned architecture can be derived from an analysis of current state IS models. Building an interoperable IS, i. e. an IS consisting of interoperable components, can be considered a relevant strategic information management goal for many IS in healthcare. Integrating the healthcare enterprise (IHE) is an initiative which targets interoperability by using established standards. To link IHE concepts to 3LGM² concepts within the 3LGM² tool. To describe how an information manager can be supported in handling the complex IHE world and planning interoperable IS using 3LGM² models. To describe how developers or maintainers of IHE profiles can be supported by the representation of IHE concepts in 3LGM². Conceptualization and concept mapping methods are used to assign IHE concepts such as domains, integration profiles actors and transactions to the concepts of the three-layer graph-based meta-model (3LGM²). IHE concepts were successfully linked to 3LGM² concepts. An IHE-master-model, i. e. an abstract model for IHE concepts, was modeled with the help of 3LGM² tool. Two IHE domains were modeled in detail (ITI, QRPH). We describe two use cases for the representation of IHE concepts and IHE domains as 3LGM² models. Information managers can use the IHE-master-model as reference model for modeling interoperable IS based on IHE profiles during EAP activities. IHE developers are supported in analyzing consistency of IHE concepts with the help of the IHE-master-model and functions of the 3LGM² tool The complex relations between IHE concepts can be modeled by using the EAP method 3LGM². 3LGM² tool offers visualization and analysis features which are now available for the IHE-master-model. Thus information managers and IHE

  7. Benefit quantification of interoperability in coordinate metrology

    DEFF Research Database (Denmark)

    Savio, E.; Carmignato, S.; De Chiffre, Leonardo

    2014-01-01

    these inefficiencies. The paper presents a methodology for an economic evaluation of interoperability benefits with respect to the verification of geometrical product specifications. It requires input data from testing and inspection activities, as well as information on training of personnel and licensing of software......One of the factors contributing to limited reproducibility of coordinate measurements is the use of different inspection software. Time-consuming efforts for translation of part programmes are sometimes needed, and interoperability of inspection equipment has the potential to reduce...

  8. Toward semantic interoperability with linked foundational ontologies in ROMULUS

    CSIR Research Space (South Africa)

    Khan, ZC

    2013-06-01

    Full Text Available A purpose of a foundational ontology is to solve interoperability issues among ontologies. Many foundational ontologies have been developed, reintroducing the ontology interoperability problem. We address this with the new online foundational...

  9. Interoperability after deployment: persistent challenges and regional strategies in Denmark.

    Science.gov (United States)

    Kierkegaard, Patrick

    2015-04-01

    The European Union has identified Denmark as one of the countries who have the potential to provide leadership and inspiration for other countries in eHealth implementation and adoption. However, Denmark has historically struggled to facilitate data exchange between their public hospitals' electronic health records (EHRs). Furthermore, state-led projects failed to adequately address the challenges of interoperability after deployment. Changes in the organizational setup and division of responsibilities concerning the future of eHealth implementations in hospitals took place, which granted the Danish regions the full responsibility for all hospital systems, specifically the consolidation of EHRs to one system per region. The regions reduced the number of different EHRs to six systems by 2014. Additionally, the first version of the National Health Record was launched to provide health care practitioners with an overview of a patient's data stored in all EHRs across the regions and within the various health sectors. The governance of national eHealth implementation plays a crucial role in the development and diffusion of interoperable technologies. Changes in the organizational setup and redistribution of responsibilities between the Danish regions and the state play a pivotal role in producing viable and coherent solutions in a timely manner. Interoperability initiatives are best managed on a regional level or by the authorities responsible for the provision of local health care services. Cross-regional communication is essential during the initial phases of planning in order to set a common goal for countrywide harmonization, coherence and collaboration. © The Author 2015. Published by Oxford University Press in association with the International Society for Quality in Health Care; all rights reserved.

  10. An Open Source Tool to Test Interoperability

    Science.gov (United States)

    Bermudez, L. E.

    2012-12-01

    Scientists interact with information at various levels from gathering of the raw observed data to accessing portrayed processed quality control data. Geoinformatics tools help scientist on the acquisition, storage, processing, dissemination and presentation of geospatial information. Most of the interactions occur in a distributed environment between software components that take the role of either client or server. The communication between components includes protocols, encodings of messages and managing of errors. Testing of these communication components is important to guarantee proper implementation of standards. The communication between clients and servers can be adhoc or follow standards. By following standards interoperability between components increase while reducing the time of developing new software. The Open Geospatial Consortium (OGC), not only coordinates the development of standards but also, within the Compliance Testing Program (CITE), provides a testing infrastructure to test clients and servers. The OGC Web-based Test Engine Facility, based on TEAM Engine, allows developers to test Web services and clients for correct implementation of OGC standards. TEAM Engine is a JAVA open source facility, available at Sourceforge that can be run via command line, deployed in a web servlet container or integrated in developer's environment via MAVEN. The TEAM Engine uses the Compliance Test Language (CTL) and TestNG to test HTTP requests, SOAP services and XML instances against Schemas and Schematron based assertions of any type of web service, not only OGC services. For example, the OGC Web Feature Service (WFS) 1.0.0 test has more than 400 test assertions. Some of these assertions includes conformance of HTTP responses, conformance of GML-encoded data; proper values for elements and attributes in the XML; and, correct error responses. This presentation will provide an overview of TEAM Engine, introduction of how to test via the OGC Testing web site and

  11. Connectivity, interoperability and manageability challenges in internet of things

    Science.gov (United States)

    Haseeb, Shariq; Hashim, Aisha Hassan A.; Khalifa, Othman O.; Ismail, Ahmad Faris

    2017-09-01

    The vision of Internet of Things (IoT) is about interconnectivity between sensors, actuators, people and processes. IoT exploits connectivity between physical objects like fridges, cars, utilities, buildings and cities for enhancing the lives of people through automation and data analytics. However, this sudden increase in connected heterogeneous IoT devices takes a huge toll on the existing Internet infrastructure and introduces new challenges for researchers to embark upon. This paper highlights the effects of heterogeneity challenges on connectivity, interoperability, management in greater details. It also surveys some of the existing solutions adopted in the core network to solve the challenges of massive IoT deployment. The paper finally concludes that IoT architecture and network infrastructure needs to be reengineered ground-up, so that IoT solutions can be safely and efficiently deployed.

  12. Equipping the Enterprise Interoperability Problem Solver

    NARCIS (Netherlands)

    Oude Luttighuis, Paul; Folmer, Erwin Johan Albert; Charalabidis, Yannis

    2010-01-01

    The maturity of the enterprise interoperability field does not match the importance attached to it by many, both in the public as well as the private community. A host of models, paradigms, designs, standards, methods, and instruments seems to be available, but many of them are only used in rather

  13. Smart Grid Interoperability Maturity Model Beta Version

    Energy Technology Data Exchange (ETDEWEB)

    Widergren, Steven E.; Drummond, R.; Giroti, Tony; Houseman, Doug; Knight, Mark; Levinson, Alex; longcore, Wayne; Lowe, Randy; Mater, J.; Oliver, Terry V.; Slack, Phil; Tolk, Andreas; Montgomery, Austin

    2011-12-02

    The GridWise Architecture Council was formed by the U.S. Department of Energy to promote and enable interoperability among the many entities that interact with the electric power system. This balanced team of industry representatives proposes principles for the development of interoperability concepts and standards. The Council provides industry guidance and tools that make it an available resource for smart grid implementations. In the spirit of advancing interoperability of an ecosystem of smart grid devices and systems, this document presents a model for evaluating the maturity of the artifacts and processes that specify the agreement of parties to collaborate across an information exchange interface. You are expected to have a solid understanding of large, complex system integration concepts and experience in dealing with software component interoperation. Those without this technical background should read the Executive Summary for a description of the purpose and contents of the document. Other documents, such as checklists, guides, and whitepapers, exist for targeted purposes and audiences. Please see the www.gridwiseac.org website for more products of the Council that may be of interest to you.

  14. An interoperable security framework for connected healthcare

    NARCIS (Netherlands)

    Asim, M.; Petkovic, M.; Qu, M.; Wang, Changjie

    2011-01-01

    Connected and interoperable healthcare system promises to reduce the cost of healthcare delivery, increase its efficiency and enable consumers to better engage with clinicians and manage their care. However at the same time it introduces new risks towards security and privacy of personal health

  15. An Interoperable Security Framework for Connected Healthcare

    NARCIS (Netherlands)

    Asim, M.; Petkovic, M.; Qu, M.; Wang, C.

    2011-01-01

    Connected and interoperable healthcare system promises to reduce thecost of the healthcare delivery, increase its efficiency and enableconsumers to better engage with clinicians and manage their care. However at the same time it introduces new risks towards security andprivacy of personal health

  16. Aragonite coating solutions (ACS) based on artificial seawater

    Science.gov (United States)

    Tas, A. Cuneyt

    2015-03-01

    Aragonite (CaCO3, calcium carbonate) is an abundant biomaterial of marine life. It is the dominant inorganic phase of coral reefs, mollusc bivalve shells and the stalactites or stalagmites of geological sediments. Inorganic and initially precipitate-free aragonite coating solutions (ACS) of pH 7.4 were developed in this study to deposit monolayers of aragonite spherules or ooids on biomaterial (e.g., UHMWPE, ultrahigh molecular weight polyethylene) surfaces soaked in ACS at 30 °C. The ACS solutions of this study have been developed for the surface engineering of synthetic biomaterials. The abiotic ACS solutions, enriched with calcium and bicarbonate ions at different concentrations, essentially mimicked the artificial seawater composition and started to deposit aragonite after a long (4 h) incubation period at the tropical sea surface temperature of 30 °C. While numerous techniques for the solution deposition of calcium hydroxyapatite (Ca10(PO4)6(OH)2), of low thermodynamic solubility, on synthetic biomaterials have been demonstrated, procedures related to the solution-based surface deposition of high solubility aragonite remained uncommon. Monolayers of aragonite ooids deposited at 30 °C on UHMWPE substrates soaked in organic-free ACS solutions were found to possess nano-structures similar to the mortar-and-brick-type botryoids observed in biogenic marine shells. Samples were characterized using SEM, XRD, FTIR, ICP-AES and contact angle goniometry.

  17. Aragonite coating solutions (ACS) based on artificial seawater

    International Nuclear Information System (INIS)

    Tas, A. Cuneyt

    2015-01-01

    Graphical abstract: - Highlights: • Developed completely inorganic solutions for the deposition of monolayers of aragonite spherules (or ooids). • Solutions mimicked the artificial seawater. • Biomimetic crystallization was performed at the tropical sea surface temperature of 30 °C. - Abstract: Aragonite (CaCO 3 , calcium carbonate) is an abundant biomaterial of marine life. It is the dominant inorganic phase of coral reefs, mollusc bivalve shells and the stalactites or stalagmites of geological sediments. Inorganic and initially precipitate-free aragonite coating solutions (ACS) of pH 7.4 were developed in this study to deposit monolayers of aragonite spherules or ooids on biomaterial (e.g., UHMWPE, ultrahigh molecular weight polyethylene) surfaces soaked in ACS at 30 °C. The ACS solutions of this study have been developed for the surface engineering of synthetic biomaterials. The abiotic ACS solutions, enriched with calcium and bicarbonate ions at different concentrations, essentially mimicked the artificial seawater composition and started to deposit aragonite after a long (4 h) incubation period at the tropical sea surface temperature of 30 °C. While numerous techniques for the solution deposition of calcium hydroxyapatite (Ca 10 (PO 4 ) 6 (OH) 2 ), of low thermodynamic solubility, on synthetic biomaterials have been demonstrated, procedures related to the solution-based surface deposition of high solubility aragonite remained uncommon. Monolayers of aragonite ooids deposited at 30 °C on UHMWPE substrates soaked in organic-free ACS solutions were found to possess nano-structures similar to the mortar-and-brick-type botryoids observed in biogenic marine shells. Samples were characterized using SEM, XRD, FTIR, ICP-AES and contact angle goniometry

  18. Aragonite coating solutions (ACS) based on artificial seawater

    Energy Technology Data Exchange (ETDEWEB)

    Tas, A. Cuneyt, E-mail: c_tas@hotmail.com

    2015-03-01

    Graphical abstract: - Highlights: • Developed completely inorganic solutions for the deposition of monolayers of aragonite spherules (or ooids). • Solutions mimicked the artificial seawater. • Biomimetic crystallization was performed at the tropical sea surface temperature of 30 °C. - Abstract: Aragonite (CaCO{sub 3}, calcium carbonate) is an abundant biomaterial of marine life. It is the dominant inorganic phase of coral reefs, mollusc bivalve shells and the stalactites or stalagmites of geological sediments. Inorganic and initially precipitate-free aragonite coating solutions (ACS) of pH 7.4 were developed in this study to deposit monolayers of aragonite spherules or ooids on biomaterial (e.g., UHMWPE, ultrahigh molecular weight polyethylene) surfaces soaked in ACS at 30 °C. The ACS solutions of this study have been developed for the surface engineering of synthetic biomaterials. The abiotic ACS solutions, enriched with calcium and bicarbonate ions at different concentrations, essentially mimicked the artificial seawater composition and started to deposit aragonite after a long (4 h) incubation period at the tropical sea surface temperature of 30 °C. While numerous techniques for the solution deposition of calcium hydroxyapatite (Ca{sub 10}(PO{sub 4}){sub 6}(OH){sub 2}), of low thermodynamic solubility, on synthetic biomaterials have been demonstrated, procedures related to the solution-based surface deposition of high solubility aragonite remained uncommon. Monolayers of aragonite ooids deposited at 30 °C on UHMWPE substrates soaked in organic-free ACS solutions were found to possess nano-structures similar to the mortar-and-brick-type botryoids observed in biogenic marine shells. Samples were characterized using SEM, XRD, FTIR, ICP-AES and contact angle goniometry.

  19. Solution of partial differential equations by agent-based simulation

    International Nuclear Information System (INIS)

    Szilagyi, Miklos N

    2014-01-01

    The purpose of this short note is to demonstrate that partial differential equations can be quickly solved by agent-based simulation with high accuracy. There is no need for the solution of large systems of algebraic equations. This method is especially useful for quick determination of potential distributions and demonstration purposes in teaching electromagnetism. (letters and comments)

  20. Impact of Business Interoperability on the Performance of Complex Cooperative Supply Chain Networks: A Case Study

    Directory of Open Access Journals (Sweden)

    Izunildo Cabral

    2018-01-01

    Full Text Available This paper proposes an agent-based model for evaluating the effect of business interoperability on the performance of cooperative supply chain networks. The model is based on insights from the Industrial Marketing and Purchasing network approach and the complex systems theory perspective. To demonstrate its applicability, an explanatory case study regarding a Portuguese reverse logistics cooperative supply chain network is presented. Face-to-face interviews and forms were used to collect data. The findings show that the establishment of appropriate levels of business interoperability has helped to reduce several non-value-added interaction processes and consequently improve the operational performance of the Valorpneu network. Regarding the research implications, this paper extends the current knowledge on business interoperability and an important problem in business: how business interoperability gaps in dyadic organizational relationships affect the network of companies that the two companies belong to—network effect. In terms of practical implications, managers can use the proposed model as a starting point to simulate complex interactions between supply chain network partners and understand better how the performance of their networks emerges from these interactions and from the adoption of different levels of business interoperability.

  1. Memristor-based memory: The sneak paths problem and solutions

    KAUST Repository

    Zidan, Mohammed A.

    2012-10-29

    In this paper, we investigate the read operation of memristor-based memories. We analyze the sneak paths problem and provide a noise margin metric to compare the various solutions proposed in the literature. We also analyze the power consumption associated with these solutions. Moreover, we study the effect of the aspect ratio of the memory array on the sneak paths. Finally, we introduce a new technique for solving the sneak paths problem by gating the memory cell using a three-terminal memistor device.

  2. Memristor-based memory: The sneak paths problem and solutions

    KAUST Repository

    Zidan, Mohammed A.; Fahmy, Hossam A.H.; Hussain, Muhammad Mustafa; Salama, Khaled N.

    2012-01-01

    In this paper, we investigate the read operation of memristor-based memories. We analyze the sneak paths problem and provide a noise margin metric to compare the various solutions proposed in the literature. We also analyze the power consumption associated with these solutions. Moreover, we study the effect of the aspect ratio of the memory array on the sneak paths. Finally, we introduce a new technique for solving the sneak paths problem by gating the memory cell using a three-terminal memistor device.

  3. Designing learning management system interoperability in semantic web

    Science.gov (United States)

    Anistyasari, Y.; Sarno, R.; Rochmawati, N.

    2018-01-01

    The extensive adoption of learning management system (LMS) has set the focus on the interoperability requirement. Interoperability is the ability of different computer systems, applications or services to communicate, share and exchange data, information, and knowledge in a precise, effective and consistent way. Semantic web technology and the use of ontologies are able to provide the required computational semantics and interoperability for the automation of tasks in LMS. The purpose of this study is to design learning management system interoperability in the semantic web which currently has not been investigated deeply. Moodle is utilized to design the interoperability. Several database tables of Moodle are enhanced and some features are added. The semantic web interoperability is provided by exploited ontology in content materials. The ontology is further utilized as a searching tool to match user’s queries and available courses. It is concluded that LMS interoperability in Semantic Web is possible to be performed.

  4. Stability of subsystem solutions in agent-based models

    Science.gov (United States)

    Perc, Matjaž

    2018-01-01

    The fact that relatively simple entities, such as particles or neurons, or even ants or bees or humans, give rise to fascinatingly complex behaviour when interacting in large numbers is the hallmark of complex systems science. Agent-based models are frequently employed for modelling and obtaining a predictive understanding of complex systems. Since the sheer number of equations that describe the behaviour of an entire agent-based model often makes it impossible to solve such models exactly, Monte Carlo simulation methods must be used for the analysis. However, unlike pairwise interactions among particles that typically govern solid-state physics systems, interactions among agents that describe systems in biology, sociology or the humanities often involve group interactions, and they also involve a larger number of possible states even for the most simplified description of reality. This begets the question: when can we be certain that an observed simulation outcome of an agent-based model is actually stable and valid in the large system-size limit? The latter is key for the correct determination of phase transitions between different stable solutions, and for the understanding of the underlying microscopic processes that led to these phase transitions. We show that a satisfactory answer can only be obtained by means of a complete stability analysis of subsystem solutions. A subsystem solution can be formed by any subset of all possible agent states. The winner between two subsystem solutions can be determined by the average moving direction of the invasion front that separates them, yet it is crucial that the competing subsystem solutions are characterised by a proper composition and spatiotemporal structure before the competition starts. We use the spatial public goods game with diverse tolerance as an example, but the approach has relevance for a wide variety of agent-based models.

  5. An interoperable standard system for the automatic generation and publication of the fire risk maps based on Fire Weather Index (FWI)

    Science.gov (United States)

    Julià Selvas, Núria; Ninyerola Casals, Miquel

    2015-04-01

    It has been implemented an automatic system to predict the fire risk in the Principality of Andorra, a small country located in the eastern Pyrenees mountain range, bordered by Catalonia and France, due to its location, his landscape is a set of a rugged mountains with an average elevation around 2000 meters. The system is based on the Fire Weather Index (FWI) that consists on different components, each one, measuring a different aspect of the fire danger calculated by the values of the weather variables at midday. CENMA (Centre d'Estudis de la Neu i de la Muntanya d'Andorra) has a network around 10 automatic meteorological stations, located in different places, peeks and valleys, that measure weather data like relative humidity, wind direction and speed, surface temperature, rainfall and snow cover every ten minutes; this data is sent daily and automatically to the system implemented that will be processed in the way to filter incorrect measurements and to homogenizer measurement units. Then this data is used to calculate all components of the FWI at midday and for the level of each station, creating a database with the values of the homogeneous measurements and the FWI components for each weather station. In order to extend and model this data to all Andorran territory and to obtain a continuous map, an interpolation method based on a multiple regression with spline residual interpolation has been implemented. This interpolation considerer the FWI data as well as other relevant predictors such as latitude, altitude, global solar radiation and sea distance. The obtained values (maps) are validated using a cross-validation leave-one-out method. The discrete and continuous maps are rendered in tiled raster maps and published in a web portal conform to Web Map Service (WMS) Open Geospatial Consortium (OGC) standard. Metadata and other reference maps (fuel maps, topographic maps, etc) are also available from this geoportal.

  6. Surface phase transitions in cu-based solid solutions

    Science.gov (United States)

    Zhevnenko, S. N.; Chernyshikhin, S. V.

    2017-11-01

    We have measured surface energy in two-component Cu-based systems in H2 + Ar gas atmosphere. The experiments on solid Cu [Ag] and Cu [Co] solutions show presence of phase transitions on the surfaces. Isotherms of the surface energy have singularities (the minimum in the case of copper solid solutions with silver and the maximum in the case of solid solutions with cobalt). In both cases, the surface phase transitions cause deficiency of surface miscibility: formation of a monolayer (multilayer) (Cu-Ag) or of nanoscale particles (Cu-Co). At the same time, according to the volume phase diagrams, the concentration and temperature of the surface phase transitions correspond to the solid solution within the volume. The method permits determining the rate of diffusional creep in addition to the surface energy. The temperature and concentration dependence of the solid solutions' viscosity coefficient supports the fact of the surface phase transitions and provides insights into the diffusion properties of the transforming surfaces.

  7. The eXtensible ontology development (XOD) principles and tool implementation to support ontology interoperability.

    Science.gov (United States)

    He, Yongqun; Xiang, Zuoshuang; Zheng, Jie; Lin, Yu; Overton, James A; Ong, Edison

    2018-01-12

    Ontologies are critical to data/metadata and knowledge standardization, sharing, and analysis. With hundreds of biological and biomedical ontologies developed, it has become critical to ensure ontology interoperability and the usage of interoperable ontologies for standardized data representation and integration. The suite of web-based Ontoanimal tools (e.g., Ontofox, Ontorat, and Ontobee) support different aspects of extensible ontology development. By summarizing the common features of Ontoanimal and other similar tools, we identified and proposed an "eXtensible Ontology Development" (XOD) strategy and its associated four principles. These XOD principles reuse existing terms and semantic relations from reliable ontologies, develop and apply well-established ontology design patterns (ODPs), and involve community efforts to support new ontology development, promoting standardized and interoperable data and knowledge representation and integration. The adoption of the XOD strategy, together with robust XOD tool development, will greatly support ontology interoperability and robust ontology applications to support data to be Findable, Accessible, Interoperable and Reusable (i.e., FAIR).

  8. Enabling Interoperable and Selective Data Sharing among Social Networking Sites

    Science.gov (United States)

    Shin, Dongwan; Lopes, Rodrigo

    With the widespread use of social networking (SN) sites and even introduction of a social component in non-social oriented services, there is a growing concern over user privacy in general, how to handle and share user profiles across SN sites in particular. Although there have been several proprietary or open source-based approaches to unifying the creation of third party applications, the availability and retrieval of user profile information are still limited to the site where the third party application is run, mostly devoid of the support for data interoperability. In this paper we propose an approach to enabling interopearable and selective data sharing among SN sites. To support selective data sharing, we discuss an authenticated dictionary (ADT)-based credential which enables a user to share only a subset of her information certified by external SN sites with applications running on an SN site. For interoperable data sharing, we propose an extension to the OpenSocial API so that it can provide an open source-based framework for allowing the ADT-based credential to be used seamlessly among different SN sites.

  9. Spectral radiative property control method based on filling solution

    International Nuclear Information System (INIS)

    Jiao, Y.; Liu, L.H.; Hsu, P.-F.

    2014-01-01

    Controlling thermal radiation by tailoring spectral properties of microstructure is a promising method, can be applied in many industrial systems and have been widely researched recently. Among various property tailoring schemes, geometry design of microstructures is a commonly used method. However, the existing radiation property tailoring is limited by adjustability of processed microstructures. In other words, the spectral radiative properties of microscale structures are not possible to change after the gratings are fabricated. In this paper, we propose a method that adjusts the grating spectral properties by means of injecting filling solution, which could modify the thermal radiation in a fabricated microstructure. Therefore, this method overcomes the limitation mentioned above. Both mercury and water are adopted as the filling solution in this study. Aluminum and silver are selected as the grating materials to investigate the generality and limitation of this control method. The rigorous coupled-wave analysis is used to investigate the spectral radiative properties of these filling solution grating structures. A magnetic polaritons mechanism identification method is proposed based on LC circuit model principle. It is found that this control method could be used by different grating materials. Different filling solutions would enable the high absorption peak to move to longer or shorter wavelength band. The results show that the filling solution grating structures are promising for active control of spectral radiative properties. -- Highlights: • A filling solution grating structure is designed to adjust spectral radiative properties. • The mechanism of radiative property control is studied for engineering utilization. • Different grating materials are studied to find multi-functions for grating

  10. Model-based dispersive wave processing: A recursive Bayesian solution

    International Nuclear Information System (INIS)

    Candy, J.V.; Chambers, D.H.

    1999-01-01

    Wave propagation through dispersive media represents a significant problem in many acoustic applications, especially in ocean acoustics, seismology, and nondestructive evaluation. In this paper we propose a propagation model that can easily represent many classes of dispersive waves and proceed to develop the model-based solution to the wave processing problem. It is shown that the underlying wave system is nonlinear and time-variable requiring a recursive processor. Thus the general solution to the model-based dispersive wave enhancement problem is developed using a Bayesian maximum a posteriori (MAP) approach and shown to lead to the recursive, nonlinear extended Kalman filter (EKF) processor. The problem of internal wave estimation is cast within this framework. The specific processor is developed and applied to data synthesized by a sophisticated simulator demonstrating the feasibility of this approach. copyright 1999 Acoustical Society of America.

  11. Comparison of Ring-Buffer-Based Packet Capture Solutions

    Energy Technology Data Exchange (ETDEWEB)

    Barker, Steven Andrew [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2015-10-01

    Traditional packet-capture solutions using commodity hardware incur a large amount of overhead as packets are copied multiple times by the operating system. This overhead slows sensor systems to a point where they are unable to keep up with high bandwidth traffic, resulting in dropped packets. Incomplete packet capture files hinder network monitoring and incident response efforts. While costly commercial hardware exists to capture high bandwidth traffic, several software-based approaches exist to improve packet capture performance using commodity hardware.

  12. Land based use of natural gas - distribution solutions

    International Nuclear Information System (INIS)

    Jordanger, Einar; Moelnvik, Mona J.; Owren, Geir; Einang, Per Magne; Grinden, Bjoern; Tangen, Grethe

    2002-05-01

    The report presents results from the project ''Landbasert bruk av naturgass - distribusjonsloesninger'' (Land based use of natural gas - distribution solutions). It describes the aims of the project, the political external conditions for the use of natural gas, some environmental profits by changing from petroleum and coal to natural gas, the Norwegian infrastructure, the optimisation of energy transport, strategic consequences of the introduction of LNG and the practical consequences of the Enova strategy

  13. Focus for 3D city models should be on interoperability

    DEFF Research Database (Denmark)

    Bodum, Lars; Kjems, Erik; Jaegly, Marie Michele Helena

    2006-01-01

    that would make it useful for other purposes than visualisation. Time has come to try to change this trend and to convince the municipalities that interoperability and semantics are important issues for the future. It is important for them to see that 3D modelling, mapping and geographic information...... developments in Geographical Exploration Systems. Centralized and proprietary Geographical Exploration Systems only give us their own perspective on the world. On the contrary, GRIFINOR is decentralized and available for everyone to use, empowering people to promote their own world vision....... are subjects on the same agenda towards an integrated solution for an object-oriented mapping of multidimensional geographic objects in the urban environment. Many relevant subjects could be discussed regarding these matters, but in this paper we will narrow the discussion down to the ideas behind...

  14. Internet of Things Heterogeneous Interoperable Network Architecture Design

    DEFF Research Database (Denmark)

    Bhalerao, Dipashree M.

    2014-01-01

    Internet of Thing‘s (IoT) state of the art deduce that there is no mature Internet of Things architecture available. Thesis contributes an abstract generic IoT system reference architecture development with specifications. Novelties of thesis are proposed solutions and implementations....... It is proved that reduction of data at a source will result in huge vertical scalability and indirectly horizontal also. Second non functional feature contributes in heterogeneous interoperable network architecture for constrained Things. To eliminate increasing number of gateways, Wi-Fi access point...... with Bluetooth, Zigbee (new access point is called as BZ-Fi) is proposed. Co-existence of Wi-Fi, Bluetooth, and Zigbee network technologies results in interference. To reduce the interference, orthogonal frequency division multiplexing (OFDM) is proposed tobe implemented in Bluetooth and Zigbee. The proposed...

  15. Database Security for an Integrated Solution to Automate Sales Processes in Banking

    OpenAIRE

    Alexandra Maria Ioana FLOREA

    2013-01-01

    In order to maintain a competitive edge in a very active banking market the implementation of a web-based solution to standardize, optimize and manage the flow of sales / pre-sales and generating new leads is requested by a company. This article presents the realization of a development framework for software interoperability in the banking financial institutions and an integrated solution for achieving sales process automation in banking. The paper focuses on presenting the requirements for ...

  16. Regulatory Barriers Blocking Standardization of Interoperability

    OpenAIRE

    Zhong, Daidi; Kirwan, Michael J; Duan, Xiaolian

    2013-01-01

    Developing and implementing a set of personal health device interoperability standards is key to cultivating a healthy global industry ecosystem. The standardization organizations, including the Institute of Electrical and Electronics Engineers 11073 Personal Health Device Workgroup (IEEE 11073-PHD WG) and Continua Health Alliance, are striving for this purpose. However, factors like the medial device regulation, health policy, and market reality have placed non-technical barriers over the ad...

  17. UGV Control Interoperability Profile (IOP), Version 0

    Science.gov (United States)

    2011-12-21

    a tracked vehicle to climb stairs , traverse ditches/ruts, etc. The operator should be able to control the position of the flippers via the OCU and...Unclassified UGV Control Interoperability Profile (IOP) Version 0 Robotic Systems, Joint Project Office (RS JPO) SFAE-GCS-UGV MS...NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Robotic Systems, Joint Project Office (RS JPO),SFAE-GCS-UGV MS 266,6501 East 11 Mile Road

  18. Interoperability in the e-Government Context

    Science.gov (United States)

    2012-01-01

    TN-014 | 3 ing e- government systems focus primarily on these technical challenges [UNDP 2007a, p. 10; CS Transform 2009, p. 3]. More recently...Thailand’s government hits its own wall. Responding agencies and non- governmental groups are unable to share information vital to the rescue effort...Interoperability and Open Standards for e- Governance .” egov (Sep. 1, 2007): 17–19. [Secretary General, United Nations 2010] Secretary General, United

  19. Solution-based targeted genomic enrichment for precious DNA samples

    Directory of Open Access Journals (Sweden)

    Shearer Aiden

    2012-05-01

    Full Text Available Abstract Background Solution-based targeted genomic enrichment (TGE protocols permit selective sequencing of genomic regions of interest on a massively parallel scale. These protocols could be improved by: 1 modifying or eliminating time consuming steps; 2 increasing yield to reduce input DNA and excessive PCR cycling; and 3 enhancing reproducible. Results We developed a solution-based TGE method for downstream Illumina sequencing in a non-automated workflow, adding standard Illumina barcode indexes during the post-hybridization amplification to allow for sample pooling prior to sequencing. The method utilizes Agilent SureSelect baits, primers and hybridization reagents for the capture, off-the-shelf reagents for the library preparation steps, and adaptor oligonucleotides for Illumina paired-end sequencing purchased directly from an oligonucleotide manufacturing company. Conclusions This solution-based TGE method for Illumina sequencing is optimized for small- or medium-sized laboratories and addresses the weaknesses of standard protocols by reducing the amount of input DNA required, increasing capture yield, optimizing efficiency, and improving reproducibility.

  20. A Game Theory Based Solution for Security Challenges in CRNs

    Science.gov (United States)

    Poonam; Nagpal, Chander Kumar

    2018-03-01

    Cognitive radio networks (CRNs) are being envisioned to drive the next generation Ad hoc wireless networks due to their ability to provide communications resilience in continuously changing environments through the use of dynamic spectrum access. Conventionally CRNs are dependent upon the information gathered by other secondary users to ensure the accuracy of spectrum sensing making them vulnerable to security attacks leading to the need of security mechanisms like cryptography and trust. However, a typical cryptography based solution is not a viable security solution for CRNs owing to their limited resources. Effectiveness of trust based approaches has always been, in question, due to credibility of secondary trust resources. Game theory with its ability to optimize in an environment of conflicting interests can be quite a suitable tool to manage an ad hoc network in the presence of autonomous selfish/malevolent/malicious and attacker nodes. The literature contains several theoretical proposals for augmenting game theory in the ad hoc networks without explicit/detailed implementation. This paper implements a game theory based solution in MATLAB-2015 to secure the CRN environment and compares the obtained results with the traditional approaches of trust and cryptography. The simulation result indicates that as the time progresses the game theory performs much better with higher throughput, lower jitter and better identification of selfish/malicious nodes.

  1. Interoperability of remote handling control system software modules at Divertor Test Platform 2 using middleware

    International Nuclear Information System (INIS)

    Tuominen, Janne; Rasi, Teemu; Mattila, Jouni; Siuko, Mikko; Esque, Salvador; Hamilton, David

    2013-01-01

    Highlights: ► The prototype DTP2 remote handling control system is a heterogeneous collection of subsystems, each realizing a functional area of responsibility. ► Middleware provides well-known, reusable solutions to problems, such as heterogeneity, interoperability, security and dependability. ► A middleware solution was selected and integrated with the DTP2 RH control system. The middleware was successfully used to integrate all relevant subsystems and functionality was demonstrated. -- Abstract: This paper focuses on the inter-subsystem communication channels in a prototype distributed remote handling control system at Divertor Test Platform 2 (DTP2). The subsystems are responsible for specific tasks and, over the years, their development has been carried out using various platforms and programming languages. The communication channels between subsystems have different priorities, e.g. very high messaging rate and deterministic timing or high reliability in terms of individual messages. Generally, a control system's communication infrastructure should provide interoperability, scalability, performance and maintainability. An attractive approach to accomplish this is to use a standardized and proven middleware implementation. The selection of a middleware can have a major cost impact in future integration efforts. In this paper we present development done at DTP2 using the Object Management Group's (OMG) standard specification for Data Distribution Service (DDS) for ensuring communications interoperability. DDS has gained a stable foothold especially in the military field. It lacks a centralized broker, thereby avoiding a single-point-of-failure. It also includes an extensive set of Quality of Service (QoS) policies. The standard defines a platform- and programming language independent model and an interoperability wire protocol that enables DDS vendor interoperability, allowing software developers to avoid vendor lock-in situations

  2. Interoperability of remote handling control system software modules at Divertor Test Platform 2 using middleware

    Energy Technology Data Exchange (ETDEWEB)

    Tuominen, Janne, E-mail: janne.m.tuominen@tut.fi [Tampere University of Technology, Department of Intelligent Hydraulics and Automation, Tampere (Finland); Rasi, Teemu; Mattila, Jouni [Tampere University of Technology, Department of Intelligent Hydraulics and Automation, Tampere (Finland); Siuko, Mikko [VTT, Technical Research Centre of Finland, Tampere (Finland); Esque, Salvador [F4E, Fusion for Energy, Torres Diagonal Litoral B3, Josep Pla2, 08019, Barcelona (Spain); Hamilton, David [ITER Organization, Route de Vinon sur Verdon, 13115 Saint Paul Lez Durance (France)

    2013-10-15

    Highlights: ► The prototype DTP2 remote handling control system is a heterogeneous collection of subsystems, each realizing a functional area of responsibility. ► Middleware provides well-known, reusable solutions to problems, such as heterogeneity, interoperability, security and dependability. ► A middleware solution was selected and integrated with the DTP2 RH control system. The middleware was successfully used to integrate all relevant subsystems and functionality was demonstrated. -- Abstract: This paper focuses on the inter-subsystem communication channels in a prototype distributed remote handling control system at Divertor Test Platform 2 (DTP2). The subsystems are responsible for specific tasks and, over the years, their development has been carried out using various platforms and programming languages. The communication channels between subsystems have different priorities, e.g. very high messaging rate and deterministic timing or high reliability in terms of individual messages. Generally, a control system's communication infrastructure should provide interoperability, scalability, performance and maintainability. An attractive approach to accomplish this is to use a standardized and proven middleware implementation. The selection of a middleware can have a major cost impact in future integration efforts. In this paper we present development done at DTP2 using the Object Management Group's (OMG) standard specification for Data Distribution Service (DDS) for ensuring communications interoperability. DDS has gained a stable foothold especially in the military field. It lacks a centralized broker, thereby avoiding a single-point-of-failure. It also includes an extensive set of Quality of Service (QoS) policies. The standard defines a platform- and programming language independent model and an interoperability wire protocol that enables DDS vendor interoperability, allowing software developers to avoid vendor lock-in situations.

  3. Conceptual Model Formalization in a Semantic Interoperability Service Framework: Transforming Relational Database Schemas to OWL.

    Science.gov (United States)

    Bravo, Carlos; Suarez, Carlos; González, Carolina; López, Diego; Blobel, Bernd

    2014-01-01

    Healthcare information is distributed through multiple heterogeneous and autonomous systems. Access to, and sharing of, distributed information sources are a challenging task. To contribute to meeting this challenge, this paper presents a formal, complete and semi-automatic transformation service from Relational Databases to Web Ontology Language. The proposed service makes use of an algorithm that allows to transform several data models of different domains by deploying mainly inheritance rules. The paper emphasizes the relevance of integrating the proposed approach into an ontology-based interoperability service to achieve semantic interoperability.

  4. caCORE version 3: Implementation of a model driven, service-oriented architecture for semantic interoperability.

    Science.gov (United States)

    Komatsoulis, George A; Warzel, Denise B; Hartel, Francis W; Shanbhag, Krishnakant; Chilukuri, Ram; Fragoso, Gilberto; Coronado, Sherri de; Reeves, Dianne M; Hadfield, Jillaine B; Ludet, Christophe; Covitz, Peter A

    2008-02-01

    One of the requirements for a federated information system is interoperability, the ability of one computer system to access and use the resources of another system. This feature is particularly important in biomedical research systems, which need to coordinate a variety of disparate types of data. In order to meet this need, the National Cancer Institute Center for Bioinformatics (NCICB) has created the cancer Common Ontologic Representation Environment (caCORE), an interoperability infrastructure based on Model Driven Architecture. The caCORE infrastructure provides a mechanism to create interoperable biomedical information systems. Systems built using the caCORE paradigm address both aspects of interoperability: the ability to access data (syntactic interoperability) and understand the data once retrieved (semantic interoperability). This infrastructure consists of an integrated set of three major components: a controlled terminology service (Enterprise Vocabulary Services), a standards-based metadata repository (the cancer Data Standards Repository) and an information system with an Application Programming Interface (API) based on Domain Model Driven Architecture. This infrastructure is being leveraged to create a Semantic Service-Oriented Architecture (SSOA) for cancer research by the National Cancer Institute's cancer Biomedical Informatics Grid (caBIG).

  5. Sustainable Power Supply Solutions for Off-Grid Base Stations

    Directory of Open Access Journals (Sweden)

    Asma Mohamad Aris

    2015-09-01

    Full Text Available The telecommunication sector plays a significant role in shaping the global economy and the way people share information and knowledge. At present, the telecommunication sector is liable for its energy consumption and the amount of emissions it emits in the environment. In the context of off-grid telecommunication applications, off-grid base stations (BSs are commonly used due to their ability to provide radio coverage over a wide geographic area. However, in the past, the off-grid BSs usually relied on emission-intensive power supply solutions such as diesel generators. In this review paper, various types of solutions (including, in particular, the sustainable solutions for powering BSs are discussed. The key aspects in designing an ideal power supply solution are reviewed, and these mainly include the pre-feasibility study and the thermal management of BSs, which comprise heating and cooling of the BS shelter/cabinets and BS electronic equipment and power supply components. The sizing and optimization approaches used to design the BSs’ power supply systems as well as the operational and control strategies adopted to manage the power supply systems are also reviewed in this paper.

  6. An airport surface surveillance solution based on fusion algorithm

    Science.gov (United States)

    Liu, Jianliang; Xu, Yang; Liang, Xuelin; Yang, Yihuang

    2017-01-01

    In this paper, we propose an airport surface surveillance solution combined with Multilateration (MLAT) and Automatic Dependent Surveillance Broadcast (ADS-B). The moving target to be monitored is regarded as a linear stochastic hybrid system moving freely and each surveillance technology is simplified as a sensor with white Gaussian noise. The dynamic model of target and the observation model of sensor are established in this paper. The measurements of sensors are filtered properly by estimators to get the estimation results for current time. Then, we analysis the characteristics of two fusion solutions proposed, and decide to use the scheme based on sensor estimation fusion for our surveillance solution. In the proposed fusion algorithm, according to the output of estimators, the estimation error is quantified, and the fusion weight of each sensor is calculated. The two estimation results are fused with weights, and the position estimation of target is computed accurately. Finally the proposed solution and algorithm are validated by an illustrative target tracking simulation.

  7. Environmental Models as a Service: Enabling Interoperability ...

    Science.gov (United States)

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantage of streamlined deployment processes and affordable cloud access to move algorithms and data to the web for discoverability and consumption. In these deployments, environmental models can become available to end users through RESTful web services and consistent application program interfaces (APIs) that consume, manipulate, and store modeling data. RESTful modeling APIs also promote discoverability and guide usability through self-documentation. Embracing the RESTful paradigm allows models to be accessible via a web standard, and the resulting endpoints are platform- and implementation-agnostic while simultaneously presenting significant computational capabilities for spatial and temporal scaling. RESTful APIs present data in a simple verb-noun web request interface: the verb dictates how a resource is consumed using HTTP methods (e.g., GET, POST, and PUT) and the noun represents the URL reference of the resource on which the verb will act. The RESTful API can self-document in both the HTTP response and an interactive web page using the Open API standard. This lets models function as an interoperable service that promotes sharing, documentation, and discoverability. Here, we discuss the

  8. Sociotechnical Challenges of Developing an Interoperable Personal Health Record

    Science.gov (United States)

    Gaskin, G.L.; Longhurst, C.A.; Slayton, R.; Das, A.K.

    2011-01-01

    Objectives To analyze sociotechnical issues involved in the process of developing an interoperable commercial Personal Health Record (PHR) in a hospital setting, and to create guidelines for future PHR implementations. Methods This qualitative study utilized observational research and semi-structured interviews with 8 members of the hospital team, as gathered over a 28 week period of developing and adapting a vendor-based PHR at Lucile Packard Children’s Hospital at Stanford University. A grounded theory approach was utilized to code and analyze over 100 pages of typewritten field notes and interview transcripts. This grounded analysis allowed themes to surface during the data collection process which were subsequently explored in greater detail in the observations and interviews. Results Four major themes emerged: (1) Multidisciplinary teamwork helped team members identify crucial features of the PHR; (2) Divergent goals for the PHR existed even within the hospital team; (3) Differing organizational conceptions of the end-user between the hospital and software company differentially shaped expectations for the final product; (4) Difficulties with coordination and accountability between the hospital and software company caused major delays and expenses and strained the relationship between hospital and software vendor. Conclusions Though commercial interoperable PHRs have great potential to improve healthcare, the process of designing and developing such systems is an inherently sociotechnical process with many complex issues and barriers. This paper offers recommendations based on the lessons learned to guide future development of such PHRs. PMID:22003373

  9. Interoperability in planetary research for geospatial data analysis

    Science.gov (United States)

    Hare, Trent M.; Rossi, Angelo P.; Frigeri, Alessandro; Marmo, Chiara

    2018-01-01

    For more than a decade there has been a push in the planetary science community to support interoperable methods for accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (e.g., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized geospatial image formats, geologic mapping conventions, U.S. Federal Geographic Data Committee (FGDC) cartographic and metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Map Tile Services (cached image tiles), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they can be just as valuable for planetary domain. Another initiative, called VESPA (Virtual European Solar and Planetary Access), will marry several of the above geoscience standards and astronomy-based standards as defined by International Virtual Observatory Alliance (IVOA). This work outlines the current state of interoperability initiatives in use or in the process of being researched within the planetary geospatial community.

  10. Ocean Data Interoperability Platform: developing a common global framework for marine data management

    Science.gov (United States)

    Glaves, Helen; Schaap, Dick

    2017-04-01

    elsewhere. To add a further layer of complexity there are also global initiatives providing marine data infrastructures e.g. IOC-IODE, POGO as well as those with a wider remit which includes environmental data e.g. GEOSS, COPERNICUS etc. Ecosystem level marine research requires a common framework for marine data management that supports the sharing of data across these regional and global data systems, and provides the user with access to the data available from these services via a single point of access. This framework must be based on existing data systems and established by developing interoperability between them. The Ocean Data and Interoperability Platform (ODIP/ODIP II) project brings together those organisations responsible for maintaining selected regional data infrastructures along with other relevant experts in order to identify the common standards and best practice necessary to underpin this framework, and to evaluate the differences and commonalties between the regional data infrastructures in order to establish interoperability between them for the purposes of data sharing. This coordinated approach is being demonstrated and validated through the development of a series of prototype interoperability solutions that demonstrate the mechanisms and standards necessary to facilitate the sharing of marine data across these existing data infrastructures.

  11. IHE cross-enterprise document sharing for imaging: interoperability testing software

    Directory of Open Access Journals (Sweden)

    Renaud Bérubé

    2010-09-01

    Full Text Available Abstract Background With the deployments of Electronic Health Records (EHR, interoperability testing in healthcare is becoming crucial. EHR enables access to prior diagnostic information in order to assist in health decisions. It is a virtual system that results from the cooperation of several heterogeneous distributed systems. Interoperability between peers is therefore essential. Achieving interoperability requires various types of testing. Implementations need to be tested using software that simulates communication partners, and that provides test data and test plans. Results In this paper we describe a software that is used to test systems that are involved in sharing medical images within the EHR. Our software is used as part of the Integrating the Healthcare Enterprise (IHE testing process to test the Cross Enterprise Document Sharing for imaging (XDS-I integration profile. We describe its architecture and functionalities; we also expose the challenges encountered and discuss the elected design solutions. Conclusions EHR is being deployed in several countries. The EHR infrastructure will be continuously evolving to embrace advances in the information technology domain. Our software is built on a web framework to allow for an easy evolution with web technology. The testing software is publicly available; it can be used by system implementers to test their implementations. It can also be used by site integrators to verify and test the interoperability of systems, or by developers to understand specifications ambiguities, or to resolve implementations difficulties.

  12. Solution NMR Spectroscopy in Target-Based Drug Discovery.

    Science.gov (United States)

    Li, Yan; Kang, Congbao

    2017-08-23

    Solution NMR spectroscopy is a powerful tool to study protein structures and dynamics under physiological conditions. This technique is particularly useful in target-based drug discovery projects as it provides protein-ligand binding information in solution. Accumulated studies have shown that NMR will play more and more important roles in multiple steps of the drug discovery process. In a fragment-based drug discovery process, ligand-observed and protein-observed NMR spectroscopy can be applied to screen fragments with low binding affinities. The screened fragments can be further optimized into drug-like molecules. In combination with other biophysical techniques, NMR will guide structure-based drug discovery. In this review, we describe the possible roles of NMR spectroscopy in drug discovery. We also illustrate the challenges encountered in the drug discovery process. We include several examples demonstrating the roles of NMR in target-based drug discoveries such as hit identification, ranking ligand binding affinities, and mapping the ligand binding site. We also speculate the possible roles of NMR in target engagement based on recent processes in in-cell NMR spectroscopy.

  13. Hybrid Fundamental Solution Based Finite Element Method: Theory and Applications

    Directory of Open Access Journals (Sweden)

    Changyong Cao

    2015-01-01

    Full Text Available An overview on the development of hybrid fundamental solution based finite element method (HFS-FEM and its application in engineering problems is presented in this paper. The framework and formulations of HFS-FEM for potential problem, plane elasticity, three-dimensional elasticity, thermoelasticity, anisotropic elasticity, and plane piezoelectricity are presented. In this method, two independent assumed fields (intraelement filed and auxiliary frame field are employed. The formulations for all cases are derived from the modified variational functionals and the fundamental solutions to a given problem. Generation of elemental stiffness equations from the modified variational principle is also described. Typical numerical examples are given to demonstrate the validity and performance of the HFS-FEM. Finally, a brief summary of the approach is provided and future trends in this field are identified.

  14. Theromdynamics of carbon in nickel-based multicomponent solid solutions

    International Nuclear Information System (INIS)

    Bradley, D.J.

    1978-04-01

    The activity coefficient of carbon in nickel, nickel-titanium, nickel-titanium-chromium, nickel-titanium-molybdenum and nickel-titanium-molybdenum-chromium alloys has been measured at 900, 1100 and 1215 0 C. The results indicate that carbon obeys Henry's Law over the range studied (0 to 2 at. percent). The literature for the nickel-carbon and iron-carbon systems are reviewed and corrected. For the activity of carbon in iron as a function of composition, a new relationship based on re-evaluation of the thermodynamics of the CO/CO 2 equilibrium is proposed. Calculations using this relationship reproduce the data to within 2.5 percent, but the accuracy of the calibrating standards used by many investigators to analyze for carbon is at best 5 percent. This explains the lack of agreement between the many precise sets of data. The values of the activity coefficient of carbon in the various solid solutions are used to calculate a set of parameters for the Kohler-Kaufman equation. The calculations indicate that binary interaction energies are not sufficient to describe the thermodynamics of carbon in some of the nickel-based solid solutions. The results of previous workers for carbon in nickel-iron alloys are completely described by inclusion of ternary terms in the Kohler-Kaufman equation. Most of the carbon solid solution at high temperatures in nickel and nickel-titantium alloys precipitates from solution on quenching in water. The precipitate is composed of very small particles (greater than 2.5 nm) of elemental carbon. The results of some preliminary thermomigration experiments are discussed and recommendations for further work are presented

  15. Interoperability architecture for electric mobility

    NARCIS (Netherlands)

    Brand, Allard; Iacob, Maria Eugenia; van Sinderen, Marten J.; Chapurlat, V.

    2015-01-01

    The current architecture for electric mobility provides insufficient integration with the electricity system, since at this moment there is no possibility for influencing the charge process based on information from market parties such as the distribution system operator. Charging can neither be

  16. Advancing Smart Grid Interoperability and Implementing NIST's Interoperability Roadmap

    Energy Technology Data Exchange (ETDEWEB)

    Basso,T.; DeBlasio, R.

    2010-04-01

    The IEEE American National Standards project P2030TM addressing smart grid interoperability and the IEEE 1547 series of standards addressing distributed resources interconnection with the grid have been identified in priority action plans in the Report to NIST on the Smart Grid Interoperability Standards Roadmap. This paper presents the status of the IEEE P2030 development, the IEEE 1547 series of standards publications and drafts, and provides insight on systems integration and grid infrastructure. The P2030 and 1547 series of standards are sponsored by IEEE Standards Coordinating Committee 21.

  17. PyMOOSE: interoperable scripting in Python for MOOSE

    Directory of Open Access Journals (Sweden)

    Subhasis Ray

    2008-12-01

    Full Text Available Python is emerging as a common scripting language for simulators. This opens up many possibilities for interoperability in the form of analysis, interfaces, and communications between simulators. We report the integration of Python scripting with the Multi-scale Object Oriented Simulation Environment (MOOSE. MOOSE is a general-purpose simulation system for compartmental neuronal models and for models of signaling pathways based on chemical kinetics. We show how the Python-scripting version of MOOSE, PyMOOSE, combines the power of a compiled simulator with the versatility and ease of use of Python. We illustrate this by using Python numerical libraries to analyze MOOSE output online, and by developing a GUI in Python/Qt for a MOOSE simulation. Finally, we build and run a composite neuronal/signaling model that uses both the NEURON and MOOSE numerical engines, and Python as a bridge between the two. Thus PyMOOSE has a high degree of interoperability with analysis routines, with graphical toolkits, and with other simulators.

  18. Language interoperability for high-performance parallel scientific components

    International Nuclear Information System (INIS)

    Elliot, N; Kohn, S; Smolinski, B

    1999-01-01

    With the increasing complexity and interdisciplinary nature of scientific applications, code reuse is becoming increasingly important in scientific computing. One method for facilitating code reuse is the use of components technologies, which have been used widely in industry. However, components have only recently worked their way into scientific computing. Language interoperability is an important underlying technology for these component architectures. In this paper, we present an approach to language interoperability for a high-performance parallel, component architecture being developed by the Common Component Architecture (CCA) group. Our approach is based on Interface Definition Language (IDL) techniques. We have developed a Scientific Interface Definition Language (SIDL), as well as bindings to C and Fortran. We have also developed a SIDL compiler and run-time library support for reference counting, reflection, object management, and exception handling (Babel). Results from using Babel to call a standard numerical solver library (written in C) from C and Fortran show that the cost of using Babel is minimal, where as the savings in development time and the benefits of object-oriented development support for C and Fortran far outweigh the costs

  19. Governance of Interoperability in Intergovernmental Services - Towards an Empirical Taxonomy

    Directory of Open Access Journals (Sweden)

    Herbert Kubicek

    2008-12-01

    Full Text Available High quality and comfortable online delivery of governmental services often requires the seamless exchange of data between two or more government agencies. Smooth data exchange, in turn, requires interoperability of the databases and workflows in the agencies involved. Interoperability (IOP is a complex issue covering purely technical aspects such as transmission protocols and data exchange formats, but also content-related semantic aspects such as identifiers and the meaning of codes as well as organizational, contractual or legal issues. Starting from IOP frameworks which provide classifications of what has to be standardized, this paper, based on an ongoing research project, adopts a political and managerial view and tries to clarify the governance of achieving IOP, i.e. where and by whom IOPstandards are developed and established and how they are put into operation. By analyzing 32 cases of successful implementation of IOP in E-Government services within the European Union empirical indicators for different aspects of governance are proposed and applied to develop an empirical taxonomy of different types of IOP governance which can be used for future comparative research regarding success factors, barriers etc.

  20. Telemedicine system interoperability architecture: concept description and architecture overview.

    Energy Technology Data Exchange (ETDEWEB)

    Craft, Richard Layne, II

    2004-05-01

    In order for telemedicine to realize the vision of anywhere, anytime access to care, it must address the question of how to create a fully interoperable infrastructure. This paper describes the reasons for pursuing interoperability, outlines operational requirements that any interoperability approach needs to consider, proposes an abstract architecture for meeting these needs, identifies candidate technologies that might be used for rendering this architecture, and suggests a path forward that the telemedicine community might follow.

  1. SHIWA workflow interoperability solutions for neuroimaging data analysis

    NARCIS (Netherlands)

    Korkhov, Vladimir; Krefting, Dagmar; Montagnat, Johan; Truong Huu, Tram; Kukla, Tamas; Terstyanszky, Gabor; Manset, David; Caan, Matthan; Olabarriaga, Silvia

    2012-01-01

    Neuroimaging is a field that benefits from distributed computing infrastructures (DCIs) to perform data- and compute-intensive processing and analysis. Using grid workflow systems not only automates the processing pipelines, but also enables domain researchers to implement their expertise on how to

  2. EV integration in smart grids through interoperability solutions

    OpenAIRE

    Rodríguez-Sánchez, Raúl; Madina, Carlos; Zabala, Eduardo

    2015-01-01

    The high total cost of ownership and the uncertainties surrounding battery reliability are still the main barriers for electric vehicle (EV) market take off in Europe. Storage evolution, leading to both price reduction and performance improvement, is a huge technical challenge in the medium-long term. In the meantime, new business models and market niche developments might play a facilitator role for EV deployment by tackling the economic gap between conventional ICE and electromobility (e-mo...

  3. A web services choreography scenario for interoperating bioinformatics applications

    Directory of Open Access Journals (Sweden)

    Cheung David W

    2004-03-01

    Full Text Available Abstract Background Very often genome-wide data analysis requires the interoperation of multiple databases and analytic tools. A large number of genome databases and bioinformatics applications are available through the web, but it is difficult to automate interoperation because: 1 the platforms on which the applications run are heterogeneous, 2 their web interface is not machine-friendly, 3 they use a non-standard format for data input and output, 4 they do not exploit standards to define application interface and message exchange, and 5 existing protocols for remote messaging are often not firewall-friendly. To overcome these issues, web services have emerged as a standard XML-based model for message exchange between heterogeneous applications. Web services engines have been developed to manage the configuration and execution of a web services workflow. Results To demonstrate the benefit of using web services over traditional web interfaces, we compare the two implementations of HAPI, a gene expression analysis utility developed by the University of California San Diego (UCSD that allows visual characterization of groups or clusters of genes based on the biomedical literature. This utility takes a set of microarray spot IDs as input and outputs a hierarchy of MeSH Keywords that correlates to the input and is grouped by Medical Subject Heading (MeSH category. While the HTML output is easy for humans to visualize, it is difficult for computer applications to interpret semantically. To facilitate the capability of machine processing, we have created a workflow of three web services that replicates the HAPI functionality. These web services use document-style messages, which means that messages are encoded in an XML-based format. We compared three approaches to the implementation of an XML-based workflow: a hard coded Java application, Collaxa BPEL Server and Taverna Workbench. The Java program functions as a web services engine and interoperates

  4. PACS/information systems interoperability using Enterprise Communication Framework.

    Science.gov (United States)

    alSafadi, Y; Lord, W P; Mankovich, N J

    1998-06-01

    Interoperability among healthcare applications goes beyond connectivity to allow components to exchange structured information and work together in a predictable, coordinated fashion. To facilitate building an interoperability infrastructure, an Enterprise Communication Framework (ECF) was developed by the members of the Andover Working Group for Healthcare Interoperability (AWG-OHI). The ECF consists of four models: 1) Use Case Model, 2) Domain Information Model (DIM), 3) Interaction Model, and 4) Message Model. To realize this framework, a software component called the Enterprise Communicator (EC) is used. In this paper, we will demonstrate the use of the framework in interoperating a picture archiving and communication system (PACS) with a radiology information system (RIS).

  5. Robotics Systems Joint Project Office (RSJPO) Interoperability Profiles (IOPS) 101

    Science.gov (United States)

    2012-07-01

    interoperability, although they are supported by some interoperability attributes  For example, stair climbing » Stair climbing is not something that...IOPs need to specify » However, the mobility & actuation related interoperable messages can be used to provide stair climbing » Also...interoperability can enable management of different poses or modes, one of which may be stair climbing R O B O T IC S Y S T E M S J P O L e a d e r s h i p

  6. Holistic Framework For Establishing Interoperability of Heterogeneous Software Development Tools

    National Research Council Canada - National Science Library

    Puett, Joseph

    2003-01-01

    This dissertation presents a Holistic Framework for Software Engineering (HFSE) that establishes collaborative mechanisms by which existing heterogeneous software development tools and models will interoperate...

  7. A Survey on Smartphone-Based Crowdsensing Solutions

    Directory of Open Access Journals (Sweden)

    Willian Zamora

    2016-01-01

    Full Text Available In recent years, the widespread adoption of mobile phones, combined with the ever-increasing number of sensors that smartphones are equipped with, greatly simplified the generalized adoption of crowdsensing solutions by reducing hardware requirements and costs to a minimum. These factors have led to an outstanding growth of crowdsensing proposals from both academia and industry. In this paper, we provide a survey of smartphone-based crowdsensing solutions that have emerged in the past few years, focusing on 64 works published in top-ranked journals and conferences. To properly analyze these previous works, we first define a reference framework based on how we classify the different proposals under study. The results of our survey evidence that there is still much heterogeneity in terms of technologies adopted and deployment approaches, although modular designs at both client and server elements seem to be dominant. Also, the preferred client platform is Android, while server platforms are typically web-based, and client-server communications mostly rely on XML or JSON over HTTP. The main detected pitfall concerns the performance evaluation of the different proposals, which typically fail to make a scalability analysis despite being critical issue when targeting very large communities of users.

  8. Investigation of samarium solubility in the magnesium based solid solution

    International Nuclear Information System (INIS)

    Rokhlin, L.L.; Padezhnova, E.M.; Guzej, L.S.

    1976-01-01

    Electric resistance measurements and microscopic analysis were used to investigate the solubility of samarium in a magnesium-based solid solution. The constitutional diagram Mg-Sm on the magnesium side is of an eutectic type with the temperature of the eutectic transformation of 542 deg C. Samarium is partly soluble in solid magnesium, the less so, the lower is the temperature. The maximum solubility of samarium in magnesium (at the eutectic transformation point) is 5.8 % by mass (0.99 at. %). At 200 deg C, the solubility of samarium in magnesium is 0.4 % by mass (0.063 at. %)

  9. Professional SharePoint 2010 Cloud-Based Solutions

    CERN Document Server

    Fox, Steve; Stubbs, Paul; Follette, Donovan

    2011-01-01

    An authoritative guide to extending SharePoint's power with cloud-based services If you want to be part of the next major shift in the IT industry, you'll want this book. Melding two of the hottest trends in the industry—the widespread popularity of the SharePoint collaboration platform and the rapid rise of cloud computing—this practical guide shows developers how to extend their SharePoint solutions with the cloud's almost limitless capabilities. See how to get started, discover smart ways to leverage cloud data and services through Azure, start incorporating Twitter or LinkedIn

  10. Meeting People’s Needs in a Fully Interoperable Domotic Environment

    Directory of Open Access Journals (Sweden)

    Vittorio Miori

    2012-05-01

    Full Text Available The key idea underlying many Ambient Intelligence (AmI projects and applications is context awareness, which is based mainly on their capacity to identify users and their locations. The actual computing capacity should remain in the background, in the periphery of our awareness, and should only move to the center if and when necessary. Computing thus becomes ‘invisible’, as it is embedded in the environment and everyday objects. The research project described herein aims to realize an Ambient Intelligence-based environment able to improve users’ quality of life by learning their habits and anticipating their needs. This environment is part of an adaptive, context-aware framework designed to make today’s incompatible heterogeneous domotic systems fully interoperable, not only for connecting sensors and actuators, but for providing comprehensive connections of devices to users. The solution is a middleware architecture based on open and widely recognized standards capable of abstracting the peculiarities of underlying heterogeneous technologies and enabling them to co-exist and interwork, without however eliminating their differences. At the highest level of this infrastructure, the Ambient Intelligence framework, integrated with the domotic sensors, can enable the system to recognize any unusual or dangerous situations and anticipate health problems or special user needs in a technological living environment, such as a house or a public space.

  11. Technical Interoperability for Machine Connectivity on the Shop Floor

    Directory of Open Access Journals (Sweden)

    Magnus Åkerman

    2018-06-01

    Full Text Available This paper presents a generic technical solution that can increase Industry 4.0 maturity by collecting data from sensors and control systems on the shop floor. Within the research project “5G-Enabled Manufacturing”, an LTE (Long-Term Evolution network with 5G technologies was deployed on the shop floor to enable fast and scalable connectivity. This network was used to connect a grinding machine to a remote private cloud where data was stored and streamed to a data analytics center. This enabled visibility and transparency of the production data, which is the basis for Industry 4.0 and smart manufacturing. The solution is described with a focus on high-level communication technologies above wireless communication standards. These technologies are discussed regarding technical interoperability, focusing on the system layout, communication standards, and open systems. From the discussion, it can be derived that generic solutions such as this are possible, but manufacturing end-users must expand and further internalize knowledge of future information and communication technologies to reduce their dependency on equipment and technology providers.

  12. Enterprise Interoperability - Proceedings of the 5th International IFIP Working Conference on Enterprise Interoperability, IWEI 2013

    NARCIS (Netherlands)

    van Sinderen, Marten J.; Oude Luttighuis, P.H.W.M.; Folmer, Erwin Johan Albert; Bosems, S.; Unknown, [Unknown

    IWEI is an International IFIP Working Conference covering all aspects of enterprise interoperability with the purpose of achieving flexible cross-organizational collaboration through integrated support at business and technical levels. It provides a forum for discussing ideas and results among both

  13. 77 FR 19575 - Promoting Interoperability in the 700 MHz Commercial Spectrum; Interoperability of Mobile User...

    Science.gov (United States)

    2012-04-02

    ... efforts and/or through modifications to the Commission's technical rules or other regulatory measures. The... regulatory measures. \\1\\ The Commission has a longstanding interest in promoting the interoperability of... standards for Long-Term Evolution (LTE) wireless broadband technology are developed by the 3rd Generation...

  14. Robust Adaptive LCMV Beamformer Based On An Iterative Suboptimal Solution

    Directory of Open Access Journals (Sweden)

    Xiansheng Guo

    2015-06-01

    Full Text Available The main drawback of closed-form solution of linearly constrained minimum variance (CF-LCMV beamformer is the dilemma of acquiring long observation time for stable covariance matrix estimates and short observation time to track dynamic behavior of targets, leading to poor performance including low signal-noise-ratio (SNR, low jammer-to-noise ratios (JNRs and small number of snapshots. Additionally, CF-LCMV suffers from heavy computational burden which mainly comes from two matrix inverse operations for computing the optimal weight vector. In this paper, we derive a low-complexity Robust Adaptive LCMV beamformer based on an Iterative Suboptimal solution (RAIS-LCMV using conjugate gradient (CG optimization method. The merit of our proposed method is threefold. Firstly, RAIS-LCMV beamformer can reduce the complexity of CF-LCMV remarkably. Secondly, RAIS-LCMV beamformer can adjust output adaptively based on measurement and its convergence speed is comparable. Finally, RAIS-LCMV algorithm has robust performance against low SNR, JNRs, and small number of snapshots. Simulation results demonstrate the superiority of our proposed algorithms.

  15. RFID in libraries a step toward interoperability

    CERN Document Server

    Ayre, Lori Bowen

    2012-01-01

    The approval by The National Information Standards Organization (NISO) of a new standard for RFID in libraries is a big step toward interoperability among libraries and vendors. By following this set of practices and procedures, libraries can ensure that an RFID tag in one library can be used seamlessly by another, assuming both comply, even if they have different suppliers for tags, hardware, and software. In this issue of Library Technology Reports, Lori Bowen Ayre, an experienced implementer of automated materials handling systems, Provides background on the evolution of the standard

  16. Web services for distributed and interoperable hydro-information systems

    Science.gov (United States)

    Horak, J.; Orlik, A.; Stromsky, J.

    2008-03-01

    Web services support the integration and interoperability of Web-based applications and enable machine-to-machine interaction. The concepts of web services and open distributed architecture were applied to the development of T-DSS, the prototype customised for web based hydro-information systems. T-DSS provides mapping services, database related services and access to remote components, with special emphasis placed on the output flexibility (e.g. multilingualism), where SOAP web services are mainly used for communication. The remote components are represented above all by remote data and mapping services (e.g. meteorological predictions), modelling and analytical systems (currently HEC-HMS, MODFLOW and additional utilities), which support decision making in water management.

  17. The 2nd DBCLS BioHackathon: interoperable bioinformatics Web services for integrated applications

    Directory of Open Access Journals (Sweden)

    Katayama Toshiaki

    2011-08-01

    Full Text Available Abstract Background The interaction between biological researchers and the bioinformatics tools they use is still hampered by incomplete interoperability between such tools. To ensure interoperability initiatives are effectively deployed, end-user applications need to be aware of, and support, best practices and standards. Here, we report on an initiative in which software developers and genome biologists came together to explore and raise awareness of these issues: BioHackathon 2009. Results Developers in attendance came from diverse backgrounds, with experts in Web services, workflow tools, text mining and visualization. Genome biologists provided expertise and exemplar data from the domains of sequence and pathway analysis and glyco-informatics. One goal of the meeting was to evaluate the ability to address real world use cases in these domains using the tools that the developers represented. This resulted in i a workflow to annotate 100,000 sequences from an invertebrate species; ii an integrated system for analysis of the transcription factor binding sites (TFBSs enriched based on differential gene expression data obtained from a microarray experiment; iii a workflow to enumerate putative physical protein interactions among enzymes in a metabolic pathway using protein structure data; iv a workflow to analyze glyco-gene-related diseases by searching for human homologs of glyco-genes in other species, such as fruit flies, and retrieving their phenotype-annotated SNPs. Conclusions Beyond deriving prototype solutions for each use-case, a second major purpose of the BioHackathon was to highlight areas of insufficiency. We discuss the issues raised by our exploration of the problem/solution space, concluding that there are still problems with the way Web services are modeled and annotated, including: i the absence of several useful data or analysis functions in the Web service "space"; ii the lack of documentation of methods; iii lack of

  18. The 2nd DBCLS BioHackathon: interoperable bioinformatics Web services for integrated applications

    Science.gov (United States)

    2011-01-01

    Background The interaction between biological researchers and the bioinformatics tools they use is still hampered by incomplete interoperability between such tools. To ensure interoperability initiatives are effectively deployed, end-user applications need to be aware of, and support, best practices and standards. Here, we report on an initiative in which software developers and genome biologists came together to explore and raise awareness of these issues: BioHackathon 2009. Results Developers in attendance came from diverse backgrounds, with experts in Web services, workflow tools, text mining and visualization. Genome biologists provided expertise and exemplar data from the domains of sequence and pathway analysis and glyco-informatics. One goal of the meeting was to evaluate the ability to address real world use cases in these domains using the tools that the developers represented. This resulted in i) a workflow to annotate 100,000 sequences from an invertebrate species; ii) an integrated system for analysis of the transcription factor binding sites (TFBSs) enriched based on differential gene expression data obtained from a microarray experiment; iii) a workflow to enumerate putative physical protein interactions among enzymes in a metabolic pathway using protein structure data; iv) a workflow to analyze glyco-gene-related diseases by searching for human homologs of glyco-genes in other species, such as fruit flies, and retrieving their phenotype-annotated SNPs. Conclusions Beyond deriving prototype solutions for each use-case, a second major purpose of the BioHackathon was to highlight areas of insufficiency. We discuss the issues raised by our exploration of the problem/solution space, concluding that there are still problems with the way Web services are modeled and annotated, including: i) the absence of several useful data or analysis functions in the Web service "space"; ii) the lack of documentation of methods; iii) lack of compliance with the SOAP

  19. Implementation and Evaluation of Four Interoperable Open Standards for the Internet of Things

    Science.gov (United States)

    Jazayeri, Mohammad Ali; Liang, Steve H. L.; Huang, Chih-Yuan

    2015-01-01

    Recently, researchers are focusing on a new use of the Internet called the Internet of Things (IoT), in which enabled electronic devices can be remotely accessed over the Internet. As the realization of IoT concept is still in its early stages, manufacturers of Internet-connected devices and IoT web service providers are defining their proprietary protocols based on their targeted applications. Consequently, IoT becomes heterogeneous in terms of hardware capabilities and communication protocols. Addressing these heterogeneities by following open standards is a necessary step to communicate with various IoT devices. In this research, we assess the feasibility of applying existing open standards on resource-constrained IoT devices. The standard protocols developed in this research are OGC PUCK over Bluetooth, TinySOS, SOS over CoAP, and OGC SensorThings API. We believe that by hosting open standard protocols on IoT devices, not only do the devices become self-describable, self-contained, and interoperable, but innovative applications can also be easily developed with standardized interfaces. In addition, we use memory consumption, request message size, response message size, and response latency to benchmark the efficiency of the implemented protocols. In all, this research presents and evaluates standard-based solutions to better understand the feasibility of applying existing standards to the IoT vision. PMID:26402683

  20. Implementation and Evaluation of Four Interoperable Open Standards for the Internet of Things.

    Science.gov (United States)

    Jazayeri, Mohammad Ali; Liang, Steve H L; Huang, Chih-Yuan

    2015-09-22

    Recently, researchers are focusing on a new use of the Internet called the Internet of Things (IoT), in which enabled electronic devices can be remotely accessed over the Internet. As the realization of IoT concept is still in its early stages, manufacturers of Internet-connected devices and IoT web service providers are defining their proprietary protocols based on their targeted applications. Consequently, IoT becomes heterogeneous in terms of hardware capabilities and communication protocols. Addressing these heterogeneities by following open standards is a necessary step to communicate with various IoT devices. In this research, we assess the feasibility of applying existing open standards on resource-constrained IoT devices. The standard protocols developed in this research are OGC PUCK over Bluetooth, TinySOS, SOS over CoAP, and OGC SensorThings API. We believe that by hosting open standard protocols on IoT devices, not only do the devices become self-describable, self-contained, and interoperable, but innovative applications can also be easily developed with standardized interfaces. In addition, we use memory consumption, request message size, response message size, and response latency to benchmark the efficiency of the implemented protocols. In all, this research presents and evaluates standard-based solutions to better understand the feasibility of applying existing standards to the IoT vision.

  1. Design and study of geosciences data share platform :platform framework, data interoperability, share approach

    Science.gov (United States)

    Lu, H.; Yi, D.

    2010-12-01

    The Deep Exploration is one of the important approaches to the Geoscience research. Since 1980s we had started it and achieved a lot of data. Researchers usually integrate both data of space exploration and deep exploration to study geological structures and represent the Earth’s subsurface, and analyze and explain on the base of integrated data. Due to the different exploration approach it results the heterogeneity of data, and therefore the data achievement is always of the import issue to make the researchers confused. The problem of data share and interaction has to be solved during the development of the SinoProbe research project. Through the research of domestic and overseas well-known exploration project and geosciences data platform, the subject explores the solution of data share and interaction. Based on SOA we present the deep exploration data share framework which comprises three level: data level is used for the solution of data store and the integration of the heterogeneous data; medial level provides the data service of geophysics, geochemistry, etc. by the means of Web service, and carry out kinds of application combination by the use of GIS middleware and Eclipse RCP; interaction level provides professional and non-professional customer the access to different accuracy data. The framework adopts GeoSciML data interaction approach. GeoSciML is a geosciences information markup language, as an application of the OpenGIS Consortium’s (OGC) Geography Markup Language (GML). It transfers heterogeneous data into one earth frame and implements inter-operation. We dissertate in this article the solution how to integrate the heterogeneous data and share the data in the project of SinoProbe.

  2. A Collaborative System Software Solution for Modeling Business Flows Based on Automated Semantic Web Service Composition

    Directory of Open Access Journals (Sweden)

    Ion SMEUREANU

    2009-01-01

    Full Text Available Nowadays, business interoperability is one of the key factors for assuring competitive advantage for the participant business partners. In order to implement business cooperation, scalable, distributed and portable collaborative systems have to be implemented. This article presents some of the mostly used technologies in this field. Furthermore, it presents a software application architecture based on Business Process Modeling Notation standard and automated semantic web service coupling for modeling business flow in a collaborative manner. The main business processes will be represented in a single, hierarchic flow diagram. Each element of the diagram will represent calls to semantic web services. The business logic (the business rules and constraints will be structured with the help of OWL (Ontology Web Language. Moreover, OWL will also be used to create the semantic web service specifications.

  3. MPEG-4 solutions for virtualizing RDP-based applications

    Science.gov (United States)

    Joveski, Bojan; Mitrea, Mihai; Ganji, Rama-Rao

    2014-02-01

    The present paper provides the proof-of-concepts for the use of the MPEG-4 multimedia scene representations (BiFS and LASeR) as a virtualization tool for RDP-based applications (e.g. MS Windows applications). Two main applicative benefits are thus granted. First, any legacy application can be virtualized without additional programming effort. Second, heterogeneous mobile devices (different manufacturers, OS) can collaboratively enjoy full multimedia experiences. From the methodological point of view, the main novelty consists in (1) designing an architecture allowing the conversion of the RDP content into a semantic multimedia scene-graph and its subsequent rendering on the client and (2) providing the underlying scene graph management and interactivity tools. Experiments consider 5 users and two RDP applications (MS Word and Internet Explorer), and benchmark our solution against two state-of-the-art technologies (VNC and FreeRDP). The visual quality is evaluated by six objective measures (e.g. PSNRVNC by a factor 1.8 while being 2 times heavier then the FreeRDP; (2) for Internet browsing, the MPEG solutions outperform both VNC and FreeRDP by factors of 1.9 and 1.5, respectively. The average round-trip times (less than 40ms) cope with real-time application constraints.

  4. Towards E-Society Policy Interoperability

    Science.gov (United States)

    Iannella, Renato

    The move towards the Policy-Oriented Web is destined to provide support for policy expression and management in the core web layers. One of the most promising areas that can drive this new technology adoption is e-Society communities. With so much user-generated content being shared by these social networks, there is the real danger that the implicit sharing rules that communities have developed over time will be lost in translation in the new digital communities. This will lead to a corresponding loss in confidence in e-Society sites. The Policy-Oriented Web attempts to turn the implicit into the explicit with a common framework for policy language interoperability and awareness. This paper reports on the policy driving factors from the Social Networks experiences using real-world use cases and scenarios. In particular, the key functions of policy-awareness - for privacy, rights, and identity - will be the driving force that enables the e-Society to appreciate new interoperable policy regimes.

  5. Open Health Tools: Tooling for Interoperable Healthcare

    Directory of Open Access Journals (Sweden)

    Skip McGaughey

    2008-11-01

    Full Text Available The Open Health Tools initiative is creating an ecosystem focused on the production of software tooling that promotes the exchange of medical information across political, geographic, cultural, product, and technology lines. At its core, OHT believes that the availability of high-quality tooling that interoperates will propel the industry forward, enabling organizations and vendors to build products and systems that effectively work together. This will ?raise the interoperability bar? as a result of having tools that just work. To achieve these lofty goals, careful consideration must be made to the constituencies that will be most affected by an OHT-influenced world. This document outlines a vision of OHT?s impact to these stakeholders. It does not explain the OHT process itself or how the OHT community operates. Instead, we place emphasis on the impact of that process within the health industry. The catchphrase ?code is king? underpins this document, meaning that the manifestation of any open source community lies in the products and technology it produces.

  6. Reference architecture for interoperability testing of Electric Vehicle charging

    NARCIS (Netherlands)

    Lehfuss, F.; Nohrer, M.; Werkmany, E.; Lopezz, J.A.; Zabalaz, E.

    2015-01-01

    This paper presents a reference architecture for interoperability testing of electric vehicles as well as their support equipment with the smart grid and the e-Mobility environment. Pan-European Electric Vehicle (EV)-charging is currently problematic as there are compliance and interoperability

  7. Interoperability of Demand Response Resources Demonstration in NY

    Energy Technology Data Exchange (ETDEWEB)

    Wellington, Andre

    2014-03-31

    The Interoperability of Demand Response Resources Demonstration in NY (Interoperability Project) was awarded to Con Edison in 2009. The objective of the project was to develop and demonstrate methodologies to enhance the ability of customer sited Demand Response resources to integrate more effectively with electric delivery companies and regional transmission organizations.

  8. Promoting Interoperability: The Case for Discipline-Specific PSAPS

    Science.gov (United States)

    2014-12-01

    multijurisdictional, interoperability is a key factor for success. Responses to 9/11,9 the Oso mudslides in Washington, the Boston Marathon bombing...Continuum125 2. Functional Interoperability As demonstrated by the 9/11 attacks, the Oso mudslide in Washington, the Boston Marathon bombing, and other large

  9. On the applicability of schema integration techniques to database interoperation

    NARCIS (Netherlands)

    Vermeer, Mark W.W.; Apers, Peter M.G.

    1996-01-01

    We discuss the applicability of schema integration techniques developed for tightly-coupled database interoperation to interoperation of databases stemming from different modelling contexts. We illustrate that in such an environment, it is typically quite difficult to infer the real-world semantics

  10. Interactive test tool for interoperable C-ITS development

    NARCIS (Netherlands)

    Voronov, A.; Englund, C.; Bengtsson, H.H.; Chen, L.; Ploeg, J.; Jongh, J.F.C.M. de; Sluis, H.J.D. van de

    2015-01-01

    This paper presents the architecture of an Interactive Test Tool (ITT) for interoperability testing of Cooperative Intelligent Transport Systems (C-ITS). Cooperative systems are developed by different manufacturers at different locations, which makes interoperability testing a tedious task. Up until

  11. Connected Lighting System Interoperability Study Part 1: Application Programming Interfaces

    Energy Technology Data Exchange (ETDEWEB)

    Gaidon, Clement [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Poplawski, Michael [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2017-10-31

    First in a series of studies that focuses on interoperability as realized by the use of Application Programming Interfaces (APIs), explores the diversity of such interfaces in several connected lighting systems; characterizes the extent of interoperability that they provide; and illustrates challenges, limitations, and tradeoffs that were encountered during this exploration.

  12. Establishing Interoperability of a Blog Archive through Linked Open Data

    DEFF Research Database (Denmark)

    Kalb, Hendrik; Lazaridou, Paraskevi; Trier, Matthias

    2013-01-01

    on archived data. However, interoperability among BlogForever archives, as well as with other digital libraries, is necessary in order to avoid silos of data. In this paper, we reveal some of our efforts to establish interoperability through the application of Linked Open data....

  13. Trust and Privacy Solutions Based on Holistic Service Requirements

    Science.gov (United States)

    Sánchez Alcón, José Antonio; López, Lourdes; Martínez, José-Fernán; Rubio Cifuentes, Gregorio

    2015-01-01

    The products and services designed for Smart Cities provide the necessary tools to improve the management of modern cities in a more efficient way. These tools need to gather citizens’ information about their activity, preferences, habits, etc. opening up the possibility of tracking them. Thus, privacy and security policies must be developed in order to satisfy and manage the legislative heterogeneity surrounding the services provided and comply with the laws of the country where they are provided. This paper presents one of the possible solutions to manage this heterogeneity, bearing in mind these types of networks, such as Wireless Sensor Networks, have important resource limitations. A knowledge and ontology management system is proposed to facilitate the collaboration between the business, legal and technological areas. This will ease the implementation of adequate specific security and privacy policies for a given service. All these security and privacy policies are based on the information provided by the deployed platforms and by expert system processing. PMID:26712752

  14. Trust and Privacy Solutions Based on Holistic Service Requirements.

    Science.gov (United States)

    Sánchez Alcón, José Antonio; López, Lourdes; Martínez, José-Fernán; Rubio Cifuentes, Gregorio

    2015-12-24

    The products and services designed for Smart Cities provide the necessary tools to improve the management of modern cities in a more efficient way. These tools need to gather citizens' information about their activity, preferences, habits, etc. opening up the possibility of tracking them. Thus, privacy and security policies must be developed in order to satisfy and manage the legislative heterogeneity surrounding the services provided and comply with the laws of the country where they are provided. This paper presents one of the possible solutions to manage this heterogeneity, bearing in mind these types of networks, such as Wireless Sensor Networks, have important resource limitations. A knowledge and ontology management system is proposed to facilitate the collaboration between the business, legal and technological areas. This will ease the implementation of adequate specific security and privacy policies for a given service. All these security and privacy policies are based on the information provided by the deployed platforms and by expert system processing.

  15. Upon a Home Assistant Solution Based on Raspberry Pi Platform

    Directory of Open Access Journals (Sweden)

    Alexandru Florentin IFTIMIE

    2017-01-01

    Full Text Available Our ongoing research on Internet of Things (IoT has been focused on a project aiming to creating a proof of concept for a distributed system capable of controlling common devices found in a house such as TVs, air conditioning units, and other electrical devices. In order to automate these devices, the system integrates various sensors and actuators and, depending of user’s needs and creativity in conceiving and implementing new commands, the system is able to take care and execute the respective commands in a safe and secure manner. This paper presents our current research results upon a personal home assistant solution designed and built around Raspberry Pi V3 platform. The distributed, client-server approach enables users to control home electric and electronic devices from an Android based mobile application.

  16. Algorithms for synthesizing management solutions based on OLAP-technologies

    Science.gov (United States)

    Pishchukhin, A. M.; Akhmedyanova, G. F.

    2018-05-01

    OLAP technologies are a convenient means of analyzing large amounts of information. An attempt was made in their work to improve the synthesis of optimal management decisions. The developed algorithms allow forecasting the needs and accepted management decisions on the main types of the enterprise resources. Their advantage is the efficiency, based on the simplicity of quadratic functions and differential equations of only the first order. At the same time, the optimal redistribution of resources between different types of products from the assortment of the enterprise is carried out, and the optimal allocation of allocated resources in time. The proposed solutions can be placed on additional specially entered coordinates of the hypercube representing the data warehouse.

  17. PERSPECTIVES ON INTEROPERABILITY INTEGRATION WITHIN NATO DEFENSE PLANNING PROCESS

    Directory of Open Access Journals (Sweden)

    Florian CIOCAN

    2011-01-01

    Full Text Available Interoperability is not a new area of effort at NATO level. In fact, interoperability and more specifi cally standardization, has been a key element of the Alliance’s approach to fi elding forces for decades. But as the security and operational environment has been in a continuous change, the need to face the new threats and the current involvement in challenging operations in Afghanistan and elsewhere alongside with the necessity to interoperate at lower and lower levels of command with an increasing number of nations, including non-NATO ISAF partners, NGOs, and other organizations, have made the task even more challenging. In this respect Interoperability Integration within NATO Defense Planning Process will facilitate the timely identifi cation, development and delivery of required forces and capabilities that are interoperable and adequately prepared, equipped, trained and supported to undertake the Alliance’s full spectrum of missions.

  18. Common business objects: Demonstrating interoperability in the oil and gas industry

    International Nuclear Information System (INIS)

    McLellan, S.G.; Abusalbi, N.; Brown, J.; Quinlivan, W.F.

    1997-01-01

    The PetroTechnical Open Software Corp. (POSC) was organized in 1990 to define technical methods to make it easier to design interoperable data solutions for oil and gas companies. When POSC rolls out seed implementations, oilfield service members must validate them, correct any errors or ambiguities, and champion these corrections into the original specifications before full integration into POSC-compliant, commercial products. Organizations like POSC are assuming a new role of promoting formation of projects where E and P companies and vendors jointly test their pieces of the migration puzzle on small subsets of the whole problem. The authors describe three such joint projects. While confirming the value of such open cross-company cooperation, these cases also help to redefine interoperability in terms of business objects that will be common across oilfield companies, their applications, access software, data, or data stores

  19. Gastric Outlet Obstruction Palliation: A Novel Stent-Based Solution

    Directory of Open Access Journals (Sweden)

    Natasha M. Rueth

    2010-06-01

    Full Text Available Gastric outlet obstruction (GOO after esophagectomy is a morbid outcome and significantly hinders quality of life for end-stage esophageal cancer patients. In the pre-stent era, palliation consisted of chemotherapy, radiation, tumor ablation, or stricture dilation. In the current era, palliative stenting has emerged as an additional tool; however, migration and tumor ingrowth are ongoing challenges. To mitigate these challenges, we developed a novel, hybrid, stent-based approach for the palliative management of GOO. We present a patient with esophageal cancer diagnosed with recurrent, metastatic disease 1 year after esophagectomy. She developed dehydration and intractable emesis, which significantly interfered with her quality of life. For palliation, we dilated the stenosis and proceeded with our stent-based solution. Using a combined endoscopic and fluoroscopic approach, we placed a 12-mm silicone salivary bypass tube across the pylorus, where it kinked slightly because of local tumor biology. To bridge this defect and ensure luminal patency, we placed a nitinol tracheobronchial stent through the silicone stent. Clinically, the patient had immediate relief from her pre-operative symptoms and was discharged home on a liquid diet. In conclusion, GOO and malignant dysphagia after esophagectomy are significant challenges for patients with end-stage disease. Palliative stenting is a viable option, but migration and tumor ingrowth are common complications. The hybrid approach presented here provides a unique solution to these potential pitfalls. The flared silicone tube minimized the chance of migration and impaired tumor ingrowth. The nitinol stent aided with patency and overcame the challenges of the soft tube. This novel strategy achieved palliation, describing another endoscopic option in the treatment of malignant GOO.

  20. Numerical solution of modified differential equations based on symmetry preservation.

    Science.gov (United States)

    Ozbenli, Ersin; Vedula, Prakash

    2017-12-01

    In this paper, we propose a method to construct invariant finite-difference schemes for solution of partial differential equations (PDEs) via consideration of modified forms of the underlying PDEs. The invariant schemes, which preserve Lie symmetries, are obtained based on the method of equivariant moving frames. While it is often difficult to construct invariant numerical schemes for PDEs due to complicated symmetry groups associated with cumbersome discrete variable transformations, we note that symmetries associated with more convenient transformations can often be obtained by appropriately modifying the original PDEs. In some cases, modifications to the original PDEs are also found to be useful in order to avoid trivial solutions that might arise from particular selections of moving frames. In our proposed method, modified forms of PDEs can be obtained either by addition of perturbation terms to the original PDEs or through defect correction procedures. These additional terms, whose primary purpose is to enable symmetries with more convenient transformations, are then removed from the system by considering moving frames for which these specific terms go to zero. Further, we explore selection of appropriate moving frames that result in improvement in accuracy of invariant numerical schemes based on modified PDEs. The proposed method is tested using the linear advection equation (in one- and two-dimensions) and the inviscid Burgers' equation. Results obtained for these tests cases indicate that numerical schemes derived from the proposed method perform significantly better than existing schemes not only by virtue of improvement in numerical accuracy but also due to preservation of qualitative properties or symmetries of the underlying differential equations.

  1. Thermodynamics of dilute aqueous solutions of imidazolium based ionic liquids

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Tejwant [Salt and Marine Chemicals Division, Central Salt and Marine Chemicals Research Institute, Council of Scientific and Industrial Research (CSIR), G.B. Marg, Bhavnagar 364002 (India); Kumar, Arvind, E-mail: arvind@csmcri.or [Salt and Marine Chemicals Division, Central Salt and Marine Chemicals Research Institute, Council of Scientific and Industrial Research (CSIR), G.B. Marg, Bhavnagar 364002 (India)

    2011-06-15

    Research highlights: The thermodynamic behaviour of aqueous imidazolium ILs has been investigated. Volumetric and ultrasonic results indicated the hydrophobic hydration of ILs. Viscometric studies revealed studied ionic liquids as water-structure makers. Hydration number increased with increase in alkyl chain length of the cation. - Abstract: Experimental measurements of density {rho}, speed of sound u, and viscosity {eta} of aqueous solutions of various 1-alkyl-3-methylimidazolium based ionic liquid (IL) solutions have been performed in dilute concentration regime at 298.15 K to get insight into hydration behaviour of ILs. The investigated ILs are based on 1-alkyl-3-methylimidazolium cation, [C{sub n}mim] having [BF{sub 4}]{sup -}, [Cl]{sup -}, [C{sub 1}OSO{sub 3}]{sup -}, and [C{sub 8}OSO{sub 3}]{sup -} as anions where n = 4 or 8. Several thermodynamic parameters like apparent molar volume {phi}{sub V}, isentropic compressibility {beta}{sub s}, and viscosity B-coefficients have been derived from experimental data. Limiting value of apparent molar volume has been discussed in terms of intrinsic molar volume (V{sub int}) molar electrostriction volume (V{sub elec}), molar disordered (V{sub dis}), and cage volume (V{sub cage}). Viscosity B-coefficients have been used to quantify the kosmotropic or chaotropic nature of ILs. Hydration number of ILs obtained using elctrostriction volume, isentropic compressibility, viscosity, and differential scanning calorimetry have been found to be comparative within the experimental error. The hydrophobic hydration has found to play an important role in hydration of ILs as compared to hydration due to hydrogen bonding and electrostriction. Limiting molar properties, hydration numbers, and B-coefficients have been discussed in terms of alkyl chain length of cation or nature of anion.

  2. Thermodynamics of dilute aqueous solutions of imidazolium based ionic liquids

    International Nuclear Information System (INIS)

    Singh, Tejwant; Kumar, Arvind

    2011-01-01

    Research highlights: → The thermodynamic behaviour of aqueous imidazolium ILs has been investigated. → Volumetric and ultrasonic results indicated the hydrophobic hydration of ILs. → Viscometric studies revealed studied ionic liquids as water-structure makers. → Hydration number increased with increase in alkyl chain length of the cation. - Abstract: Experimental measurements of density ρ, speed of sound u, and viscosity η of aqueous solutions of various 1-alkyl-3-methylimidazolium based ionic liquid (IL) solutions have been performed in dilute concentration regime at 298.15 K to get insight into hydration behaviour of ILs. The investigated ILs are based on 1-alkyl-3-methylimidazolium cation, [C n mim] having [BF 4 ] - , [Cl] - , [C 1 OSO 3 ] - , and [C 8 OSO 3 ] - as anions where n = 4 or 8. Several thermodynamic parameters like apparent molar volume φ V , isentropic compressibility β s , and viscosity B-coefficients have been derived from experimental data. Limiting value of apparent molar volume has been discussed in terms of intrinsic molar volume (V int ) molar electrostriction volume (V elec ), molar disordered (V dis ), and cage volume (V cage ). Viscosity B-coefficients have been used to quantify the kosmotropic or chaotropic nature of ILs. Hydration number of ILs obtained using elctrostriction volume, isentropic compressibility, viscosity, and differential scanning calorimetry have been found to be comparative within the experimental error. The hydrophobic hydration has found to play an important role in hydration of ILs as compared to hydration due to hydrogen bonding and electrostriction. Limiting molar properties, hydration numbers, and B-coefficients have been discussed in terms of alkyl chain length of cation or nature of anion.

  3. Trust Model to Enhance Security and Interoperability of Cloud Environment

    Science.gov (United States)

    Li, Wenjuan; Ping, Lingdi

    Trust is one of the most important means to improve security and enable interoperability of current heterogeneous independent cloud platforms. This paper first analyzed several trust models used in large and distributed environment and then introduced a novel cloud trust model to solve security issues in cross-clouds environment in which cloud customer can choose different providers' services and resources in heterogeneous domains can cooperate. The model is domain-based. It divides one cloud provider's resource nodes into the same domain and sets trust agent. It distinguishes two different roles cloud customer and cloud server and designs different strategies for them. In our model, trust recommendation is treated as one type of cloud services just like computation or storage. The model achieves both identity authentication and behavior authentication. The results of emulation experiments show that the proposed model can efficiently and safely construct trust relationship in cross-clouds environment.

  4. Operational Plan Ontology Model for Interconnection and Interoperability

    Science.gov (United States)

    Long, F.; Sun, Y. K.; Shi, H. Q.

    2017-03-01

    Aiming at the assistant decision-making system’s bottleneck of processing the operational plan data and information, this paper starts from the analysis of the problem of traditional expression and the technical advantage of ontology, and then it defines the elements of the operational plan ontology model and determines the basis of construction. Later, it builds up a semi-knowledge-level operational plan ontology model. Finally, it probes into the operational plan expression based on the operational plan ontology model and the usage of the application software. Thus, this paper has the theoretical significance and application value in the improvement of interconnection and interoperability of the operational plan among assistant decision-making systems.

  5. Adaptation of interoperability standards for cross domain usage

    Science.gov (United States)

    Essendorfer, B.; Kerth, Christian; Zaschke, Christian

    2017-05-01

    As globalization affects most aspects of modern life, challenges of quick and flexible data sharing apply to many different domains. To protect a nation's security for example, one has to look well beyond borders and understand economical, ecological, cultural as well as historical influences. Most of the time information is produced and stored digitally and one of the biggest challenges is to receive relevant readable information applicable to a specific problem out of a large data stock at the right time. These challenges to enable data sharing across national, organizational and systems borders are known to other domains (e.g., ecology or medicine) as well. Solutions like specific standards have been worked on for the specific problems. The question is: what can the different domains learn from each other and do we have solutions when we need to interlink the information produced in these domains? A known problem is to make civil security data available to the military domain and vice versa in collaborative operations. But what happens if an environmental crisis leads to the need to quickly cooperate with civil or military security in order to save lives? How can we achieve interoperability in such complex scenarios? The paper introduces an approach to adapt standards from one domain to another and lines out problems that have to be overcome and limitations that may apply.

  6. Optimisation-Based Solution Methods for Set Partitioning Models

    DEFF Research Database (Denmark)

    Rasmussen, Matias Sevel

    The scheduling of crew, i.e. the construction of work schedules for crew members, is often not a trivial task, but a complex puzzle. The task is complicated by rules, restrictions, and preferences. Therefore, manual solutions as well as solutions from standard software packages are not always su......_cient with respect to solution quality and solution time. Enhancement of the overall solution quality as well as the solution time can be of vital importance to many organisations. The _elds of operations research and mathematical optimisation deal with mathematical modelling of di_cult scheduling problems (among...... other topics). The _elds also deal with the development of sophisticated solution methods for these mathematical models. This thesis describes the set partitioning model which has been widely used for modelling crew scheduling problems. Integer properties for the set partitioning model are shown...

  7. Providing interoperability of eHealth communities through peer-to-peer networks.

    Science.gov (United States)

    Kilic, Ozgur; Dogac, Asuman; Eichelberg, Marco

    2010-05-01

    Providing an interoperability infrastructure for Electronic Healthcare Records (EHRs) is on the agenda of many national and regional eHealth initiatives. Two important integration profiles have been specified for this purpose, namely, the "Integrating the Healthcare Enterprise (IHE) Cross-enterprise Document Sharing (XDS)" and the "IHE Cross Community Access (XCA)." IHE XDS describes how to share EHRs in a community of healthcare enterprises and IHE XCA describes how EHRs are shared across communities. However, the current version of the IHE XCA integration profile does not address some of the important challenges of cross-community exchange environments. The first challenge is scalability. If every community that joins the network needs to connect to every other community, i.e., a pure peer-to-peer network, this solution will not scale. Furthermore, each community may use a different coding vocabulary for the same metadata attribute, in which case, the target community cannot interpret the query involving such an attribute. Yet another important challenge is that each community may (and typically will) have a different patient identifier domain. Querying for the patient identifiers in the target community using patient demographic data may create patient privacy concerns. In this paper, we address each of these challenges and show how they can be handled effectively in a superpeer-based peer-to-peer architecture.

  8. Agile Management and Interoperability Testing of SDN/NFV‐Enriched 5G Core Networks

    Directory of Open Access Journals (Sweden)

    Taesang Choi

    2018-02-01

    Full Text Available In the fifth generation (5G era, the radio internet protocol capacity is expected to reach 20 Gb/s per sector, and ultralarge content traffic will travel across a faster wireless/wireline access network and packet core network. Moreover, the massive and mission‐critical Internet of Things is the main differentiator of 5G services. These types of real‐time and large‐bandwidth‐consuming services require a radio latency of less than 1 ms and an end‐to‐end latency of less than a few milliseconds. By distributing 5G core nodes closer to cell sites, the backhaul traffic volume and latency can be significantly reduced by having mobile devices download content immediately from a closer content server. In this paper, we propose a novel solution based on software‐defined network and network function virtualization technologies in order to achieve agile management of 5G core network functionalities with a proof‐of‐concept implementation targeted for the PyeongChang Winter Olympics and describe the results of interoperability testing experiences between two core networks.

  9. Secure Interoperable Open Smart Grid Demonstration Project

    Energy Technology Data Exchange (ETDEWEB)

    Magee, Thoman [Consolidated Edison Company Of New York, Inc., NY (United States)

    2014-12-28

    The Consolidated Edison, Inc., of New York (Con Edison) Secure Interoperable Open Smart Grid Demonstration Project (SGDP), sponsored by the United States (US) Department of Energy (DOE), demonstrated that the reliability, efficiency, and flexibility of the grid can be improved through a combination of enhanced monitoring and control capabilities using systems and resources that interoperate within a secure services framework. The project demonstrated the capability to shift, balance, and reduce load where and when needed in response to system contingencies or emergencies by leveraging controllable field assets. The range of field assets includes curtailable customer loads, distributed generation (DG), battery storage, electric vehicle (EV) charging stations, building management systems (BMS), home area networks (HANs), high-voltage monitoring, and advanced metering infrastructure (AMI). The SGDP enables the seamless integration and control of these field assets through a common, cyber-secure, interoperable control platform, which integrates a number of existing legacy control and data systems, as well as new smart grid (SG) systems and applications. By integrating advanced technologies for monitoring and control, the SGDP helps target and reduce peak load growth, improves the reliability and efficiency of Con Edison’s grid, and increases the ability to accommodate the growing use of distributed resources. Con Edison is dedicated to lowering costs, improving reliability and customer service, and reducing its impact on the environment for its customers. These objectives also align with the policy objectives of New York State as a whole. To help meet these objectives, Con Edison’s long-term vision for the distribution grid relies on the successful integration and control of a growing penetration of distributed resources, including demand response (DR) resources, battery storage units, and DG. For example, Con Edison is expecting significant long-term growth of DG

  10. Analysis of Android Device-Based Solutions for Fall Detection

    Directory of Open Access Journals (Sweden)

    Eduardo Casilari

    2015-07-01

    Full Text Available Falls are a major cause of health and psychological problems as well as hospitalization costs among older adults. Thus, the investigation on automatic Fall Detection Systems (FDSs has received special attention from the research community during the last decade. In this area, the widespread popularity, decreasing price, computing capabilities, built-in sensors and multiplicity of wireless interfaces of Android-based devices (especially smartphones have fostered the adoption of this technology to deploy wearable and inexpensive architectures for fall detection. This paper presents a critical and thorough analysis of those existing fall detection systems that are based on Android devices. The review systematically classifies and compares the proposals of the literature taking into account different criteria such as the system architecture, the employed sensors, the detection algorithm or the response in case of a fall alarms. The study emphasizes the analysis of the evaluation methods that are employed to assess the effectiveness of the detection process. The review reveals the complete lack of a reference framework to validate and compare the proposals. In addition, the study also shows that most research works do not evaluate the actual applicability of the Android devices (with limited battery and computing resources to fall detection solutions.

  11. Analysis of Android Device-Based Solutions for Fall Detection.

    Science.gov (United States)

    Casilari, Eduardo; Luque, Rafael; Morón, María-José

    2015-07-23

    Falls are a major cause of health and psychological problems as well as hospitalization costs among older adults. Thus, the investigation on automatic Fall Detection Systems (FDSs) has received special attention from the research community during the last decade. In this area, the widespread popularity, decreasing price, computing capabilities, built-in sensors and multiplicity of wireless interfaces of Android-based devices (especially smartphones) have fostered the adoption of this technology to deploy wearable and inexpensive architectures for fall detection. This paper presents a critical and thorough analysis of those existing fall detection systems that are based on Android devices. The review systematically classifies and compares the proposals of the literature taking into account different criteria such as the system architecture, the employed sensors, the detection algorithm or the response in case of a fall alarms. The study emphasizes the analysis of the evaluation methods that are employed to assess the effectiveness of the detection process. The review reveals the complete lack of a reference framework to validate and compare the proposals. In addition, the study also shows that most research works do not evaluate the actual applicability of the Android devices (with limited battery and computing resources) to fall detection solutions.

  12. Analysis of Android Device-Based Solutions for Fall Detection

    Science.gov (United States)

    Casilari, Eduardo; Luque, Rafael; Morón, María-José

    2015-01-01

    Falls are a major cause of health and psychological problems as well as hospitalization costs among older adults. Thus, the investigation on automatic Fall Detection Systems (FDSs) has received special attention from the research community during the last decade. In this area, the widespread popularity, decreasing price, computing capabilities, built-in sensors and multiplicity of wireless interfaces of Android-based devices (especially smartphones) have fostered the adoption of this technology to deploy wearable and inexpensive architectures for fall detection. This paper presents a critical and thorough analysis of those existing fall detection systems that are based on Android devices. The review systematically classifies and compares the proposals of the literature taking into account different criteria such as the system architecture, the employed sensors, the detection algorithm or the response in case of a fall alarms. The study emphasizes the analysis of the evaluation methods that are employed to assess the effectiveness of the detection process. The review reveals the complete lack of a reference framework to validate and compare the proposals. In addition, the study also shows that most research works do not evaluate the actual applicability of the Android devices (with limited battery and computing resources) to fall detection solutions. PMID:26213928

  13. Middleware Interoperability for Robotics: A ROS-YARP Framework

    Directory of Open Access Journals (Sweden)

    Plinio Moreno

    2016-10-01

    Full Text Available Middlewares are fundamental tools for progress in research and applications in robotics. They enable the integration of multiple heterogeneous sensing and actuation devices, as well as providing general purpose modules for key robotics functions (kinematics, navigation, planning. However, no existing middleware yet provides a complete set of functionalities for all robotics applications, and many robots may need to rely on more than one framework. This paper focuses on the interoperability between two of the most prevalent middleware in robotics: YARP and ROS. Interoperability between middlewares should ideally allow users to execute existing software without the necessity of: (i changing the existing code, and (ii writing hand-coded ``bridges'' for each use-case. We propose a framework enabling the communication between existing YARP modules and ROS nodes for robotics applications in an automated way. Our approach generates the ``bridging gap'' code from a configuration file, connecting YARP ports and ROS topics through code-generated YARP Bottles. %%The configuration file must describe: (i the sender entities, (ii the way to group and convert the information read from the sender, (iii the structure of the output message and (iv the receiving entity. Our choice for the many inputs to one output is the most common use-case in robotics applications, where examples include filtering, decision making and visualization. %We support YARP/ROS and ROS/YARP sender/receiver configurations, which are demonstrated in a humanoid on wheels robot that uses YARP for upper body motor control and visual perception, and ROS for mobile base control and navigation algorithms.

  14. Managing interoperability and complexity in health systems.

    Science.gov (United States)

    Bouamrane, M-M; Tao, C; Sarkar, I N

    2015-01-01

    In recent years, we have witnessed substantial progress in the use of clinical informatics systems to support clinicians during episodes of care, manage specialised domain knowledge, perform complex clinical data analysis and improve the management of health organisations' resources. However, the vision of fully integrated health information eco-systems, which provide relevant information and useful knowledge at the point-of-care, remains elusive. This journal Focus Theme reviews some of the enduring challenges of interoperability and complexity in clinical informatics systems. Furthermore, a range of approaches are proposed in order to address, harness and resolve some of the many remaining issues towards a greater integration of health information systems and extraction of useful or new knowledge from heterogeneous electronic data repositories.

  15. Interoperability science cases with the CDPP tools

    Science.gov (United States)

    Nathanaël, J.; Cecconi, B.; André, N.; Bouchemit, M.; Gangloff, M.; Budnik, E.; Jacquey, C.; Pitout, F.; Durand, J.; Rouillard, A.; Lavraud, B.; Genot, V. N.; Popescu, D.; Beigbeder, L.; Toniutti, J. P.; Caussarieu, S.

    2017-12-01

    Data exchange protocols are never as efficient as when they are invisible for the end user who is then able to discover data, to cross compare observations and modeled data and finally to perform in depth analysis. Over the years these protocols, including SAMP from IVOA, EPN-TAP from the Europlanet 2020 RI community, backed by standard web-services, have been deployed in tools designed by the French Centre de Données de la Physique des Plasmas (CDPP) including AMDA, the Propagation Tool, 3DView, ... . This presentation will focus on science cases which show the capability of interoperability in the planetary and heliophysics contexts, involving both CDPP and companion tools. Europlanet 2020 RI has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No 654208.

  16. Solving Interoperability in Translational Health. Perspectives of Students from the International Partnership in Health Informatics Education (IPHIE) 2016 Master Class.

    Science.gov (United States)

    Turner, Anne M; Facelli, Julio C; Jaspers, Monique; Wetter, Thomas; Pfeifer, Daniel; Gatewood, Laël Cranmer; Adam, Terry; Li, Yu-Chuan; Lin, Ming-Chin; Evans, R Scott; Beukenhorst, Anna; van Mens, Hugo Johan Theodoore; Tensen, Esmee; Bock, Christian; Fendrich, Laura; Seitz, Peter; Suleder, Julian; Aldelkhyyel, Ranyah; Bridgeman, Kent; Hu, Zhen; Sattler, Aaron; Guo, Shin-Yi; Mohaimenul, Islam Md Mohaimenul; Anggraini Ningrum, Dina Nur; Tung, Hsin-Ru; Bian, Jiantano; Plasek, Joseph M; Rommel, Casey; Burke, Juandalyn; Sohih, Harkirat

    2017-06-20

    In the summer of 2016 an international group of biomedical and health informatics faculty and graduate students gathered for the 16th meeting of the International Partnership in Health Informatics Education (IPHIE) masterclass at the University of Utah campus in Salt Lake City, Utah. This international biomedical and health informatics workshop was created to share knowledge and explore issues in biomedical health informatics (BHI). The goal of this paper is to summarize the discussions of biomedical and health informatics graduate students who were asked to define interoperability, and make critical observations to gather insight on how to improve biomedical education. Students were assigned to one of four groups and asked to define interoperability and explore potential solutions to current problems of interoperability in health care. We summarize here the student reports on the importance and possible solutions to the "interoperability problem" in biomedical informatics. Reports are provided from each of the four groups of highly qualified graduate students from leading BHI programs in the US, Europe and Asia. International workshops such as IPHIE provide a unique opportunity for graduate student learning and knowledge sharing. BHI faculty are encouraged to incorporate into their curriculum opportunities to exercise and strengthen student critical thinking to prepare our students for solving health informatics problems in the future.

  17. Managing Interoperability for GEOSS - A Report from the SIF

    Science.gov (United States)

    Khalsa, S. J.; Actur, D.; Nativi, S.; Browdy, S.; Eglitis, P.

    2009-04-01

    The Global Earth Observation System of Systems (GEOSS) is a coordinating and integrating framework for Earth observing and information systems, which are contributed on a voluntary basis by Members and Participating Organizations of the intergovernmental Group on Earth Observations (GEO). GEOSS exists to support informed decision making for the benefit of society, including the implementation of international environmental treaty obligations. GEO Members and Participating organizations use the GEOSS Common Infrastructure (GCI) to register their Earth observation resources, thereby making them discoverable and consumable by both humans and client applications. Essential to meeting GEO user needs is a process for supporting interoperability of observing, processing, modeling and dissemination capabilities. The GEO Standards and Interoperability Forum (SIF) was created to develop, implement and oversee this process. The SIF supports GEO organizations contributing resources to the GEOSS by helping them understand and work with the GEOSS interoperability guidelines and encouraging them to register their "interoperability arrangements" (standards or other ad hoc arrangements for interoperability) in the GEOSS standards registry, which is part of the GCI. These registered interoperability arrangements support the actual services used to achieve interoperability of systems. By making information about these interoperability arrangements available to users of the GEOSS the SIF enhances the understanding and utility of contributed resources. We describe the procedures that the SIF has enacted to carry out its work. To operate effectively the SIF uses a workflow system and is establishing a set of regional teams and domain experts. In the near term our work has focused on population and review of the GEOSS Standards Registry, but we are also developing approaches to achieving progressive convergence on, and uptake of, an optimal set of interoperability arrangements for all of

  18. A Gradient Based Iterative Solutions for Sylvester Tensor Equations

    Directory of Open Access Journals (Sweden)

    Zhen Chen

    2013-01-01

    proposed by Ding and Chen, 2005, and by using tensor arithmetic concepts, an iterative algorithm and its modification are established to solve the Sylvester tensor equation. Convergence analysis indicates that the iterative solutions always converge to the exact solution for arbitrary initial value. Finally, some examples are provided to show that the proposed algorithms are effective.

  19. Exact angular momentum projection based on cranked HFB solution

    Energy Technology Data Exchange (ETDEWEB)

    Enami, Kenichi; Tanabe, Kosai; Yosinaga, Naotaka [Saitama Univ., Urawa (Japan). Dept. of Physics

    1998-03-01

    Exact angular momentum projection of cranked HFB solutions is carried out. It is reconfirmed from this calculation that cranked HFB solutions reproduce the intrinsic structure of deformed nucleus. The result also indicates that the energy correction from projection is important for further investigation of nuclear structure. (author)

  20. Hydrocarbon-based solution for drilling and damping wells

    Energy Technology Data Exchange (ETDEWEB)

    Orlov, G A; Davydova, A I; Dobroskok, B Ye; Kendis, M Sh; Salimov, M Kh; Zvagil' skiy, G Ye

    1982-01-01

    The proportions are, %: oil product 23-74.4; emulsifier 0.5-1.2; monoethanolamine 0.1-0.2 and the rest mineral water. The solution is prepared as follows: the oil product (a mixture of Romashkinskiy oilfield oil and bituminous distillate 1:1) is mixed with emulsifier (85%) and stabilizer (15%). Mineral water is gradually added to a density of 1.18 g/cm/sup 3/. Mixing stops upon reaching the desired value of breakdown voltage, characterizing a stable solution. This solution has a higher overall stability (electrostability 1.8-3.1 times higher) than the usual solution. Also it has higher structural mechanical properties at lesser viscosity. The solution remains rather stable even when clay powder is added at 700 g/1 added at temperatures up to 95/sup 0/. It breaks down at a clay powder content of 350 g/1 and a temperature of 70/sup 0/. The solution can be used for opening layers and damping wells, having 95/sup 0/ temperatures. It is useful for drilling horizons with unstable rock. The solution currently used is used for wells having 60/sup 0/ temperatures and for horizons that do not have unstable rock. Due to cheaper additives, the solution is 6.2 times cheaper per lm/sup 3/ than the one being used currently.

  1. Trust and Privacy Solutions Based on Holistic Service Requirements

    Directory of Open Access Journals (Sweden)

    José Antonio Sánchez Alcón

    2015-12-01

    Full Text Available The products and services designed for Smart Cities provide the necessary tools to improve the management of modern cities in a more efficient way. These tools need to gather citizens’ information about their activity, preferences, habits, etc. opening up the possibility of tracking them. Thus, privacy and security policies must be developed in order to satisfy and manage the legislative heterogeneity surrounding the services provided and comply with the laws of the country where they are provided. This paper presents one of the possible solutions to manage this heterogeneity, bearing in mind these types of networks, such as Wireless Sensor Networks, have important resource limitations. A knowledge and ontology management system is proposed to facilitate the collaboration between the business, legal and technological areas. This will ease the implementation of adequate specific security and privacy policies for a given service. All these security and privacy policies are based on the information provided by the deployed platforms and by expert system processing.

  2. Radiation effects on viscosimetry of protein based solutions

    International Nuclear Information System (INIS)

    Sabato, S.F.; Lacroix, M.

    2002-01-01

    Due to their good functional properties allied to their excellent nutritional value, milk protein isolates and soy protein concentrates have gained a crescent interest. These proteins could have their structural properties improved when some treatments are applied, such as gamma irradiation, alone or in presence of other compounds, as a plasticizer. In this work, solutions of those proteins were mixed with a generally recognized as safe plasticizer, glycerol. These mixtures (8% protein (w/v) base) at two ratios 1:1 and 2:1 (protein:glycerol) were submitted to a gamma irradiation treatment ( 60 Co), at doses 0, 5, 15 and 25 kGy, and their rheological performance was studied. As irradiation dose increased viscosity measurements decayed significantly (p<0.05) for mixture soy/glycerol and calcium caseinate/glycerol. The mixture sodium caseinate/glycerol showed a trend to form aggregation of macromolecules with dose of 5 kGy, while the apparent viscosity for dispersions containing whey/glycerol remained almost constant as irradiation dose increases. In the case of soy protein isolate and sodium caseinate, a mixture of 2:1 showed a significant higher viscosity (p<0.05) than a mixture of 1:1

  3. Radiation effects on viscosimetry of protein based solutions

    Energy Technology Data Exchange (ETDEWEB)

    Sabato, S.F.; Lacroix, M. E-mail: monique.lacroix@inrs-iaf.uquebec.ca

    2002-03-01

    Due to their good functional properties allied to their excellent nutritional value, milk protein isolates and soy protein concentrates have gained a crescent interest. These proteins could have their structural properties improved when some treatments are applied, such as gamma irradiation, alone or in presence of other compounds, as a plasticizer. In this work, solutions of those proteins were mixed with a generally recognized as safe plasticizer, glycerol. These mixtures (8% protein (w/v) base) at two ratios 1:1 and 2:1 (protein:glycerol) were submitted to a gamma irradiation treatment ({sup 60}Co), at doses 0, 5, 15 and 25 kGy, and their rheological performance was studied. As irradiation dose increased viscosity measurements decayed significantly (p<0.05) for mixture soy/glycerol and calcium caseinate/glycerol. The mixture sodium caseinate/glycerol showed a trend to form aggregation of macromolecules with dose of 5 kGy, while the apparent viscosity for dispersions containing whey/glycerol remained almost constant as irradiation dose increases. In the case of soy protein isolate and sodium caseinate, a mixture of 2:1 showed a significant higher viscosity (p<0.05) than a mixture of 1:1.

  4. Maritime Activities: Requirements for Improving Space Based Solutions

    Science.gov (United States)

    Cragnolini, A.; Miguel-Lago, M.

    2005-03-01

    Maritime initiatives cannot be pursued only within their own perimeter. Sector endeavours and the policies which rule over them have wide range implications and several links with other sectors of activity. A well- balanced relationship of sea exploitation, maritime transportation, environmental protection and security ruled by national or international laws, will be a main issue for the future of all kind of maritime activities. Scientific research and technology development, along with enlightened and appropriate institutional regulations are relevant to ensure maritime sustainability.The use of satellite technology for monitoring international agreements should have a close co- ordination and be based on institutional consensus. Frequently, rules and new regulations set by policy makers are not demanding enough due to lack of knowledge about the possibilities offered by available technologies.Law enforcement actions could bring space technology new opportunities to offer solutions for monitoring and verification. Operators should aim at offering space data in a more operational and user-friendly way, providing them with useful and timely information.This paper will analyse the contribution of satellite technology to deal with the specificity of maritime sector, stressing the conditions for both an adequate technology improvement and an effective policy implementation.After analysing the links between maritime activities, space technologies and the institutional environment, the paper identifies some boundary conditions of the future developments. Conclusions are basically a check list for improving the present situation, while a road map is suggested as a matter of a way to proceed.

  5. A School with Solutions: Implementing a Solution-Focused/Adlerian-Based Comprehensive School Counseling Program.

    Science.gov (United States)

    LaFountain, Rebecca M.; Garner, Nadine E.

    This book explains how counselors can integrate the theories of solution focused and Adlerian counseling into a comprehensive developmental counseling curriculum. Following an introduction in Chapter 1, Chapter 2 explains how support needs to be developed among the staff to implement a comprehensive school program. The comprehensive developmental…

  6. A logical approach to semantic interoperability in healthcare.

    Science.gov (United States)

    Bird, Linda; Brooks, Colleen; Cheong, Yu Chye; Tun, Nwe Ni

    2011-01-01

    Singapore is in the process of rolling out a number of national e-health initiatives, including the National Electronic Health Record (NEHR). A critical enabler in the journey towards semantic interoperability is a Logical Information Model (LIM) that harmonises the semantics of the information structure with the terminology. The Singapore LIM uses a combination of international standards, including ISO 13606-1 (a reference model for electronic health record communication), ISO 21090 (healthcare datatypes), and SNOMED CT (healthcare terminology). The LIM is accompanied by a logical design approach, used to generate interoperability artifacts, and incorporates mechanisms for achieving unidirectional and bidirectional semantic interoperability.

  7. Clinical data integration model. Core interoperability ontology for research using primary care data.

    Science.gov (United States)

    Ethier, J-F; Curcin, V; Barton, A; McGilchrist, M M; Bastiaens, H; Andreasson, A; Rossiter, J; Zhao, L; Arvanitis, T N; Taweel, A; Delaney, B C; Burgun, A

    2015-01-01

    This article is part of the Focus Theme of METHODS of Information in Medicine on "Managing Interoperability and Complexity in Health Systems". Primary care data is the single richest source of routine health care data. However its use, both in research and clinical work, often requires data from multiple clinical sites, clinical trials databases and registries. Data integration and interoperability are therefore of utmost importance. TRANSFoRm's general approach relies on a unified interoperability framework, described in a previous paper. We developed a core ontology for an interoperability framework based on data mediation. This article presents how such an ontology, the Clinical Data Integration Model (CDIM), can be designed to support, in conjunction with appropriate terminologies, biomedical data federation within TRANSFoRm, an EU FP7 project that aims to develop the digital infrastructure for a learning healthcare system in European Primary Care. TRANSFoRm utilizes a unified structural / terminological interoperability framework, based on the local-as-view mediation paradigm. Such an approach mandates the global information model to describe the domain of interest independently of the data sources to be explored. Following a requirement analysis process, no ontology focusing on primary care research was identified and, thus we designed a realist ontology based on Basic Formal Ontology to support our framework in collaboration with various terminologies used in primary care. The resulting ontology has 549 classes and 82 object properties and is used to support data integration for TRANSFoRm's use cases. Concepts identified by researchers were successfully expressed in queries using CDIM and pertinent terminologies. As an example, we illustrate how, in TRANSFoRm, the Query Formulation Workbench can capture eligibility criteria in a computable representation, which is based on CDIM. A unified mediation approach to semantic interoperability provides a

  8. Solution-Processed Smart Window Platforms Based on Plasmonic Electrochromics

    KAUST Repository

    Abbas, Sara

    2018-04-30

    Electrochromic smart windows offer a viable route to reducing the consumption of buildings energy, which represents about 30% of the worldwide energy consumption. Smart windows are far more compelling than current static windows in that they can dynamically modulate the solar spectrum depending on climate and lighting conditions or simply to meet personal preferences. The latest generation of smart windows relies on nominally transparent metal oxide nanocrystal materials whose chromism can be electrochemically controlled using the plasmonic effect. Plasmonic electrochromic materials selectively control the near infrared (NIR) region of the solar spectrum, responsible for solar heat, without affecting the visible transparency. This is in contrast to conventional electrochromic materials which block both the visible and NIR and thus enables electrochromic devices to reduce the energy consumption of a building or a greenhouse in warm climate regions due to enhancements of both visible lighting and heat blocking. Despite this edge, this technology can benefit from important developments, including low-cost solution-based manufacturing on flexible substrates while maintaining durability and coloration efficiency, demonstration of independent control in the NIR and visible spectra, and demonstration of self-powering capabilities. This thesis is focused on developing low-temperature and all-solution processed plasmonic electrochromic devices and dual-band electrochromic devices. We demonstrate new device fabrication approaches in terms of materials and processes which enhance electrochromic performance all the while maintaining low processing temperatures. Scalable fabrication methods are used to highlight compatibility with high throughput, continuous roll-to-roll fabrication on flexible substrates. In addition, a dualband plasmonic electrochromic device was developed by combining the plasmonic layer with a conventional electrochromic ion storage layer. This enables

  9. Bases for DOT exemption uranyl nitrate solution shipments

    International Nuclear Information System (INIS)

    Moyer, R.A.

    1982-07-01

    Uranyl nitrate solutions from a Savannah River Plant reprocessing facility have been transported in cargo tank trailers for more than 20 years without incident during transit. The solution is shipped to Oak Ridge for further processing and returned to SRP in a solid metal form for recycle. This solution, called uranyl nitrate hexahydrate (UNH) solution in Department of Transportation (DOT) regulations, is currently diluted about 2-fold to comply with DOT concentration limits (10% of low specific activity levels) specified for bulk low specific activity (LSA) liquid shipments. Dilution of the process solution increases the number of shipments, the cost of transportation, the cost of shipper preparations, the cost of further reprocessing in the receiving facility to first evaporate the added water, and the total risk to the population along the route of travel. However, the radiological risk remains about the same. Therefore, obtaining an exemption from DOT regulations to permit shipment of undiluted UNH solution, which is normally about two times the present limit, is prudent and more economical. The radiological and nonradiological risks from shipping a unit load of undiluted solution are summarized for the probable route. Data and calculations are presented on a per load or per shipment basis throughout this memorandum to keep it unclassified

  10. Measuring interoperable EHR adoption and maturity: a Canadian example.

    Science.gov (United States)

    Gheorghiu, Bobby; Hagens, Simon

    2016-01-25

    An interoperable electronic health record is a secure consolidated record of an individual's health history and care, designed to facilitate authorized information sharing across the care continuum.  Each Canadian province and territory has implemented such a system and for all, measuring adoption is essential to understanding progress and optimizing use in order to realize intended benefits. About 250,000 health professionals-approximately half of Canada's anticipated potential physician, nurse, pharmacist, and administrative users-indicated that they electronically access data, such as those found in provincial/territorial lab or drug information systems, in 2015.  Trends suggest further growth as maturity of use increases. There is strong interest in health information exchange through the iEHR in Canada, and continued growth in adoption is expected. Central to managing the evolution of digital health is access to robust data about who is using solutions, how they are used, where and when.  Stakeholders such as government, program leads, and health system administrators must critically assess progress and achievement of benefits, to inform future strategic and operational decisions.

  11. Radiolysis of nucleosides in aqueous solutions: base liberation by the base attack mechanism

    International Nuclear Information System (INIS)

    Fujita, S.

    1984-01-01

    On the radiolysis of uridine and some other nucleosides in aqueous solution, a pH-dependent liberation of uracil or the corresponding base was found. e - sub(aq) and HOsup(anion radicals) 2 gave no freed bases, although many oxidizing radicals, including OH, Clsup(anion radicals) 2 , Brsup(anion radicals) 2 , (CNS)sup(anion radicals) 2 and SOsup(anion radicals) 4 , did cause the release of unaltered bases, depending on the pH of the solutions. The base yields were generally high at pH >= 11, with the exception of SOsup(anion radicals) 4 , which gave a rather high yield of uracil (from uridine) even in the pH region of - , present at high pH as the dissociated form of OH, may act partly as an oxidizing radical. A plausible mechanism of 3 1 -radical formation is discussed. (author)

  12. CCSDS SM and C Mission Operations Interoperability Prototype

    Science.gov (United States)

    Lucord, Steven A.

    2010-01-01

    This slide presentation reviews the prototype of the Spacecraft Monitor and Control (SM&C) Operations for interoperability among other space agencies. This particular prototype uses the German Space Agency (DLR) to test the ideas for interagency coordination.

  13. Interoperable Multimedia Annotation and Retrieval for the Tourism Sector

    NARCIS (Netherlands)

    Chatzitoulousis, Antonios; Efraimidis, Pavlos S.; Athanasiadis, I.N.

    2015-01-01

    The Atlas Metadata System (AMS) employs semantic web annotation techniques in order to create an interoperable information annotation and retrieval platform for the tourism sector. AMS adopts state-of-the-art metadata vocabularies, annotation techniques and semantic web technologies.

  14. Patterns in Standards and Technologies for Economic Information Systems Interoperability

    Directory of Open Access Journals (Sweden)

    Vasile Irimia

    2012-06-01

    Full Text Available This paper presets results from a review of the current standards used for collaboration between economic information systems, including web services and service oriented architecture, EDI, ebXML framework, RosettaNet framework, cXML, xCBL UBL, BPMN, BPEL, WS-CDL, ASN.1, and others. Standards have a key role in promoting economic information system interoperability, and thus enable collaboration. Analyzing the current standards, technologies and applications used for economic information systems interoperability has revealed a common pattern that runs through all of them. From this pattern we construct a basic model of interoperability around which we relate and judge all standards, technologies and applications for economic information systems interoperability.

  15. Radio Interoperability: There Is More to It Than Hardware

    National Research Council Canada - National Science Library

    Hutchins, Susan G; Timmons, Ronald P

    2007-01-01

    Radio Interoperability: The Problem *Superfluous radio transmissions contribute to auditory overload of first responders -Obscure development of an accurate operational picture for all involved -Radio spectrum is a limited commodity once...

  16. A Cultural Framework for the Interoperability of C2 Systems

    National Research Council Canada - National Science Library

    Slay, Jill

    2002-01-01

    In considering some of the difficulties experienced in coalition operations, it becomes apparent that attention is needed, is in establishing a cultural framework for the interoperability of personnel (the human agents...

  17. GEOSS interoperability for Weather, Ocean and Water

    Science.gov (United States)

    Richardson, David; Nyenhuis, Michael; Zsoter, Ervin; Pappenberger, Florian

    2013-04-01

    "Understanding the Earth system — its weather, climate, oceans, atmosphere, water, land, geodynamics, natural resources, ecosystems, and natural and human-induced hazards — is crucial to enhancing human health, safety and welfare, alleviating human suffering including poverty, protecting the global environment, reducing disaster losses, and achieving sustainable development. Observations of the Earth system constitute critical input for advancing this understanding." With this in mind, the Group on Earth Observations (GEO) started implementing the Global Earth Observation System of Systems (GEOSS). GEOWOW, short for "GEOSS interoperability for Weather, Ocean and Water", is supporting this objective. GEOWOW's main challenge is to improve Earth observation data discovery, accessibility and exploitability, and to evolve GEOSS in terms of interoperability, standardization and functionality. One of the main goals behind the GEOWOW project is to demonstrate the value of the TIGGE archive in interdisciplinary applications, providing a vast amount of useful and easily accessible information to the users through the GEO Common Infrastructure (GCI). GEOWOW aims at developing funcionalities that will allow easy discovery, access and use of TIGGE archive data and of in-situ observations, e.g. from the Global Runoff Data Centre (GRDC), to support applications such as river discharge forecasting.TIGGE (THORPEX Interactive Grand Global Ensemble) is a key component of THORPEX: a World Weather Research Programme to accelerate the improvements in the accuracy of 1-day to 2 week high-impact weather forecasts for the benefit of humanity. The TIGGE archive consists of ensemble weather forecast data from ten global NWP centres, starting from October 2006, which has been made available for scientific research. The TIGGE archive has been used to analyse hydro-meteorological forecasts of flooding in Europe as well as in China. In general the analysis has been favourable in terms of

  18. Segment-based Eyring-Wilson viscosity model for polymer solutions

    International Nuclear Information System (INIS)

    Sadeghi, Rahmat

    2005-01-01

    A theory-based model is presented for correlating viscosity of polymer solutions and is based on the segment-based Eyring mixture viscosity model as well as the segment-based Wilson model for describing deviations from ideality. The model has been applied to several polymer solutions and the results show that it is reliable both for correlation and prediction of the viscosity of polymer solutions at different molar masses and temperature of the polymer

  19. Interoperability, Enterprise Architectures, and IT Governance in Government

    OpenAIRE

    Scholl , Hans ,; Kubicek , Herbert; Cimander , Ralf

    2011-01-01

    Part 4: Architecture, Security and Interoperability; International audience; Government represents a unique, and also uniquely complex, environment for interoperation of information systems as well as for integration of workflows and processes across governmental levels and branches. While private-sector organizations by and large have the capacity to implement “enterprise architectures” in a relatively straightforward fashion, for notable reasons governments do not enjoy such luxury. For thi...

  20. A novel wound rinsing solution based on nano colloidal silver

    Directory of Open Access Journals (Sweden)

    Soheila Kordestani

    2014-10-01

    Full Text Available Objective(s: The present study aimed to investigate the antiseptic properties of a colloidal nano silver wound rinsing solution to inhibit a wide range of pathogens including bacteria, viruses and fungus present in chronic and acute wounds. Materials and Methods:The wound rinsing solution named SilvoSept® was prepared using colloidal nano silver suspension. Physicochemical properties, effectiveness against microorganism including  Staphylocoocous aureus ATCC 6538P, Pseudomonas aeruginosa ATCC 9027, Escherichia coli ATCC 8739 ,Candida albicans ATCC 10231, Aspergillus niger ATCC 16404, MRSA , Mycobacterium spp. , HSV-1 and H1N1, and biocompatibility tests were carried out according to relevant standards . Results: X-ray diffraction (XRD scan was performed on the sample and verify single phase of silver particles in the compound. The size of the silver particles in the solution, measured by dynamic light scattering (DLS techniqu, ranged 80-90 nm. Transmission electron microscopy (TEM revealed spherical shape with smooth surface of the silver nanoparticles. SilvoSept® reduced 5 log from the initial count of 107 CFU/mL of Staphylocoocous aureus ATCC 6538P, Pseudomonas aeruginosa ATCC 9027, Escherichia coli ATCC 8739, Candida albicans ATCC 10231, Aspergillus niger ATCC 16404, MRSA, Mycobacterium spp. Further assessments of SilvoSept solution exhibited a significant inhibition on the replication of HSV-1 and H1N1. The biocompatibility studies showed that the solution was non-allergic, non-irritant and noncytotoxic. Conclusion: Findings of the present study showed that SilvoSept® wound rinsing solution containing nano silver particles is an effective antiseptic solution against a wide spectrum of microorganism. This compound can be a suitable candidate for wound irrigation.   

  1. Regulatory barriers blocking standardization of interoperability.

    Science.gov (United States)

    Zhong, Daidi; Kirwan, Michael J; Duan, Xiaolian

    2013-07-12

    Developing and implementing a set of personal health device interoperability standards is key to cultivating a healthy global industry ecosystem. The standardization organizations, including the Institute of Electrical and Electronics Engineers 11073 Personal Health Device Workgroup (IEEE 11073-PHD WG) and Continua Health Alliance, are striving for this purpose. However, factors like the medial device regulation, health policy, and market reality have placed non-technical barriers over the adoption of technical standards throughout the industry. These barriers have significantly impaired the motivations of consumer device vendors who desire to enter the personal health market and the overall success of personal health industry ecosystem. In this paper, we present the affect that these barriers have placed on the health ecosystem. This requires immediate action from policy makers and other stakeholders. The current regulatory policy needs to be updated to reflect the reality and demand of consumer health industry. Our hope is that this paper will draw wide consensus amongst its readers, policy makers, and other stakeholders.

  2. The advanced microgrid. Integration and interoperability

    Energy Technology Data Exchange (ETDEWEB)

    Bower, Ward Isaac [Ward Bower Innovations, LLC, Albuquerque, NM (United Staes); Ton, Dan T. [U.S. Dept. of Energy, Washington, DC (United States); Guttromson, Ross [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Glover, Steven F [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Stamp, Jason Edwin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bhatnagar, Dhruv [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Reilly, Jim [Reily Associates, Pittston, PA (United States)

    2014-02-01

    This white paper focuses on "advanced microgrids," but sections do, out of necessity, reference today's commercially available systems and installations in order to clearly distinguish the differences and advances. Advanced microgrids have been identified as being a necessary part of the modern electrical grid through a two DOE microgrid workshops, the National Institute of Standards and Technology, Smart Grid Interoperability Panel and other related sources. With their grid-interconnectivity advantages, advanced microgrids will improve system energy efficiency and reliability and provide enabling technologies for grid-independence to end-user sites. One popular definition that has been evolved and is used in multiple references is that a microgrid is a group of interconnected loads and distributed-energy resources within clearly defined electrical boundaries that acts as a single controllable entity with respect to the grid. A microgrid can connect and disconnect from the grid to enable it to operate in both grid-connected or island-mode. Further, an advanced microgrid can then be loosely defined as a dynamic microgrid.

  3. AliEn - EDG Interoperability in ALICE

    CERN Document Server

    Bagnasco, S; Buncic, P; Carminati, F; Cerello, P G; Saiz, P

    2003-01-01

    AliEn (ALICE Environment) is a GRID-like system for large scale job submission and distributed data management developed and used in the context of ALICE, the CERN LHC heavy-ion experiment. With the aim of exploiting upcoming Grid resources to run AliEn-managed jobs and store the produced data, the problem of AliEn-EDG interoperability was addressed and an in-terface was designed. One or more EDG (European Data Grid) User Interface machines run the AliEn software suite (Cluster Monitor, Storage Element and Computing Element), and act as interface nodes between the systems. An EDG Resource Broker is seen by the AliEn server as a single Computing Element, while the EDG storage is seen by AliEn as a single, large Storage Element; files produced in EDG sites are registered in both the EDG Replica Catalogue and in the AliEn Data Catalogue, thus ensuring accessibility from both worlds. In fact, both registrations are required: the AliEn one is used for the data management, the EDG one to guarantee the integrity and...

  4. Combining Archetypes with Fast Health Interoperability Resources in Future-proof Health Information Systems.

    Science.gov (United States)

    Bosca, Diego; Moner, David; Maldonado, Jose Alberto; Robles, Montserrat

    2015-01-01

    Messaging standards, and specifically HL7 v2, are heavily used for the communication and interoperability of Health Information Systems. HL7 FHIR was created as an evolution of the messaging standards to achieve semantic interoperability. FHIR is somehow similar to other approaches like the dual model methodology as both are based on the precise modeling of clinical information. In this paper, we demonstrate how we can apply the dual model methodology to standards like FHIR. We show the usefulness of this approach for data transformation between FHIR and other specifications such as HL7 CDA, EN ISO 13606, and openEHR. We also discuss the advantages and disadvantages of defining archetypes over FHIR, and the consequences and outcomes of this approach. Finally, we exemplify this approach by creating a testing data server that supports both FHIR resources and archetypes.

  5. Ocean Data Interoperability Platform (ODIP): using regional data systems for global ocean research

    Science.gov (United States)

    Schaap, D.; Thijsse, P.; Glaves, H.

    2017-12-01

    Ocean acidification, loss of coral reefs, sustainable exploitation of the marine environment are just a few of the challenges researchers around the world are currently attempting to understand and address. However, studies of these ecosystem level challenges are impossible unless researchers can discover and re-use the large volumes of interoperable multidisciplinary data that are currently only accessible through regional and global data systems that serve discreet, and often discipline specific, user communities. The plethora of marine data systems currently in existence are also using different standards, technologies and best practices making re-use of the data problematic for those engaged in interdisciplinary marine research. The Ocean Data Interoperability Platform (ODIP) is responding to this growing demand for discoverable, accessible and reusable data by establishing the foundations for a common global framework for marine data management. But creation of such an infrastructure is a major undertaking, and one that needs to be achieved in part by establishing different levels of interoperability across existing regional and global marine e-infrastructures. Workshops organised by ODIP II facilitate dialogue between selected regional and global marine data systems in an effort to identify potential solutions that integrate these marine e-infrastructures. The outcomes of these discussions have formed the basis for a number of prototype development tasks that aim to demonstrate effective sharing of data across multiple data systems, and allow users to access data from more than one system through a single access point. The ODIP II project is currently developing four prototype solutions that are establishing interoperability between selected regional marine data management infrastructures in Europe, the USA, Canada and Australia, and with the global POGO, IODE Ocean Data Portal (ODP) and GEOSS systems. The potential impact of implementing these solutions for

  6. LED-based Photometric Stereo: Modeling, Calibration and Numerical Solutions

    DEFF Research Database (Denmark)

    Quéau, Yvain; Durix, Bastien; Wu, Tao

    2018-01-01

    We conduct a thorough study of photometric stereo under nearby point light source illumination, from modeling to numerical solution, through calibration. In the classical formulation of photometric stereo, the luminous fluxes are assumed to be directional, which is very difficult to achieve in pr...

  7. A cellular-based solution for radio communications in MOUT

    NARCIS (Netherlands)

    Overduin, R.

    2005-01-01

    A short-term and potentially cost-effective solution is proposed for tactical radio communications in Military Operations in Urban Terrain (MOUT) for the Royal Netherlands Army (RNLA). Measurements and computer simulations presented show that on average, outdoor ranges in MOUT as attainable with

  8. A Working Framework for Enabling International Science Data System Interoperability

    Science.gov (United States)

    Hughes, J. Steven; Hardman, Sean; Crichton, Daniel J.; Martinez, Santa; Law, Emily; Gordon, Mitchell K.

    2016-07-01

    For diverse scientific disciplines to interoperate they must be able to exchange information based on a shared understanding. To capture this shared understanding, we have developed a knowledge representation framework that leverages ISO level reference models for metadata registries and digital archives. This framework provides multi-level governance, evolves independent of the implementation technologies, and promotes agile development, namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. The knowledge representation is captured in an ontology through a process of knowledge acquisition. Discipline experts in the role of stewards at the common, discipline, and project levels work to design and populate the ontology model. The result is a formal and consistent knowledge base that provides requirements for data representation, integrity, provenance, context, identification, and relationship. The contents of the knowledge base are translated and written to files in suitable formats to configure system software and services, provide user documentation, validate input, and support data analytics. This presentation will provide an overview of the framework, present a use case that has been adopted by an entire science discipline at the international level, and share some important lessons learned.

  9. An ontological system for interoperable spatial generalisation in biodiversity monitoring

    Science.gov (United States)

    Nieland, Simon; Moran, Niklas; Kleinschmit, Birgit; Förster, Michael

    2015-11-01

    Semantic heterogeneity remains a barrier to data comparability and standardisation of results in different fields of spatial research. Because of its thematic complexity, differing acquisition methods and national nomenclatures, interoperability of biodiversity monitoring information is especially difficult. Since data collection methods and interpretation manuals broadly vary there is a need for automatised, objective methodologies for the generation of comparable data-sets. Ontology-based applications offer vast opportunities in data management and standardisation. This study examines two data-sets of protected heathlands in Germany and Belgium which are based on remote sensing image classification and semantically formalised in an OWL2 ontology. The proposed methodology uses semantic relations of the two data-sets, which are (semi-)automatically derived from remote sensing imagery, to generate objective and comparable information about the status of protected areas by utilising kernel-based spatial reclassification. This automatised method suggests a generalisation approach, which is able to generate delineation of Special Areas of Conservation (SAC) of the European biodiversity Natura 2000 network. Furthermore, it is able to transfer generalisation rules between areas surveyed with varying acquisition methods in different countries by taking into account automated inference of the underlying semantics. The generalisation results were compared with the manual delineation of terrestrial monitoring. For the different habitats in the two sites an accuracy of above 70% was detected. However, it has to be highlighted that the delineation of the ground-truth data inherits a high degree of uncertainty, which is discussed in this study.

  10. A Solution Generator Algorithm for Decision Making based Automated Negotiation in the Construction Domain

    Directory of Open Access Journals (Sweden)

    Arazi Idrus

    2017-12-01

    Full Text Available In this paper, we present our work-in-progress of a proposed framework for automated negotiation in the construction domain. The proposed framework enables software agents to conduct negotiations and autonomously make value-based decisions. The framework consists of three main components which are, solution generator algorithm, negotiation algorithm, and conflict resolution algorithm. This paper extends the discussion on the solution generator algorithm that enables software agents to generate solutions and rank them from 1st to nth solution for the negotiation stage of the operation. The solution generator algorithm consists of three steps which are, review solutions, rank solutions, and form ranked solutions. For validation purpose, we present a scenario that utilizes the proposed algorithm to rank solutions. The validation shows that the algorithm is promising, however, it also highlights the conflict between different parties that needs further negotiation action.

  11. A Service Oriented Architecture Approach to Achieve Interoperability between Immunization Information Systems in Iran.

    Science.gov (United States)

    Hosseini, Masoud; Ahmadi, Maryam; Dixon, Brian E

    2014-01-01

    Clinical decision support (CDS) systems can support vaccine forecasting and immunization reminders; however, immunization decision-making requires data from fragmented, independent systems. Interoperability and accurate data exchange between immunization information systems (IIS) is an essential factor to utilize Immunization CDS systems. Service oriented architecture (SOA) and Health Level 7 (HL7) are dominant standards for web-based exchange of clinical information. We implemented a system based on SOA and HL7 v3 to support immunization CDS in Iran. We evaluated system performance by exchanging 1500 immunization records for roughly 400 infants between two IISs. System turnaround time is less than a minute for synchronous operation calls and the retrieved immunization history of infants were always identical in different systems. CDS generated reports were accordant to immunization guidelines and the calculations for next visit times were accurate. Interoperability is rare or nonexistent between IIS. Since inter-state data exchange is rare in United States, this approach could be a good prototype to achieve interoperability of immunization information.

  12. A Web Service Protocol Realizing Interoperable Internet of Things Tasking Capability

    Science.gov (United States)

    Huang, Chih-Yuan; Wu, Cheng-Hung

    2016-01-01

    The Internet of Things (IoT) is an infrastructure that interconnects uniquely-identifiable devices using the Internet. By interconnecting everyday appliances, various monitoring, and physical mashup applications can be constructed to improve human’s daily life. In general, IoT devices provide two main capabilities: sensing and tasking capabilities. While the sensing capability is similar to the World-Wide Sensor Web, this research focuses on the tasking capability. However, currently, IoT devices created by different manufacturers follow different proprietary protocols and are locked in many closed ecosystems. This heterogeneity issue impedes the interconnection between IoT devices and damages the potential of the IoT. To address this issue, this research aims at proposing an interoperable solution called tasking capability description that allows users to control different IoT devices using a uniform web service interface. This paper demonstrates the contribution of the proposed solution by interconnecting different IoT devices for different applications. In addition, the proposed solution is integrated with the OGC SensorThings API standard, which is a Web service standard defined for the IoT sensing capability. Consequently, the Extended SensorThings API can realize both IoT sensing and tasking capabilities in an integrated and interoperable manner. PMID:27589759

  13. A Web Service Protocol Realizing Interoperable Internet of Things Tasking Capability.

    Science.gov (United States)

    Huang, Chih-Yuan; Wu, Cheng-Hung

    2016-08-31

    The Internet of Things (IoT) is an infrastructure that interconnects uniquely-identifiable devices using the Internet. By interconnecting everyday appliances, various monitoring, and physical mashup applications can be constructed to improve human's daily life. In general, IoT devices provide two main capabilities: sensing and tasking capabilities. While the sensing capability is similar to the World-Wide Sensor Web, this research focuses on the tasking capability. However, currently, IoT devices created by different manufacturers follow different proprietary protocols and are locked in many closed ecosystems. This heterogeneity issue impedes the interconnection between IoT devices and damages the potential of the IoT. To address this issue, this research aims at proposing an interoperable solution called tasking capability description that allows users to control different IoT devices using a uniform web service interface. This paper demonstrates the contribution of the proposed solution by interconnecting different IoT devices for different applications. In addition, the proposed solution is integrated with the OGC SensorThings API standard, which is a Web service standard defined for the IoT sensing capability. Consequently, the Extended SensorThings API can realize both IoT sensing and tasking capabilities in an integrated and interoperable manner.

  14. Solution or suspension - Does it matter for lipid based systems?

    DEFF Research Database (Denmark)

    Larsen, A T; Holm, R; Müllertz, A

    2017-01-01

    In this study, the potential of co-administering an aqueous suspension with a placebo lipid vehicle, i.e. chase dosing, was investigated in rats relative to the aqueous suspension alone or a solution of the drug in the lipid vehicle. The lipid investigated in the present study was Labrafil M2125CS...... and three evaluated poorly soluble model compounds, danazol, cinnarizine and halofantrine. For cinnarizine and danazol the oral bioavailability in rats after chase dosing or dosing the compound dissolved in Labrafil M21515CS was similar and significantly higher than for the aqueous suspension....... For halofantrine the chase dosed group had a tendency towards a low bioavailability relative to the Labrafil M2125CS solution, but still a significant higher bioavailability relative to the aqueous suspension. This could be due to factors such as a slower dissolution rate in the intestinal phase of halofantrine...

  15. A Survey on Smartphone-Based Crowdsensing Solutions

    OpenAIRE

    Zamora-Mero, Willian Jesus; Tavares De Araujo Cesariny Calafate, Carlos Miguel; Cano Escribá, Juan Carlos; Manzoni, Pietro

    2016-01-01

    © 2016 Willian Zamora et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. In recent years, the widespread adoption of mobile phones, combined with the ever-increasing number of sensors that smartphones are equipped with, greatly simplified the generalized adoption of crowdsensing solutions by reducing hardware requirements...

  16. A New IMS Based Inter-working Solution

    Science.gov (United States)

    Zhu, Zhongwen; Brunner, Richard

    With the evolution of third generation network, more and more multimedia services are developed and deployed. Any new service to be deployed in IMS network is required to inter-work with existing Internet communities or legacy terminal users in order to appreciate the end users, who are the main drivers for the service to succeed. The challenge for Inter-working between IMS (IP Multimedia Subsystem) and non-IMS network is “how to handle recipient’s address”. This is because each network has its own routable address schema. For instance, the address for Google Talk user is xmpp:xyz@google.com, which is un-routable in IMS network. Hereafter a new Inter-working (IW) solution between IMS and non-IMS network is proposed for multimedia services that include Instant Messaging, Chat, and File transfer, etc. It is an end-to-end solution built on IMS infrastructure. The Public Service Identity (PSI) defined in 3GPP standard (3rd Generation Partnership Project) is used to allow terminal clients to allocate this IW service. When sending the SIP (Session Initial Protocol) request out for multimedia services, the terminal includes the recipient’s address in the payload instead of the “Request-URI” header. In the network, the proposed solution provides the mapping rules between different networks in MM-IW (Multimedia IW). The detailed technical description and the corresponding use cases are present. The comparison with other alternatives is made. The benefits of the proposed solution are highlighted.

  17. Distributed GIS Systems, Open Specifications and Interoperability: How do They Relate to the Sustainable Management of Natural Resources?

    Science.gov (United States)

    Rafael Moreno-Sanchez

    2006-01-01

    The aim of this is paper is to provide a conceptual framework for the session: “The role of web-based Geographic Information Systems in supporting sustainable management.” The concepts of sustainability, sustainable forest management, Web Services, Distributed Geographic Information Systems, interoperability, Open Specifications, and Open Source Software are defined...

  18. An Interoperable System toward Cardiac Risk Stratification from ECG Monitoring

    Directory of Open Access Journals (Sweden)

    Cristina Soguero-Ruiz

    2018-03-01

    Full Text Available Many indices have been proposed for cardiovascular risk stratification from electrocardiogram signal processing, still with limited use in clinical practice. We created a system integrating the clinical definition of cardiac risk subdomains from ECGs and the use of diverse signal processing techniques. Three subdomains were defined from the joint analysis of the technical and clinical viewpoints. One subdomain was devoted to demographic and clinical data. The other two subdomains were intended to obtain widely defined risk indices from ECG monitoring: a simple-domain (heart rate turbulence (HRT, and a complex-domain (heart rate variability (HRV. Data provided by the three subdomains allowed for the generation of alerts with different intensity and nature, as well as for the grouping and scrutinization of patients according to the established processing and risk-thresholding criteria. The implemented system was tested by connecting data from real-world in-hospital electronic health records and ECG monitoring by considering standards for syntactic (HL7 messages and semantic interoperability (archetypes based on CEN/ISO EN13606 and SNOMED-CT. The system was able to provide risk indices and to generate alerts in the health records to support decision-making. Overall, the system allows for the agile interaction of research and clinical practice in the Holter-ECG-based cardiac risk domain.

  19. An Interoperable System toward Cardiac Risk Stratification from ECG Monitoring

    Science.gov (United States)

    Mora-Jiménez, Inmaculada; Ramos-López, Javier; Quintanilla Fernández, Teresa; García-García, Antonio; Díez-Mazuela, Daniel; García-Alberola, Arcadi

    2018-01-01

    Many indices have been proposed for cardiovascular risk stratification from electrocardiogram signal processing, still with limited use in clinical practice. We created a system integrating the clinical definition of cardiac risk subdomains from ECGs and the use of diverse signal processing techniques. Three subdomains were defined from the joint analysis of the technical and clinical viewpoints. One subdomain was devoted to demographic and clinical data. The other two subdomains were intended to obtain widely defined risk indices from ECG monitoring: a simple-domain (heart rate turbulence (HRT)), and a complex-domain (heart rate variability (HRV)). Data provided by the three subdomains allowed for the generation of alerts with different intensity and nature, as well as for the grouping and scrutinization of patients according to the established processing and risk-thresholding criteria. The implemented system was tested by connecting data from real-world in-hospital electronic health records and ECG monitoring by considering standards for syntactic (HL7 messages) and semantic interoperability (archetypes based on CEN/ISO EN13606 and SNOMED-CT). The system was able to provide risk indices and to generate alerts in the health records to support decision-making. Overall, the system allows for the agile interaction of research and clinical practice in the Holter-ECG-based cardiac risk domain. PMID:29494497

  20. Advances in a Distributed Approach for Ocean Model Data Interoperability

    Directory of Open Access Journals (Sweden)

    Richard P. Signell

    2014-03-01

    Full Text Available An infrastructure for earth science data is emerging across the globe based on common data models and web services. As we evolve from custom file formats and web sites to standards-based web services and tools, data is becoming easier to distribute, find and retrieve, leaving more time for science. We describe recent advances that make it easier for ocean model providers to share their data, and for users to search, access, analyze and visualize ocean data using MATLAB® and Python®. These include a technique for modelers to create aggregated, Climate and Forecast (CF metadata convention datasets from collections of non-standard Network Common Data Form (NetCDF output files, the capability to remotely access data from CF-1.6-compliant NetCDF files using the Open Geospatial Consortium (OGC Sensor Observation Service (SOS, a metadata standard for unstructured grid model output (UGRID, and tools that utilize both CF and UGRID standards to allow interoperable data search, browse and access. We use examples from the U.S. Integrated Ocean Observing System (IOOS® Coastal and Ocean Modeling Testbed, a project in which modelers using both structured and unstructured grid model output needed to share their results, to compare their results with other models, and to compare models with observed data. The same techniques used here for ocean modeling output can be applied to atmospheric and climate model output, remote sensing data, digital terrain and bathymetric data.

  1. Advances in a distributed approach for ocean model data interoperability

    Science.gov (United States)

    Signell, Richard P.; Snowden, Derrick P.

    2014-01-01

    An infrastructure for earth science data is emerging across the globe based on common data models and web services. As we evolve from custom file formats and web sites to standards-based web services and tools, data is becoming easier to distribute, find and retrieve, leaving more time for science. We describe recent advances that make it easier for ocean model providers to share their data, and for users to search, access, analyze and visualize ocean data using MATLAB® and Python®. These include a technique for modelers to create aggregated, Climate and Forecast (CF) metadata convention datasets from collections of non-standard Network Common Data Form (NetCDF) output files, the capability to remotely access data from CF-1.6-compliant NetCDF files using the Open Geospatial Consortium (OGC) Sensor Observation Service (SOS), a metadata standard for unstructured grid model output (UGRID), and tools that utilize both CF and UGRID standards to allow interoperable data search, browse and access. We use examples from the U.S. Integrated Ocean Observing System (IOOS®) Coastal and Ocean Modeling Testbed, a project in which modelers using both structured and unstructured grid model output needed to share their results, to compare their results with other models, and to compare models with observed data. The same techniques used here for ocean modeling output can be applied to atmospheric and climate model output, remote sensing data, digital terrain and bathymetric data.

  2. Making Interoperability Easier with the NASA Metadata Management Tool

    Science.gov (United States)

    Shum, D.; Reese, M.; Pilone, D.; Mitchell, A. E.

    2016-12-01

    ISO 19115 has enabled interoperability amongst tools, yet many users find it hard to build ISO metadata for their collections because it can be large and overly flexible for their needs. The Metadata Management Tool (MMT), part of NASA's Earth Observing System Data and Information System (EOSDIS), offers users a modern, easy to use browser based tool to develop ISO compliant metadata. Through a simplified UI experience, metadata curators can create and edit collections without any understanding of the complex ISO-19115 format, while still generating compliant metadata. The MMT is also able to assess the completeness of collection level metadata by evaluating it against a variety of metadata standards. The tool provides users with clear guidance as to how to change their metadata in order to improve their quality and compliance. It is based on NASA's Unified Metadata Model for Collections (UMM-C) which is a simpler metadata model which can be cleanly mapped to ISO 19115. This allows metadata authors and curators to meet ISO compliance requirements faster and more accurately. The MMT and UMM-C have been developed in an agile fashion, with recurring end user tests and reviews to continually refine the tool, the model and the ISO mappings. This process is allowing for continual improvement and evolution to meet the community's needs.

  3. Interoperability challenges for the Sustainable Management of seagrass meadows (Invited)

    Science.gov (United States)

    Nativi, S.; Pastres, R.; Bigagli, L.; Venier, C.; Zucchetta, M.; Santoro, M.

    2013-12-01

    Seagrass meadows (marine angiosperm plants) occupy less than 0.2% of the global ocean surface, annually store about 10-18% of the so-called 'Blue Carbon', i.e. the Carbon stored in coastal vegetated areas. Recent literature estimates that the flux to the long-term carbon sink in seagrasses represents 10-20% of seagrasses global average production. Such figures can be translated into economic benefits, taking into account that a ton of carbon dioxide in Europe is paid at around 15 € in the carbon market. This means that the organic carbon retained in seagrass sediments in the Mediterranean is worth 138 - 1128 billion €, which represents 6-23 € per square meter. This is 9-35 times more than one square meter of tropical forest soil (0.66 € per square meter), or 5-17 times when considering both the above and the belowground compartments in tropical forests. According the most conservative estimations, about 10% of the Mediterranean meadows have been lost during the last century. In the framework of the GEOSS (Global Earth Observation System of Systems) initiative, the MEDINA project (funded by the European Commission and coordinated by the University of Ca'Foscari in Venice) prepared a showcase as part of the GEOSS Architecture Interoperability Pilot -phase 6 (AIP-6). This showcase aims at providing a tool for the sustainable management of seagrass meadows along the Mediterranean coastline. The application is based on an interoperability framework providing a set of brokerage services to easily ingest and run a Habitat Suitability model (a model predicting the probability a given site to provide a suitable habitat for the development of seagrass meadow and the average coverage expected). The presentation discusses such a framework explaining how the input data is discovered, accessed and processed to ingest the model (developed in the MEDINA project). Furthermore, the brokerage framework provides the necessary services to run the model and visualize results

  4. Lactated Ringer-based storage solutions are equally well suited for the storage of fresh osteochondral allografts as cell culture medium-based storage solutions.

    Science.gov (United States)

    Harb, Afif; von Horn, Alexander; Gocalek, Kornelia; Schäck, Luisa Marilena; Clausen, Jan; Krettek, Christian; Noack, Sandra; Neunaber, Claudia

    2017-07-01

    Due to the rising interest in Europe to treat large cartilage defects with osteochondrale allografts, research aims to find a suitable solution for long-term storage of osteochondral allografts. This is further encouraged by the fact that legal restrictions currently limit the use of the ingredients from animal or human sources that are being used in other regions of the world (e.g. in the USA). Therefore, the aim of this study was A) to analyze if a Lactated Ringer (LR) based solution is as efficient as a Dulbecco modified Eagle's minimal essential medium (DMEM) in maintaining chondrocyte viability and B) at which storage temperature (4°C vs. 37°C) chondrocyte survival of the osteochondral allograft is optimally sustained. 300 cartilage grafts were collected from knees of ten one year-old Black Head German Sheep. The grafts were stored in four different storage solutions (one of them DMEM-based, the other three based on Lactated Ringer Solution), at two different temperatures (4 and 37°C) for 14 and 56days. At both points in time, chondrocyte survival as well as death rate, Glycosaminoglycan (GAG) content, and Hydroxyproline (HP) concentration were measured and compared between the grafts stored in the different solutions and at the different temperatures. Independent of the storage solutions tested, chondrocyte survival rates were higher when stored at 4°C compared to storage at 37°C both after short-term (14days) and long-term storage (56days). At no point in time did the DMEM-based solution show a superior chondrocyte survival compared to lactated Ringer based solution. GAG and HP content were comparable across all time points, temperatures and solutions. LR based solutions that contain only substances that are approved in Germany may be just as efficient for storing grafts as the USA DMEM-based solution gold standard. Moreover, in the present experiment storage of osteochondral allografts at 4°C was superior to storage at 37°C. Copyright © 2017

  5. The development of a prototype level-three interoperable catalog system

    Science.gov (United States)

    Hood, Carroll A.; Howie, Randy; Verhanovitz, Rich

    1993-08-01

    The development of a level-three interoperable catalog system is defined by a new paradigm for metadata access. The old paradigm is characterized by a hierarchy of metadata layers, the transfer of control to target systems, and the requirement for the user to be familiar with the syntax and data dictionaries of several catalog system elements. Attributes of the new paradigm are exactly orthogonal: the directory and inventories are peer entities, there is a single user interface, and the system manages the complexity of interacting transparently with remote elements. We have designed and implemented a prototype level-three interoperable catalog system based on the new paradigm. Through a single intelligent interface, users can interoperably access a master directory, inventories for selected satellite datasets, and an in situ meteorological dataset inventory. This paper describes the development of the prototype system and three of the formidable challenges that were addressed in the process. The first involved the interoperable integration of satellite and in situ inventories, which to our knowledge, has never been operationally demonstrated. The second was the development of a search strategy for orbital and suborbital granules which preserves the capability to identify temporally or spatially coincident subsets between them. The third involved establishing a method of incorporating inventory-specific search criteria into user queries. We are working closely with selected science data users to obtain feedback on the system's design and performance. The lessons learned from this prototype will help direct future development efforts. Distributed data systems of the 1990s such as EOSDIS and the Global Change Data and Information System (GCDIS) will be able to build on this prototype.

  6. Solution of wave-like equation based on Haar wavelet

    Directory of Open Access Journals (Sweden)

    Naresh Berwal

    2012-11-01

    Full Text Available Wavelet transform and wavelet analysis are powerful mathematical tools for many problems. Wavelet also can be applied in numerical analysis. In this paper, we apply Haar wavelet method to solve wave-like equation with initial and boundary conditions known. The fundamental idea of Haar wavelet method is to convert the differential equations into a group of algebraic equations, which involves a finite number or variables. The results and graph show that the proposed way is quite reasonable when compared to exact solution.

  7. Interoperability Guidelines for Lithuanian E-Learning Management Systems

    Directory of Open Access Journals (Sweden)

    Eugenijus Kurilovas

    2011-08-01

    Full Text Available Purpose – the paper aims to analyse e-learning content and repositories along with the problems of learning organisation interoperability. The main objective of the paper is to analyse scientific research results and the newest international experience in the area and to provide interoperability guidelines and recommendations for the implementation of appropriate Lithuanian state programmes. The learning content and repositories recommendations are designed for the implementation of the Lithuanian education portal project as well as Lithuanian Virtual University (LVU programme’s information services’ (LABT / eLABa and e-learning services’ (LieDM sub-programmes. The whole education institution recommendations are designed for the maintenance and development of LVU programme’s management services’ (LieMSIS system.Design/methodology/approach – methods used for the general analysis of proposed interoperability guidelines (reccomendations were bibliographic research and comparative analysis of Lithuanian and foreign scientific works published in periodicals and large-scale EU-funded interoperability projects deliverables. System analysis and comparative analysis methods were used in order to formulate and analyse systems’ interoperability guidelines and recommendations. The author employed the experimental research method while working in the appropriate EU-funded interoperability projects to form the guidelines (recommendations. In order to summarize the results, the evaluative research method was used..Findings – the international guidelines and recommendations presented in the paper could be suitable for implementation while developing Lithuanian state education information systems such as the Lithuanian education portal, the Lithuanian academic libraries’ (eLABa system, the Lithuanian distance learning system (LieDM, and the Lithuanian universities’ management system (LieMSIS.Research limitations/implications – the paper

  8. Interoperability Guidelines for Lithuanian E-Learning Management Systems

    Directory of Open Access Journals (Sweden)

    Eugenijus Kurilovas

    2013-08-01

    Full Text Available Purpose – the paper aims to analyse e-learning content and repositories along with the problems of learning organisation interoperability. The main objective of the paper is to analyse scientific research results and the newest international experience in the area and to provide interoperability guidelines and recommendations for the implementation of appropriate Lithuanian state programmes. The learning content and repositories recommendations are designed for the implementation of the Lithuanian education portal project as well as Lithuanian Virtual University (LVU programme’s information services’ (LABT / eLABa and e-learning services’ (LieDM sub-programmes. The whole education institution recommendations are designed for the maintenance and development of LVU programme’s management services’ (LieMSIS system.Design/methodology/approach – methods used for the general analysis of proposed interoperability guidelines (reccomendations were bibliographic research and comparative analysis of Lithuanian and foreign scientific works published in periodicals and large-scale EU-funded interoperability projects deliverables. System analysis and comparative analysis methods were used in order to formulate and analyse systems’ interoperability guidelines and recommendations. The author employed the experimental research method while working in the appropriate EU-funded interoperability projects to form the guidelines (recommendations. In order to summarize the results, the evaluative research method was used.Findings – the international guidelines and recommendations presented in the paper could be suitable for implementation while developing Lithuanian state education information systems such as the Lithuanian education portal, the Lithuanian academic libraries’ (eLABa system, the Lithuanian distance learning system (LieDM, and the Lithuanian universities’ management system (LieMSIS.Research limitations/implications – the paper

  9. An isotherm-based thermodynamic model of multicomponent aqueous solutions, applicable over the entire concentration range.

    Science.gov (United States)

    Dutcher, Cari S; Ge, Xinlei; Wexler, Anthony S; Clegg, Simon L

    2013-04-18

    In previous studies (Dutcher et al. J. Phys. Chem. C 2011, 115, 16474-16487; 2012, 116, 1850-1864), we derived equations for the Gibbs energy, solvent and solute activities, and solute concentrations in multicomponent liquid mixtures, based upon expressions for adsorption isotherms that include arbitrary numbers of hydration layers on each solute. In this work, the long-range electrostatic interactions that dominate in dilute solutions are added to the Gibbs energy expression, thus extending the range of concentrations for which the model can be used from pure liquid solute(s) to infinite dilution in the solvent, water. An equation for the conversion of the reference state for solute activity coefficients to infinite dilution in water has been derived. A number of simplifications are identified, notably the equivalence of the sorption site parameters r and the stoichiometric coefficients of the solutes, resulting in a reduction in the number of model parameters. Solute concentrations in mixtures conform to a modified Zdanovskii-Stokes-Robinson mixing rule, and solute activity coefficients to a modified McKay-Perring relation, when the effects of the long-range (Debye-Hückel) term in the equations are taken into account. Practical applications of the equations to osmotic and activity coefficients of pure aqueous electrolyte solutions and mixtures show both satisfactory accuracy from low to high concentrations, together with a thermodynamically reasonable extrapolation (beyond the range of measurements) to extreme concentration and to the pure liquid solute(s).

  10. Massive-scale data management using standards-based solutions

    CERN Document Server

    Shiers, J

    1999-01-01

    In common with many large institutes, CERN has traditionally developed and maintained its own data management solutions. Recently, a significant change of direction has taken place and we have now adopted commercial tools, together with a small amount of site- specific code, for this task. The solutions chosen were originally studied as part of research and development projects oriented towards the Large Hadron Collider (LHC), which is currently under construction at CERN. They have since been adopted not only by the LHC collaborations, which are due to take production data starting in 2005, but also by numerous current experiments, both at CERN and at other High Energy Physics laboratories. Previous experiments, that used data management tools developed in-house, are also studying a possible move to the new environment. To meet the needs of today's experiments, data rates of up to 35 MB/second and data volumes of many hundred TB per experiment must be supported. Data distribution to multiple sites must be pr...

  11. Supply Chain-based Solution to Prevent Fuel Tax Evasion

    Energy Technology Data Exchange (ETDEWEB)

    Franzese, Oscar [ORNL; Capps, Gary J [ORNL; Daugherty, Michael [United States Department of Transportation (USDOT), Federal Highway Administration (FHWA); Siekmann, Adam [ORNL; Lascurain, Mary Beth [ORNL; Barker, Alan M [ORNL

    2016-01-01

    The primary source of funding for the United States transportation system is derived from motor fuel and other highway use taxes. Loss of revenue attributed to fuel-tax evasion has been assessed to be somewhere between $1 billion per year, or approximately 25% of the total tax collected. Any solution that addresses this problem needs to include not only the tax-collection agencies and auditors, but also the carriers transporting oil products and the carriers customers. This paper presents a system developed by the Oak Ridge National Laboratory for the Federal Highway Administration which has the potential to reduce or eliminate many fuel-tax evasion schemes. The solution balances the needs of tax-auditors and those of the fuel-hauling companies and their customers. The technology was deployed and successfully tested during an eight-month period on a real-world fuel-hauling fleet. Day-to-day operations of the fleet were minimally affected by their interaction with this system. The results of that test are discussed in this paper.

  12. An approach to define semantics for BPM systems interoperability

    Science.gov (United States)

    Rico, Mariela; Caliusco, María Laura; Chiotti, Omar; Rosa Galli, María

    2015-04-01

    This article proposes defining semantics for Business Process Management systems interoperability through the ontology of Electronic Business Documents (EBD) used to interchange the information required to perform cross-organizational processes. The semantic model generated allows aligning enterprise's business processes to support cross-organizational processes by matching the business ontology of each business partner with the EBD ontology. The result is a flexible software architecture that allows dynamically defining cross-organizational business processes by reusing the EBD ontology. For developing the semantic model, a method is presented, which is based on a strategy for discovering entity features whose interpretation depends on the context, and representing them for enriching the ontology. The proposed method complements ontology learning techniques that can not infer semantic features not represented in data sources. In order to improve the representation of these entity features, the method proposes using widely accepted ontologies, for representing time entities and relations, physical quantities, measurement units, official country names, and currencies and funds, among others. When the ontologies reuse is not possible, the method proposes identifying whether that feature is simple or complex, and defines a strategy to be followed. An empirical validation of the approach has been performed through a case study.

  13. An open, interoperable, and scalable prehospital information technology network architecture.

    Science.gov (United States)

    Landman, Adam B; Rokos, Ivan C; Burns, Kevin; Van Gelder, Carin M; Fisher, Roger M; Dunford, James V; Cone, David C; Bogucki, Sandy

    2011-01-01

    Some of the most intractable challenges in prehospital medicine include response time optimization, inefficiencies at the emergency medical services (EMS)-emergency department (ED) interface, and the ability to correlate field interventions with patient outcomes. Information technology (IT) can address these and other concerns by ensuring that system and patient information is received when and where it is needed, is fully integrated with prior and subsequent patient information, and is securely archived. Some EMS agencies have begun adopting information technologies, such as wireless transmission of 12-lead electrocardiograms, but few agencies have developed a comprehensive plan for management of their prehospital information and integration with other electronic medical records. This perspective article highlights the challenges and limitations of integrating IT elements without a strategic plan, and proposes an open, interoperable, and scalable prehospital information technology (PHIT) architecture. The two core components of this PHIT architecture are 1) routers with broadband network connectivity to share data between ambulance devices and EMS system information services and 2) an electronic patient care report to organize and archive all electronic prehospital data. To successfully implement this comprehensive PHIT architecture, data and technology requirements must be based on best available evidence, and the system must adhere to health data standards as well as privacy and security regulations. Recent federal legislation prioritizing health information technology may position federal agencies to help design and fund PHIT architectures.

  14. Evaluating Sustainability Models for Interoperability through Brokering Software

    Science.gov (United States)

    Pearlman, Jay; Benedict, Karl; Best, Mairi; Fyfe, Sue; Jacobs, Cliff; Michener, William; Nativi, Stefano; Powers, Lindsay; Turner, Andrew

    2016-04-01

    Sustainability of software and research support systems is an element of innovation that is not often discussed. Yet, sustainment is essential if we expect research communities to make the time investment to learn and adopt new technologies. As the Research Data Alliance (RDA) is developing new approaches to interoperability, the question of uptake and sustainability is important. Brokering software sustainability is one of the areas that is being addressed in RDA. The Business Models Team of the Research Data Alliance Brokering Governance Working Group examined several support models proposed to promote the long-term sustainability of brokering middleware. The business model analysis includes examination of funding source, implementation frameworks and challenges, and policy and legal considerations. Results of this comprehensive analysis highlight advantages and disadvantages of the various models with respect to the specific requirements for brokering services. We offer recommendations based on the outcomes of this analysis that suggest that hybrid funding models present the most likely avenue to long term sustainability.

  15. Enhancing Science Teaching through Performing Marbling Art Using Basic Solutions and Base Indicators

    Science.gov (United States)

    Çil, Emine; Çelik, Kevser; Maçin, Tuba; Demirbas, Gülay; Gökçimen, Özlem

    2014-01-01

    Basic solutions are an indispensable part of our daily life. Basic solutions are commonly used in industries such as the textile industry, oil refineries, the fertilizer industry, and pharmaceutical products. Most cleaning agents, such as soap, detergent, and bleach, and some of our foods, such as chocolate and eggs, include bases. Bases are the…

  16. An Interoperability Framework and Capability Profiling for Manufacturing Software

    Science.gov (United States)

    Matsuda, M.; Arai, E.; Nakano, N.; Wakai, H.; Takeda, H.; Takata, M.; Sasaki, H.

    ISO/TC184/SC5/WG4 is working on ISO16100: Manufacturing software capability profiling for interoperability. This paper reports on a manufacturing software interoperability framework and a capability profiling methodology which were proposed and developed through this international standardization activity. Within the context of manufacturing application, a manufacturing software unit is considered to be capable of performing a specific set of function defined by a manufacturing software system architecture. A manufacturing software interoperability framework consists of a set of elements and rules for describing the capability of software units to support the requirements of a manufacturing application. The capability profiling methodology makes use of the domain-specific attributes and methods associated with each specific software unit to describe capability profiles in terms of unit name, manufacturing functions, and other needed class properties. In this methodology, manufacturing software requirements are expressed in terns of software unit capability profiles.

  17. An Architecture for Semantically Interoperable Electronic Health Records.

    Science.gov (United States)

    Toffanello, André; Gonçalves, Ricardo; Kitajima, Adriana; Puttini, Ricardo; Aguiar, Atualpa

    2017-01-01

    Despite the increasing adhesion of electronic health records, the challenge of semantic interoperability remains unsolved. The fact that different parties can exchange messages does not mean they can understand the underlying clinical meaning, therefore, it cannot be assumed or treated as a requirement. This work introduces an architecture designed to achieve semantic interoperability, in a way which organizations that follow different policies may still share medical information through a common infrastructure comparable to an ecosystem, whose organisms are exemplified within the Brazilian scenario. Nonetheless, the proposed approach describes a service-oriented design with modules adaptable to different contexts. We also discuss the establishment of an enterprise service bus to mediate a health infrastructure defined on top of international standards, such as openEHR and IHE. Moreover, we argue that, in order to achieve truly semantic interoperability in a wide sense, a proper profile must be published and maintained.

  18. Mutagenicity of irradiated solutions of nuclei acid bases and nucleosides in Salmonella typhimurium

    International Nuclear Information System (INIS)

    Wilmer, J.; Schubert, J.

    1981-01-01

    Solutions of nucleic acid bases, nucleosides and a nucleotide, saturated with either N 2 , N 2 O or O 2 , were irradiated and tested for mutagenicity towards Salmonella typhimurium, with and without pre-incubation. Irradiated solutions of the nuclei acid bases were all non-mutagenic. Irradiated solutions of the nucleosides showed mutagenicity in S. typhimurium TA100 (pre-incubation assay). Generally, the mutagenicity followed the order: N 2 O > N 2 > O 2 . The results show that the formation of mutagenic radiolytic products is initiated by attack of mainly solutions of the nucleotide thymidine-5'-monophosphate, no mutagenicity could be detected. (orig.)

  19. Combination of graph heuristics in producing initial solution of curriculum based course timetabling problem

    Science.gov (United States)

    Wahid, Juliana; Hussin, Naimah Mohd

    2016-08-01

    The construction of population of initial solution is a crucial task in population-based metaheuristic approach for solving curriculum-based university course timetabling problem because it can affect the convergence speed and also the quality of the final solution. This paper presents an exploration on combination of graph heuristics in construction approach in curriculum based course timetabling problem to produce a population of initial solutions. The graph heuristics were set as single and combination of two heuristics. In addition, several ways of assigning courses into room and timeslot are implemented. All settings of heuristics are then tested on the same curriculum based course timetabling problem instances and are compared with each other in terms of number of population produced. The result shows that combination of saturation degree followed by largest degree heuristic produce the highest number of population of initial solutions. The results from this study can be used in the improvement phase of algorithm that uses population of initial solutions.

  20. The Situation and Solutions of Institutional and Community-Based ...

    African Journals Online (AJOL)

    ... of Institutional and Community-Based Rehabilitation for Persons With Mental and ... regardless of the country and the model, reveals a litany of constraints and ... involvement of all stakeholders in decision making and execution and finally, ...

  1. Business intelligence and capacity planning: web-based solutions.

    Science.gov (United States)

    James, Roger

    2010-07-01

    Income (activity) and expenditure (costs) form the basis of a modern hospital's 'business intelligence'. However, clinical engagement in business intelligence is patchy. This article describes the principles of business intelligence and outlines some recent developments using web-based applications.

  2. Interoperation of World-Wide Production e-Science Infrastructures

    CERN Document Server

    Riedel, M; Soddemann, T; Field, L; Navarro, JP; Casey, J; Litmaath, M; Baud, J; Koblitz, B; Catlett, C; Skow, D; Wang, S; Saeki, Y; Sato, H; Matsuoka, S; Geddes, N

    Many production Grid and e-Science infrastructures have begun to offer services to end-users during the past several years with an increasing number of scientific applications that require access to a wide variety of resources and services in multiple Grids. Therefore, the Grid Interoperation Now—Community Group of the Open Grid Forum—organizes and manages interoperation efforts among those production Grid infrastructures to reach the goal of a world-wide Grid vision on a technical level in the near future. This contribution highlights fundamental approaches of the group and discusses open standards in the context of production e-Science infrastructures.

  3. Improved semantic interoperability for content reuse through knowledge organization systems

    Directory of Open Access Journals (Sweden)

    José Antonio Moreiro González

    2012-04-01

    Full Text Available The Knowledge Organization Systems (KOS are resources designed to improve the knowledge interoperability, management and retrieval. As increases the web resources, it’s evidenced the lack of KOS, with the consequent impact in the resources interoperability. The KOSS are, by definition, complicated and costly tools, so much in his creation as in his management. The reuse of similar organizational structures is a necessary element in this context. They analyses experiences of reuse of The KOS and signals like the new standards are impinged on this appearance.

  4. Location-based solutions in the Experience centre

    DEFF Research Database (Denmark)

    Hansen, Dan Witzner; Alapetite, Alexandre; Holdgaard, Nanna

    In this paper we present a prototype system for location-based guiding. A user survey has been conducted and the observations are used to support design choices. The prototype allows for both indoor and outdoor navigation at and in the vicinity of the experience centre NaturBornholm1 in Denmark u...... using a combination of Bluetooth, GPS and QR-codes. Bluetooth and GPS are used for location-based information and QR-codes are used to convey user preferences....

  5. Location-based solutions in the Experience centre

    DEFF Research Database (Denmark)

    Witzner Hansen, Dan; Alapetite, Alexandre Philippe Bernard; Holdgaard, Nanna

    2008-01-01

    In this paper we present a prototype system for location-based guiding. A user survey has been conducted and the observations are used to support design choices. The prototype allows for both indoor and outdoor navigation at and in the vicinity of the experience centre NaturBornholm1 in Denmark u...... using a combination of Bluetooth, GPS and QRcodes. Bluetooth and GPS are used for location-based information and QR-codes are used to convey user preferences....

  6. Requirements for and barriers towards interoperable ehealth technology in primary care

    NARCIS (Netherlands)

    Oude Nijeweme-d'Hollosy, Wendeline; van Velsen, Lex Stefan; Huygens, Martine; Hermens, Hermanus J.

    Despite eHealth technology's rapid growth, eHealth applications are rarely embedded within primary care, mostly because systems lack interoperability. This article identifies requirements for, and barriers towards, interoperable eHealth technology from healthcare professionals' perspective -- the

  7. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation (presentation)

    Science.gov (United States)

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  8. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation.

    Science.gov (United States)

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  9. Finite element based composite solution for neutron transport problems

    International Nuclear Information System (INIS)

    Mirza, A.N.; Mirza, N.M.

    1995-01-01

    A finite element treatment for solving neutron transport problems is presented. The employs region-wise discontinuous finite elements for the spatial representation of the neutron angular flux, while spherical harmonics are used for directional dependence. Composite solutions has been obtained by using different orders of angular approximations in different parts of a system. The method has been successfully implemented for one dimensional slab and two dimensional rectangular geometry problems. An overall reduction in the number of nodal coefficients (more than 60% in some cases as compared to conventional schemes) has been achieved without loss of accuracy with better utilization of computational resources. The method also provides an efficient way of handling physically difficult situations such as treatment of voids in duct problems and sharply changing angular flux. It is observed that a great wealth of information about the spatial and directional dependence of the angular flux is obtained much more quickly as compared to Monte Carlo method, where most of the information in restricted to the locality of immediate interest. (author)

  10. The impact of interoperability of electronic health records on ambulatory physician practices: a discrete-event simulation study

    Directory of Open Access Journals (Sweden)

    Yuan Zhou

    2014-02-01

    Full Text Available Background The effect of health information technology (HIT on efficiency and workload among clinical and nonclinical staff has been debated, with conflicting evidence about whether electronic health records (EHRs increase or decrease effort. None of this paper to date, however, examines the effect of interoperability quantitatively using discrete event simulation techniques.Objective To estimate the impact of EHR systems with various levels of interoperability on day-to-day tasks and operations of ambulatory physician offices.Methods Interviews and observations were used to collect workflow data from 12 adult primary and specialty practices. A discrete event simulation model was constructed to represent patient flows and clinical and administrative tasks of physicians and staff members.Results High levels of EHR interoperability were associated with reduced time spent by providers on four tasks: preparing lab reports, requesting lab orders, prescribing medications, and writing referrals. The implementation of an EHR was associated with less time spent by administrators but more time spent by physicians, compared with time spent at paper-based practices. In addition, the presence of EHRs and of interoperability did not significantly affect the time usage of registered nurses or the total visit time and waiting time of patients.Conclusion This paper suggests that the impact of using HIT on clinical and nonclinical staff work efficiency varies, however, overall it appears to improve time efficiency more for administrators than for physicians and nurses.

  11. Spectroscopic Characterization of HAN-Based Liquid Gun Propellants and Nitrate Salt Solutions

    Science.gov (United States)

    1989-01-15

    spectra were recorded of bubbles of a concentrated aqueous nitrate solution, mineral oil, and an aqueous surfactant solution. Polymethacrylic acid ...FTIR spectra of droplets of a concentrated aqueous nitrate salt based solution (LGP1845), of solid particles cf polymethacrylic acid packing IO, 3... polymethacrylic acid low density packing foam cut to a 3x4 mnn rectangle was levitated with a low acoustic power. The sample was easily I positioned in the

  12. A method for valuing architecture-based business transformation and measuring the value of solutions architecture

    OpenAIRE

    Slot, R.G.

    2010-01-01

    Enterprise and Solution Architecture are key in today’s business environment. It is surprising that the foundation and business case for these activities are nonexistent; the financial value for the business of these activities is largely undetermined. To determine business value of enterprise and solution architecture, this thesis shows how to measure and quantify, in business terms, the value of enterprise architecture-based on business transformation and the value of solution architecture.

  13. A Prototype Ontology Tool and Interface for Coastal Atlas Interoperability

    Science.gov (United States)

    Wright, D. J.; Bermudez, L.; O'Dea, L.; Haddad, T.; Cummins, V.

    2007-12-01

    While significant capacity has been built in the field of web-based coastal mapping and informatics in the last decade, little has been done to take stock of the implications of these efforts or to identify best practice in terms of taking lessons learned into consideration. This study reports on the second of two transatlantic workshops that bring together key experts from Europe, the United States and Canada to examine state-of-the-art developments in coastal web atlases (CWA), based on web enabled geographic information systems (GIS), along with future needs in mapping and informatics for the coastal practitioner community. While multiple benefits are derived from these tailor-made atlases (e.g. speedy access to multiple sources of coastal data and information; economic use of time by avoiding individual contact with different data holders), the potential exists to derive added value from the integration of disparate CWAs, to optimize decision-making at a variety of levels and across themes. The second workshop focused on the development of a strategy to make coastal web atlases interoperable by way of controlled vocabularies and ontologies. The strategy is based on web service oriented architecture and an implementation of Open Geospatial Consortium (OGC) web services, such as Web Feature Services (WFS) and Web Map Service (WMS). Atlases publishes Catalog Web Services (CSW) using ISO 19115 metadata and controlled vocabularies encoded as Uniform Resource Identifiers (URIs). URIs allows the terminology of each atlas to be uniquely identified and facilitates mapping of terminologies using semantic web technologies. A domain ontology was also created to formally represent coastal erosion terminology as a use case, and with a test linkage of those terms between the Marine Irish Digital Atlas and the Oregon Coastal Atlas. A web interface is being developed to discover coastal hazard themes in distributed coastal atlases as part of a broader International Coastal

  14. Medical Device Plug-and-Play Interoperability Standards and Technology Leadership

    Science.gov (United States)

    2017-10-01

    Award Number: W81XWH-09-1-0705 TITLE: “Medical Device Plug-and-Play Interoperability Standards and Technology Leadership” PRINCIPAL INVESTIGATOR...Sept 2016 – 20 Sept 2017 4. TITLE AND SUBTITLE “Medical Device Plug-and-Play Interoperability 5a. CONTRACT NUMBER Standards and Technology ...efficiency through interoperable medical technologies . We played a leadership role on interoperability safety standards (AAMI, AAMI/UL Joint

  15. Solution of the weighted symmetric similarity transformations based on quaternions

    Science.gov (United States)

    Mercan, H.; Akyilmaz, O.; Aydin, C.

    2017-12-01

    A new method through Gauss-Helmert model of adjustment is presented for the solution of the similarity transformations, either 3D or 2D, in the frame of errors-in-variables (EIV) model. EIV model assumes that all the variables in the mathematical model are contaminated by random errors. Total least squares estimation technique may be used to solve the EIV model. Accounting for the heteroscedastic uncertainty both in the target and the source coordinates, that is the more common and general case in practice, leads to a more realistic estimation of the transformation parameters. The presented algorithm can handle the heteroscedastic transformation problems, i.e., positions of the both target and the source points may have full covariance matrices. Therefore, there is no limitation such as the isotropic or the homogenous accuracy for the reference point coordinates. The developed algorithm takes the advantage of the quaternion definition which uniquely represents a 3D rotation matrix. The transformation parameters: scale, translations, and the quaternion (so that the rotation matrix) along with their covariances, are iteratively estimated with rapid convergence. Moreover, prior least squares (LS) estimation of the unknown transformation parameters is not required to start the iterations. We also show that the developed method can also be used to estimate the 2D similarity transformation parameters by simply treating the problem as a 3D transformation problem with zero (0) values assigned for the z-components of both target and source points. The efficiency of the new algorithm is presented with the numerical examples and comparisons with the results of the previous studies which use the same data set. Simulation experiments for the evaluation and comparison of the proposed and the conventional weighted LS (WLS) method is also presented.

  16. Location-based solutions in the experience center

    DEFF Research Database (Denmark)

    Witzner Hansen, Dan; Alapetite, Alexandre; Holdgaard, Nanna

    2009-01-01

    In this paper we present a prototype system for location-based guiding. A user survey has been conducted and the observations are used to support design choices. The prototype allows for both indoor and outdoor navigation at and in the vicinity of the NaturBornholm [1] experience centre in Denmar...... using a combination of Bluetooth, Near field communication (NFC), GPS and QR-codes. Bluetooth, NFC, and GPS are used for location-based information and QR-codes are used to convey user preferences. [1] http://naturbornholm.dk...

  17. Location-based solutions in the Experience centre

    DEFF Research Database (Denmark)

    Hansen, Dan Witzner; Alapetite, Alexandre; Holdgaard, Nanna

    2009-01-01

    In this paper we present a prototype system for location-based guiding. A user survey has been conducted and the observations are used to support design choices. The prototype allows for both indoor and outdoor navigation at and in the vicinity of the NaturBornholm' experience centre in Denmark u...... using a combination of Bluetooth, Near Field Communication (NFC), GPS and QR codes. Bluetooth, NFC and GPS are used for location-based information and QR codes are used to convey user preferences....

  18. People counting with stereo cameras : two template-based solutions

    NARCIS (Netherlands)

    Englebienne, Gwenn; van Oosterhout, Tim; Kröse, B.J.A.

    2012-01-01

    People counting is a challenging task with many applications. We propose a method with a fixed stereo camera that is based on projecting a template onto the depth image. The method was tested on a challenging outdoor dataset with good results and runs in real time.

  19. Studying boat-based bear viewing: Methodological challenges and solutions

    Science.gov (United States)

    Sarah Elmeligi

    2007-01-01

    Wildlife viewing, a growing industry throughout North America, holds much potential for increased revenue and public awareness regarding species conservation. In Alaska and British Columbia, grizzly bear (Ursus arctos) viewing is becoming more popular, attracting tourists from around the world. Viewing is typically done from a land-based observation...

  20. Database Security for an Integrated Solution to Automate Sales Processes in Banking

    Directory of Open Access Journals (Sweden)

    Alexandra Maria Ioana FLOREA

    2013-05-01

    Full Text Available In order to maintain a competitive edge in a very active banking market the implementation of a web-based solution to standardize, optimize and manage the flow of sales / pre-sales and generating new leads is requested by a company. This article presents the realization of a development framework for software interoperability in the banking financial institutions and an integrated solution for achieving sales process automation in banking. The paper focuses on presenting the requirements for security and confidentiality of stored data and also on presenting the identified techniques and procedures to implement these requirements.

  1. Pemanfaatan Google API Untuk Model Interoperability Web Berbasis PHP Dengan Google Drive

    OpenAIRE

    Sumiari, Ni Kadek

    2015-01-01

    Dalam sebuah website tercapinya interoperability suatu system sangatlah penting. Penggunaan database berbasis Mysql, Sql Server ataupun oracle memang sudah sangat lumrah dipergunakan dalam sebuah system berbasis website. Namun penggunaan database tersebut tidak bisa menjamin apakah interoperability dari system tersebut dapat tercapai. Selain dari keamanan data dari segi implementasi system juga cukup sulit. Salah satu solusi dalam mencapi interoperability suatu system berbasis website adalah...

  2. Interoperability of Services in an Open Broadband Market : Cases from the Netherlands

    NARCIS (Netherlands)

    Burgmeijer, J.

    2006-01-01

    End-to-end interoperability of broadband services and networks is a condition for an open broadband market. A business model for broadband service interoperability is given. Two cases from the Netherlands, of initiatives from the market to reach interoperability, are presented: E-norm and FIST VoIP.

  3. The exact solutions and approximate analytic solutions of the (2 + 1)-dimensional KP equation based on symmetry method.

    Science.gov (United States)

    Gai, Litao; Bilige, Sudao; Jie, Yingmo

    2016-01-01

    In this paper, we successfully obtained the exact solutions and the approximate analytic solutions of the (2 + 1)-dimensional KP equation based on the Lie symmetry, the extended tanh method and the homotopy perturbation method. In first part, we obtained the symmetries of the (2 + 1)-dimensional KP equation based on the Wu-differential characteristic set algorithm and reduced it. In the second part, we constructed the abundant exact travelling wave solutions by using the extended tanh method. These solutions are expressed by the hyperbolic functions, the trigonometric functions and the rational functions respectively. It should be noted that when the parameters are taken as special values, some solitary wave solutions are derived from the hyperbolic function solutions. Finally, we apply the homotopy perturbation method to obtain the approximate analytic solutions based on four kinds of initial conditions.

  4. Image Based Solution to Occlusion Problem for Multiple Robots Navigation

    Directory of Open Access Journals (Sweden)

    Taj Mohammad Khan

    2012-04-01

    Full Text Available In machine vision, occlusions problem is always a challenging issue in image based mapping and navigation tasks. This paper presents a multiple view vision based algorithm for the development of occlusion-free map of the indoor environment. The map is assumed to be utilized by the mobile robots within the workspace. It has wide range of applications, including mobile robot path planning and navigation, access control in restricted areas, and surveillance systems. We used wall mounted fixed camera system. After intensity adjustment and background subtraction of the synchronously captured images, the image registration was performed. We applied our algorithm on the registered images to resolve the occlusion problem. This technique works well even in the existence of total occlusion for a longer period.

  5. Datacube Interoperability, Encoding Independence, and Analytics

    Science.gov (United States)

    Baumann, Peter; Hirschorn, Eric; Maso, Joan

    2017-04-01

    Datacubes are commonly accepted as an enabling paradigm which provides a handy abstraction for accessing and analyzing the zillions of image files delivered by the manifold satellite instruments and climate simulations, among others. Additionally, datacubes are the classic model for statistical and OLAP datacubes, so a further information category can be integrated. From a standards perspective, spatio-temporal datacubes naturally are included in the concept of coverages which encompass regular and irregular grids, point clouds, and general meshes - or, more abstractly, digital representations of spatio-temporally varying phenomena. ISO 19123, which is identical to OGC Abstract Topic 6, gives a high-level abstract definition which is complemented by the OGC Coverage Implementation Schema (CIS) which is an interoperable, yet format independent concretization of the abstract model. Currently, ISO is working on adopting OGC CIS as ISO 19123-2; the existing ISO 19123 standard is under revision by one of the abstract authors and will become ISO 19123-1. The roadmap agreed by ISO further foresees adoption of the OGC Web Coverage Service (WCS) as an ISO standard so that a complete data and service model will exist. In 2016, INSPIRE has adopted WCS as Coverage Download Service, including the datacube analytics language Web Coverage Processing Service (WCPS). The rasdaman technology (www.rasdaman.org) is both OGC and INSPIRE Reference Implementation. In the global EarthServer initiative rasdaman database sizes are exceeding 250 TB today, heading for the Petabyte frontier well in 2017. Technically, CIS defines a compact, efficient model for representing multi-dimensional datacubes in several ways. The classical coverage cube defines a domain set (where are values?), a range set (what are these values?), and range type (what do the values mean?), as well as a "bag" for arbitrary metadata. With CIS 1.1, coordinate/value pair sequences have been added, as well as tiled

  6. Distributed Cooperation Solution Method of Complex System Based on MAS

    Science.gov (United States)

    Weijin, Jiang; Yuhui, Xu

    To adapt the model in reconfiguring fault diagnosing to dynamic environment and the needs of solving the tasks of complex system fully, the paper introduced multi-Agent and related technology to the complicated fault diagnosis, an integrated intelligent control system is studied in this paper. Based on the thought of the structure of diagnostic decision and hierarchy in modeling, based on multi-layer decomposition strategy of diagnosis task, a multi-agent synchronous diagnosis federation integrated different knowledge expression modes and inference mechanisms are presented, the functions of management agent, diagnosis agent and decision agent are analyzed, the organization and evolution of agents in the system are proposed, and the corresponding conflict resolution algorithm in given, Layered structure of abstract agent with public attributes is build. System architecture is realized based on MAS distributed layered blackboard. The real world application shows that the proposed control structure successfully solves the fault diagnose problem of the complex plant, and the special advantage in the distributed domain.

  7. Porous materials based on foaming solutions obtained from industrial waste

    Science.gov (United States)

    Starostina, I. V.; Antipova, A. N.; Ovcharova, I. V.; Starostina, Yu L.

    2018-03-01

    This study analyzes foam concrete production efficiency. Research has shown the possibility of using a newly-designed protein-based foaming agent to produce porous materials using gypsum and cement binders. The protein foaming agent is obtained by alkaline hydrolysis of a raw mixture consisting of industrial waste in an electromagnetic field. The mixture consists of spent biomass of the Aspergillus niger fungus and dust from burning furnaces used in cement production. Varying the content of the foaming agent allows obtaining gypsum binder-based foam concretes with the density of 200-500 kg/m3 and compressive strength of 0.1-1.0 MPa, which can be used for thermal and sound insulation of building interiors. Cement binders were used to obtain structural and thermal insulation materials with the density of 300-950 kg/m3 and compressive strength of 0.9-9.0 MPa. The maximum operating temperature of cement-based foam concretes is 500°C because it provides the shrinkage of less than 2%.

  8. Solution-Focused Therapy: Strength-Based Counseling for Children with Social Phobia

    Science.gov (United States)

    George, Cindy M.

    2008-01-01

    Solution-focused therapy is proposed as an effective strength-based model for children with social phobia. Social phobia is described along with the etiology and prevailing treatment approaches. A case illustration demonstrates the application of solution-focused therapy with a child who experienced social phobia. Implications for counseling and…

  9. Solution immersed silicon (SIS)-based biosensors: a new approach in biosensing.

    Science.gov (United States)

    Diware, M S; Cho, H M; Chegal, W; Cho, Y J; Jo, J H; O, S W; Paek, S H; Yoon, Y H; Kim, D

    2015-02-07

    A novel, solution immersed silicon (SIS)-based sensor has been developed which employs the non-reflecting condition (NRC) for a p-polarized wave. The SIS sensor's response is almost independent of change in the refractive index (RI) of a buffer solution (BS) which makes it capable of measuring low-concentration and/or low-molecular-weight compounds.

  10. An integrable, web-based solution for easy assessment of video-recorded performances

    DEFF Research Database (Denmark)

    Subhi, Yousif; Todsen, Tobias; Konge, Lars

    2014-01-01

    , and access to this information should be restricted to select personnel. A local software solution may also ease the need for customization to local needs and integration into existing user databases or project management software. We developed an integrable web-based solution for easy assessment of video...

  11. Interoperable Access to NCAR Research Data Archive Collections

    Science.gov (United States)

    Schuster, D.; Ji, Z.; Worley, S. J.; Manross, K.

    2014-12-01

    The National Center for Atmospheric Research (NCAR) Research Data Archive (RDA) provides free access to 600+ observational and gridded dataset collections. The RDA is designed to support atmospheric and related sciences research, updated frequently where datasets have ongoing production, and serves data to 10,000 unique users annually. The traditional data access options include web-based direct archive file downloads, user selected data subsets and format conversions produced by server-side computations, and client and cURL-based APIs for routine scripted data retrieval. To enhance user experience and utility, the RDA now also offers THREDDS Data Server (TDS) access for many highly valued dataset collections. TDS offered datasets are presented as aggregations, enabling users to access an entire dataset collection, that can be comprised of 1000's of files, through a single virtual file. The OPeNDAP protocol, supported by the TDS, allows compatible tools to open and access these virtual files remotely, and make the native data file format transparent to the end user. The combined functionality (TDS/OPeNDAP) gives users the ability to browse, select, visualize, and download data from a complete dataset collection without having to transfer archive files to a local host. This presentation will review the TDS basics and describe the specific TDS implementation on the RDA's diverse archive of GRIB-1, GRIB-2, and gridded NetCDF formatted dataset collections. Potential future TDS implementation on in-situ observational dataset collections will be discussed. Illustrative sample cases will be used to highlight the end users benefits from this interoperable data access to the RDA.

  12. Whispering Gallery Mode Based Optical Fiber Sensor for Measuring Concentration of Salt Solution

    Directory of Open Access Journals (Sweden)

    Chia-Chin Chiang

    2013-01-01

    Full Text Available An optical fiber solution-concentration sensor based on whispering gallery mode (WGM is proposed in this paper. The WGM solution-concentration sensors were used to measure salt solutions, in which the concentrations ranged from 1% to 25% and the wavelength drifted from the left to the right. The experimental results showed an average sensitivity of approximately 0.372 nm/% and an R2 linearity of 0.8835. The proposed WGM sensors are of low cost, feasible for mass production, and durable for solution-concentration sensing.

  13. Double Marginalization in Performance-Based Advertising: Implications and Solutions

    OpenAIRE

    Chrysanthos Dellarocas

    2012-01-01

    An important current trend in advertising is the replacement of traditional pay-per-exposure (pay-per-impression) pricing models with performance-based mechanisms in which advertisers pay only for measurable actions by consumers. Such pay-per-action (PPA) mechanisms are becoming the predominant method of selling advertising on the Internet. Well-known examples include pay-per-click, pay-per-call, and pay-per-sale. This work highlights an important, and hitherto unrecognized, side effect of PP...

  14. Analytical solution of a stochastic content-based network model

    International Nuclear Information System (INIS)

    Mungan, Muhittin; Kabakoglu, Alkan; Balcan, Duygu; Erzan, Ayse

    2005-01-01

    We define and completely solve a content-based directed network whose nodes consist of random words and an adjacency rule involving perfect or approximate matches for an alphabet with an arbitrary number of letters. The analytic expression for the out-degree distribution shows a crossover from a leading power law behaviour to a log-periodic regime bounded by a different power law decay. The leading exponents in the two regions have a weak dependence on the mean word length, and an even weaker dependence on the alphabet size. The in-degree distribution, on the other hand, is much narrower and does not show any scaling behaviour

  15. Fingerprint Sensors: Liveness Detection Issue and Hardware based Solutions

    Directory of Open Access Journals (Sweden)

    Shahzad Memon

    2012-01-01

    Full Text Available Securing an automated and unsupervised fingerprint recognition system is one of the most critical and challenging tasks in government and commercial applications. In these systems, the detection of liveness of a finger placed on a fingerprint sensor is a major issue that needs to be addressed in order to ensure the credibility of the system. The main focus of this paper is to review the existing fingerprint sensing technologies in terms of liveness detection and discusses hardware based ‘liveness detection’ techniques reported in the literature for automatic fingerprint biometrics.

  16. Empirical agent-based modelling challenges and solutions

    CERN Document Server

    Barreteau, Olivier

    2014-01-01

    This instructional book showcases techniques to parameterise human agents in empirical agent-based models (ABM). In doing so, it provides a timely overview of key ABM methodologies and the most innovative approaches through a variety of empirical applications.  It features cutting-edge research from leading academics and practitioners, and will provide a guide for characterising and parameterising human agents in empirical ABM.  In order to facilitate learning, this text shares the valuable experiences of other modellers in particular modelling situations. Very little has been published in the area of empirical ABM, and this contributed volume will appeal to graduate-level students and researchers studying simulation modeling in economics, sociology, ecology, and trans-disciplinary studies, such as topics related to sustainability. In a similar vein to the instruction found in a cookbook, this text provides the empirical modeller with a set of 'recipes'  ready to be implemented. Agent-based modeling (AB...

  17. Solutions for wood-based bio-energy price discovery

    Energy Technology Data Exchange (ETDEWEB)

    Teraes, Timo [FOEX Indexes Ltd., Helsinki (Finland)], e-mail: timo@foex.fi

    2012-11-01

    Energy prices are highly volatile. This volatility can have serious ill-effects on the profitability of companies engaged in the energy business. There are, however, a number of price risk management tools which can be used to reduce the problems caused by price volatility. International trade of wood pellets and wood chips is rapidly growing. A good price transparency helps in developing the trade further. In order to meet the renewable energy targets within the EU, further growth of volumes is needed, at least within Europe and from overseas supply sources to the European markets. Reliable price indices are a central element in price risk management and in general price discovery. Exchanges have provided, in the past, the most widely known price discovery systems. Since 1990's, an increasing number of price risk management tools has been based on cash settlement concept. Cash settlement requires high quality benchmark price indices. These have been developed by the exchanges themselves, by trade press and by independent price benchmark provider companies. The best known of these benchmarks in forest industry and now also in wood-based bioenergy products are the PIX indices, provided by FOEX Indexes Ltd. This presentation discusses the key requirements for a good price index and the different ways of using the indices. Price relationships between wood chip prices and pellet prices are also discussed as will be the outlook for the future volume growth and trade flows in woodchips and pellets mainly from the European perspective.

  18. Waveform Diversity and Design for Interoperating Radar Systems

    Science.gov (United States)

    2013-01-01

    University Di Pisa Department Di Ingegneria Dell Informazione Elettronica, Informatica , Telecomunicazioni Via Girolamo Caruso 16 Pisa, Italy 56122...NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) University Di Pisa Department Di Ingegneria Dell Informazione Elettronica, Informatica ...DIPARTIMENTO DI INGEGNERIA DELL’INFORMAZIONE ELETTRONICA, INFORMATICA , TELECOMUNICAZIONI WAVEFORM DIVERSITY AND DESIGN FOR INTEROPERATING

  19. Managing Uncertainty: The Road Towards Better Data Interoperability

    NARCIS (Netherlands)

    Herschel, M.; van Keulen, Maurice

    Data interoperability encompasses the many data management activities needed for effective information management in anyone´s or any organization´s everyday work such as data cleaning, coupling, fusion, mapping, and information extraction. It is our conviction that a significant amount of money and

  20. Look who's talking. A guide to interoperability groups and resources.

    Science.gov (United States)

    2011-06-01

    There are huge challenges in getting medical devices to communicate with other devices and to information systems. Fortunately, a number of groups have emerged to help hospitals cope. Here's a description of the most prominent ones, including useful web links for each. We also discuss the latest and most pertinent interoperability standards.

  1. Interoperability, Scaling, and the Digital Libraries Research Agenda.

    Science.gov (United States)

    Lynch, Clifford; Garcia-Molina, Hector

    1996-01-01

    Summarizes reports and activities at the Information Infrastructure Technology and Applications workshop on digital libraries (Reston, Virginia, August 22, 1995). Defines digital library roles and identifies areas of needed research, including: interoperability; protocols for digital objects; collection management; interface design; human-computer…

  2. Information and documentation - Thesauri and interoperability with other vocabularies

    DEFF Research Database (Denmark)

    Lykke, Marianne; Dalbin, Sylvie; Smedt, Johan De

    ISO 25964-2:2013 is applicable to thesauri and other types of vocabulary that are commonly used for information retrieval. It describes, compares and contrasts the elements and features of these vocabularies that are implicated when interoperability is needed. It gives recommendations for the est...

  3. Design of large-scale enterprise interoperable value webs

    NARCIS (Netherlands)

    Hofman, W.J.

    2011-01-01

    Still a lot of enterprises are faced with the issue of interoperability. Whereas large enterprises are able to implement the required technology, SMEs (Small and Medium sized Enterprises) face challenges as they lack knowledge and budget. Enterprises have defined their specific semantics and

  4. Ontologies for interaction : enabling serendipitous interoperability in smart environments

    NARCIS (Netherlands)

    Niezen, G.

    2012-01-01

    The thesis describes the design and development of an ontology and software framework to support user interaction in ubiquitous computing scenarios. The key goal of ubiquitous computing is "serendipitous interoperability", where devices that were not necessarily designed to work together should be

  5. Enterprise interoperability with SOA: a survey of service composition approaches

    NARCIS (Netherlands)

    Mantovaneli Pessoa, Rodrigo; Goncalves da Silva, Eduardo; van Sinderen, Marten J.; Quartel, Dick; Ferreira Pires, Luis

    Service-oriented architecture (SOA) claims to facilitate the construction of flexible and loosely coupled business applications, and therefore is seen as an enabling factor for enterprise interoperability. The concept of service, which is central to SOA, is very convenient to address the matching of

  6. Towards Cross-Organizational Innovative Business Process Interoperability Services

    Science.gov (United States)

    Karacan, Ömer; Del Grosso, Enrico; Carrez, Cyril; Taglino, Francesco

    This paper presents the vision and initial results of the COIN (FP7-IST-216256) European project for the development of open source Collaborative Business Process Interoperability (CBPip) in cross-organisational business collaboration environments following the Software-as-a-Service Utility (SaaS-U) paradigm.

  7. The MADE reference information model for interoperable pervasive telemedicine systems

    NARCIS (Netherlands)

    Fung, L.S.N.; Jones, Valerie M.; Hermens, Hermanus J.

    2017-01-01

    Objectives: The main objective is to develop and validate a reference information model (RIM) to support semantic interoperability of pervasive telemedicine systems. The RIM is one component within a larger, computer-interpretable "MADE language" developed by the authors in the context of the

  8. 47 CFR 0.192 - Emergency Response Interoperability Center.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Emergency Response Interoperability Center. 0.192 Section 0.192 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL COMMISSION ORGANIZATION..., industry representatives, and service providers. [75 FR 28207, May 20, 2010] ...

  9. Evanescent Wave Absorption Based Fiber Sensor for Measuring Glucose Solution Concentration

    Science.gov (United States)

    Marzuki, Ahmad; Candra Pratiwi, Arni; Suryanti, Venty

    2018-03-01

    An optical fiber sensor based on evanescent wave absorption designed for measuring glucose solution consentration was proposed. The sensor was made to detect absorbance of various wavelength in the glucose solution. The sensing element was fabricated by side polishing of multimode polymer optical fiber to form a D-shape. The sensing element was immersed in different concentration of glucoce solution. As light propagated through the optical fiber, the evanescent wave interacted with the glucose solution. Light was absorbed by the glucose solution. The larger concentration the glucose solution has, the more the evanescent wave was absorbed in particular wavelenght. Here in this paper, light absorbtion as function of glucose concentration was measured as function of wavelength (the color of LED). We have shown that the proposed sensor can demonstrated an increase of light absorption as function of glucose concentration.

  10. Plant-based solutions for veterinary immunotherapeutics and prophylactics.

    Science.gov (United States)

    Kolotilin, Igor; Topp, Ed; Cox, Eric; Devriendt, Bert; Conrad, Udo; Joensuu, Jussi; Stöger, Eva; Warzecha, Heribert; McAllister, Tim; Potter, Andrew; McLean, Michael D; Hall, J Christopher; Menassa, Rima

    2014-12-31

    An alarming increase in emergence of antibiotic resistance among pathogens worldwide has become a serious threat to our ability to treat infectious diseases according to the World Health Organization. Extensive use of antibiotics by livestock producers promotes the spread of new resistant strains, some of zoonotic concern, which increases food-borne illness in humans and causes significant economic burden on healthcare systems. Furthermore, consumer preferences for meat/poultry/fish produced without the use of antibiotics shape today's market demand. So, it is viewed as inevitable by the One Health Initiative that humans need to reduce the use of antibiotics and turn to alternative, improved means to control disease: vaccination and prophylactics. Besides the intense research focused on novel therapeutic molecules, both these strategies rely heavily on the availability of cost-effective, efficient and scalable production platforms which will allow large-volume manufacturing for vaccines, antibodies and other biopharmaceuticals. Within this context, plant-based platforms for production of recombinant therapeutic proteins offer significant advantages over conventional expression systems, including lack of animal pathogens, low production costs, fast turnaround and response times and rapid, nearly-unlimited scalability. Also, because dried leaves and seeds can be stored at room temperature for lengthy periods without loss of recombinant proteins, plant expression systems have the potential to offer lucrative benefits from the development of edible vaccines and prophylactics, as these would not require "cold chain" storage and transportation, and could be administered in mass volumes with minimal processing. Several biotechnology companies currently have developed and adopted plant-based platforms for commercial production of recombinant protein therapeutics. In this manuscript, we outline the challenges in the process of livestock immunization as well as the current

  11. Managing and delivering of 3D geo data across institutions has a web based solution - intermediate results of the project GeoMol.

    Science.gov (United States)

    Gietzel, Jan; Schaeben, Helmut; Gabriel, Paul

    2014-05-01

    The increasing relevance of geological information for policy and economy at transnational level has recently been recognized by the European Commission, who has called for harmonized information related to reserves and resources in the EU Member States. GeoMol's transnational approach responds to that, providing consistent and seamless 3D geological information of the Alpine Foreland Basins based on harmonized data and agreed methodologies. However, until recently no adequate tool existed to ensure full interoperability among the involved GSOs and to distribute the multi-dimensional information of a transnational project facing diverse data policy, data base systems and software solutions. In recent years (open) standards describing 2D spatial data have been developed and implemented in different software systems including production environments for 2D spatial data (like regular 2D-GI-Systems). Easy yet secured access to the data is of upmost importance and thus priority for any spatial data infrastructure. To overcome limitations conditioned by highly sophisticated and platform dependent geo modeling software packages functionalities of a web portals can be utilized. Thus, combining a web portal with a "check-in-check-out" system allows distributed organized editing of data and models but requires standards for the exchange of 3D geological information to ensure interoperability. Another major concern is the management of large models and the ability of 3D tiling into spatially restricted models with refined resolution, especially when creating countrywide models . Using GST ("Geosciences in Space and Time") developed initially at TU Bergakademie Freiberg and continuously extended by the company GiGa infosystems, incorporating these key issues and based on an object-relational data model, it is possible to check out parts or whole models for edits and check in again after modification. GST is the core of GeoMol's web-based collaborative environment designed to

  12. Evaluation of a Web-based Online Grant Application Review Solution

    Directory of Open Access Journals (Sweden)

    Marius Daniel PETRISOR

    2013-12-01

    Full Text Available This paper focuses on the evaluation of a web-based application used in grant application evaluations, software developed in our university, and underlines the need for simple solutions, based on recent technology, specifically tailored to one’s needs. We asked the reviewers to answer a short questionnaire, in order to assess their satisfaction with such a web-based grant application evaluation solution. All 20 reviewers accepted to answer the questionnaire, which contained 8 closed items (YES/NO answers related to reviewer’s previous experience in evaluating grant applications, previous use of such software solutions and his familiarity in using computer systems. The presented web-based application, evaluated by the users, shown a high level of acceptance and those respondents stated that they are willing to use such a solution in the future.

  13. Thermodynamics of hydrogen bonding and van der Waals interactions of organic solutes in solutions of imidazolium based ionic liquids: “Structure-property” relationships

    Energy Technology Data Exchange (ETDEWEB)

    Varfolomeev, Mikhail A., E-mail: vma.ksu@gmail.com; Khachatrian, Artashes A.; Akhmadeev, Bulat S.; Solomonov, Boris N.

    2016-06-10

    Highlights: • Solution enthalpies of organic solutes in imidazolium based ionic liquids were measured. • van der Waals interactions scale of imidazolium based ionic liquids was proposed. • Enthalpies of solvation of organic solutes in ionic liquids were determined. • Hydrogen bond enthalpies of organic solutes with ionic liquids were calculated. • Relationships between structure of ionic liquids and thermochemical data were obtained. - Abstract: In the present work thermochemistry of intermolecular interactions of organic compounds in solutions of imidazolium based ionic liquids (ILs) has been studied using solution calorimetry method. Enthalpies of solution at infinite dilution of non-polar (alkanes, aromatic hydrocarbons) and polar (alcohols, amides, and etc.) organic solutes in two ionic liquids 1-butyl-3-methylimidazolium tetrafluoroborate and 1-butyl-3-methylimidazolium trifluoromethanesulfonate were measured at 298.15 K. The scale of van der Waals interactions of imidazolium based ILs has been proposed on the basis of solution enthalpies of n-alkanes in their media. The effect of the cation and anion structure of ILs on the enthalpies of solvation was analyzed. Enthalpies of hydrogen bonding of organic solutes with imidazolium based ILs were determined. It has been shown that these values are close to zero for proton acceptor solutes. At the same time, enthalpies of hydrogen bonding of proton donor solutes with ionic liquids are increased depending the anion: tetrafluoroborate ≈ bis(trifluoromethylsulfonyl)imide < 2-(2-methoxyethoxy)ethyl sulfate < trifluoromethanesulfonate. Enthalpies of van der Waals interactions and hydrogen bonding in the solutions of imidazolium based ionic liquids were compared with the same data for molecular solvents.

  14. Thermodynamics of hydrogen bonding and van der Waals interactions of organic solutes in solutions of imidazolium based ionic liquids: “Structure-property” relationships

    International Nuclear Information System (INIS)

    Varfolomeev, Mikhail A.; Khachatrian, Artashes A.; Akhmadeev, Bulat S.; Solomonov, Boris N.

    2016-01-01

    Highlights: • Solution enthalpies of organic solutes in imidazolium based ionic liquids were measured. • van der Waals interactions scale of imidazolium based ionic liquids was proposed. • Enthalpies of solvation of organic solutes in ionic liquids were determined. • Hydrogen bond enthalpies of organic solutes with ionic liquids were calculated. • Relationships between structure of ionic liquids and thermochemical data were obtained. - Abstract: In the present work thermochemistry of intermolecular interactions of organic compounds in solutions of imidazolium based ionic liquids (ILs) has been studied using solution calorimetry method. Enthalpies of solution at infinite dilution of non-polar (alkanes, aromatic hydrocarbons) and polar (alcohols, amides, and etc.) organic solutes in two ionic liquids 1-butyl-3-methylimidazolium tetrafluoroborate and 1-butyl-3-methylimidazolium trifluoromethanesulfonate were measured at 298.15 K. The scale of van der Waals interactions of imidazolium based ILs has been proposed on the basis of solution enthalpies of n-alkanes in their media. The effect of the cation and anion structure of ILs on the enthalpies of solvation was analyzed. Enthalpies of hydrogen bonding of organic solutes with imidazolium based ILs were determined. It has been shown that these values are close to zero for proton acceptor solutes. At the same time, enthalpies of hydrogen bonding of proton donor solutes with ionic liquids are increased depending the anion: tetrafluoroborate ≈ bis(trifluoromethylsulfonyl)imide < 2-(2-methoxyethoxy)ethyl sulfate < trifluoromethanesulfonate. Enthalpies of van der Waals interactions and hydrogen bonding in the solutions of imidazolium based ionic liquids were compared with the same data for molecular solvents.

  15. Solution processed nanogap organic diodes based on liquid crystalline materials

    Science.gov (United States)

    Wang, Yi-Fei; Iino, Hiroaki; Hanna, Jun-ichi

    2017-09-01

    Co-planar nanogap organic diodes were fabricated with smectic liquid crystalline materials of the benzothienobenzothiophene (BTBT) derivative by a spin-coating technique. A high rectification ratio of the order of 106 at ±3 V was achieved when a liquid crystalline material of 2,7-didecyl benzothieno[3,2-b][1]benzothiophene (10-BTBT-10) was used in a device configuration of Al/10-BTBT-10/pentafluorobenzenethiol-treated Au on a glass substrate, which was 4 orders higher than that of the device based on non-liquid crystalline materials of 2,7-dibutyl benzothieno[3,2-b][1]benzothiophene (4-BTBT-4) and BTBT. Similar results were also observed when another liquid crystalline material of ω, ω'-dioctylterthiophene (8-TTP-8) and a non-liquid crystalline material of terthiophene (TTP) were used. These improved rectifications can be ascribed to the self-assembly properties and controllable molecular orientation of liquid crystalline materials, which made uniform perpendicular oriented polycrystalline films favorable for superior charge transport in nano-channels.

  16. Climate Solutions based on advanced scientific discoveries of Allatra physics

    Directory of Open Access Journals (Sweden)

    Vershigora Valery

    2016-01-01

    Full Text Available Global climate change is one of the most important international problems of the 21st century. The overall rapid increase in the dynamics of cataclysms, which have been observed in recent decades, is particularly alarming. Howdo modern scientists predict the occurrence of certain events? In meteorology, unusually powerful cumulonimbus clouds are one of the main conditions for the emergence of a tornado. The former, in their turn, are formed during the invasion of cold air on the overheated land surface. The satellite captures the cloud front, and, based on these pictures, scientists make assumptions about the possibility of occurrence of the respective natural phenomena. In fact, mankind visually observes and draws conclusions about the consequences of the physical phenomena which have already taken place in the invisible world, so the conclusions of scientists are assumptions by their nature, rather than precise knowledge of the causes of theorigin of these phenomena in the physics of microcosm. The latest research in the field of the particle physics and neutrino astrophysics, which was conducted by a working team of scientists of ALLATRA International Public Movement (hereinafter ALLATRA SCIENCE groupallatra-science.org, last accessed 10 April 2016., offers increased opportunities for advanced fundamental and applied research in climatic engineering.

  17. The development of clinical document standards for semantic interoperability in china.

    Science.gov (United States)

    Yang, Peng; Pan, Feng; Liu, Danhong; Xu, Yongyong; Wan, Yi; Tu, Haibo; Tang, Xuejun; Hu, Jianping

    2011-12-01

    This study is aimed at developing a set of data groups (DGs) to be employed as reusable building blocks for the construction of the eight most common clinical documents used in China's general hospitals in order to achieve their structural and semantic standardization. The Diagnostics knowledge framework, the related approaches taken from the Health Level Seven (HL7), the Integrating the Healthcare Enterprise (IHE), and the Healthcare Information Technology Standards Panel (HITSP) and 1,487 original clinical records were considered together to form the DG architecture and data sets. The internal structure, content, and semantics of each DG were then defined by mapping each DG data set to a corresponding Clinical Document Architecture data element and matching each DG data set to the metadata in the Chinese National Health Data Dictionary. By using the DGs as reusable building blocks, standardized structures and semantics regarding the clinical documents for semantic interoperability were able to be constructed. Altogether, 5 header DGs, 48 section DGs, and 17 entry DGs were developed. Several issues regarding the DGs, including their internal structure, identifiers, data set names, definitions, length and format, data types, and value sets, were further defined. Standardized structures and semantics regarding the eight clinical documents were structured by the DGs. This approach of constructing clinical document standards using DGs is a feasible standard-driven solution useful in preparing documents possessing semantic interoperability among the disparate information systems in China. These standards need to be validated and refined through further study.

  18. A NOVEL RHODAMINE-BASED FLUORESCENCE CHEMOSENSOR CONTAINING POLYETHER FOR MERCURY (II IONS IN AQUEOUS SOLUTION

    Directory of Open Access Journals (Sweden)

    Wenqi Du

    Full Text Available A novel rhodamine-based Hg2+ chemosensor P2 containing polyether was readily synthesized and investigated, which displayed high selectivity and sensitivity for Hg2+. Because of good water-solubility of polyther, the rhodamine-based chemosensor containing polyether can be used in aqueous solution. The sensor responded rapidly to Hg2+ in pure water solutions with a 1:1 stoichiometry. Meanwhile, it indicated excellent adaptability and also the responsiveness.

  19. Improving the interoperability of biomedical ontologies with compound alignments.

    Science.gov (United States)

    Oliveira, Daniela; Pesquita, Catia

    2018-01-09

    Ontologies are commonly used to annotate and help process life sciences data. Although their original goal is to facilitate integration and interoperability among heterogeneous data sources, when these sources are annotated with distinct ontologies, bridging this gap can be challenging. In the last decade, ontology matching systems have been evolving and are now capable of producing high-quality mappings for life sciences ontologies, usually limited to the equivalence between two ontologies. However, life sciences research is becoming increasingly transdisciplinary and integrative, fostering the need to develop matching strategies that are able to handle multiple ontologies and more complex relations between their concepts. We have developed ontology matching algorithms that are able to find compound mappings between multiple biomedical ontologies, in the form of ternary mappings, finding for instance that "aortic valve stenosis"(HP:0001650) is equivalent to the intersection between "aortic valve"(FMA:7236) and "constricted" (PATO:0001847). The algorithms take advantage of search space filtering based on partial mappings between ontology pairs, to be able to handle the increased computational demands. The evaluation of the algorithms has shown that they are able to produce meaningful results, with precision in the range of 60-92% for new mappings. The algorithms were also applied to the potential extension of logical definitions of the OBO and the matching of several plant-related ontologies. This work is a first step towards finding more complex relations between multiple ontologies. The evaluation shows that the results produced are significant and that the algorithms could satisfy specific integration needs.

  20. Market based solutions for increased flexibility in electricity consumption

    International Nuclear Information System (INIS)

    Grande, Ove S.; Saele, Hanne

    2005-06-01

    The main focus of this paper is on manual and automatic demand response to prices in the day ahead market. The content is mainly based on the results and experiences from the large scale Norwegian test and research project End User flexibility by efficient use of ICT (2001-2004) involving 10,894 customers with automatic meter reading (AMR) and remote load control (RLC) options. The response to hourly spot price products and intraday time of use (ToU) tariffs were tested. The registered response differs from 0.18-1 kWh/h in average per household customer for the different combination of these price signals. The largest response was achieved for the customers with both the ToU network tariff and hourly spot price. Some of the customers were offered remote controlled automatic disconnection of water heaters in the high price periods during week days. The test shows that the potential of load reduction from water heaters can be estimated to 0.6 kWh/h in the peak hours on average. For Norway this indicates that a total of 600 MWh/h automatic price elasticity could be achieved, provided that half of the 2 million Norwegian households accept RLC of their water heater referred to spot price. The benefit for load shifting is limited for each customer, but of great value for the power system as a whole. Combination of an hourly spot price contract with an intraday ToU network tariff should therefore be considered, in order to provide stable economic incentives for load reduction. One potential drawback for customers with spot price energy contracts is the risk of high electricity prices in periods of lasting scarcity. Combination with financial power contracts as an insurance for the customer is an option that will be examined in a follow up project

  1. Non-Invasive Acoustic-Based Monitoring of Heavy Water and Uranium Process Solutions

    Energy Technology Data Exchange (ETDEWEB)

    Pantea, Cristian [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Sinha, Dipen N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Lakis, Rollin Evan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Beedle, Christopher Craig [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Davis, Eric Sean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-10-20

    This presentation includes slides on Project Goals; Heavy Water Production Monitoring: A New Challenge for the IAEA; Noninvasive Measurements in SFAI Cell; Large Scatter in Literature Values; Large Scatter in Literature Values; Highest Precision Sound Speed Data Available: New Standard in H/D; ~400 pts of data; Noninvasive Measurements in SFAI Cell; New funding from NA241 SGTech; Uranium Solution Monitoring: Inspired by IAEA Challenge in Kazakhstan; Non-Invasive Acoustic-Based Monitoring of Uranium in Solutions; Non-Invasive Acoustic-Based Monitoring of Uranium in Solutions; and finally a summary.

  2. Promoting Savings at Tax Time through a Video-Based Solution-Focused Brief Coaching Intervention

    Directory of Open Access Journals (Sweden)

    Lance Palmer

    2016-09-01

    Full Text Available Solution-focused brief coaching, based on solution-focused brief therapy, is a well-established practice model and is used widely to help individuals progress toward desired outcomes in a variety of settings. This papers presents the findings of a pilot study that examined the impact of a video-based solution-focused brief coaching intervention delivered in conjunction with income tax preparation services at a Volunteer Income Tax Assistance location (n = 212. Individuals receiving tax preparation assistance were randomly assigned to one of four treatment groups: 1 control group; 2 video-based solution-focused brief coaching; 3 discount card incentive; 4 both the video-based solution-focused brief coaching and the discount card incentive. Results of the study indicate that the video-based solution-focused brief coaching intervention increased both the frequency and amount of self-reported savings at tax time. Results also indicate that financial therapy based interventions may be scalable through the use of technology.

  3. The Use of Alkaliphilic Bacteria-based Repair Solution for Porous Network Concrete Healing Mechanism

    NARCIS (Netherlands)

    Sangadji, S.; Wiktor, V.A.C.; Jonkers, H.M.; Schlangen, H.E.J.G.

    2017-01-01

    Bacteria induced calcium carbonate precipitation based on metabolic conversion of nutrients has been acknowledged for having potentials in self-healing cement-based materials. Recent studies have shown the development of bacteria-based repair solution (liquid) for concrete surface repair. This

  4. General-base catalysed hydrolysis and nucleophilic substitution of activated amides in aqueous solutions

    NARCIS (Netherlands)

    Buurma, NJ; Blandamer, MJ; Engberts, JBFN; Buurma, Niklaas J.

    The reactivity of 1-benzoyl-3-phenyl-1,2,4-triazole (1a) was studied in the presence of a range of weak bases in aqueous solution. A change in mechanism is observed from general-base catalysed hydrolysis to nucleophilic substitution and general-base catalysed nucleophilic substitution. A slight

  5. Solution identification and quantitative analysis of fiber-capacitive drop analyzer based on multivariate statistical methods

    Science.gov (United States)

    Chen, Zhe; Qiu, Zurong; Huo, Xinming; Fan, Yuming; Li, Xinghua

    2017-03-01

    A fiber-capacitive drop analyzer is an instrument which monitors a growing droplet to produce a capacitive opto-tensiotrace (COT). Each COT is an integration of fiber light intensity signals and capacitance signals and can reflect the unique physicochemical property of a liquid. In this study, we propose a solution analytical and concentration quantitative method based on multivariate statistical methods. Eight characteristic values are extracted from each COT. A series of COT characteristic values of training solutions at different concentrations compose a data library of this kind of solution. A two-stage linear discriminant analysis is applied to analyze different solution libraries and establish discriminant functions. Test solutions can be discriminated by these functions. After determining the variety of test solutions, Spearman correlation test and principal components analysis are used to filter and reduce dimensions of eight characteristic values, producing a new representative parameter. A cubic spline interpolation function is built between the parameters and concentrations, based on which we can calculate the concentration of the test solution. Methanol, ethanol, n-propanol, and saline solutions are taken as experimental subjects in this paper. For each solution, nine or ten different concentrations are chosen to be the standard library, and the other two concentrations compose the test group. By using the methods mentioned above, all eight test solutions are correctly identified and the average relative error of quantitative analysis is 1.11%. The method proposed is feasible which enlarges the applicable scope of recognizing liquids based on the COT and improves the concentration quantitative precision, as well.

  6. Nuclear Forensic Lab Interoperability and Criminal Investigation

    Science.gov (United States)

    2014-08-01

    to the decay-corrected disintegrations per second (Bq) from the certified specific activity and the aliquot mass. Count rates for each peak were...balance, five- place • Anti-static holder for plastic bottles • Anti-static “gun” or equivalent Bottles for reagents Teflon or HDPE Test tubes...for GPEC wash solutions plastic Vials or bottles for sample collection Teflon or HDPE (or other non-leachable plastic ) Pipets, pipet tips

  7. Analysis of cloud-based solutions on EHRs systems in different scenarios.

    Science.gov (United States)

    Fernández-Cardeñosa, Gonzalo; de la Torre-Díez, Isabel; López-Coronado, Miguel; Rodrigues, Joel J P C

    2012-12-01

    Nowadays with the growing of the wireless connections people can access all the resources hosted in the Cloud almost everywhere. In this context, organisms can take advantage of this fact, in terms of e-Health, deploying Cloud-based solutions on e-Health services. In this paper two Cloud-based solutions for different scenarios of Electronic Health Records (EHRs) management system are proposed. We have researched articles published between the years 2005 and 2011 about the implementation of e-Health services based on the Cloud in Medline. In order to analyze the best scenario for the deployment of Cloud Computing two solutions for a large Hospital and a network of Primary Care Health centers have been studied. Economic estimation of the cost of the implementation for both scenarios has been done via the Amazon calculator tool. As a result of this analysis two solutions are suggested depending on the scenario: To deploy a Cloud solution for a large Hospital a typical Cloud solution in which are hired just the needed services has been assumed. On the other hand to work with several Primary Care Centers it's suggested the implementation of a network, which interconnects these centers with just one Cloud environment. Finally it's considered the fact of deploying a hybrid solution: in which EHRs with images will be hosted in the Hospital or Primary Care Centers and the rest of them will be migrated to the Cloud.

  8. Rare earth-based low-index films for IR and multispectral thin film solutions

    Science.gov (United States)

    Stolze, Markus; Neff, Joe; Waibel, Friedrich

    2017-10-01

    Non-thoriated rare-earth fluoride based coating solutions involving DyF3 and YbF3 based films as well as non-wetting fluorohydrocarbon cap layers on such films, have been deposited, analyzed and partly optimized. Intermediate results for DyF3 based films from ion assisted e-gun deposition with O2 and N2 alone and as base for the non-wetting to-player as well as for YbF3 starting material with or without admixtures of CaF2 are discussed for low-loss LWIR and multispectral solutions.

  9. Two innovative solutions based on fibre concrete blocks designed for building substructure

    Science.gov (United States)

    Pazderka, J.; Hájek, P.

    2017-09-01

    Using of fibers in a high-strength concrete allows reduction of the dimensions of small precast concrete elements, which opens up new ways of solution for traditional construction details in buildings. The paper presents two innovative technical solutions for building substructure: The special shaped plinth block from fibre concrete and the fibre concrete elements for new technical solution of ventilated floor. The main advantages of plinth block from fibre concrete blocks (compared with standard plinth solutions) is: easier and faster assembly, higher durability and thanks to the air cavity between the vertical part of the block, the building substructure reduced moisture level of structures under the waterproofing layer and a comprehensive solution to the final surface of building plinth as well as the surface of adjacent terrain. The ventilated floor based on fibre concrete precast blocks is an attractive structural alternative for tackling the problem of increased moisture in masonry in older buildings, lacking a functional waterproof layer in the substructure.

  10. Four-center bubbled BPS solutions with a Gibbons-Hawking base

    Science.gov (United States)

    Heidmann, Pierre

    2017-10-01

    We construct four-center bubbled BPS solutions with a Gibbons-Hawking base space. We give a systematic procedure to build scaling solutions: starting from three-supertube configurations and using generalized spectral flows and gauge transformations to extend to solutions with four Gibbons-Hawking centers. This allows us to construct very large families of smooth horizonless solutions that have the same charges and angular momentum as supersymmetric black holes with a macroscopically large horizon area. Our construction reveals that all scaling solutions with four Gibbons Hawking centers have an angular momentum at around 99% of the cosmic censorship bound. We give both an analytical and a numerical explanation for this unexpected feature.

  11. A web-based rapid assessment tool for production publishing solutions

    Science.gov (United States)

    Sun, Tong

    2010-02-01

    Solution assessment is a critical first-step in understanding and measuring the business process efficiency enabled by an integrated solution package. However, assessing the effectiveness of any solution is usually a very expensive and timeconsuming task which involves lots of domain knowledge, collecting and understanding the specific customer operational context, defining validation scenarios and estimating the expected performance and operational cost. This paper presents an intelligent web-based tool that can rapidly assess any given solution package for production publishing workflows via a simulation engine and create a report for various estimated performance metrics (e.g. throughput, turnaround time, resource utilization) and operational cost. By integrating the digital publishing workflow ontology and an activity based costing model with a Petri-net based workflow simulation engine, this web-based tool allows users to quickly evaluate any potential digital publishing solutions side-by-side within their desired operational contexts, and provides a low-cost and rapid assessment for organizations before committing any purchase. This tool also benefits the solution providers to shorten the sales cycles, establishing a trustworthy customer relationship and supplement the professional assessment services with a proven quantitative simulation and estimation technology.

  12. Solid solution strengthening and diffusion in nickel- and cobalt-based superalloys

    Energy Technology Data Exchange (ETDEWEB)

    Rehman, Hamad ur

    2016-07-01

    Nickel and cobalt-based superalloys with a γ-γ{sup '} microstructure are known for their excellent creep resistance at high temperatures. Their microstructure is engineered using different alloying elements, that partition either to the fcc γ matrix or to the ordered γ{sup '} phase. In the present work the effect of alloying elements on their segregation behaviour in nickel-based superalloys, diffusion in cobalt-based superalloys and the temperature dependent solid solution strengthening in nickel-based alloys is investigated. The effect of dendritic segregation on the local mechanical properties of individual phases in the as-cast, heat treated and creep deformed state of a nickel-based superalloy is investigated. The local chemical composition is characterized using Electron Probe Micro Analysis and then correlated with the mechanical properties of individual phases using nanoindentation. Furthermore, the temperature dependant solid solution hardening contribution of Ta, W and Re towards fcc nickel is studied. The room temperature hardening is determined by a diffusion couple approach using nanoindentation and energy dispersive X-ray analysis for relating hardness to the chemical composition. The high temperature properties are determined using compression strain rate jump tests. The results show that at lower temperatures, the solute size is prevalent and the elements with the largest size difference with nickel, induce the greatest hardening consistent with a classical solid solution strengthening theory. At higher temperatures, the solutes interact with the dislocations such that the slowest diffusing solute poses maximal resistance to dislocation glide and climb. Lastly, the diffusion of different technically relevant solutes in fcc cobalt is investigated using diffusion couples. The results show that the large atoms diffuse faster in cobalt-based superalloys similar to their nickel-based counterparts.

  13. Solid solution strengthening and diffusion in nickel- and cobalt-based superalloys

    International Nuclear Information System (INIS)

    Rehman, Hamad ur

    2016-01-01

    Nickel and cobalt-based superalloys with a γ-γ ' microstructure are known for their excellent creep resistance at high temperatures. Their microstructure is engineered using different alloying elements, that partition either to the fcc γ matrix or to the ordered γ ' phase. In the present work the effect of alloying elements on their segregation behaviour in nickel-based superalloys, diffusion in cobalt-based superalloys and the temperature dependent solid solution strengthening in nickel-based alloys is investigated. The effect of dendritic segregation on the local mechanical properties of individual phases in the as-cast, heat treated and creep deformed state of a nickel-based superalloy is investigated. The local chemical composition is characterized using Electron Probe Micro Analysis and then correlated with the mechanical properties of individual phases using nanoindentation. Furthermore, the temperature dependant solid solution hardening contribution of Ta, W and Re towards fcc nickel is studied. The room temperature hardening is determined by a diffusion couple approach using nanoindentation and energy dispersive X-ray analysis for relating hardness to the chemical composition. The high temperature properties are determined using compression strain rate jump tests. The results show that at lower temperatures, the solute size is prevalent and the elements with the largest size difference with nickel, induce the greatest hardening consistent with a classical solid solution strengthening theory. At higher temperatures, the solutes interact with the dislocations such that the slowest diffusing solute poses maximal resistance to dislocation glide and climb. Lastly, the diffusion of different technically relevant solutes in fcc cobalt is investigated using diffusion couples. The results show that the large atoms diffuse faster in cobalt-based superalloys similar to their nickel-based counterparts.

  14. NASA's Earth Science Gateway: A Platform for Interoperable Services in Support of the GEOSS Architecture

    Science.gov (United States)

    Alameh, N.; Bambacus, M.; Cole, M.

    2006-12-01

    Nasa's Earth Science as well as interdisciplinary research and applications activities require access to earth observations, analytical models and specialized tools and services, from diverse distributed sources. Interoperability and open standards for geospatial data access and processing greatly facilitate such access among the information and processing compo¬nents related to space¬craft, airborne, and in situ sensors; predictive models; and decision support tools. To support this mission, NASA's Geosciences Interoperability Office (GIO) has been developing the Earth Science Gateway (ESG; online at http://esg.gsfc.nasa.gov) by adapting and deploying a standards-based commercial product. Thanks to extensive use of open standards, ESG can tap into a wide array of online data services, serve a variety of audiences and purposes, and adapt to technology and business changes. Most importantly, the use of open standards allow ESG to function as a platform within a larger context of distributed geoscience processing, such as the Global Earth Observing System of Systems (GEOSS). ESG shares the goals of GEOSS to ensure that observations and products shared by users will be accessible, comparable, and understandable by relying on common standards and adaptation to user needs. By maximizing interoperability, modularity, extensibility and scalability, ESG's architecture fully supports the stated goals of GEOSS. As such, ESG's role extends beyond that of a gateway to NASA science data to become a shared platform that can be leveraged by GEOSS via: A modular and extensible architecture Consensus and community-based standards (e.g. ISO and OGC standards) A variety of clients and visualization techniques, including WorldWind and Google Earth A variety of services (including catalogs) with standard interfaces Data integration and interoperability Mechanisms for user involvement and collaboration Mechanisms for supporting interdisciplinary and domain-specific applications ESG

  15. Is Toscana A Formal Concept Analysis Based Solution In Web Usage Mining?

    Directory of Open Access Journals (Sweden)

    Dan-Andrei SITAR-TĂUT

    2012-01-01

    Full Text Available Analyzing large amount of data come from web logs represents a complex, but challenging nowadays problem with implication in various fields, thing that lets open a way for theoretically infinite approaches an implementations. The main goal of our paper represents the possibility of applying the formal concept analysis as viable solution of sustaining the web mining process, based on a technological open-source solution called TOSCANA.

  16. The Forward-Bias Puzzle: A Solution Based on Covered Interest Parity

    OpenAIRE

    Pippenger, John

    2009-01-01

    The forward-bias puzzle is probably the most important puzzle in international macroeconomics. After more than 20 years, there is no accepted solution. My solution is based on covered interest parity (CIP). CIP implies: (1) Forward rates are not rational expectations of future spot rates. Those expectations depend on future spot rates and interest rate differentials. (2) The forward bias is the result of a specification error, replacing future forward exchange rates with current forward ...

  17. A perturbation method for dark solitons based on a complete set of the squared Jost solutions

    International Nuclear Information System (INIS)

    Ao Shengmei; Yan Jiaren

    2005-01-01

    A perturbation method for dark solitons is developed, which is based on the construction and the rigorous proof of the complete set of squared Jost solutions. The general procedure solving the adiabatic solution of perturbed nonlinear Schroedinger + equation, the time-evolution equation of dark soliton parameters and a formula for calculating the first-order correction are given. The method can also overcome the difficulties resulting from the non-vanishing boundary condition

  18. An Agent Based Modelling Approach for Multi-Stakeholder Analysis of City Logistics Solutions

    NARCIS (Netherlands)

    Anand, N.

    2015-01-01

    This thesis presents a comprehensive framework for multi-stakeholder analysis of city logistics solutions using agent based modeling. The framework describes different stages for the systematic development of an agent based model for the city logistics domain. The framework includes a

  19. Interoperability And Value Added To Earth Observation Data

    Science.gov (United States)

    Gasperi, J.

    2012-04-01

    Geospatial web services technology has provided a new means for geospatial data interoperability. Open Geospatial Consortium (OGC) services such as Web Map Service (WMS) to request maps on the Internet, Web Feature Service (WFS) to exchange vectors or Catalog Service for the Web (CSW) to search for geospatialized data have been widely adopted in the Geosciences community in general and in the remote sensing community in particular. These services make Earth Observation data available to a wider range of public users than ever before. The mapshup web client offers an innovative and efficient user interface that takes advantage of the power of interoperability. This presentation will demonstrate how mapshup can be effectively used in the context of natural disasters management.

  20. Modelling and approaching pragmatic interoperability of distributed geoscience data

    Science.gov (United States)

    Ma, Xiaogang

    2010-05-01

    Interoperability of geodata, which is essential for sharing information and discovering insights within a cyberinfrastructure, is receiving increasing attention. A key requirement of interoperability in the context of geodata sharing is that data provided by local sources can be accessed, decoded, understood and appropriately used by external users. Various researchers have discussed that there are four levels in data interoperability issues: system, syntax, schematics and semantics, which respectively relate to the platform, encoding, structure and meaning of geodata. Ontology-driven approaches have been significantly studied addressing schematic and semantic interoperability issues of geodata in the last decade. There are different types, e.g. top-level ontologies, domain ontologies and application ontologies and display forms, e.g. glossaries, thesauri, conceptual schemas and logical theories. Many geodata providers are maintaining their identified local application ontologies in order to drive standardization in local databases. However, semantic heterogeneities often exist between these local ontologies, even though they are derived from equivalent disciplines. In contrast, common ontologies are being studied in different geoscience disciplines (e.g., NAMD, SWEET, etc.) as a standardization procedure to coordinate diverse local ontologies. Semantic mediation, e.g. mapping between local ontologies, or mapping local ontologies to common ontologies, has been studied as an effective way of achieving semantic interoperability between local ontologies thus reconciling semantic heterogeneities in multi-source geodata. Nevertheless, confusion still exists in the research field of semantic interoperability. One problem is caused by eliminating elements of local pragmatic contexts in semantic mediation. Comparing to the context-independent feature of a common domain ontology, local application ontologies are closely related to elements (e.g., people, time, location

  1. An approach for the semantic interoperability of ISO EN 13606 and OpenEHR archetypes.

    Science.gov (United States)

    Martínez-Costa, Catalina; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás

    2010-10-01

    The communication between health information systems of hospitals and primary care organizations is currently an important challenge to improve the quality of clinical practice and patient safety. However, clinical information is usually distributed among several independent systems that may be syntactically or semantically incompatible. This fact prevents healthcare professionals from accessing clinical information of patients in an understandable and normalized way. In this work, we address the semantic interoperability of two EHR standards: OpenEHR and ISO EN 13606. Both standards follow the dual model approach which distinguishes information and knowledge, this being represented through archetypes. The solution presented here is capable of transforming OpenEHR archetypes into ISO EN 13606 and vice versa by combining Semantic Web and Model-driven Engineering technologies. The resulting software implementation has been tested using publicly available collections of archetypes for both standards.

  2. Architecture of a Process Broker for Interoperable Geospatial Modeling on the Web

    Directory of Open Access Journals (Sweden)

    Lorenzo Bigagli

    2015-04-01

    Full Text Available The identification of appropriate mechanisms for process sharing and reuse by means of composition is considered a key enabler for the effective uptake of a global Earth Observation infrastructure, currently pursued by the international geospatial research community. Modelers in need of running complex workflows may benefit from outsourcing process composition to a dedicated external service, according to the brokering approach. This work introduces our architecture of a process broker, as a distributed information system for creating, validating, editing, storing, publishing and executing geospatial-modeling workflows. The broker provides a service framework for adaptation, reuse and complementation of existing processing resources (including models and geospatial services in general in the form of interoperable, executable workflows. The described solution has been experimentally applied in several use scenarios in the context of EU-funded projects and the Global Earth Observation System of Systems.

  3. Service interoperability through advanced media gateways

    CERN Document Server

    van der Meer, S

    2000-01-01

    The convergence of telecommunications systems and the Internet causes a variety of concepts for service integration. The focus of the recent research studies and the work of several standardization bodies lies mostly on the interworking of services and the universal service access from end-user systems including both fixed and wireless terminals. All approaches are driven by the concept of providing several technologies to users by keeping the peculiarity of each service alive. But, developments should not only concentrate on media adaptation between VoIP and PSTN, but also consider the adaptation among completely different types of applications as for example E- mail, facsimile, or voice. Unified messaging, which is an already accepted service on the market, provides solutions for conversions of different application protocols into each other. The functionality of converting one medium into another is implemented here in so-called media gateways. This paper provides an overview of the current developments in...

  4. Developing consensus-based policy solutions for medicines adherence for Europe: a delphi study

    Science.gov (United States)

    2012-01-01

    Background Non-adherence to prescribed medication is a pervasive problem that can incur serious effects on patients’ health outcomes and well-being, and the availability of resources in healthcare systems. This study aimed to develop practical consensus-based policy solutions to address medicines non-adherence for Europe. Methods A four-round Delphi study was conducted. The Delphi Expert Panel comprised 50 participants from 14 countries and was representative of: patient/carers organisations; healthcare providers and professionals; commissioners and policy makers; academics; and industry representatives. Participants engaged in the study remotely, anonymously and electronically. Participants were invited to respond to open questions about the causes, consequences and solutions to medicines non-adherence. Subsequent rounds refined responses, and sought ratings of the relative importance, and operational and political feasibility of each potential solution to medicines non-adherence. Feedback of individual and group responses was provided to participants after each round. Members of the Delphi Expert Panel and members of the research group participated in a consensus meeting upon completion of the Delphi study to discuss and further refine the proposed policy solutions. Results 43 separate policy solutions to medication non-adherence were agreed by the Panel. 25 policy solutions were prioritised based on composite scores for importance, and operational and political feasibility. Prioritised policy solutions focused on interventions for patients, training for healthcare professionals, and actions to support partnership between patients and healthcare professionals. Few solutions concerned actions by governments, healthcare commissioners, or interventions at the system level. Conclusions Consensus about practical actions necessary to address non-adherence to medicines has been developed for Europe. These actions are also applicable to other regions. Prioritised

  5. Building Future Transatlantic Interoperability Around a Robust NATO Response Force

    Science.gov (United States)

    2012-10-01

    than already traveled . However, this accrued wealth of interoperable capa- bility may be at its apogee, soon to decline as the result of two looming...and Bydgo- szcz, Poland, as well as major national training centers such as the bilateral U.S.- Romanian Joint Task Force– East at Kogalniceanu...operations. Increase U.S. and Allied Exchange Students at National and NATO military schools. Austerity measures may eventually affect the investment

  6. Technical Data Interoperability (TDI) Pathfinder Via Emerging Standards

    Science.gov (United States)

    Conroy, Mike; Gill, Paul; Hill, Bradley; Ibach, Brandon; Jones, Corey; Ungar, David; Barch, Jeffrey; Ingalls, John; Jacoby, Joseph; Manning, Josh; hide

    2014-01-01

    The TDI project (TDI) investigates trending technical data standards for applicability to NASA vehicles, space stations, payloads, facilities, and equipment. TDI tested COTS software compatible with a certain suite of related industry standards for capabilities of individual benefits and interoperability. These standards not only esnable Information Technology (IT) efficiencies, but also address efficient structures and standard content for business processes. We used source data from generic industry samples as well as NASA and European Space Agency (ESA) data from space systems.

  7. Interoperability between Fingerprint Biometric Systems: An Empirical Study

    OpenAIRE

    Gashi, I.; Mason, S.; Lugini, L.; Marasco, E.; Cukic, B.

    2014-01-01

    Fingerprints are likely the most widely used biometric in commercial as well as law enforcement applications. With the expected rapid growth of fingerprint authentication in mobile devices their importance justifies increased demands for dependability. An increasing number of new sensors,applications and a diverse user population also intensify concerns about the interoperability in fingerprint authentication. In most applications, fingerprints captured for user enrollment with one device may...

  8. Product-driven Enterprise Interoperability for Manufacturing Systems Integration

    OpenAIRE

    Dassisti , Michele; Panetto , Hervé; Tursi , Angela

    2006-01-01

    International audience; The “Babel tower effect”, induced by the heterogeneity of applications available in the operation of enterprises brings to a consistent lack of “exchangeability” and risk of semantic loss whenever cooperation has to take place within the same enterprise. Generally speaking, this kind of problem falls within the umbrella of interoperability between local reference information models .This position paper discuss some idea on this field and traces a research roadmap to ma...

  9. Enabling IoT ecosystems through platform interoperability

    OpenAIRE

    Bröring, Arne; Schmid, Stefan; Schindhelm, Corina-Kim; Khelil, Abdelmajid; Kabisch, Sebastian; Kramer, Denis; Le Phuoc, Danh; Mitic, Jelena; Anicic, Darko; Teniente López, Ernest

    2017-01-01

    Today, the Internet of Things (IoT) comprises vertically oriented platforms for things. Developers who want to use them need to negotiate access individually and adapt to the platform-specific API and information models. Having to perform these actions for each platform often outweighs the possible gains from adapting applications to multiple platforms. This fragmentation of the IoT and the missing interoperability result in high entry barriers for developers and prevent the emergence of broa...

  10. The Internet of Things: New Interoperability, Management and Security Challenges

    OpenAIRE

    Elkhodr, Mahmoud; Shahrestani, Seyed; Cheung, Hon

    2016-01-01

    The Internet of Things (IoT) brings connectivity to about every objects found in the physical space. It extends connectivity to everyday objects. From connected fridges, cars and cities, the IoT creates opportunities in numerous domains. However, this increase in connectivity creates many prominent challenges. This paper provides a survey of some of the major issues challenging the widespread adoption of the IoT. Particularly, it focuses on the interoperability, management, securi...

  11. Facilitating Semantic Interoperability Among Ocean Data Systems: ODIP-R2R Student Outcomes

    Science.gov (United States)

    Stocks, K. I.; Chen, Y.; Shepherd, A.; Chandler, C. L.; Dockery, N.; Elya, J. L.; Smith, S. R.; Ferreira, R.; Fu, L.; Arko, R. A.

    2014-12-01

    With informatics providing an increasingly important set of tools for geoscientists, it is critical to train the next generation of scientists in information and data techniques. The NSF-supported Rolling Deck to Repository (R2R) Program works with the academic fleet community to routinely document, assess, and preserve the underway sensor data from U.S. research vessels. The Ocean Data Interoperability Platform (ODIP) is an EU-US-Australian collaboration fostering interoperability among regional e-infrastructures through workshops and joint prototype development. The need to align terminology between systems is a common challenge across all of the ODIP prototypes. Five R2R students were supported to address aspects of semantic interoperability within ODIP. Developing a vocabulary matching service that links terms from different vocabularies with similar concept. The service implements Google Refine reconciliation service interface such that users can leverage Google Refine application as a friendly user interface while linking different vocabulary terms. Developing Resource Description Framework (RDF) resources that map Shipboard Automated Meteorological Oceanographic System (SAMOS) vocabularies to internationally served vocabularies. Each SAMOS vocabulary term (data parameter and quality control flag) will be described as an RDF resource page. These RDF resources allow for enhanced discoverability and retrieval of SAMOS data by enabling data searches based on parameter. Improving data retrieval and interoperability by exposing data and mapped vocabularies using Semantic Web technologies. We have collaborated with ODIP participating organizations in order to build a generalized data model that will be used to populate a SPARQL endpoint in order to provide expressive querying over our data files. Mapping local and regional vocabularies used by R2R to those used by ODIP partners. This work is described more fully in a companion poster. Making published Linked Data

  12. The caCORE Software Development Kit: Streamlining construction of interoperable biomedical information services

    Directory of Open Access Journals (Sweden)

    Warzel Denise

    2006-01-01

    Full Text Available Abstract Background Robust, programmatically accessible biomedical information services that syntactically and semantically interoperate with other resources are challenging to construct. Such systems require the adoption of common information models, data representations and terminology standards as well as documented application programming interfaces (APIs. The National Cancer Institute (NCI developed the cancer common ontologic representation environment (caCORE to provide the infrastructure necessary to achieve interoperability across the systems it develops or sponsors. The caCORE Software Development Kit (SDK was designed to provide developers both within and outside the NCI with the tools needed to construct such interoperable software systems. Results The caCORE SDK requires a Unified Modeling Language (UML tool to begin the development workflow with the construction of a domain information model in the form of a UML Class Diagram. Models are annotated with concepts and definitions from a description logic terminology source using the Semantic Connector component. The annotated model is registered in the Cancer Data Standards Repository (caDSR using the UML Loader component. System software is automatically generated using the Codegen component, which produces middleware that runs on an application server. The caCORE SDK was initially tested and validated using a seven-class UML model, and has been used to generate the caCORE production system, which includes models with dozens of classes. The deployed system supports access through object-oriented APIs with consistent syntax for retrieval of any type of data object across all classes in the original UML model. The caCORE SDK is currently being used by several development teams, including by participants in the cancer biomedical informatics grid (caBIG program, to create compatible data services. caBIG compatibility standards are based upon caCORE resources, and thus the caCORE SDK has

  13. Interoperable and accessible census and survey data from IPUMS.

    Science.gov (United States)

    Kugler, Tracy A; Fitch, Catherine A

    2018-02-27

    The first version of the Integrated Public Use Microdata Series (IPUMS) was released to users in 1993, and since that time IPUMS has come to stand for interoperable and accessible census and survey data. Initially created to harmonize U.S. census microdata over time, IPUMS now includes microdata from the U.S. and international censuses and from surveys on health, employment, and other topics. IPUMS also provides geo-spatial data, aggregate population data, and environmental data. IPUMS supports ten data products, each disseminating an integrated data collection with a set of tools that make complex data easy to find, access, and use. Key features are record-level integration to create interoperable datasets, user-friendly interfaces, and comprehensive metadata and documentation. The IPUMS philosophy aligns closely with the FAIR principles of findability, accessibility, interoperability, and re-usability. IPUMS data have catalyzed knowledge generation across a wide range of social science and other disciplines, as evidenced by the large volume of publications and other products created by the vast IPUMS user community.

  14. Two-Level Evaluation on Sensor Interoperability of Features in Fingerprint Image Segmentation

    Directory of Open Access Journals (Sweden)

    Ya-Shuo Li

    2012-03-01

    Full Text Available Features used in fingerprint segmentation significantly affect the segmentation performance. Various features exhibit different discriminating abilities on fingerprint images derived from different sensors. One feature which has better discriminating ability on images derived from a certain sensor may not adapt to segment images derived from other sensors. This degrades the segmentation performance. This paper empirically analyzes the sensor interoperability problem of segmentation feature, which refers to the feature’s ability to adapt to the raw fingerprints captured by different sensors. To address this issue, this paper presents a two-level feature evaluation method, including the first level feature evaluation based on segmentation error rate and the second level feature evaluation based on decision tree. The proposed method is performed on a number of fingerprint databases which are obtained from various sensors. Experimental results show that the proposed method can effectively evaluate the sensor interoperability of features, and the features with good evaluation results acquire better segmentation accuracies of images originating from different sensors.

  15. The role of preservation solution on acid-base regulation during machine perfusion of kidneys.

    Science.gov (United States)

    Baicu, Simona C; Taylor, Michael J; Brockbank, Kelvin G M

    2006-01-01

    To meet the current clinical organ demand, efficient preservation methods and solutions are needed to increase the number of viable kidneys for transplantation. In the present study, the influence of perfusion solution buffering strength on renal pH dynamics and regulation mechanisms during kidney ex vivo preservation was determined. Porcine kidneys were hypothermically machine perfused for 72 h with either Unisol-UHK or Belzer-Machine Perfusion solution, Belzer-MP solution. Renal perfusate samples were periodically collected and biochemically analyzed. The UHK solution, a Hepes-based solution (35 mM), provided a more efficient control of renal pH that, in turn, resulted in minor changes in the perfusate pH relative to baseline, in response to tissue CO2 and HCO3- production. In the perfusate of Belzer-MP kidney group a wider range of pH values were recorded and a pronounced pH reduction was seen in response to significant rises in pCO2 and HCO3- concentrations. The Belzer-MP solution, containing phosphate (25 mM) as its main buffer, and only 10 mM Hepes, had a greater buffering requirement to attenuate larger pH changes.

  16. Finite element limit analysis based plastic limit pressure solutions for cracked pipes

    International Nuclear Information System (INIS)

    Shim, Do Jun; Huh, Nam Su; Kim, Yun Jae; Kim, Young Jin

    2002-01-01

    Based on detailed FE limit analyses, the present paper provides tractable approximations for plastic limit pressure solutions for axial through-wall cracked pipe; axial (inner) surface cracked pipe; circumferential through-wall cracked pipe; and circumferential (inner) surface cracked pipe. Comparisons with existing analytical and empirical solutions show a large discrepancy in circumferential short through-wall cracks and in surface cracks (both axial and circumferential). Being based on detailed 3-D FE limit analysis, the present solutions are believed to be the most accurate, and thus to be valuable information not only for plastic collapse analysis of pressurised piping but also for estimating non-linear fracture mechanics parameters based on the reference stress approach

  17. 23 CFR 950.7 - Interoperability requirements.

    Science.gov (United States)

    2010-04-01

    ... the Value Pricing Pilot Program, this part only applies if tolls are imposed on a facility after the... in that area. (b) Based on the identification conducted under subsection (a), the toll agency shall...

  18. Synthesis method based on solution regions for planar four bar straight line linkages

    International Nuclear Information System (INIS)

    Lai Rong, Yin; Cong, Mao; Jian you, Han; Tong, Yang; Juan, Huang

    2012-01-01

    An analytical method for synthesizing and selecting desired four-bar straight line mechanisms based on solution regions is presented. Given two fixed pivots, the point position and direction of the target straight line, an infinite number of mechanism solutions can be produced by employing this method, both in the general case and all three special cases. Unifying the straight line direction and the displacement from the given point to the instant center into the same form with different angles as parameters, infinite mechanism solutions can be expressed with different solution region charts. The mechanism property graphs have been computed to enable the designers to find out the involved mechanism information more intuitively and avoid aimlessness in selecting optimal mechanisms

  19. Extending the GI Brokering Suite to Support New Interoperability Specifications

    Science.gov (United States)

    Boldrini, E.; Papeschi, F.; Santoro, M.; Nativi, S.

    2014-12-01

    The GI brokering suite provides the discovery, access, and semantic Brokers (i.e. GI-cat, GI-axe, GI-sem) that empower a Brokering framework for multi-disciplinary and multi-organizational interoperability. GI suite has been successfully deployed in the framework of several programmes and initiatives, such as European Union funded projects, NSF BCube, and the intergovernmental coordinated effort Global Earth Observation System of Systems (GEOSS). Each GI suite Broker facilitates interoperability for a particular functionality (i.e. discovery, access, semantic extension) among a set of brokered resources published by autonomous providers (e.g. data repositories, web services, semantic assets) and a set of heterogeneous consumers (e.g. client applications, portals, apps). A wide set of data models, encoding formats, and service protocols are already supported by the GI suite, such as the ones defined by international standardizing organizations like OGC and ISO (e.g. WxS, CSW, SWE, GML, netCDF) and by Community specifications (e.g. THREDDS, OpenSearch, OPeNDAP, ESRI APIs). Using GI suite, resources published by a particular Community or organization through their specific technology (e.g. OPeNDAP/netCDF) can be transparently discovered, accessed, and used by different Communities utilizing their preferred tools (e.g. a GIS visualizing WMS layers). Since Information Technology is a moving target, new standards and technologies continuously emerge and are adopted in the Earth Science context too. Therefore, GI Brokering suite was conceived to be flexible and accommodate new interoperability protocols and data models. For example, GI suite has recently added support to well-used specifications, introduced to implement Linked data, Semantic Web and precise community needs. Amongst the others, they included: DCAT: a RDF vocabulary designed to facilitate interoperability between Web data catalogs. CKAN: a data management system for data distribution, particularly used by

  20. A potential model for sodium chloride solutions based on the TIP4P/2005 water model

    Science.gov (United States)

    Benavides, A. L.; Portillo, M. A.; Chamorro, V. C.; Espinosa, J. R.; Abascal, J. L. F.; Vega, C.

    2017-09-01

    Despite considerable efforts over more than two decades, our knowledge of the interactions in electrolyte solutions is not yet satisfactory. Not even one of the most simple and important aqueous solutions, NaCl(aq), escapes this assertion. A requisite for the development of a force field for any water solution is the availability of a good model for water. Despite the fact that TIP4P/2005 seems to fulfill the requirement, little work has been devoted to build a force field based on TIP4P/2005. In this work, we try to fill this gap for NaCl(aq). After unsuccessful attempts to produce accurate predictions for a wide range of properties using unity ionic charges, we decided to follow recent suggestions indicating that the charges should be scaled in the ionic solution. In this way, we have been able to develop a satisfactory non-polarizable force field for NaCl(aq). We evaluate a number of thermodynamic properties of the solution (equation of state, maximum in density, enthalpies of solution, activity coefficients, radial distribution functions, solubility, surface tension, diffusion coefficients, and viscosity). Overall the results for the solution are very good. An important achievement of our model is that it also accounts for the dynamical properties of the solution, a test for which the force fields so far proposed failed. The same is true for the solubility and for the maximum in density where the model describes the experimental results almost quantitatively. The price to pay is that the model is not so good at describing NaCl in the solid phase, although the results for several properties (density and melting temperature) are still acceptable. We conclude that the scaling of the charges improves the overall description of NaCl aqueous solutions when the polarization is not included.

  1. The solution of target assignment problem in command and control decision-making behaviour simulation

    Science.gov (United States)

    Li, Ni; Huai, Wenqing; Wang, Shaodan

    2017-08-01

    C2 (command and control) has been understood to be a critical military component to meet an increasing demand for rapid information gathering and real-time decision-making in a dynamically changing battlefield environment. In this article, to improve a C2 behaviour model's reusability and interoperability, a behaviour modelling framework was proposed to specify a C2 model's internal modules and a set of interoperability interfaces based on the C-BML (coalition battle management language). WTA (weapon target assignment) is a typical C2 autonomous decision-making behaviour modelling problem. Different from most WTA problem descriptions, here sensors were considered to be available resources of detection and the relationship constraints between weapons and sensors were also taken into account, which brought it much closer to actual application. A modified differential evolution (MDE) algorithm was developed to solve this high-dimension optimisation problem and obtained an optimal assignment plan with high efficiency. In case study, we built a simulation system to validate the proposed C2 modelling framework and interoperability interface specification. Also, a new optimisation solution was used to solve the WTA problem efficiently and successfully.

  2. CityGML - Interoperable semantic 3D city models

    Science.gov (United States)

    Gröger, Gerhard; Plümer, Lutz

    2012-07-01

    CityGML is the international standard of the Open Geospatial Consortium (OGC) for the representation and exchange of 3D city models. It defines the three-dimensional geometry, topology, semantics and appearance of the most relevant topographic objects in urban or regional contexts. These definitions are provided in different, well-defined Levels-of-Detail (multiresolution model). The focus of CityGML is on the semantical aspects of 3D city models, its structures, taxonomies and aggregations, allowing users to employ virtual 3D city models for advanced analysis and visualization tasks in a variety of application domains such as urban planning, indoor/outdoor pedestrian navigation, environmental simulations, cultural heritage, or facility management. This is in contrast to purely geometrical/graphical models such as KML, VRML, or X3D, which do not provide sufficient semantics. CityGML is based on the Geography Markup Language (GML), which provides a standardized geometry model. Due to this model and its well-defined semantics and structures, CityGML facilitates interoperable data exchange in the context of geo web services and spatial data infrastructures. Since its standardization in 2008, CityGML has become used on a worldwide scale: tools from notable companies in the geospatial field provide CityGML interfaces. Many applications and projects use this standard. CityGML is also having a strong impact on science: numerous approaches use CityGML, particularly its semantics, for disaster management, emergency responses, or energy-related applications as well as for visualizations, or they contribute to CityGML, improving its consistency and validity, or use CityGML, particularly its different Levels-of-Detail, as a source or target for generalizations. This paper gives an overview of CityGML, its underlying concepts, its Levels-of-Detail, how to extend it, its applications, its likely future development, and the role it plays in scientific research. Furthermore, its

  3. Towards interoperable and reproducible QSAR analyses: Exchange of datasets.

    Science.gov (United States)

    Spjuth, Ola; Willighagen, Egon L; Guha, Rajarshi; Eklund, Martin; Wikberg, Jarl Es

    2010-06-30

    QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML) which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join, extend, combine datasets and hence work collectively, but

  4. Towards interoperable and reproducible QSAR analyses: Exchange of datasets

    Directory of Open Access Journals (Sweden)

    Spjuth Ola

    2010-06-01

    Full Text Available Abstract Background QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. Results We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Conclusions Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join

  5. Distributed interoperable workflow support for electronic commerce

    NARCIS (Netherlands)

    Papazoglou, M.; Jeusfeld, M.A.; Weigand, H.; Jarke, M.

    1998-01-01

    Abstract. This paper describes a flexible distributed transactional workflow environment based on an extensible object-oriented framework built around class libraries, application programming interfaces, and shared services. The purpose of this environment is to support a range of EC-like business

  6. Semantic interoperability for collaborative spatial design

    NARCIS (Netherlands)

    Hofman, W.

    2009-01-01

    Mobile devices offer integrated functionality to browse, phone, play music, and watch video. Moreover, these devices have sufficient memory and processing power to run (small) applications based on for instance Google Android and the iPhone/iPod OS. As such, they support for instance Google Earth to

  7. Category Theory Approach to Solution Searching Based on Photoexcitation Transfer Dynamics

    Directory of Open Access Journals (Sweden)

    Makoto Naruse

    2017-07-01

    Full Text Available Solution searching that accompanies combinatorial explosion is one of the most important issues in the age of artificial intelligence. Natural intelligence, which exploits natural processes for intelligent functions, is expected to help resolve or alleviate the difficulties of conventional computing paradigms and technologies. In fact, we have shown that a single-celled organism such as an amoeba can solve constraint satisfaction problems and related optimization problems as well as demonstrate experimental systems based on non-organic systems such as optical energy transfer involving near-field interactions. However, the fundamental mechanisms and limitations behind solution searching based on natural processes have not yet been understood. Herein, we present a theoretical background of solution searching based on optical excitation transfer from a category-theoretic standpoint. One important indication inspired by the category theory is that the satisfaction of short exact sequences is critical for an adequate computational operation that determines the flow of time for the system and is termed as “short-exact-sequence-based time.” In addition, the octahedral and braid structures known in triangulated categories provide a clear understanding of the underlying mechanisms, including a quantitative indication of the difficulties of obtaining solutions based on homology dimension. This study contributes to providing a fundamental background of natural intelligence.

  8. Ocean Data Interoperability Platform (ODIP): developing a common framework for marine data management on a global scale

    Science.gov (United States)

    Schaap, Dick M. A.; Glaves, Helen

    2016-04-01

    Europe, the USA, and Australia are making significant progress in facilitating the discovery, access and long term stewardship of ocean and marine data through the development, implementation, population and operation of national, regional or international distributed ocean and marine observing and data management infrastructures such as SeaDataNet, EMODnet, IOOS, R2R, and IMOS. All of these developments are resulting in the development of standards and services implemented and used by their regional communities. The Ocean Data Interoperability Platform (ODIP) project is supported by the EU FP7 Research Infrastructures programme, National Science Foundation (USA) and Australian government and has been initiated 1st October 2012. Recently the project has been continued as ODIP II for another 3 years with EU HORIZON 2020 funding. ODIP includes all the major organisations engaged in ocean data management in EU, US, and Australia. ODIP is also supported by the IOC-IODE, closely linking this activity with its Ocean Data Portal (ODP) and Ocean Data Standards Best Practices (ODSBP) projects. The ODIP platform aims to ease interoperability between the regional marine data management infrastructures. Therefore it facilitates an organised dialogue between the key infrastructure representatives by means of publishing best practice, organising a series of international workshops and fostering the development of common standards and interoperability solutions. These are evaluated and tested by means of prototype projects. The presentation will give further background on the ODIP projects and the latest information on the progress of three prototype projects addressing: 1. establishing interoperability between the regional EU, USA and Australia data discovery and access services (SeaDataNet CDI, US NODC, and IMOS MCP) and contributing to the global GEOSS and IODE-ODP portals; 2. establishing interoperability between cruise summary reporting systems in Europe, the USA and

  9. Development of high performance scientific components for interoperability of computing packages

    Energy Technology Data Exchange (ETDEWEB)

    Gulabani, Teena Pratap [Iowa State Univ., Ames, IA (United States)

    2008-01-01

    Three major high performance quantum chemistry computational packages, NWChem, GAMESS and MPQC have been developed by different research efforts following different design patterns. The goal is to achieve interoperability among these packages by overcoming the challenges caused by the different communication patterns and software design of each of these packages. A chemistry algorithm is hard to develop as well as being a time consuming process; integration of large quantum chemistry packages will allow resource sharing and thus avoid reinvention of the wheel. Creating connections between these incompatible packages is the major motivation of the proposed work. This interoperability is achieved by bringing the benefits of Component Based Software Engineering through a plug-and-play component framework called Common Component Architecture (CCA). In this thesis, I present a strategy and process used for interfacing two widely used and important computational chemistry methodologies: Quantum Mechanics and Molecular Mechanics. To show the feasibility of the proposed approach the Tuning and Analysis Utility (TAU) has been coupled with NWChem code and its CCA components. Results show that the overhead is negligible when compared to the ease and potential of organizing and coping with large-scale software applications.

  10. Multi-Agent Decision Support Tool to Enable Interoperability among Heterogeneous Energy Systems

    Directory of Open Access Journals (Sweden)

    Brígida Teixeira

    2018-02-01

    Full Text Available Worldwide electricity markets are undergoing a major restructuring process. One of the main reasons for the ongoing changes is to enable the adaptation of current market models to the new paradigm that arises from the large-scale integration of distributed generation sources. In order to deal with the unpredictability caused by the intermittent nature of the distributed generation and the large number of variables that contribute to the energy sector balance, it is extremely important to use simulation systems that are capable of dealing with the required complexity. This paper presents the Tools Control Center (TOOCC, a framework that allows the interoperability between heterogeneous energy and power simulation systems through the use of ontologies, allowing the simulation of scenarios with a high degree of complexity, through the cooperation of the individual capacities of each system. A case study based on real data is presented in order to demonstrate the interoperability capabilities of TOOCC. The simulation considers the energy management of a microgrid of a real university campus, from the perspective of the network manager and also of its consumers/producers, in a projection for a typical day of the winter of 2050.

  11. An Interoperability Platform Enabling Reuse of Electronic Health Records for Signal Verification Studies

    Directory of Open Access Journals (Sweden)

    Mustafa Yuksel

    2016-01-01

    Full Text Available Depending mostly on voluntarily sent spontaneous reports, pharmacovigilance studies are hampered by low quantity and quality of patient data. Our objective is to improve postmarket safety studies by enabling safety analysts to seamlessly access a wide range of EHR sources for collecting deidentified medical data sets of selected patient populations and tracing the reported incidents back to original EHRs. We have developed an ontological framework where EHR sources and target clinical research systems can continue using their own local data models, interfaces, and terminology systems, while structural interoperability and Semantic Interoperability are handled through rule-based reasoning on formal representations of different models and terminology systems maintained in the SALUS Semantic Resource Set. SALUS Common Information Model at the core of this set acts as the common mediator. We demonstrate the capabilities of our framework through one of the SALUS safety analysis tools, namely, the Case Series Characterization Tool, which have been deployed on top of regional EHR Data Warehouse of the Lombardy Region containing about 1 billion records from 16 million patients and validated by several pharmacovigilance researchers with real-life cases. The results confirm significant improvements in signal detection and evaluation compared to traditional methods with the missing background information.

  12. System and methods of resource usage using an interoperable management framework

    Science.gov (United States)

    Heileman, Gregory L.; Jamkhedkar, Pramod A.; Lamb, Christopher C.

    2017-10-31

    Generic rights expression language allowing interoperability across different computing environments including resource usage of different applications. A formal framework for usage management provides scaffolding upon which interoperable usage management systems can be built. Certain features of the framework are standardized, such as the operational semantics, including areas free of standards that necessitate choice and innovation to achieve a balance of flexibility and usability for interoperability in usage management systems.

  13. A Conceptual Framework to Enhance the Interoperability of Observatories among Countries, Continents and the World

    Science.gov (United States)

    Loescher, H.; Fundamental Instrument Unit

    2013-05-01

    Ecological research addresses challenges relating to the dynamics of the planet, such as changes in climate, biodiversity, ecosystem functioning and services, carbon and energy cycles, natural and human-induced hazards, and adaptation and mitigation strategies that involve many science and engineering disciplines and cross national boundaries. Because of the global nature of these challenges, greater international collaboration is required for knowledge sharing and technology deployment to advance earth science investigations and enhance societal benefits. For example, the Working Group on Biodiversity Preservation and Ecosystem Services (PCAST 2011) noted the scale and complexity of the physical and human resources needed to address these challenges. Many of the most pressing ecological research questions require global-scale data and global scale solutions (Suresh 2012), e.g., interdisciplinary data access from data centers managing ecological resources and hazards, drought, heat islands, carbon cycle, or data used to forecast the rate of spread of invasive species or zoonotic diseases. Variability and change at one location or in one region may well result from the superposition of global processes coupled together with regional and local modes of variability. For example, we know the El Niño-Southern Oscillation large-scale modes of variability in the coupled terrestrial-aquatic-atmospheric systems' correlation with variability in regional rainfall and ecosystem functions. It is therefore a high priority of government and non-government organizations to develop the necessary large scale, world-class research infrastructures for environmental research—and the framework by which these data can be shared, discovered, and utilized by a broad user community of scientists and policymakers, alike. Given that there are many, albeit nascent, efforts to build new environmental observatories/networks globally (e.g., EU-ICOS, EU-Lifewatch, AU-TERN, China-CERN, GEOSS

  14. Modification of Rat Lung Decellularization Protocol Based on Dynamic Conductometry of Working Solution.

    Science.gov (United States)

    Kuevda, E V; Gubareva, E A; Gumenyuk, I S; Sotnichenko, A S; Gilevich, I V; Nakokhov, R Z; Rusinova, T V; Yudina, T G; Red'ko, A N; Alekseenko, S N

    2017-03-01

    We modified the protocol of obtaining of biological scaffolds of rat lungs based on dynamic recording of specific resistivity of working detergent solution (conductometry) during perfusion decellularization. Termination of sodium deoxycholate exposure after attaining ionic equilibrium plateau did not impair the quality of decellularization and preserved structural matrix components, which was confirmed by morphological analysis and quantitative assay of residual DNA.

  15. Effectiveness of a Solution-Based Counseling on Students' Self-Perception

    Science.gov (United States)

    Joker, Habib; Ghaderi, Zahra

    2015-01-01

    The purpose of this study was to evaluate the effectiveness of solution-based counseling to increase students' self-conception. Method of research was semi-experimental with pretest and posttest design with a control group. The study sample consisted of all high school students in Dashtestan city, Bushkan district for which 30 subjects were…

  16. Model-based fuzzy control solutions for a laboratory Antilock Braking System

    DEFF Research Database (Denmark)

    Precup, Radu-Emil; Spataru, Sergiu; Rǎdac, Mircea-Bogdan

    2010-01-01

    This paper gives two original model-based fuzzy control solutions dedicated to the longitudinal slip control of Antilock Braking System laboratory equipment. The parallel distributed compensation leads to linear matrix inequalities which guarantee the global stability of the fuzzy control systems...

  17. Comparing the creativity of children's design solutions based on expert assessment

    NARCIS (Netherlands)

    Thang, B.; Sluis - Thiescheffer, R.J.W.; Bekker, M.M.; Eggen, J.H.; Vermeeren, A.P.O.S.; Ridder, de H.

    2008-01-01

    LOBBI Netherlands Consortium Subscribe (Full Service) Register (Limited Service, Free) Login Search: The ACM Digital Library The Guide Feedback Comparing the creativity of children's design solutions based on expert assessment Full text Pdf (3.04 MB) Source Interaction Design and Children archive

  18. Synthesis and characterization of homogeneous interstitial solutions of nitrogen and carbon in iron-based lattices

    DEFF Research Database (Denmark)

    Brink, Bastian Klüge

    work in synthesis and characterization of interstitial solutions ofnitrogen and carbon in iron-based lattices. In order to avoid the influences of gradients incomposition and residual stresses, which are typically found in treated surface layers,homogenous samples are needed. These were prepared from...

  19. Prediction of Pure Component Adsorption Equilibria Using an Adsorption Isotherm Equation Based on Vacancy Solution Theory

    DEFF Research Database (Denmark)

    Marcussen, Lis; Aasberg-Petersen, K.; Krøll, Annette Elisabeth

    2000-01-01

    An adsorption isotherm equation for nonideal pure component adsorption based on vacancy solution theory and the Non-Random-Two-Liquid (NRTL) equation is found to be useful for predicting pure component adsorption equilibria at a variety of conditions. The isotherm equation is evaluated successfully...... adsorption systems, spreading pressure and isosteric heat of adsorption are also calculated....

  20. Military Interoperable Digital Hospital Testbed (MIDHT)

    Science.gov (United States)

    2013-10-01

    activities are selected highlights completed by Northrop Grumman during the year. Cycle 4 development: - Increased the max_allowed_packet size in MySQL ...deployment with the Java install that is required by CONNECT v3.3.1.3. - Updated the MIDHT code base to work with the CONNECT v.3.3.1.3 Core Libraries...Provided TATRC the CONNECTUniversalClientGUI binaries for use with CONNECT v3.3.1.3 − Created and deployed a common Java library for the CONNECT