WorldWideScience

Sample records for integrating heterogeneous information

  1. Integration of Heterogeneous Information Sources into a Knowledge Resource Management System for Lifelong Learning

    NARCIS (Netherlands)

    Demidova, Elena; Ternier, Stefaan; Olmedilla, Daniel; Duval, Erik; Dicerto, Michele; Stefanov, Krassen; Sacristán, Naiara

    2007-01-01

    Demidova, E., Ternier, S., Olmedilla, D., Duval, E., Dicerto, M., Stefanov, K., et al. (2007). Integration of Heterogeneous Information Sources into a Knowledge Resource Management System for Lifelong. TENCompetence Workshop on Service Oriented Approaches and Lifelong Competence Development

  2. Implementation of integrated heterogeneous electronic electrocardiography data into Maharaj Nakorn Chiang Mai Hospital Information System.

    Science.gov (United States)

    Khumrin, Piyapong; Chumpoo, Pitupoom

    2016-03-01

    Electrocardiography is one of the most important non-invasive diagnostic tools for diagnosing coronary heart disease. The electrocardiography information system in Maharaj Nakorn Chiang Mai Hospital required a massive manual labor effort. In this article, we propose an approach toward the integration of heterogeneous electrocardiography data and the implementation of an integrated electrocardiography information system into the existing Hospital Information System. The system integrates different electrocardiography formats into a consistent electrocardiography rendering by using Java software. The interface acts as middleware to seamlessly integrate different electrocardiography formats. Instead of using a common electrocardiography protocol, we applied a central format based on Java classes for mapping different electrocardiography formats which contains a specific parser for each electrocardiography format to acquire the same information. Our observations showed that the new system improved the effectiveness of data management, work flow, and data quality; increased the availability of information; and finally improved quality of care. © The Author(s) 2014.

  3. A network integration approach for drug-target interaction prediction and computational drug repositioning from heterogeneous information.

    Science.gov (United States)

    Luo, Yunan; Zhao, Xinbin; Zhou, Jingtian; Yang, Jinglin; Zhang, Yanqing; Kuang, Wenhua; Peng, Jian; Chen, Ligong; Zeng, Jianyang

    2017-09-18

    The emergence of large-scale genomic, chemical and pharmacological data provides new opportunities for drug discovery and repositioning. In this work, we develop a computational pipeline, called DTINet, to predict novel drug-target interactions from a constructed heterogeneous network, which integrates diverse drug-related information. DTINet focuses on learning a low-dimensional vector representation of features, which accurately explains the topological properties of individual nodes in the heterogeneous network, and then makes prediction based on these representations via a vector space projection scheme. DTINet achieves substantial performance improvement over other state-of-the-art methods for drug-target interaction prediction. Moreover, we experimentally validate the novel interactions between three drugs and the cyclooxygenase proteins predicted by DTINet, and demonstrate the new potential applications of these identified cyclooxygenase inhibitors in preventing inflammatory diseases. These results indicate that DTINet can provide a practically useful tool for integrating heterogeneous information to predict new drug-target interactions and repurpose existing drugs.Network-based data integration for drug-target prediction is a promising avenue for drug repositioning, but performance is wanting. Here, the authors introduce DTINet, whose performance is enhanced in the face of noisy, incomplete and high-dimensional biological data by learning low-dimensional vector representations.

  4. HETEROGENEOUS INTEGRATION TECHNOLOGY

    Science.gov (United States)

    2017-08-24

    AFRL-RY-WP-TR-2017-0168 HETEROGENEOUS INTEGRATION TECHNOLOGY Dr. Burhan Bayraktaroglu Devices for Sensing Branch Aerospace Components & Subsystems...Final September 1, 2016 – May 1, 2017 4. TITLE AND SUBTITLE HETEROGENEOUS INTEGRATION TECHNOLOGY 5a. CONTRACT NUMBER In-house 5b. GRANT NUMBER N/A...provide a structure for this review. The history and the current status of integration technologies in each category are examined and product examples are

  5. Service Oriented Integration of Distributed Heterogeneous IT Systems in Production Engineering Using Information Standards and Linked Data

    Directory of Open Access Journals (Sweden)

    Navid Shariat Zadeh

    2017-01-01

    Full Text Available While design of production systems based on digital models brings benefits, the communication of models comes with challenges since models typically reside in a heterogeneous IT environment using different syntax and semantics. Coping with heterogeneity requires a smart integration strategy. One main paradigm to integrate data and IT systems is to deploy information standards. In particular, ISO 10303 STEP has been endorsed as a suitable standard to exchange a wide variety of product manufacturing data. One the other hand, service-oriented tool integration solutions are progressively adopted for the integration of data and IT-tools, especially with the emergence of Open Services for Lifecycle Collaboration whose focus is on the linking of data from heterogeneous software tools. In practice, there should be a combination of these approaches to facilitate the integration process. Hence, the aim of this paper is to investigate the applications of the approaches and the principles behind them and try to find criteria for where to use which approach. In addition, we explore the synergy between them and consequently suggest an approach based on combination of them. In addition, a systematic approach is suggested to identify required level of integrations and their corresponding approaches exemplified in a typical IT system architecture in Production Engineering.

  6. Integration Of Data From Heterogeneous Sources Using Etl Technology.

    Directory of Open Access Journals (Sweden)

    Marek Macura

    2014-01-01

    Full Text Available Data integration is a crucial issue in environments of heterogeneous data sources. At present mentioned heterogeneity is becoming widespread. Whenever, based on various data sources, we want to gain useful information and knowledge we must solve data integration problem in order to apply appropriate analytical methods on comprehensive and uniform data. Such activity is known as knowledge discovery from data process. Therefore approaches to data integration problem are very interesting and bring us closer to the "age of information". The paper presents an architecture, which implements knowledge discovery from data process. The solution combines ETL technology and wrapper layer known from mediated systems. It also provides semantic integration through connections mechanism between data elements. The solution allows for integration of any data sources and implementation of analytical methods in one environment. The proposed environment is verified by applying it to data sources on the foundry industry.

  7. Integration of heterogeneous features for remote sensing scene classification

    Science.gov (United States)

    Wang, Xin; Xiong, Xingnan; Ning, Chen; Shi, Aiye; Lv, Guofang

    2018-01-01

    Scene classification is one of the most important issues in remote sensing (RS) image processing. We find that features from different channels (shape, spectral, texture, etc.), levels (low-level and middle-level), or perspectives (local and global) could provide various properties for RS images, and then propose a heterogeneous feature framework to extract and integrate heterogeneous features with different types for RS scene classification. The proposed method is composed of three modules (1) heterogeneous features extraction, where three heterogeneous feature types, called DS-SURF-LLC, mean-Std-LLC, and MS-CLBP, are calculated, (2) heterogeneous features fusion, where the multiple kernel learning (MKL) is utilized to integrate the heterogeneous features, and (3) an MKL support vector machine classifier for RS scene classification. The proposed method is extensively evaluated on three challenging benchmark datasets (a 6-class dataset, a 12-class dataset, and a 21-class dataset), and the experimental results show that the proposed method leads to good classification performance. It produces good informative features to describe the RS image scenes. Moreover, the integration of heterogeneous features outperforms some state-of-the-art features on RS scene classification tasks.

  8. Integrating mean and variance heterogeneities to identify differentially expressed genes.

    Science.gov (United States)

    Ouyang, Weiwei; An, Qiang; Zhao, Jinying; Qin, Huaizhen

    2016-12-06

    In functional genomics studies, tests on mean heterogeneity have been widely employed to identify differentially expressed genes with distinct mean expression levels under different experimental conditions. Variance heterogeneity (aka, the difference between condition-specific variances) of gene expression levels is simply neglected or calibrated for as an impediment. The mean heterogeneity in the expression level of a gene reflects one aspect of its distribution alteration; and variance heterogeneity induced by condition change may reflect another aspect. Change in condition may alter both mean and some higher-order characteristics of the distributions of expression levels of susceptible genes. In this report, we put forth a conception of mean-variance differentially expressed (MVDE) genes, whose expression means and variances are sensitive to the change in experimental condition. We mathematically proved the null independence of existent mean heterogeneity tests and variance heterogeneity tests. Based on the independence, we proposed an integrative mean-variance test (IMVT) to combine gene-wise mean heterogeneity and variance heterogeneity induced by condition change. The IMVT outperformed its competitors under comprehensive simulations of normality and Laplace settings. For moderate samples, the IMVT well controlled type I error rates, and so did existent mean heterogeneity test (i.e., the Welch t test (WT), the moderated Welch t test (MWT)) and the procedure of separate tests on mean and variance heterogeneities (SMVT), but the likelihood ratio test (LRT) severely inflated type I error rates. In presence of variance heterogeneity, the IMVT appeared noticeably more powerful than all the valid mean heterogeneity tests. Application to the gene profiles of peripheral circulating B raised solid evidence of informative variance heterogeneity. After adjusting for background data structure, the IMVT replicated previous discoveries and identified novel experiment

  9. Integrating heterogeneous databases in clustered medic care environments using object-oriented technology

    Science.gov (United States)

    Thakore, Arun K.; Sauer, Frank

    1994-05-01

    The organization of modern medical care environments into disease-related clusters, such as a cancer center, a diabetes clinic, etc., has the side-effect of introducing multiple heterogeneous databases, often containing similar information, within the same organization. This heterogeneity fosters incompatibility and prevents the effective sharing of data amongst applications at different sites. Although integration of heterogeneous databases is now feasible, in the medical arena this is often an ad hoc process, not founded on proven database technology or formal methods. In this paper we illustrate the use of a high-level object- oriented semantic association method to model information found in different databases into an integrated conceptual global model that integrates the databases. We provide examples from the medical domain to illustrate an integration approach resulting in a consistent global view, without attacking the autonomy of the underlying databases.

  10. Mining Heterogeneous Information Networks by Exploring the Power of Links

    Science.gov (United States)

    Han, Jiawei

    Knowledge is power but for interrelated data, knowledge is often hidden in massive links in heterogeneous information networks. We explore the power of links at mining heterogeneous information networks with several interesting tasks, including link-based object distinction, veracity analysis, multidimensional online analytical processing of heterogeneous information networks, and rank-based clustering. Some recent results of our research that explore the crucial information hidden in links will be introduced, including (1) Distinct for object distinction analysis, (2) TruthFinder for veracity analysis, (3) Infonet-OLAP for online analytical processing of information networks, and (4) RankClus for integrated ranking-based clustering. We also discuss some of our on-going studies in this direction.

  11. Ontology Based Resolution of Semantic Conflicts in Information Integration

    Institute of Scientific and Technical Information of China (English)

    LU Han; LI Qing-zhong

    2004-01-01

    Semantic conflict is the conflict caused by using different ways in heterogeneous systems to express the same entity in reality.This prevents information integration from accomplishing semantic coherence.Since ontology helps to solve semantic problems, this area has become a hot topic in information integration.In this paper, we introduce semantic conflict into information integration of heterogeneous applications.We discuss the origins and categories of the conflict, and present an ontology-based schema mapping approach to eliminate semantic conflicts.

  12. XML-based approaches for the integration of heterogeneous bio-molecular data.

    Science.gov (United States)

    Mesiti, Marco; Jiménez-Ruiz, Ernesto; Sanz, Ismael; Berlanga-Llavori, Rafael; Perlasca, Paolo; Valentini, Giorgio; Manset, David

    2009-10-15

    The today's public database infrastructure spans a very large collection of heterogeneous biological data, opening new opportunities for molecular biology, bio-medical and bioinformatics research, but raising also new problems for their integration and computational processing. In this paper we survey the most interesting and novel approaches for the representation, integration and management of different kinds of biological data by exploiting XML and the related recommendations and approaches. Moreover, we present new and interesting cutting edge approaches for the appropriate management of heterogeneous biological data represented through XML. XML has succeeded in the integration of heterogeneous biomolecular information, and has established itself as the syntactic glue for biological data sources. Nevertheless, a large variety of XML-based data formats have been proposed, thus resulting in a difficult effective integration of bioinformatics data schemes. The adoption of a few semantic-rich standard formats is urgent to achieve a seamless integration of the current biological resources.

  13. Graph embedding with rich information through heterogeneous graph

    KAUST Repository

    Sun, Guolei

    2017-11-12

    Graph embedding, aiming to learn low-dimensional representations for nodes in graphs, has attracted increasing attention due to its critical application including node classification, link prediction and clustering in social network analysis. Most existing algorithms for graph embedding only rely on the topology information and fail to use the copious information in nodes as well as edges. As a result, their performance for many tasks may not be satisfactory. In this thesis, we proposed a novel and general framework for graph embedding with rich text information (GERI) through constructing a heterogeneous network, in which we integrate node and edge content information with graph topology. Specially, we designed a novel biased random walk to explore the constructed heterogeneous network with the notion of flexible neighborhood. Our sampling strategy can compromise between BFS and DFS local search on heterogeneous graph. To further improve our algorithm, we proposed semi-supervised GERI (SGERI), which learns graph embedding in an discriminative manner through heterogeneous network with label information. The efficacy of our method is demonstrated by extensive comparison experiments with 9 baselines over multi-label and multi-class classification on various datasets including Citeseer, Cora, DBLP and Wiki. It shows that GERI improves the Micro-F1 and Macro-F1 of node classification up to 10%, and SGERI improves GERI by 5% in Wiki.

  14. Heterogeneous Monolithic Integration of Single-Crystal Organic Materials.

    Science.gov (United States)

    Park, Kyung Sun; Baek, Jangmi; Park, Yoonkyung; Lee, Lynn; Hyon, Jinho; Koo Lee, Yong-Eun; Shrestha, Nabeen K; Kang, Youngjong; Sung, Myung Mo

    2017-02-01

    Manufacturing high-performance organic electronic circuits requires the effective heterogeneous integration of different nanoscale organic materials with uniform morphology and high crystallinity in a desired arrangement. In particular, the development of high-performance organic electronic and optoelectronic devices relies on high-quality single crystals that show optimal intrinsic charge-transport properties and electrical performance. Moreover, the heterogeneous integration of organic materials on a single substrate in a monolithic way is highly demanded for the production of fundamental organic electronic components as well as complex integrated circuits. Many of the various methods that have been designed to pattern multiple heterogeneous organic materials on a substrate and the heterogeneous integration of organic single crystals with their crystal growth are described here. Critical issues that have been encountered in the development of high-performance organic integrated electronics are also addressed. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Transfer Printed Nanomembranes for Heterogeneously Integrated Membrane Photonics

    Directory of Open Access Journals (Sweden)

    Hongjun Yang

    2015-11-01

    Full Text Available Heterogeneous crystalline semiconductor nanomembrane (NM integration is investigated for single-layer and double-layer Silicon (Si NM photonics, III-V/Si NM lasers, and graphene/Si NM total absorption devices. Both homogeneous and heterogeneous integration are realized by the versatile transfer printing technique. The performance of these integrated membrane devices shows, not only intact optical and electrical characteristics as their bulk counterparts, but also the unique light and matter interactions, such as Fano resonance, slow light, and critical coupling in photonic crystal cavities. Such a heterogeneous integration approach offers tremendous practical application potentials on unconventional, Si CMOS compatible, and high performance optoelectronic systems.

  16. Information and Heterogeneous Beliefs

    DEFF Research Database (Denmark)

    Christensen, Peter Ove; Qin, Zhenjiang

    2014-01-01

    In an incomplete market with heterogeneous prior beliefs, we show public information can have a substantial impact on the ex ante cost of capital, trading volume, and investor welfare. The Pareto effcient public information system is the system enjoying the maximum ex ante cost of capital...... and the maximum expected abnormal trading volume. Imperfect public information increases the gains-to-trade based on heterogeneously updated posterior beliefs. In an exchange economy, this leads to higher growth in the investors' certainty equivalents and, thus, a higher equilibrium interest rate, whereas the ex...... ante risk premium is unaffected by the informativeness of the public information system. Similar results are obtained in a production economy, but the impact on the ex ante cost of capital is dampened compared to the exchange economy due to welfare improving reductions in real investments to smooth...

  17. Integrating heterogeneous healthcare call centers.

    Science.gov (United States)

    Peschel, K M; Reed, W C; Salter, K

    1998-01-01

    In a relatively short period, OHS has absorbed multiple call centers supporting different LOBs from various acquisitions, functioning with diverse standards, processes, and technologies. However, customer and employee satisfaction is predicated on OHS's ability to thoroughly integrate these heterogeneous call centers. The integration was initiated and has successfully progressed through a balanced program of focused leadership and a defined strategy which includes site consolidation, sound performance management philosophies, and enabling technology. Benefits have already been achieved with even more substantive ones to occur as the integration continues to evolve.

  18. Integrating CLIPS applications into heterogeneous distributed systems

    Science.gov (United States)

    Adler, Richard M.

    1991-01-01

    SOCIAL is an advanced, object-oriented development tool for integrating intelligent and conventional applications across heterogeneous hardware and software platforms. SOCIAL defines a family of 'wrapper' objects called agents, which incorporate predefined capabilities for distributed communication and control. Developers embed applications within agents and establish interactions between distributed agents via non-intrusive message-based interfaces. This paper describes a predefined SOCIAL agent that is specialized for integrating C Language Integrated Production System (CLIPS)-based applications. The agent's high-level Application Programming Interface supports bidirectional flow of data, knowledge, and commands to other agents, enabling CLIPS applications to initiate interactions autonomously, and respond to requests and results from heterogeneous remote systems. The design and operation of CLIPS agents are illustrated with two distributed applications that integrate CLIPS-based expert systems with other intelligent systems for isolating and mapping problems in the Space Shuttle Launch Processing System at the NASA Kennedy Space Center.

  19. Human disease MiRNA inference by combining target information based on heterogeneous manifolds.

    Science.gov (United States)

    Ding, Pingjian; Luo, Jiawei; Liang, Cheng; Xiao, Qiu; Cao, Buwen

    2018-04-01

    The emergence of network medicine has provided great insight into the identification of disease-related molecules, which could help with the development of personalized medicine. However, the state-of-the-art methods could neither simultaneously consider target information and the known miRNA-disease associations nor effectively explore novel gene-disease associations as a by-product during the process of inferring disease-related miRNAs. Computational methods incorporating multiple sources of information offer more opportunities to infer disease-related molecules, including miRNAs and genes in heterogeneous networks at a system level. In this study, we developed a novel algorithm, named inference of Disease-related MiRNAs based on Heterogeneous Manifold (DMHM), to accurately and efficiently identify miRNA-disease associations by integrating multi-omics data. Graph-based regularization was utilized to obtain a smooth function on the data manifold, which constitutes the main principle of DMHM. The novelty of this framework lies in the relatedness between diseases and miRNAs, which are measured via heterogeneous manifolds on heterogeneous networks integrating target information. To demonstrate the effectiveness of DMHM, we conducted comprehensive experiments based on HMDD datasets and compared DMHM with six state-of-the-art methods. Experimental results indicated that DMHM significantly outperformed the other six methods under fivefold cross validation and de novo prediction tests. Case studies have further confirmed the practical usefulness of DMHM. Copyright © 2018 Elsevier Inc. All rights reserved.

  20. Emerging heterogeneous integrated photonic platforms on silicon

    Directory of Open Access Journals (Sweden)

    Fathpour Sasan

    2015-05-01

    Full Text Available Silicon photonics has been established as a mature and promising technology for optoelectronic integrated circuits, mostly based on the silicon-on-insulator (SOI waveguide platform. However, not all optical functionalities can be satisfactorily achieved merely based on silicon, in general, and on the SOI platform, in particular. Long-known shortcomings of silicon-based integrated photonics are optical absorption (in the telecommunication wavelengths and feasibility of electrically-injected lasers (at least at room temperature. More recently, high two-photon and free-carrier absorptions required at high optical intensities for third-order optical nonlinear effects, inherent lack of second-order optical nonlinearity, low extinction ratio of modulators based on the free-carrier plasma effect, and the loss of the buried oxide layer of the SOI waveguides at mid-infrared wavelengths have been recognized as other shortcomings. Accordingly, several novel waveguide platforms have been developing to address these shortcomings of the SOI platform. Most of these emerging platforms are based on heterogeneous integration of other material systems on silicon substrates, and in some cases silicon is integrated on other substrates. Germanium and its binary alloys with silicon, III–V compound semiconductors, silicon nitride, tantalum pentoxide and other high-index dielectric or glass materials, as well as lithium niobate are some of the materials heterogeneously integrated on silicon substrates. The materials are typically integrated by a variety of epitaxial growth, bonding, ion implantation and slicing, etch back, spin-on-glass or other techniques. These wide range of efforts are reviewed here holistically to stress that there is no pure silicon or even group IV photonics per se. Rather, the future of the field of integrated photonics appears to be one of heterogenization, where a variety of different materials and waveguide platforms will be used for

  1. Heterogeneous Beliefs, Public Information, and Option Markets

    DEFF Research Database (Denmark)

    Qin, Zhenjiang

    In an incomplete market setting with heterogeneous prior beliefs, I show that public information and strike price of option have substantial infl‡uence on asset pricing in option markets, by investigating an absolute option pricing model with negative exponential utility investors and normally...... distributed dividend. I demonstrate that heterogeneous prior variances give rise to the economic value of option markets. Investors speculate in option market and public information improves allocational efficiency of markets only when there is heterogeneity in prior variance. Heterogeneity in mean is neither...... a necessary nor sufficient condition for generating speculations in option markets. With heterogeneous beliefs, options are non-redundant assets which can facilitate side-betting and enable investors to take advantage of the disagreements and the differences in con…dence. This fact leads to a higher growth...

  2. Autonomous Preference-Aware Information Services Integration for High Response in Integrated Faded Information Field Systems

    Science.gov (United States)

    Lu, Xiaodong; Mori, Kinji

    The market and users' requirements have been rapidly changing and diversified. Under these heterogeneous and dynamic situations, not only the system structure itself, but also the accessible information services would be changed constantly. To cope with the continuously changing conditions of service provision and utilization, Faded Information Field (FIF) has been proposed, which is a agent-based distributed information service system architecture. In the case of a mono-service request, the system is designed to improve users' access time and preserve load balancing through the information structure. However, with interdependent requests of multi-service increasing, adaptability and timeliness have to be assured by the system. In this paper, the relationship that exists among the correlated services and the users' preferences for separate and integrated services is clarified. Based on these factors, the autonomous preference-aware information services integration technology to provide one-stop service for users multi-service requests is proposed. As compared to the conventional system, we show that proposed technology is able to reduce the total access time.

  3. Ontology-based data integration from heterogeneous urban systems : A knowledge representation framework for smart cities

    NARCIS (Netherlands)

    Psyllidis, A.

    2015-01-01

    This paper presents a novel knowledge representation framework for smart city planning and management that enables the semantic integration of heterogeneous urban data from diverse sources. Currently, the combination of information across city agencies is cumbersome, as the increasingly available

  4. Ontology based heterogeneous materials database integration and semantic query

    Science.gov (United States)

    Zhao, Shuai; Qian, Quan

    2017-10-01

    Materials digital data, high throughput experiments and high throughput computations are regarded as three key pillars of materials genome initiatives. With the fast growth of materials data, the integration and sharing of data is very urgent, that has gradually become a hot topic of materials informatics. Due to the lack of semantic description, it is difficult to integrate data deeply in semantic level when adopting the conventional heterogeneous database integration approaches such as federal database or data warehouse. In this paper, a semantic integration method is proposed to create the semantic ontology by extracting the database schema semi-automatically. Other heterogeneous databases are integrated to the ontology by means of relational algebra and the rooted graph. Based on integrated ontology, semantic query can be done using SPARQL. During the experiments, two world famous First Principle Computational databases, OQMD and Materials Project are used as the integration targets, which show the availability and effectiveness of our method.

  5. A service platform architecture design towards a light integration of heterogeneous systems in the wellbeing domain.

    Science.gov (United States)

    Yang, Yaojin; Ahtinen, Aino; Lahteenmaki, Jaakko; Nyman, Petri; Paajanen, Henrik; Peltoniemi, Teijo; Quiroz, Carlos

    2007-01-01

    System integration is one of the major challenges for building wellbeing or healthcare related information systems. In this paper, we are going to share our experiences on how to design a service platform called Nuadu service platform, for providing integrated services in occupational health promotion and health risk management through two heterogeneous systems. Our design aims for a light integration covering the layers, from data through service up to presentation, while maintaining the integrity of the underlying systems.

  6. Simplified nonplanar wafer bonding for heterogeneous device integration

    Science.gov (United States)

    Geske, Jon; Bowers, John E.; Riley, Anton

    2004-07-01

    We demonstrate a simplified nonplanar wafer bonding technique for heterogeneous device integration. The improved technique can be used to laterally integrate dissimilar semiconductor device structures on a lattice-mismatched substrate. Using the technique, two different InP-based vertical-cavity surface-emitting laser active regions have been integrated onto GaAs without compromising the quality of the photoluminescence. Experimental and numerical simulation results are presented.

  7. Integrating hospital information systems in healthcare institutions: a mediation architecture.

    Science.gov (United States)

    El Azami, Ikram; Cherkaoui Malki, Mohammed Ouçamah; Tahon, Christian

    2012-10-01

    Many studies have examined the integration of information systems into healthcare institutions, leading to several standards in the healthcare domain (CORBAmed: Common Object Request Broker Architecture in Medicine; HL7: Health Level Seven International; DICOM: Digital Imaging and Communications in Medicine; and IHE: Integrating the Healthcare Enterprise). Due to the existence of a wide diversity of heterogeneous systems, three essential factors are necessary to fully integrate a system: data, functions and workflow. However, most of the previous studies have dealt with only one or two of these factors and this makes the system integration unsatisfactory. In this paper, we propose a flexible, scalable architecture for Hospital Information Systems (HIS). Our main purpose is to provide a practical solution to insure HIS interoperability so that healthcare institutions can communicate without being obliged to change their local information systems and without altering the tasks of the healthcare professionals. Our architecture is a mediation architecture with 3 levels: 1) a database level, 2) a middleware level and 3) a user interface level. The mediation is based on two central components: the Mediator and the Adapter. Using the XML format allows us to establish a structured, secured exchange of healthcare data. The notion of medical ontology is introduced to solve semantic conflicts and to unify the language used for the exchange. Our mediation architecture provides an effective, promising model that promotes the integration of hospital information systems that are autonomous, heterogeneous, semantically interoperable and platform-independent.

  8. [Development method of healthcare information system integration based on business collaboration model].

    Science.gov (United States)

    Li, Shasha; Nie, Hongchao; Lu, Xudong; Duan, Huilong

    2015-02-01

    Integration of heterogeneous systems is the key to hospital information construction due to complexity of the healthcare environment. Currently, during the process of healthcare information system integration, people participating in integration project usually communicate by free-format document, which impairs the efficiency and adaptability of integration. A method utilizing business process model and notation (BPMN) to model integration requirement and automatically transforming it to executable integration configuration was proposed in this paper. Based on the method, a tool was developed to model integration requirement and transform it to integration configuration. In addition, an integration case in radiology scenario was used to verify the method.

  9. A Quality-Driven Methodology for Information Systems Integration

    Directory of Open Access Journals (Sweden)

    Iyad Zikra

    2017-10-01

    Full Text Available Information systems integration is an essential instrument for organizations to attain advantage in today’s growing and fast changing business and technology landscapes. Integration solutions generate added value by combining the functionality and services of heterogeneous and diverse systems. Existing integration environments tend to rely heavily on technical, platform-dependent skills. Consequently, the solutions that they enable are not optimally aligned with the envisioned business goals of the organization. Furthermore, the gap between the goals and the solutions complicates the task of evaluating the quality of integration solutions. To address these challenges, we propose a quality-driven, model-driven methodology for designing and developing integration solutions. The methodology spans organizational and systems design details, providing a holistic view of the integration solution and its underlying business goals. A multi-view meta-model provides the basis for the integration design. Quality factors that affect various aspects of the integration solution guide and inform the progress of the methodology. An example business case is presented to demonstrate the application of the methodology.

  10. An integration bridge for heterogeneous e-service environments

    OpenAIRE

    Baeta, Henrique Jorge Lourenço

    2012-01-01

    Dissertação para obtenção do Grau de Mestre em Engenharia Electrotécnica e de Computadores Home automation has evolved from a single integration of services (provided by devices, equipment, etc.) in the environment to a more broad integration of these core services with others(external to the environment) to create some added-value services for home users. This presents a key challenge of how to integrate disparate and heterogeneous e-service networks. To this, there exist already...

  11. Integration of crosswell seismic data for simulating porosity in a heterogeneous carbonate aquifer

    Science.gov (United States)

    Emery, Xavier; Parra, Jorge

    2013-11-01

    A challenge for the geostatistical simulation of subsurface properties in mining, petroleum and groundwater applications is the integration of well logs and seismic measurements, which can provide information on geological heterogeneities at a wide range of scales. This paper presents a case study conducted at the Port Mayaca aquifer, located in western Martin County, Florida, in which it is of interest to simulate porosity, based on porosity logs at two wells and high-resolution crosswell seismic measurements of P-wave impedance. To this end, porosity and impedance are transformed into cross-correlated Gaussian random fields, using local transformations. The model parameters (transformation functions, mean values and correlation structure of the transformed fields) are inferred and checked against the data. Multiple realizations of porosity can then be constructed conditionally to the impedance information in the interwell region, which allow identifying one low-porosity structure and two to three flow units that connect the two wells, mapping heterogeneities within these units and visually assessing fluid paths in the aquifer. In particular, the results suggest that the paths in the lower flow units, formed by a network of heterogeneous conduits, are not as smooth as in the upper flow unit.

  12. Thin Film Magnetless Faraday Rotators for Compact Heterogeneous Integrated Optical Isolators (Postprint)

    Science.gov (United States)

    2017-06-15

    AFRL-RX-WP-JA-2017-0348 THIN-FILM MAGNETLESS FARADAY ROTATORS FOR COMPACT HETEROGENEOUS INTEGRATED OPTICAL ISOLATORS (POSTPRINT) Dolendra Karki...Interim 9 May 2016 – 1 December 2016 4. TITLE AND SUBTITLE THIN-FILM MAGNETLESS FARADAY ROTATORS FOR COMPACT HETEROGENEOUS INTEGRATED OPTICAL...transfer of ultra-compact thin-film magnetless Faraday rotators to silicon photonic substrates. Thin films of magnetization latching bismuth

  13. Semantic Health Knowledge Graph: Semantic Integration of Heterogeneous Medical Knowledge and Services

    Directory of Open Access Journals (Sweden)

    Longxiang Shi

    2017-01-01

    Full Text Available With the explosion of healthcare information, there has been a tremendous amount of heterogeneous textual medical knowledge (TMK, which plays an essential role in healthcare information systems. Existing works for integrating and utilizing the TMK mainly focus on straightforward connections establishment and pay less attention to make computers interpret and retrieve knowledge correctly and quickly. In this paper, we explore a novel model to organize and integrate the TMK into conceptual graphs. We then employ a framework to automatically retrieve knowledge in knowledge graphs with a high precision. In order to perform reasonable inference on knowledge graphs, we propose a contextual inference pruning algorithm to achieve efficient chain inference. Our algorithm achieves a better inference result with precision and recall of 92% and 96%, respectively, which can avoid most of the meaningless inferences. In addition, we implement two prototypes and provide services, and the results show our approach is practical and effective.

  14. Web-GIS approach for integrated analysis of heterogeneous georeferenced data

    Science.gov (United States)

    Okladnikov, Igor; Gordov, Evgeny; Titov, Alexander; Shulgina, Tamara

    2014-05-01

    Georeferenced datasets are currently actively used for modeling, interpretation and forecasting of climatic and ecosystem changes on different spatial and temporal scales [1]. Due to inherent heterogeneity of environmental datasets as well as their huge size (up to tens terabytes for a single dataset) a special software supporting studies in the climate and environmental change areas is required [2]. Dedicated information-computational system for integrated analysis of heterogeneous georeferenced climatological and meteorological data is presented. It is based on combination of Web and GIS technologies according to Open Geospatial Consortium (OGC) standards, and involves many modern solutions such as object-oriented programming model, modular composition, and JavaScript libraries based on GeoExt library (http://www.geoext.org), ExtJS Framework (http://www.sencha.com/products/extjs) and OpenLayers software (http://openlayers.org). The main advantage of the system lies in it's capability to perform integrated analysis of time series of georeferenced data obtained from different sources (in-situ observations, model results, remote sensing data) and to combine the results in a single map [3, 4] as WMS and WFS layers in a web-GIS application. Also analysis results are available for downloading as binary files from the graphical user interface or can be directly accessed through web mapping (WMS) and web feature (WFS) services for a further processing by the user. Data processing is performed on geographically distributed computational cluster comprising data storage systems and corresponding computational nodes. Several geophysical datasets represented by NCEP/NCAR Reanalysis II, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 Reanalysis, ECMWF ERA Interim Reanalysis, MRI/JMA APHRODITE's Water Resources Project Reanalysis, DWD Global Precipitation Climatology Centre's data, GMAO Modern Era-Retrospective analysis for Research and Applications, reanalysis of Monitoring

  15. Information system architecture to support transparent access to distributed, heterogeneous data sources

    International Nuclear Information System (INIS)

    Brown, J.C.

    1994-08-01

    Quality situation assessment and decision making require access to multiple sources of data and information. Insufficient accessibility to data exists for many large corporations and Government agencies. By utilizing current advances in computer technology, today's situation analyst's have a wealth of information at their disposal. There are many potential solutions to the information accessibility problem using today's technology. The United States Department of Energy (US-DOE) faced this problem when dealing with one class of problem in the US. The result of their efforts has been the creation of the Tank Waste Information Network System -- TWINS. The TWINS solution combines many technologies to address problems in several areas such as User Interfaces, Transparent Access to Multiple Data Sources, and Integrated Data Access. Data related to the complex is currently distributed throughout several US-DOE installations. Over time, each installation has adopted their own set of standards as related to information management. Heterogeneous hardware and software platforms exist both across the complex and within a single installation. Standards for information management vary between US-DOE mission areas within installations. These factors contribute to the complexity of accessing information in a manner that enhances the performance and decision making process of the analysts. This paper presents one approach taken by the DOE to resolve the problem of distributed, heterogeneous, multi-media information management for the HLW Tank complex. The information system architecture developed for the DOE by the TWINS effort is one that is adaptable to other problem domains and uses

  16. Integration of heterogeneous molecular networks to unravel gene-regulation in Mycobacterium tuberculosis.

    Science.gov (United States)

    van Dam, Jesse C J; Schaap, Peter J; Martins dos Santos, Vitor A P; Suárez-Diez, María

    2014-09-26

    Different methods have been developed to infer regulatory networks from heterogeneous omics datasets and to construct co-expression networks. Each algorithm produces different networks and efforts have been devoted to automatically integrate them into consensus sets. However each separate set has an intrinsic value that is diluted and partly lost when building a consensus network. Here we present a methodology to generate co-expression networks and, instead of a consensus network, we propose an integration framework where the different networks are kept and analysed with additional tools to efficiently combine the information extracted from each network. We developed a workflow to efficiently analyse information generated by different inference and prediction methods. Our methodology relies on providing the user the means to simultaneously visualise and analyse the coexisting networks generated by different algorithms, heterogeneous datasets, and a suite of analysis tools. As a show case, we have analysed the gene co-expression networks of Mycobacterium tuberculosis generated using over 600 expression experiments. Regarding DNA damage repair, we identified SigC as a key control element, 12 new targets for LexA, an updated LexA binding motif, and a potential mismatch repair system. We expanded the DevR regulon with 27 genes while identifying 9 targets wrongly assigned to this regulon. We discovered 10 new genes linked to zinc uptake and a new regulatory mechanism for ZuR. The use of co-expression networks to perform system level analysis allows the development of custom made methodologies. As show cases we implemented a pipeline to integrate ChIP-seq data and another method to uncover multiple regulatory layers. Our workflow is based on representing the multiple types of information as network representations and presenting these networks in a synchronous framework that allows their simultaneous visualization while keeping specific associations from the different

  17. Harvesting Information from Heterogeneous Sources

    DEFF Research Database (Denmark)

    Qureshi, Pir Abdul Rasool; Memon, Nasrullah; Wiil, Uffe Kock

    2011-01-01

    The abundance of information regarding any topic makes the Internet a very good resource. Even though searching the Internet is very easy, what remains difficult is to automate the process of information extraction from the available online information due to the lack of structure and the diversi...... with performance of our tool with respect to each format. Finally, the different potential applications of the proposed tool are discussed with special emphasis on open source intelligence....... in the sharing methods. Most of the times, information is stored in different proprietary formats, complying with different standards and protocols which makes tasks like data mining and information harvesting very difficult. In this paper, an information harvesting tool (heteroHarvest) is presented...... with objectives to address these problems by filtering the useful information and then normalizing the information in a singular non hypertext format. We also discuss state of the art tools along with the shortcomings and present the results of an analysis carried out over different heterogeneous formats along...

  18. Heterogeneity, learning and information stickiness in inflation expectations

    DEFF Research Database (Denmark)

    Pfajfar, Damjan; Santoro, Emiliano

    2010-01-01

    In this paper we propose novel techniques for the empirical analysis of adaptive learning and sticky information in inflation expectations. These methodologies are applied to the distribution of households’ inflation expectations collected by the University of Michigan Survey Research Center....... To account for the evolution of the cross-section of inflation forecasts over time and measure the degree of heterogeneity in private agents’ forecasts, we explore time series of percentiles from the empirical distribution. Our results show that heterogeneity is pervasive in the process of inflation...... hand side of the median formed in accordance with adaptive learning and sticky information....

  19. An Integrated Information Retrieval Support System for Campus Network

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    This paper presents a new integrated information retrieval support system (IIRSS) which can help Web search engines retrieve cross-lingual information from heterogeneous resources stored in multi-databases in Intranet. The IIRSS, with a three-layer architecture, can cooperate with other application servers running in Intranet. By using intelligent agents to collect information and to create indexes on-the-fly, using an access control strategy to confine a user to browsing those accessible documents for him/her through a single portal, and using a new cross-lingual translation tool to help the search engine retrieve documents, the new system provides controllable information access with different authorizations, personalized services, and real-time information retrieval.

  20. Dynamic heterogeneity: a framework to promote ecological integration and hypothesis generation in urban systems

    Science.gov (United States)

    S. T. A. Pickett; M. L. Cadenasso; E. J. Rosi-Marshall; Ken Belt; P. M. Groffman; Morgan Grove; E. G. Irwin; S. S. Kaushal; S. L. LaDeau; C. H. Nilon; C. M. Swan; P. S. Warren

    2016-01-01

    Urban areas are understood to be extraordinarily spatially heterogeneous. Spatial heterogeneity, and its causes, consequences, and changes, are central to ecological science. The social sciences and urban design and planning professions also include spatial heterogeneity as a key concern. However, urban ecology, as a pursuit that integrates across these disciplines,...

  1. Heterogeneously Integrated Microwave Signal Generators with Narrow Linewidth Lasers

    Science.gov (United States)

    2017-03-20

    have shown that heterogeneous integration not only allows for a reduced cost due to economy of scale, but also allows for same or even better...advantage of introducing SOAs for microwave generator is the control and boosting of optical power before the detector providing higher RF powers. A

  2. Heterogeneous porous media permeability field characterization from fluid displacement data; Integration de donnees de deplacements de fluides dans la caracterisation de milieux poreux heterogenes

    Energy Technology Data Exchange (ETDEWEB)

    Kretz, V.

    2002-11-01

    The prediction of oil recovery or pollutant dispersion requires an accurate knowledge of the permeability field distribution. Available data are usually measurements in well bores, and, since a few years, 4D-seismic data (seismic mappings repeated in time). Such measurements allow to evaluate fluids displacements fronts evolution. The purpose of the thesis is to evaluate the possibility to determinate permeability fields from fluid displacement measurements in heterogeneous porous media. At the laboratory scale, experimental studies are made on a model and on numerical simulations. The system uses blocks of granular materials whose individual geometries and permeabilities are controlled. The fluids displacements are detected with an acoustical. The key parameters of the study are the size and spatial correlation of the permeability heterogeneity distribution, and the influence of viscosity and gravity contrasts between the injected ant displaced fluid. Then the inverse problem - evaluating the permeability field from concentration fronts evolution - is approached. At the reservoir scale, the work will mainly be focused on the integration of 4D-seismic data into inversion programs on a 3D synthetic case. A particular importance will be given to the calculation of gradients, in order to obtain a complementary information about the sensitivity of data. The information provided by 4D-seismic data consists in maps showing the vertical average of oil saturation or the presence of gas. The purpose is to integrate this qualitative information in the inversion process and to evaluate the impact on the reservoir characterization. Comparative studies - with or without 4D-seismic data - will be realized on a synthetic case. (author)

  3. Unsupervised multiple kernel learning for heterogeneous data integration.

    Science.gov (United States)

    Mariette, Jérôme; Villa-Vialaneix, Nathalie

    2018-03-15

    Recent high-throughput sequencing advances have expanded the breadth of available omics datasets and the integrated analysis of multiple datasets obtained on the same samples has allowed to gain important insights in a wide range of applications. However, the integration of various sources of information remains a challenge for systems biology since produced datasets are often of heterogeneous types, with the need of developing generic methods to take their different specificities into account. We propose a multiple kernel framework that allows to integrate multiple datasets of various types into a single exploratory analysis. Several solutions are provided to learn either a consensus meta-kernel or a meta-kernel that preserves the original topology of the datasets. We applied our framework to analyse two public multi-omics datasets. First, the multiple metagenomic datasets, collected during the TARA Oceans expedition, was explored to demonstrate that our method is able to retrieve previous findings in a single kernel PCA as well as to provide a new image of the sample structures when a larger number of datasets are included in the analysis. To perform this analysis, a generic procedure is also proposed to improve the interpretability of the kernel PCA in regards with the original data. Second, the multi-omics breast cancer datasets, provided by The Cancer Genome Atlas, is analysed using a kernel Self-Organizing Maps with both single and multi-omics strategies. The comparison of these two approaches demonstrates the benefit of our integration method to improve the representation of the studied biological system. Proposed methods are available in the R package mixKernel, released on CRAN. It is fully compatible with the mixOmics package and a tutorial describing the approach can be found on mixOmics web site http://mixomics.org/mixkernel/. jerome.mariette@inra.fr or nathalie.villa-vialaneix@inra.fr. Supplementary data are available at Bioinformatics online.

  4. Integrated reservoir characterization: Improvement in heterogeneities stochastic modelling by integration of additional external constraints

    Energy Technology Data Exchange (ETDEWEB)

    Doligez, B.; Eschard, R. [Institut Francais du Petrole, Rueil Malmaison (France); Geffroy, F. [Centre de Geostatistique, Fontainebleau (France)] [and others

    1997-08-01

    The classical approach to construct reservoir models is to start with a fine scale geological model which is informed with petrophysical properties. Then scaling-up techniques allow to obtain a reservoir model which is compatible with the fluid flow simulators. Geostatistical modelling techniques are widely used to build the geological models before scaling-up. These methods provide equiprobable images of the area under investigation, which honor the well data, and which variability is the same than the variability computed from the data. At an appraisal phase, when few data are available, or when the wells are insufficient to describe all the heterogeneities and the behavior of the field, additional constraints are needed to obtain a more realistic geological model. For example, seismic data or stratigraphic models can provide average reservoir information with an excellent areal coverage, but with a poor vertical resolution. New advances in modelisation techniques allow now to integrate this type of additional external information in order to constrain the simulations. In particular, 2D or 3D seismic derived information grids, or sand-shale ratios maps coming from stratigraphic models can be used as external drifts to compute the geological image of the reservoir at the fine scale. Examples are presented to illustrate the use of these new tools, their impact on the final reservoir model, and their sensitivity to some key parameters.

  5. Cascade of chromosomal rearrangements caused by a heterogeneous T-DNA integration supports the double-stranded break repair model for T-DNA integration.

    Science.gov (United States)

    Hu, Yufei; Chen, Zhiyu; Zhuang, Chuxiong; Huang, Jilei

    2017-06-01

    Transferred DNA (T-DNA) from Agrobacterium tumefaciens can be integrated into the plant genome. The double-stranded break repair (DSBR) pathway is a major model for T-DNA integration. From this model, we expect that two ends of a T-DNA molecule would invade into a single DNA double-stranded break (DSB) or independent DSBs in the plant genome. We call the later phenomenon a heterogeneous T-DNA integration, which has never been observed. In this work, we demonstrated it in an Arabidopsis T-DNA insertion mutant seb19. To resolve the chromosomal structural changes caused by T-DNA integration at both the nucleotide and chromosome levels, we performed inverse PCR, genome resequencing, fluorescence in situ hybridization and linkage analysis. We found, in seb19, a single T-DNA connected two different chromosomal loci and caused complex chromosomal rearrangements. The specific break-junction pattern in seb19 is consistent with the result of heterogeneous T-DNA integration but not of recombination between two T-DNA insertions. We demonstrated that, in seb19, heterogeneous T-DNA integration evoked a cascade of incorrect repair of seven DSBs on chromosomes 4 and 5, and then produced translocation, inversion, duplication and deletion. Heterogeneous T-DNA integration supports the DSBR model and suggests that two ends of a T-DNA molecule could be integrated into the plant genome independently. Our results also show a new origin of chromosomal abnormalities. © 2017 The Authors The Plant Journal © 2017 John Wiley & Sons Ltd.

  6. Generative Adversarial Networks Based Heterogeneous Data Integration and Its Application for Intelligent Power Distribution and Utilization

    Directory of Open Access Journals (Sweden)

    Yuanpeng Tan

    2018-01-01

    Full Text Available Heterogeneous characteristics of a big data system for intelligent power distribution and utilization have already become more and more prominent, which brings new challenges for the traditional data analysis technologies and restricts the comprehensive management of distribution network assets. In order to solve the problem that heterogeneous data resources of power distribution systems are difficult to be effectively utilized, a novel generative adversarial networks (GANs based heterogeneous data integration method for intelligent power distribution and utilization is proposed. In the proposed method, GANs theory is introduced to expand the distribution of completed data samples. Then, a so-called peak clustering algorithm is proposed to realize the finite open coverage of the expanded sample space, and repair those incomplete samples to eliminate the heterogeneous characteristics. Finally, in order to realize the integration of the heterogeneous data for intelligent power distribution and utilization, the well-trained discriminator model of GANs is employed to check the restored data samples. The simulation experiments verified the validity and stability of the proposed heterogeneous data integration method, which provides a novel perspective for the further data quality management of power distribution systems.

  7. Understanding as Integration of Heterogeneous Representations

    Science.gov (United States)

    Martínez, Sergio F.

    2014-03-01

    The search for understanding is a major aim of science. Traditionally, understanding has been undervalued in the philosophy of science because of its psychological underpinnings; nowadays, however, it is widely recognized that epistemology cannot be divorced from psychology as sharp as traditional epistemology required. This eliminates the main obstacle to give scientific understanding due attention in philosophy of science. My aim in this paper is to describe an account of scientific understanding as an emergent feature of our mastering of different (causal) explanatory frameworks that takes place through the mastering of scientific practices. Different practices lead to different kinds of representations. Such representations are often heterogeneous. The integration of such representations constitute understanding.

  8. Polymer-based 2D/3D wafer level heterogeneous integration for SSL module

    NARCIS (Netherlands)

    Yuan, C.; Wei, J.; Ye, H.; Koh, S.; Harianto, S.; Nieuwenhof, M.A. van den; Zhang, G.Q.

    2012-01-01

    This paper demonstrates a heterogeneous integration of solid state lighting (SSL) module, including light source (LED) and driver/control components. Such integration has been realized by the polymer-based reconfigured wafer level package technologies and such structure has been prototyped and

  9. Enhancing yeast transcription analysis through integration of heterogeneous data

    DEFF Research Database (Denmark)

    Grotkjær, Thomas; Nielsen, Jens

    2004-01-01

    of Saccharomyces cerevisiae whole genome transcription data. A special focus is on the quantitative aspects of normalisation and mathematical modelling approaches, since they are expected to play an increasing role in future DNA microarray analysis studies. Data analysis is exemplified with cluster analysis......DNA microarray technology enables the simultaneous measurement of the transcript level of thousands of genes. Primary analysis can be done with basic statistical tools and cluster analysis, but effective and in depth analysis of the vast amount of transcription data requires integration with data...... from several heterogeneous data Sources, such as upstream promoter sequences, genome-scale metabolic models, annotation databases and other experimental data. In this review, we discuss how experimental design, normalisation, heterogeneous data and mathematical modelling can enhance analysis...

  10. heteroHarvest: Harvesting Information from Heterogeneous Sources

    DEFF Research Database (Denmark)

    Qureshi, Pir Abdul Rasool; Memon, Nasrullah; Wiil, Uffe Kock

    2011-01-01

    The abundance of information regarding any topic makes the Internet a very good resource. Even though searching the Internet is very easy, what remains difficult is to automate the process of information extraction from the available online information due to the lack of structure and the diversity...... in the sharing methods. Most of the times, information is stored in different proprietary formats, complying with different standards and protocols which makes tasks like data mining and information harvesting very difficult. In this paper, an information harvesting tool (heteroHarvest) is presented...... with objectives to address these problems by filtering the useful information and then normalizing the information in a singular non hypertext format. Finally we describe the results of experimental evaluation. The results are found promising with an overall error rate equal to 6.5% across heterogeneous formats....

  11. Automated Integration of Dedicated Hardwired IP Cores in Heterogeneous MPSoCs Designed with ESPAM

    Directory of Open Access Journals (Sweden)

    Ed Deprettere

    2008-06-01

    Full Text Available This paper presents a methodology and techniques for automated integration of dedicated hardwired (HW IP cores into heterogeneous multiprocessor systems. We propose an IP core integration approach based on an HW module generation that consists of a wrapper around a predefined IP core. This approach has been implemented in a tool called ESPAM for automated multiprocessor system design, programming, and implementation. In order to keep high performance of the integrated IP cores, the structure of the IP core wrapper is devised in a way that adequately represents and efficiently implements the main characteristics of the formal model of computation, namely, Kahn process networks, we use as an underlying programming model in ESPAM. We present details about the structure of the HW module, the supported types of IP cores, and the minimum interfaces these IP cores have to provide in order to allow automated integration in heterogeneous multiprocessor systems generated by ESPAM. The ESPAM design flow, the multiprocessor platforms we consider, and the underlying programming (KPN model are introduced as well. Furthermore, we present the efficiency of our approach by applying our methodology and ESPAM tool to automatically generate, implement, and program heterogeneous multiprocessor systems that integrate dedicated IP cores and execute real-life applications.

  12. Graph Regularized Meta-path Based Transductive Regression in Heterogeneous Information Network.

    Science.gov (United States)

    Wan, Mengting; Ouyang, Yunbo; Kaplan, Lance; Han, Jiawei

    2015-01-01

    A number of real-world networks are heterogeneous information networks, which are composed of different types of nodes and links. Numerical prediction in heterogeneous information networks is a challenging but significant area because network based information for unlabeled objects is usually limited to make precise estimations. In this paper, we consider a graph regularized meta-path based transductive regression model ( Grempt ), which combines the principal philosophies of typical graph-based transductive classification methods and transductive regression models designed for homogeneous networks. The computation of our method is time and space efficient and the precision of our model can be verified by numerical experiments.

  13. Generic, network schema agnostic sparse tensor factorization for single-pass clustering of heterogeneous information networks.

    Science.gov (United States)

    Wu, Jibing; Meng, Qinggang; Deng, Su; Huang, Hongbin; Wu, Yahui; Badii, Atta

    2017-01-01

    Heterogeneous information networks (e.g. bibliographic networks and social media networks) that consist of multiple interconnected objects are ubiquitous. Clustering analysis is an effective method to understand the semantic information and interpretable structure of the heterogeneous information networks, and it has attracted the attention of many researchers in recent years. However, most studies assume that heterogeneous information networks usually follow some simple schemas, such as bi-typed networks or star network schema, and they can only cluster one type of object in the network each time. In this paper, a novel clustering framework is proposed based on sparse tensor factorization for heterogeneous information networks, which can cluster multiple types of objects simultaneously in a single pass without any network schema information. The types of objects and the relations between them in the heterogeneous information networks are modeled as a sparse tensor. The clustering issue is modeled as an optimization problem, which is similar to the well-known Tucker decomposition. Then, an Alternating Least Squares (ALS) algorithm and a feasible initialization method are proposed to solve the optimization problem. Based on the tensor factorization, we simultaneously partition different types of objects into different clusters. The experimental results on both synthetic and real-world datasets have demonstrated that our proposed clustering framework, STFClus, can model heterogeneous information networks efficiently and can outperform state-of-the-art clustering algorithms as a generally applicable single-pass clustering method for heterogeneous network which is network schema agnostic.

  14. The Stanford Data Miner: a novel approach for integrating and exploring heterogeneous immunological data.

    Science.gov (United States)

    Siebert, Janet C; Munsil, Wes; Rosenberg-Hasson, Yael; Davis, Mark M; Maecker, Holden T

    2012-03-28

    Systems-level approaches are increasingly common in both murine and human translational studies. These approaches employ multiple high information content assays. As a result, there is a need for tools to integrate heterogeneous types of laboratory and clinical/demographic data, and to allow the exploration of that data by aggregating and/or segregating results based on particular variables (e.g., mean cytokine levels by age and gender). Here we describe the application of standard data warehousing tools to create a novel environment for user-driven upload, integration, and exploration of heterogeneous data. The system presented here currently supports flow cytometry and immunoassays performed in the Stanford Human Immune Monitoring Center, but could be applied more generally. Users upload assay results contained in platform-specific spreadsheets of a defined format, and clinical and demographic data in spreadsheets of flexible format. Users then map sample IDs to connect the assay results with the metadata. An OLAP (on-line analytical processing) data exploration interface allows filtering and display of various dimensions (e.g., Luminex analytes in rows, treatment group in columns, filtered on a particular study). Statistics such as mean, median, and N can be displayed. The views can be expanded or contracted to aggregate or segregate data at various levels. Individual-level data is accessible with a single click. The result is a user-driven system that permits data integration and exploration in a variety of settings. We show how the system can be used to find gender-specific differences in serum cytokine levels, and compare them across experiments and assay types. We have used the tools and techniques of data warehousing, including open-source business intelligence software, to support investigator-driven data integration and mining of diverse immunological data.

  15. The Stanford Data Miner: a novel approach for integrating and exploring heterogeneous immunological data

    Directory of Open Access Journals (Sweden)

    Siebert Janet C

    2012-03-01

    Full Text Available Abstract Background Systems-level approaches are increasingly common in both murine and human translational studies. These approaches employ multiple high information content assays. As a result, there is a need for tools to integrate heterogeneous types of laboratory and clinical/demographic data, and to allow the exploration of that data by aggregating and/or segregating results based on particular variables (e.g., mean cytokine levels by age and gender. Methods Here we describe the application of standard data warehousing tools to create a novel environment for user-driven upload, integration, and exploration of heterogeneous data. The system presented here currently supports flow cytometry and immunoassays performed in the Stanford Human Immune Monitoring Center, but could be applied more generally. Results Users upload assay results contained in platform-specific spreadsheets of a defined format, and clinical and demographic data in spreadsheets of flexible format. Users then map sample IDs to connect the assay results with the metadata. An OLAP (on-line analytical processing data exploration interface allows filtering and display of various dimensions (e.g., Luminex analytes in rows, treatment group in columns, filtered on a particular study. Statistics such as mean, median, and N can be displayed. The views can be expanded or contracted to aggregate or segregate data at various levels. Individual-level data is accessible with a single click. The result is a user-driven system that permits data integration and exploration in a variety of settings. We show how the system can be used to find gender-specific differences in serum cytokine levels, and compare them across experiments and assay types. Conclusions We have used the tools and techniques of data warehousing, including open-source business intelligence software, to support investigator-driven data integration and mining of diverse immunological data.

  16. What constitutes information integrity?

    Directory of Open Access Journals (Sweden)

    S. Flowerday

    2008-01-01

    Full Text Available This research focused on what constitutes information integrity as this is a problem facing companies today. Moreover, information integrity is a pillar of information security and is required in order to have a sound security management programme. However, it is acknowledged that 100% information integrity is not currently achievable due to various limitations and therefore the auditing concept of reasonable assurance is adopted. This is in line with the concept that 100% information security is not achievable and the notion that adequate security is the goal, using appropriate countermeasures. The main contribution of this article is to illustrate the importance of and provide a macro view of what constitutes information integrity. The findings are in harmony with Samuel Johnson's words (1751: 'Integrity without knowledge is weak and useless, and knowledge without integrity is dangerous and dreadful.'

  17. What constitutes information integrity?

    Directory of Open Access Journals (Sweden)

    S. Flowerday

    2007-12-01

    Full Text Available This research focused on what constitutes information integrity as this is a problem facing companies today. Moreover, information integrity is a pillar of information security and is required in order to have a sound security management programme. However, it is acknowledged that 100% information integrity is not currently achievable due to various limitations and therefore the auditing concept of reasonable assurance is adopted. This is in line with the concept that 100% information security is not achievable and the notion that adequate security is the goal, using appropriate countermeasures. The main contribution of this article is to illustrate the importance of and provide a macro view of what constitutes information integrity. The findings are in harmony with Samuel Johnson's words (1751: 'Integrity without knowledge is weak and useless, and knowledge without integrity is dangerous and dreadful.'

  18. Iterative resonance self-shielding methods using resonance integral table in heterogeneous transport lattice calculations

    International Nuclear Information System (INIS)

    Hong, Ser Gi; Kim, Kang-Seog

    2011-01-01

    This paper describes the iteration methods using resonance integral tables to estimate the effective resonance cross sections in heterogeneous transport lattice calculations. Basically, these methods have been devised to reduce an effort to convert resonance integral table into subgroup data to be used in the physical subgroup method. Since these methods do not use subgroup data but only use resonance integral tables directly, these methods do not include an error in converting resonance integral into subgroup data. The effective resonance cross sections are estimated iteratively for each resonance nuclide through the heterogeneous fixed source calculations for the whole problem domain to obtain the background cross sections. These methods have been implemented in the transport lattice code KARMA which uses the method of characteristics (MOC) to solve the transport equation. The computational results show that these iteration methods are quite promising in the practical transport lattice calculations.

  19. A Semantic Big Data Platform for Integrating Heterogeneous Wearable Data in Healthcare.

    Science.gov (United States)

    Mezghani, Emna; Exposito, Ernesto; Drira, Khalil; Da Silveira, Marcos; Pruski, Cédric

    2015-12-01

    Advances supported by emerging wearable technologies in healthcare promise patients a provision of high quality of care. Wearable computing systems represent one of the most thrust areas used to transform traditional healthcare systems into active systems able to continuously monitor and control the patients' health in order to manage their care at an early stage. However, their proliferation creates challenges related to data management and integration. The diversity and variety of wearable data related to healthcare, their huge volume and their distribution make data processing and analytics more difficult. In this paper, we propose a generic semantic big data architecture based on the "Knowledge as a Service" approach to cope with heterogeneity and scalability challenges. Our main contribution focuses on enriching the NIST Big Data model with semantics in order to smartly understand the collected data, and generate more accurate and valuable information by correlating scattered medical data stemming from multiple wearable devices or/and from other distributed data sources. We have implemented and evaluated a Wearable KaaS platform to smartly manage heterogeneous data coming from wearable devices in order to assist the physicians in supervising the patient health evolution and keep the patient up-to-date about his/her status.

  20. Ultra-wideband WDM VCSEL arrays by lateral heterogeneous integration

    Science.gov (United States)

    Geske, Jon

    Advancements in heterogeneous integration are a driving factor in the development of evermore sophisticated and functional electronic and photonic devices. Such advancements will merge the optical and electronic capabilities of different material systems onto a common integrated device platform. This thesis presents a new lateral heterogeneous integration technology called nonplanar wafer bonding. The technique is capable of integrating multiple dissimilar semiconductor device structures on the surface of a substrate in a single wafer bond step, leaving different integrated device structures adjacent to each other on the wafer surface. Material characterization and numerical simulations confirm that the material quality is not compromised during the process. Nonplanar wafer bonding is used to fabricate ultra-wideband wavelength division multiplexed (WDM) vertical-cavity surface-emitting laser (VCSEL) arrays. The optically-pumped VCSEL arrays span 140 nm from 1470 to 1610 nm, a record wavelength span for devices operating in this wavelength range. The array uses eight wavelength channels to span the 140 nm with all channels separated by precisely 20 nm. All channels in the array operate single mode to at least 65°C with output power uniformity of +/- 1 dB. The ultra-wideband WDM VCSEL arrays are a significant first step toward the development of a single-chip source for optical networks based on coarse WDM (CWDM), a low-cost alternative to traditional dense WDM. The CWDM VCSEL arrays make use of fully-oxidized distributed Bragg reflectors (DBRs) to provide the wideband reflectivity required for optical feedback and lasing across 140 rim. In addition, a novel optically-pumped active region design is presented. It is demonstrated, with an analytical model and experimental results, that the new active-region design significantly improves the carrier uniformity in the quantum wells and results in a 50% lasing threshold reduction and a 20°C improvement in the peak

  1. Integration of heterogeneous molecular networks to unravel gene-regulation in Mycobacterium tuberculosis

    NARCIS (Netherlands)

    Dam, van J.C.J.; Schaap, P.J.; Martins dos Santos, V.A.P.; Suarez Diez, M.

    2014-01-01

    Background: Different methods have been developed to infer regulatory networks from heterogeneous omics datasets and to construct co-expression networks. Each algorithm produces different networks and efforts have been devoted to automatically integrate them into consensus sets. However each

  2. FEATURE SELECTION METHODS BASED ON MUTUAL INFORMATION FOR CLASSIFYING HETEROGENEOUS FEATURES

    Directory of Open Access Journals (Sweden)

    Ratri Enggar Pawening

    2016-06-01

    Full Text Available Datasets with heterogeneous features can affect feature selection results that are not appropriate because it is difficult to evaluate heterogeneous features concurrently. Feature transformation (FT is another way to handle heterogeneous features subset selection. The results of transformation from non-numerical into numerical features may produce redundancy to the original numerical features. In this paper, we propose a method to select feature subset based on mutual information (MI for classifying heterogeneous features. We use unsupervised feature transformation (UFT methods and joint mutual information maximation (JMIM methods. UFT methods is used to transform non-numerical features into numerical features. JMIM methods is used to select feature subset with a consideration of the class label. The transformed and the original features are combined entirely, then determine features subset by using JMIM methods, and classify them using support vector machine (SVM algorithm. The classification accuracy are measured for any number of selected feature subset and compared between UFT-JMIM methods and Dummy-JMIM methods. The average classification accuracy for all experiments in this study that can be achieved by UFT-JMIM methods is about 84.47% and Dummy-JMIM methods is about 84.24%. This result shows that UFT-JMIM methods can minimize information loss between transformed and original features, and select feature subset to avoid redundant and irrelevant features.

  3. The effect of heterogeneous dynamics of online users on information filtering

    International Nuclear Information System (INIS)

    Chen, Bo-Lun; Zeng, An; Chen, Ling

    2015-01-01

    The rapid expansion of the Internet requires effective information filtering techniques to extract the most essential and relevant information for online users. Many recommendation algorithms have been proposed to predict the future items that a given user might be interested in. However, there is an important issue that has always been ignored so far in related works, namely the heterogeneous dynamics of online users. The interest of active users changes more often than that of less active users, which asks for different update frequency of their recommendation lists. In this paper, we develop a framework to study the effect of heterogeneous dynamics of users on the recommendation performance. We find that the personalized application of recommendation algorithms results in remarkable improvement in the recommendation accuracy and diversity. Our findings may help online retailers make better use of the existing recommendation methods. - Highlights: • We study the effect of heterogeneous dynamics of users on recommendation. • Due to the user heterogeneity, their amount of links in the probe set is different. • The personalized algorithm implementation improves the recommendation performance. • Our results suggest different update frequency for users – recommendation list.

  4. The effect of heterogeneous dynamics of online users on information filtering

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Bo-Lun [Department of Computer Science, Yangzhou University of China, Yangzhou 225127 (China); Department of Computer Science, Nanjing University of Aeronautics and Astronautics of China, Nanjing 210016 (China); Department of Physics, University of Fribourg, Chemin du Musee 3, CH-1700 Fribourg (Switzerland); Zeng, An, E-mail: anzeng@bnu.edu.cn [School of Systems Science, Beijing Normal University, Beijing 100875 (China); Chen, Ling [Department of Computer Science, Yangzhou University of China, Yangzhou 225127 (China); Department of Computer Science, Nanjing University of Aeronautics and Astronautics of China, Nanjing 210016 (China)

    2015-11-06

    The rapid expansion of the Internet requires effective information filtering techniques to extract the most essential and relevant information for online users. Many recommendation algorithms have been proposed to predict the future items that a given user might be interested in. However, there is an important issue that has always been ignored so far in related works, namely the heterogeneous dynamics of online users. The interest of active users changes more often than that of less active users, which asks for different update frequency of their recommendation lists. In this paper, we develop a framework to study the effect of heterogeneous dynamics of users on the recommendation performance. We find that the personalized application of recommendation algorithms results in remarkable improvement in the recommendation accuracy and diversity. Our findings may help online retailers make better use of the existing recommendation methods. - Highlights: • We study the effect of heterogeneous dynamics of users on recommendation. • Due to the user heterogeneity, their amount of links in the probe set is different. • The personalized algorithm implementation improves the recommendation performance. • Our results suggest different update frequency for users – recommendation list.

  5. A Spatial Data Infrastructure Integrating Multisource Heterogeneous Geospatial Data and Time Series: A Study Case in Agriculture

    Directory of Open Access Journals (Sweden)

    Gloria Bordogna

    2016-05-01

    Full Text Available Currently, the best practice to support land planning calls for the development of Spatial Data Infrastructures (SDI capable of integrating both geospatial datasets and time series information from multiple sources, e.g., multitemporal satellite data and Volunteered Geographic Information (VGI. This paper describes an original OGC standard interoperable SDI architecture and a geospatial data and metadata workflow for creating and managing multisource heterogeneous geospatial datasets and time series, and discusses it in the framework of the Space4Agri project study case developed to support the agricultural sector in Lombardy region, Northern Italy. The main novel contributions go beyond the application domain for which the SDI has been developed and are the following: the ingestion within an a-centric SDI, potentially distributed in several nodes on the Internet to support scalability, of products derived by processing remote sensing images, authoritative data, georeferenced in-situ measurements and voluntary information (VGI created by farmers and agronomists using an original Smart App; the workflow automation for publishing sets and time series of heterogeneous multisource geospatial data and relative web services; and, finally, the project geoportal, that can ease the analysis of the geospatial datasets and time series by providing complex intelligent spatio-temporal query and answering facilities.

  6. Heterogeneity among informal microenterprises in Mexico: empirical evidence and some policy implications

    Directory of Open Access Journals (Sweden)

    René Rivera Huerta

    2017-12-01

    Full Text Available Unlike traditional theories of development, new schools of thinking consider nonfarm informal micro-enterprises as a dynamic sector. Nevertheless, social researchers from both streams recognize the necessity of policies to formalize and increase the productivity of such kind of enterprises. Using Mexican data from 2008 and cluster analysis techniques, this work proposes that informal micro-enterprises constitute a very heterogeneous group and that such heterogeneity deserves a diversified strategy of development: while some entrepreneurs would benefit from productivity policies, some others would require an assistance approach.

  7. 2 μm wavelength range InP-based type-II quantum well photodiodes heterogeneously integrated on silicon photonic integrated circuits.

    Science.gov (United States)

    Wang, Ruijun; Sprengel, Stephan; Muneeb, Muhammad; Boehm, Gerhard; Baets, Roel; Amann, Markus-Christian; Roelkens, Gunther

    2015-10-05

    The heterogeneous integration of InP-based type-II quantum well photodiodes on silicon photonic integrated circuits for the 2 µm wavelength range is presented. A responsivity of 1.2 A/W at a wavelength of 2.32 µm and 0.6 A/W at 2.4 µm wavelength is demonstrated. The photodiodes have a dark current of 12 nA at -0.5 V at room temperature. The absorbing active region of the integrated photodiodes consists of six periods of a "W"-shaped quantum well, also allowing for laser integration on the same platform.

  8. A Sensor Middleware for integration of heterogeneous medical devices.

    Science.gov (United States)

    Brito, M; Vale, L; Carvalho, P; Henriques, J

    2010-01-01

    In this paper, the architecture of a modular, service-oriented, Sensor Middleware for data acquisition and processing is presented. The described solution was developed with the purpose of solving two increasingly relevant problems in the context of modern pHealth systems: i) to aggregate a number of heterogeneous, off-the-shelf, devices from which clinical measurements can be acquired and ii) to provide access and integration with an 802.15.4 network of wearable sensors. The modular nature of the Middleware provides the means to easily integrate pre-processing algorithms into processing pipelines, as well as new drivers for adding support for new sensor devices or communication technologies. Tests performed with both real and artificially generated data streams show that the presented solution is suitable for use both in a Windows PC or a Windows Mobile PDA with minimal overhead.

  9. D Web Visualization of Environmental Information - Integration of Heterogeneous Data Sources when Providing Navigation and Interaction

    Science.gov (United States)

    Herman, L.; Řezník, T.

    2015-08-01

    3D information is essential for a number of applications used daily in various domains such as crisis management, energy management, urban planning, and cultural heritage, as well as pollution and noise mapping, etc. This paper is devoted to the issue of 3D modelling from the levels of buildings to cities. The theoretical sections comprise an analysis of cartographic principles for the 3D visualization of spatial data as well as a review of technologies and data formats used in the visualization of 3D models. Emphasis was placed on the verification of available web technologies; for example, X3DOM library was chosen for the implementation of a proof-of-concept web application. The created web application displays a 3D model of the city district of Nový Lískovec in Brno, the Czech Republic. The developed 3D visualization shows a terrain model, 3D buildings, noise pollution, and other related information. Attention was paid to the areas important for handling heterogeneous input data, the design of interactive functionality, and navigation assistants. The advantages, limitations, and future development of the proposed concept are discussed in the conclusions.

  10. Information and Heterogeneous Beliefs: Cost of Capital, Trading Volume, and Investor Welfare

    DEFF Research Database (Denmark)

    Christensen, Peter Ove; Qin, Zhenjiang

    In an incomplete market setting with heterogeneous prior beliefs, we show that public information can have a substantial impact on the ex ante cost of capital, trading volume, and investor welfare. In a model with exponential utility investors and an asset with a normally distributed dividend...... information system. In an effectively complete market setting, in which investors do not need to trade dynamically in order to take full advantage of their differences in beliefs, the ex ante cost of capital and the investor welfare are both higher than in the incomplete market setting......, the Pareto efficient public information system is the system which enjoys the maximum ex ante cost of capital, and the maximum expected abnormal trading volume. The public information system facilitates improved dynamic trading opportunities based on heterogeneously updated posterior beliefs in order to take...

  11. Information and Heterogeneous Beliefs: Cost of Capital, Trading Volume, and Investor Welfare

    DEFF Research Database (Denmark)

    Christensen, Peter Ove; Qin, Zhenjiang

    information system. In an effectively complete market setting, in which investors do not need to trade dynamically in order to take full advantage of their differences in beliefs, the ex ante cost of capital and the investor welfare are both higher than in the incomplete market setting......In an incomplete market setting with heterogeneous prior beliefs, we show that public information can have a substantial impact on the ex ante cost of capital, trading volume, and investor welfare. In a model with exponential utility investors and an asset with a normally distributed dividend......, the Pareto efficient public information system is the system which enjoys the maximum ex ante cost of capital, and the maximum expected abnormal trading volume. The public information system facilitates improved dynamic trading opportunities based on heterogeneously updated posterior beliefs in order to take...

  12. A Ranking Approach on Large-Scale Graph With Multidimensional Heterogeneous Information.

    Science.gov (United States)

    Wei, Wei; Gao, Bin; Liu, Tie-Yan; Wang, Taifeng; Li, Guohui; Li, Hang

    2016-04-01

    Graph-based ranking has been extensively studied and frequently applied in many applications, such as webpage ranking. It aims at mining potentially valuable information from the raw graph-structured data. Recently, with the proliferation of rich heterogeneous information (e.g., node/edge features and prior knowledge) available in many real-world graphs, how to effectively and efficiently leverage all information to improve the ranking performance becomes a new challenging problem. Previous methods only utilize part of such information and attempt to rank graph nodes according to link-based methods, of which the ranking performances are severely affected by several well-known issues, e.g., over-fitting or high computational complexity, especially when the scale of graph is very large. In this paper, we address the large-scale graph-based ranking problem and focus on how to effectively exploit rich heterogeneous information of the graph to improve the ranking performance. Specifically, we propose an innovative and effective semi-supervised PageRank (SSP) approach to parameterize the derived information within a unified semi-supervised learning framework (SSLF-GR), then simultaneously optimize the parameters and the ranking scores of graph nodes. Experiments on the real-world large-scale graphs demonstrate that our method significantly outperforms the algorithms that consider such graph information only partially.

  13. Heterogeneous Biomedical Database Integration Using a Hybrid Strategy: A p53 Cancer Research Database

    Directory of Open Access Journals (Sweden)

    Vadim Y. Bichutskiy

    2006-01-01

    Full Text Available Complex problems in life science research give rise to multidisciplinary collaboration, and hence, to the need for heterogeneous database integration. The tumor suppressor p53 is mutated in close to 50% of human cancers, and a small drug-like molecule with the ability to restore native function to cancerous p53 mutants is a long-held medical goal of cancer treatment. The Cancer Research DataBase (CRDB was designed in support of a project to find such small molecules. As a cancer informatics project, the CRDB involved small molecule data, computational docking results, functional assays, and protein structure data. As an example of the hybrid strategy for data integration, it combined the mediation and data warehousing approaches. This paper uses the CRDB to illustrate the hybrid strategy as a viable approach to heterogeneous data integration in biomedicine, and provides a design method for those considering similar systems. More efficient data sharing implies increased productivity, and, hopefully, improved chances of success in cancer research. (Code and database schemas are freely downloadable, http://www.igb.uci.edu/research/research.html.

  14. A framework for integrating heterogeneous clinical data for a disease area into a central data warehouse.

    Science.gov (United States)

    Karmen, Christian; Ganzinger, Matthias; Kohl, Christian D; Firnkorn, Daniel; Knaup-Gregori, Petra

    2014-01-01

    Structured collection of clinical facts is a common approach in clinical research. Especially in the analysis of rare diseases it is often necessary to aggregate study data from several sites in order to achieve a statistically significant cohort size. In this paper we describe a framework how to approach an integration of heterogeneous clinical data into a central register. This enables site-spanning queries for the occurrence of specific clinical facts and thus supports clinical research. The framework consists of three sequential steps, starting from a formal data harmonization process, to the data transformation methods and finally the integration into a proper data warehouse. We implemented reusable software templates that are based on our best practices in several projects in integrating heterogeneous clinical data. Our methods potentially increase the efficiency and quality for future data integration projects by reducing the implementation effort as well as the project management effort by usage of our approaches as a guideline.

  15. Integration of Heterogeneous Bibliographic Information through Data Abstractions.

    Science.gov (United States)

    1986-01-01

    11-12 [COMPENDEX] a) Electronics v 56 n 7 Apr 7 1983 p 155-157. b) IEEE Trans Magn v Mag-14 n 5 Sep 1978, INTERMAG (lnt Magn) Conf, I Florence, Italy ...developed a g0eograph.Cally distributed information systems as DOE/ PECaN . DOD/OROLS. NASA/RECON. CAS On-Line. OARC (France) and DECHEMA (West Germany

  16. Social influence, agent heterogeneity and the emergence of the urban informal sector

    Science.gov (United States)

    García-Díaz, César; Moreno-Monroy, Ana I.

    2012-02-01

    We develop an agent-based computational model in which the urban informal sector acts as a buffer where rural migrants can earn some income while queuing for higher paying modern-sector jobs. In the model, the informal sector emerges as a result of rural-urban migration decisions of heterogeneous agents subject to social influence in the form of neighboring effects of varying strengths. Besides using a multinomial logit choice model that allows for agent idiosyncrasy, explicit agent heterogeneity is introduced in the form of socio-demographic characteristics preferred by modern-sector employers. We find that different combinations of the strength of social influence and the socio-economic composition of the workforce lead to very different urbanization and urban informal sector shares. In particular, moderate levels of social influence and a large proportion of rural inhabitants with preferred socio-demographic characteristics are conducive to a higher urbanization rate and a larger informal sector.

  17. Rule-based Information Integration

    NARCIS (Netherlands)

    de Keijzer, Ander; van Keulen, Maurice

    2005-01-01

    In this report, we show the process of information integration. We specifically discuss the language used for integration. We show that integration consists of two phases, the schema mapping phase and the data integration phase. We formally define transformation rules, conversion, evolution and

  18. Microenvironmental Heterogeneity Parallels Breast Cancer Progression: A Histology-Genomic Integration Analysis.

    Directory of Open Access Journals (Sweden)

    Rachael Natrajan

    2016-02-01

    Full Text Available The intra-tumor diversity of cancer cells is under intense investigation; however, little is known about the heterogeneity of the tumor microenvironment that is key to cancer progression and evolution. We aimed to assess the degree of microenvironmental heterogeneity in breast cancer and correlate this with genomic and clinical parameters.We developed a quantitative measure of microenvironmental heterogeneity along three spatial dimensions (3-D in solid tumors, termed the tumor ecosystem diversity index (EDI, using fully automated histology image analysis coupled with statistical measures commonly used in ecology. This measure was compared with disease-specific survival, key mutations, genome-wide copy number, and expression profiling data in a retrospective study of 510 breast cancer patients as a test set and 516 breast cancer patients as an independent validation set. In high-grade (grade 3 breast cancers, we uncovered a striking link between high microenvironmental heterogeneity measured by EDI and a poor prognosis that cannot be explained by tumor size, genomics, or any other data types. However, this association was not observed in low-grade (grade 1 and 2 breast cancers. The prognostic value of EDI was superior to known prognostic factors and was enhanced with the addition of TP53 mutation status (multivariate analysis test set, p = 9 × 10-4, hazard ratio = 1.47, 95% CI 1.17-1.84; validation set, p = 0.0011, hazard ratio = 1.78, 95% CI 1.26-2.52. Integration with genome-wide profiling data identified losses of specific genes on 4p14 and 5q13 that were enriched in grade 3 tumors with high microenvironmental diversity that also substratified patients into poor prognostic groups. Limitations of this study include the number of cell types included in the model, that EDI has prognostic value only in grade 3 tumors, and that our spatial heterogeneity measure was dependent on spatial scale and tumor size.To our knowledge, this is the first

  19. A Semi-Supervised Learning Algorithm for Predicting Four Types MiRNA-Disease Associations by Mutual Information in a Heterogeneous Network.

    Science.gov (United States)

    Zhang, Xiaotian; Yin, Jian; Zhang, Xu

    2018-03-02

    Increasing evidence suggests that dysregulation of microRNAs (miRNAs) may lead to a variety of diseases. Therefore, identifying disease-related miRNAs is a crucial problem. Currently, many computational approaches have been proposed to predict binary miRNA-disease associations. In this study, in order to predict underlying miRNA-disease association types, a semi-supervised model called the network-based label propagation algorithm is proposed to infer multiple types of miRNA-disease associations (NLPMMDA) by mutual information derived from the heterogeneous network. The NLPMMDA method integrates disease semantic similarity, miRNA functional similarity, and Gaussian interaction profile kernel similarity information of miRNAs and diseases to construct a heterogeneous network. NLPMMDA is a semi-supervised model which does not require verified negative samples. Leave-one-out cross validation (LOOCV) was implemented for four known types of miRNA-disease associations and demonstrated the reliable performance of our method. Moreover, case studies of lung cancer and breast cancer confirmed effective performance of NLPMMDA to predict novel miRNA-disease associations and their association types.

  20. Crossing heterogeneous information sources for better analysis of health and social care data

    NARCIS (Netherlands)

    Szirbik, NB; Pelletier, C; Chaussalet, TJ; Bos, L; Marsh, A

    2005-01-01

    In this paper we describe a methodology that emerged during an implementation of a health-and-social-care-oriented data repository, which consists in grouping information from heterogeneous and distributed information sources. We developed this methodology by first constructing a concrete data

  1. DAVID Knowledgebase: a gene-centered database integrating heterogeneous gene annotation resources to facilitate high-throughput gene functional analysis

    Directory of Open Access Journals (Sweden)

    Baseler Michael W

    2007-11-01

    Full Text Available Abstract Background Due to the complex and distributed nature of biological research, our current biological knowledge is spread over many redundant annotation databases maintained by many independent groups. Analysts usually need to visit many of these bioinformatics databases in order to integrate comprehensive annotation information for their genes, which becomes one of the bottlenecks, particularly for the analytic task associated with a large gene list. Thus, a highly centralized and ready-to-use gene-annotation knowledgebase is in demand for high throughput gene functional analysis. Description The DAVID Knowledgebase is built around the DAVID Gene Concept, a single-linkage method to agglomerate tens of millions of gene/protein identifiers from a variety of public genomic resources into DAVID gene clusters. The grouping of such identifiers improves the cross-reference capability, particularly across NCBI and UniProt systems, enabling more than 40 publicly available functional annotation sources to be comprehensively integrated and centralized by the DAVID gene clusters. The simple, pair-wise, text format files which make up the DAVID Knowledgebase are freely downloadable for various data analysis uses. In addition, a well organized web interface allows users to query different types of heterogeneous annotations in a high-throughput manner. Conclusion The DAVID Knowledgebase is designed to facilitate high throughput gene functional analysis. For a given gene list, it not only provides the quick accessibility to a wide range of heterogeneous annotation data in a centralized location, but also enriches the level of biological information for an individual gene. Moreover, the entire DAVID Knowledgebase is freely downloadable or searchable at http://david.abcc.ncifcrf.gov/knowledgebase/.

  2. Data Entities and Information System Matrix for Integrated Agriculture Information System (IAIS)

    Science.gov (United States)

    Budi Santoso, Halim; Delima, Rosa

    2018-03-01

    Integrated Agriculture Information System is a system that is developed to process data, information, and knowledge in Agriculture sector. Integrated Agriculture Information System brings valuable information for farmers: (1) Fertilizer price; (2) Agriculture technique and practise; (3) Pest management; (4) Cultivation; (5) Irrigation; (6) Post harvest processing; (7) Innovation in agriculture processing. Integrated Agriculture Information System contains 9 subsystems. To bring an integrated information to the user and stakeholder, it needs an integrated database approach. Thus, researchers describes data entity and its matrix relate to subsystem in Integrated Agriculture Information System (IAIS). As a result, there are 47 data entities as entities in single and integrated database.

  3. Ontology-based knowledge representation for resolution of semantic heterogeneity in GIS

    Science.gov (United States)

    Liu, Ying; Xiao, Han; Wang, Limin; Han, Jialing

    2017-07-01

    Lack of semantic interoperability in geographical information systems has been identified as the main obstacle for data sharing and database integration. The new method should be found to overcome the problems of semantic heterogeneity. Ontologies are considered to be one approach to support geographic information sharing. This paper presents an ontology-driven integration approach to help in detecting and possibly resolving semantic conflicts. Its originality is that each data source participating in the integration process contains an ontology that defines the meaning of its own data. This approach ensures the automation of the integration through regulation of semantic integration algorithm. Finally, land classification in field GIS is described as the example.

  4. Integrated care information technology.

    Science.gov (United States)

    Rowe, Ian; Brimacombe, Phil

    2003-02-21

    Counties Manukau District Health Board (CMDHB) uses information technology (IT) to drive its Integrated Care strategy. IT enables the sharing of relevant health information between care providers. This information sharing is critical to closing the gaps between fragmented areas of the health system. The tragic case of James Whakaruru demonstrates how people have been falling through those gaps. The starting point of the Integrated Care strategic initiative was the transmission of electronic discharges and referral status messages from CMDHB's secondary provider, South Auckland Health (SAH), to GPs in the district. Successful pilots of a Well Child system and a diabetes disease management system embracing primary and secondary providers followed this. The improved information flowing from hospital to GPs now enables GPs to provide better management for their patients. The Well Child system pilot helped improve reported immunization rates in a high health need area from 40% to 90%. The diabetes system pilot helped reduce the proportion of patients with HbA1c rang:9 from 47% to 16%. IT has been implemented as an integral component of an overall Integrated Care strategic initiative. Within this context, Integrated Care IT has helped to achieve significant improvements in care outcomes, broken down barriers between health system silos, and contributed to the establishment of a system of care continuum that is better for patients.

  5. FACILITATING INTEGRATED SPATIO-TEMPORAL VISUALIZATION AND ANALYSIS OF HETEROGENEOUS ARCHAEOLOGICAL AND PALAEOENVIRONMENTAL RESEARCH DATA

    Directory of Open Access Journals (Sweden)

    C. Willmes

    2012-07-01

    Full Text Available In the context of the Collaborative Research Centre 806 "Our way to Europe" (CRC806, a research database is developed for integrating data from the disciplines of archaeology, the geosciences and the cultural sciences to facilitate integrated access to heterogeneous data sources. A practice-oriented data integration concept and its implementation is presented in this contribution. The data integration approach is based on the application of Semantic Web Technology and is applied to the domains of archaeological and palaeoenvironmental data. The aim is to provide integrated spatio-temporal access to an existing wealth of data to facilitate research on the integrated data basis. For the web portal of the CRC806 research database (CRC806-Database, a number of interfaces and applications have been evaluated, developed and implemented for exposing the data to interactive analysis and visualizations.

  6. Silica-based PLC with heterogeneously-integrated PDs for one-chip DP-QPSK receiver.

    Science.gov (United States)

    Kurata, Yu; Nasu, Yusuke; Tamura, Munehisa; Kasahara, Ryoichi; Aozasa, Shinichi; Mizuno, Takayuki; Yokoyama, Haruki; Tsunashima, Satoshi; Muramoto, Yoshifumi

    2012-12-10

    To realize a DP-QPSK receiver PLC, we heterogeneously integrated eight high-speed PDs on a silica-based PLC platform with a PBS, 90-degree optical hybrids and a VOA. The use of a 2.5%-Δ waveguide reduced the receiver PLC size to 11 mm x 11 mm. We successfully demonstrated 32 Gbaud DP-QPSK signal demodulation with the receiver PLC.

  7. Effect of clonal integration on nitrogen cycling in rhizosphere of rhizomatous clonal plant, Phyllostachys bissetii, under heterogeneous light.

    Science.gov (United States)

    Li, Yang; Chen, Jing-Song; Xue, Ge; Peng, Yuanying; Song, Hui-Xing

    2018-07-01

    Clonal integration plays an important role in clonal plant adapting to heterogeneous habitats. It was postulated that clonal integration could exhibit positive effects on nitrogen cycling in the rhizosphere of clonal plant subjected to heterogeneous light conditions. An in-situ experiment was conducted using clonal fragments of Phyllostachys bissetii with two successive ramets. Shading treatments were applied to offspring or mother ramets, respectively, whereas counterparts were treated to full sunlight. Rhizomes between two successive ramets were either severed or connected. Extracellular enzyme activities and nitrogen turnover were measured, as well as soil properties. Abundance of functional genes (archaeal or bacterial amoA, nifH) in the rhizosphere of shaded, offspring or mother ramets were determined using quantitative polymerase chain reaction. Carbon or nitrogen availabilities were significantly influenced by clonal integration in the rhizosphere of shaded ramets. Clonal integration significantly increased extracellular enzyme activities and abundance of functional genes in the rhizosphere of shaded ramets. When rhizomes were connected, higher nitrogen turnover (nitrogen mineralization or nitrification rates) was exhibited in the rhizosphere of shaded offspring ramets. However, nitrogen turnover was significantly decreased by clonal integration in the rhizosphere of shaded mother ramets. Path analysis indicated that nitrogen turnover in the rhizosphere of shaded, offspring or mother ramets were primarily driven by the response of soil microorganisms to dissolved organic carbon or nitrogen. This unique in-situ experiment provided insights into the mechanism of nutrient recycling mediated by clonal integration. It was suggested that effects of clonal integration on the rhizosphere microbial processes were dependent on direction of photosynthates transport in clonal plant subjected to heterogeneous light conditions. Copyright © 2018 Elsevier B.V. All rights

  8. Integration of Grid and Sensor Web for Flood Monitoring and Risk Assessment from Heterogeneous Data

    Science.gov (United States)

    Kussul, Nataliia; Skakun, Sergii; Shelestov, Andrii

    2013-04-01

    Over last decades we have witnessed the upward global trend in natural disaster occurrence. Hydrological and meteorological disasters such as floods are the main contributors to this pattern. In recent years flood management has shifted from protection against floods to managing the risks of floods (the European Flood risk directive). In order to enable operational flood monitoring and assessment of flood risk, it is required to provide an infrastructure with standardized interfaces and services. Grid and Sensor Web can meet these requirements. In this paper we present a general approach to flood monitoring and risk assessment based on heterogeneous geospatial data acquired from multiple sources. To enable operational flood risk assessment integration of Grid and Sensor Web approaches is proposed [1]. Grid represents a distributed environment that integrates heterogeneous computing and storage resources administrated by multiple organizations. SensorWeb is an emerging paradigm for integrating heterogeneous satellite and in situ sensors and data systems into a common informational infrastructure that produces products on demand. The basic Sensor Web functionality includes sensor discovery, triggering events by observed or predicted conditions, remote data access and processing capabilities to generate and deliver data products. Sensor Web is governed by the set of standards, called Sensor Web Enablement (SWE), developed by the Open Geospatial Consortium (OGC). Different practical issues regarding integration of Sensor Web with Grids are discussed in the study. We show how the Sensor Web can benefit from using Grids and vice versa. For example, Sensor Web services such as SOS, SPS and SAS can benefit from the integration with the Grid platform like Globus Toolkit. The proposed approach is implemented within the Sensor Web framework for flood monitoring and risk assessment, and a case-study of exploiting this framework, namely the Namibia SensorWeb Pilot Project, is

  9. System model the processing of heterogeneous sensory information in robotized complex

    Science.gov (United States)

    Nikolaev, V.; Titov, V.; Syryamkin, V.

    2018-05-01

    Analyzed the scope and the types of robotic systems consisting of subsystems of the form "a heterogeneous sensors data processing subsystem". On the basis of the Queuing theory model is developed taking into account the unevenness of the intensity of information flow from the sensors to the subsystem of information processing. Analytical solution to assess the relationship of subsystem performance and uneven flows. The research of the obtained solution in the range of parameter values of practical interest.

  10. Probabilistic XML in Information Integration

    NARCIS (Netherlands)

    de Keijzer, Ander; Shim, J.; Casati, F.

    2006-01-01

    Information integration is a difficult research problem. In an ambient environment, where devices can connect and disconnect arbitrarily, the problem only increases, because data sources may become available at any time, but can also disappear. In such an environment, information integration needs

  11. Data Integration The Relational Logic Approach

    CERN Document Server

    Genesereth, Michael

    2010-01-01

    Data integration is a critical problem in our increasingly interconnected but inevitably heterogeneous world. There are numerous data sources available in organizational databases and on public information systems like the World Wide Web. Not surprisingly, the sources often use different vocabularies and different data structures, being created, as they are, by different people, at different times, for different purposes. The goal of data integration is to provide programmatic and human users with integrated access to multiple, heterogeneous data sources, giving each user the illusion of a sin

  12. Developing Verification Systems for Building Information Models of Heritage Buildings with Heterogeneous Datasets

    Science.gov (United States)

    Chow, L.; Fai, S.

    2017-08-01

    The digitization and abstraction of existing buildings into building information models requires the translation of heterogeneous datasets that may include CAD, technical reports, historic texts, archival drawings, terrestrial laser scanning, and photogrammetry into model elements. In this paper, we discuss a project undertaken by the Carleton Immersive Media Studio (CIMS) that explored the synthesis of heterogeneous datasets for the development of a building information model (BIM) for one of Canada's most significant heritage assets - the Centre Block of the Parliament Hill National Historic Site. The scope of the project included the development of an as-found model of the century-old, six-story building in anticipation of specific model uses for an extensive rehabilitation program. The as-found Centre Block model was developed in Revit using primarily point cloud data from terrestrial laser scanning. The data was captured by CIMS in partnership with Heritage Conservation Services (HCS), Public Services and Procurement Canada (PSPC), using a Leica C10 and P40 (exterior and large interior spaces) and a Faro Focus (small to mid-sized interior spaces). Secondary sources such as archival drawings, photographs, and technical reports were referenced in cases where point cloud data was not available. As a result of working with heterogeneous data sets, a verification system was introduced in order to communicate to model users/viewers the source of information for each building element within the model.

  13. SOCR data dashboard: an integrated big data archive mashing medicare, labor, census and econometric information.

    Science.gov (United States)

    Husain, Syed S; Kalinin, Alexandr; Truong, Anh; Dinov, Ivo D

    Intuitive formulation of informative and computationally-efficient queries on big and complex datasets present a number of challenges. As data collection is increasingly streamlined and ubiquitous, data exploration, discovery and analytics get considerably harder. Exploratory querying of heterogeneous and multi-source information is both difficult and necessary to advance our knowledge about the world around us. We developed a mechanism to integrate dispersed multi-source data and service the mashed information via human and machine interfaces in a secure, scalable manner. This process facilitates the exploration of subtle associations between variables, population strata, or clusters of data elements, which may be opaque to standard independent inspection of the individual sources. This a new platform includes a device agnostic tool (Dashboard webapp, http://socr.umich.edu/HTML5/Dashboard/) for graphical querying, navigating and exploring the multivariate associations in complex heterogeneous datasets. The paper illustrates this core functionality and serviceoriented infrastructure using healthcare data (e.g., US data from the 2010 Census, Demographic and Economic surveys, Bureau of Labor Statistics, and Center for Medicare Services) as well as Parkinson's Disease neuroimaging data. Both the back-end data archive and the front-end dashboard interfaces are continuously expanded to include additional data elements and new ways to customize the human and machine interactions. A client-side data import utility allows for easy and intuitive integration of user-supplied datasets. This completely open-science framework may be used for exploratory analytics, confirmatory analyses, meta-analyses, and education and training purposes in a wide variety of fields.

  14. J-integral analysis of heterogeneous mismatched girth welds in clamped single-edge notched tension specimens

    International Nuclear Information System (INIS)

    Hertelé, Stijn; De Waele, Wim; Verstraete, Matthias; Denys, Rudi; O'Dowd, Noel

    2014-01-01

    Flaw assessment procedures require a quantification of crack driving force, and such procedures are generally based on the assumption of weld homogeneity. However, welds generally have a heterogeneous microstructure, which will influence the crack driving force. This paper describes a stress-based methodology to assess complex heterogeneous welds using a J-based approach. Clamped single-edge notched tension specimens, representative of girth weld flaws, are analyzed and the influence of weld heterogeneity on crack driving force has been determined. The use of a modified limit load for heterogeneous welds is proposed, suitable for implementation in a ‘homogenized’ J-integral estimation scheme. It follows from an explicit modification of an existing solution for centre cracked tension specimens. The proposed solution provides a good estimate of crack driving force and any errors in the approximation may be accounted for by means of a small safety factor on load bearing capacity. - Highlights: • We present a crack driving force estimation procedure for heterogeneous welds. • The procedure is based on a ‘homogenized’ version of the EPRI equation. • Complex welds are translated into equivalent idealized mismatched welds. • The procedure is validated for clamped SE(T) specimens. • A mismatch limit load for clamped SE(T) specimens is developed

  15. Federated access to heterogeneous information resources in the Neuroscience Information Framework (NIF).

    Science.gov (United States)

    Gupta, Amarnath; Bug, William; Marenco, Luis; Qian, Xufei; Condit, Christopher; Rangarajan, Arun; Müller, Hans Michael; Miller, Perry L; Sanders, Brian; Grethe, Jeffrey S; Astakhov, Vadim; Shepherd, Gordon; Sternberg, Paul W; Martone, Maryann E

    2008-09-01

    The overarching goal of the NIF (Neuroscience Information Framework) project is to be a one-stop-shop for Neuroscience. This paper provides a technical overview of how the system is designed. The technical goal of the first version of the NIF system was to develop an information system that a neuroscientist can use to locate relevant information from a wide variety of information sources by simple keyword queries. Although the user would provide only keywords to retrieve information, the NIF system is designed to treat them as concepts whose meanings are interpreted by the system. Thus, a search for term should find a record containing synonyms of the term. The system is targeted to find information from web pages, publications, databases, web sites built upon databases, XML documents and any other modality in which such information may be published. We have designed a system to achieve this functionality. A central element in the system is an ontology called NIFSTD (for NIF Standard) constructed by amalgamating a number of known and newly developed ontologies. NIFSTD is used by our ontology management module, called OntoQuest to perform ontology-based search over data sources. The NIF architecture currently provides three different mechanisms for searching heterogeneous data sources including relational databases, web sites, XML documents and full text of publications. Version 1.0 of the NIF system is currently in beta test and may be accessed through http://nif.nih.gov.

  16. Integrated inventory information system

    Digital Repository Service at National Institute of Oceanography (India)

    Sarupria, J.S.; Kunte, P.D.

    The nature of oceanographic data and the management of inventory level information are described in Integrated Inventory Information System (IIIS). It is shown how a ROSCOPO (report on observations/samples collected during oceanographic programme...

  17. Heterogeneous Integration Technology

    Science.gov (United States)

    2017-05-19

    integrated CMOS imaging system for high frame rate applications [171]. .................... 68 Figure 83: CPU-DRAM Memory Landscape . [127... film transistors (TFT) were integrated with GaN HEMTs on the same wafer at AFRL. The thin film transistor fabrication using metal-oxide...second layer. Layer transfer produces the best quality devices compared to other additive technologies such as re-crystallization of thin films [148

  18. An Optimal Joint User Association and Power Allocation Algorithm for Secrecy Information Transmission in Heterogeneous Networks

    Directory of Open Access Journals (Sweden)

    Rong Chai

    2017-01-01

    Full Text Available In recent years, heterogeneous radio access technologies have experienced rapid development and gradually achieved effective coordination and integration, resulting in heterogeneous networks (HetNets. In this paper, we consider the downlink secure transmission of HetNets where the information transmission from base stations (BSs to legitimate users is subject to the interception of eavesdroppers. In particular, we stress the problem of joint user association and power allocation of the BSs. To achieve data transmission in a secure and energy efficient manner, we introduce the concept of secrecy energy efficiency which is defined as the ratio of the secrecy transmission rate and power consumption of the BSs and formulate the problem of joint user association and power allocation as an optimization problem which maximizes the joint secrecy energy efficiency of all the BSs under the power constraint of the BSs and the minimum data rate constraint of user equipment (UE. By equivalently transforming the optimization problem into two subproblems, that is, power allocation subproblem and user association subproblem of the BSs, and applying iterative method and Kuhn-Munkres (K-M algorithm to solve the two subproblems, respectively, the optimal user association and power allocation strategies can be obtained. Numerical results demonstrate that the proposed algorithm outperforms previously proposed algorithms.

  19. A resilient and efficient CFD framework: Statistical learning tools for multi-fidelity and heterogeneous information fusion

    Science.gov (United States)

    Lee, Seungjoon; Kevrekidis, Ioannis G.; Karniadakis, George Em

    2017-09-01

    Exascale-level simulations require fault-resilient algorithms that are robust against repeated and expected software and/or hardware failures during computations, which may render the simulation results unsatisfactory. If each processor can share some global information about the simulation from a coarse, limited accuracy but relatively costless auxiliary simulator we can effectively fill-in the missing spatial data at the required times by a statistical learning technique - multi-level Gaussian process regression, on the fly; this has been demonstrated in previous work [1]. Based on the previous work, we also employ another (nonlinear) statistical learning technique, Diffusion Maps, that detects computational redundancy in time and hence accelerate the simulation by projective time integration, giving the overall computation a "patch dynamics" flavor. Furthermore, we are now able to perform information fusion with multi-fidelity and heterogeneous data (including stochastic data). Finally, we set the foundations of a new framework in CFD, called patch simulation, that combines information fusion techniques from, in principle, multiple fidelity and resolution simulations (and even experiments) with a new adaptive timestep refinement technique. We present two benchmark problems (the heat equation and the Navier-Stokes equations) to demonstrate the new capability that statistical learning tools can bring to traditional scientific computing algorithms. For each problem, we rely on heterogeneous and multi-fidelity data, either from a coarse simulation of the same equation or from a stochastic, particle-based, more "microscopic" simulation. We consider, as such "auxiliary" models, a Monte Carlo random walk for the heat equation and a dissipative particle dynamics (DPD) model for the Navier-Stokes equations. More broadly, in this paper we demonstrate the symbiotic and synergistic combination of statistical learning, domain decomposition, and scientific computing in

  20. DEVELOPING VERIFICATION SYSTEMS FOR BUILDING INFORMATION MODELS OF HERITAGE BUILDINGS WITH HETEROGENEOUS DATASETS

    Directory of Open Access Journals (Sweden)

    L. Chow

    2017-08-01

    Full Text Available The digitization and abstraction of existing buildings into building information models requires the translation of heterogeneous datasets that may include CAD, technical reports, historic texts, archival drawings, terrestrial laser scanning, and photogrammetry into model elements. In this paper, we discuss a project undertaken by the Carleton Immersive Media Studio (CIMS that explored the synthesis of heterogeneous datasets for the development of a building information model (BIM for one of Canada’s most significant heritage assets – the Centre Block of the Parliament Hill National Historic Site. The scope of the project included the development of an as-found model of the century-old, six-story building in anticipation of specific model uses for an extensive rehabilitation program. The as-found Centre Block model was developed in Revit using primarily point cloud data from terrestrial laser scanning. The data was captured by CIMS in partnership with Heritage Conservation Services (HCS, Public Services and Procurement Canada (PSPC, using a Leica C10 and P40 (exterior and large interior spaces and a Faro Focus (small to mid-sized interior spaces. Secondary sources such as archival drawings, photographs, and technical reports were referenced in cases where point cloud data was not available. As a result of working with heterogeneous data sets, a verification system was introduced in order to communicate to model users/viewers the source of information for each building element within the model.

  1. Integration of IEEE 1451 and HL7 exchanging information for patients' sensor data.

    Science.gov (United States)

    Kim, Wooshik; Lim, Suyoung; Ahn, Jinsoo; Nah, Jiyoung; Kim, Namhyun

    2010-12-01

    HL7 (Health Level 7) is a standard developed for exchanging incompatible healthcare information generated from programs or devices among heterogenous medical information systems. At present, HL7 is growing as a global standard. However, the HL7 standard does not support effective methods for treating data from various medical sensors, especially from mobile sensors. As ubiquitous systems are growing, HL7 must communicate with various medical transducers. In the area of sensor fields, IEEE 1451 is a group of standards for controlling transducers and for communicating data from/to various transducers. In this paper, we present the possibility of interoperability between the two standards, i.e., HL7 and IEEE 1451. After we present a method to integrate them and show the preliminary results of this approach.

  2. Integration of top-down and bottom-up information for audio organization and retrieval

    DEFF Research Database (Denmark)

    Jensen, Bjørn Sand

    The increasing availability of digital audio and music calls for methods and systems to analyse and organize these digital objects. This thesis investigates three elements related to such systems focusing on the ability to represent and elicit the user's view on the multimedia object and the system...... output. The aim is to provide organization and processing, which aligns with the understanding and needs of the users. Audio and music is often characterized by the large amount of heterogenous information. The rst aspect investigated is the integration of such multi-variate and multi-modal information...... (indirect scaling). Inference is performed by analytical and simulation based methods, including the Laplace approximation and expectation propagation. In order to minimize the cost of the often expensive and lengthly experimentation, sequential experiment design or active learning is supported. The setup...

  3. Computer Aided Prototyping System (CAPS) for Heterogeneous Systems Development and Integration

    OpenAIRE

    Luqi; Berzins, V.; Shing, M.; Nada, N.; Eagle, C.

    2000-01-01

    2000 Command and Control Research and Technology Symposium (CCRTS), June 11-13, 2000, Naval Postgraduate School, Monterey, CA This paper addresses the problem of how to produce reliable software that is also flexible and cost effective for the DoD distributed software domain. DoD software systems fall into two categories: information systems and war fighter systems. Both types of systems can be distributed, heterogeneous and network-based, consisting of a set of components running...

  4. Heterogeneous integration of lithium niobate and silicon nitride waveguides for wafer-scale photonic integrated circuits on silicon.

    Science.gov (United States)

    Chang, Lin; Pfeiffer, Martin H P; Volet, Nicolas; Zervas, Michael; Peters, Jon D; Manganelli, Costanza L; Stanton, Eric J; Li, Yifei; Kippenberg, Tobias J; Bowers, John E

    2017-02-15

    An ideal photonic integrated circuit for nonlinear photonic applications requires high optical nonlinearities and low loss. This work demonstrates a heterogeneous platform by bonding lithium niobate (LN) thin films onto a silicon nitride (Si3N4) waveguide layer on silicon. It not only provides large second- and third-order nonlinear coefficients, but also shows low propagation loss in both the Si3N4 and the LN-Si3N4 waveguides. The tapers enable low-loss-mode transitions between these two waveguides. This platform is essential for various on-chip applications, e.g., modulators, frequency conversions, and quantum communications.

  5. Integrated Compliance Information System (ICIS)

    Data.gov (United States)

    U.S. Environmental Protection Agency — The purpose of ICIS is to meet evolving Enforcement and Compliance business needs for EPA and State users by integrating information into a single integrated data...

  6. Clinical results of HIS, RIS, PACS integration using data integration CASE tools

    Science.gov (United States)

    Taira, Ricky K.; Chan, Hing-Ming; Breant, Claudine M.; Huang, Lu J.; Valentino, Daniel J.

    1995-05-01

    Current infrastructure research in PACS is dominated by the development of communication networks (local area networks, teleradiology, ATM networks, etc.), multimedia display workstations, and hierarchical image storage architectures. However, limited work has been performed on developing flexible, expansible, and intelligent information processing architectures for the vast decentralized image and text data repositories prevalent in healthcare environments. Patient information is often distributed among multiple data management systems. Current large-scale efforts to integrate medical information and knowledge sources have been costly with limited retrieval functionality. Software integration strategies to unify distributed data and knowledge sources is still lacking commercially. Systems heterogeneity (i.e., differences in hardware platforms, communication protocols, database management software, nomenclature, etc.) is at the heart of the problem and is unlikely to be standardized in the near future. In this paper, we demonstrate the use of newly available CASE (computer- aided software engineering) tools to rapidly integrate HIS, RIS, and PACS information systems. The advantages of these tools include fast development time (low-level code is generated from graphical specifications), and easy system maintenance (excellent documentation, easy to perform changes, and centralized code repository in an object-oriented database). The CASE tools are used to develop and manage the `middle-ware' in our client- mediator-serve architecture for systems integration. Our architecture is scalable and can accommodate heterogeneous database and communication protocols.

  7. Heterogenous integration of a thin-film GaAs photodetector and a microfluidic device on a silicon substrate

    International Nuclear Information System (INIS)

    Song, Fuchuan; Xiao, Jing; Udawala, Fidaali; Seo, Sang-Woo

    2011-01-01

    In this paper, heterogeneous integration of a III–V semiconductor thin-film photodetector (PD) with a microfluidic device is demonstrated on a SiO 2 –Si substrate. Thin-film format of optical devices provides an intimate integration of optical functions with microfluidic devices. As a demonstration of a multi-material and functional system, the biphasic flow structure in the polymeric microfluidic channels was co-integrated with a III–V semiconductor thin-film PD. The fluorescent drops formed in the microfluidic device are successfully detected with an integrated thin-film PD on a silicon substrate. The proposed three-dimensional integration structure is an alternative approach to combine optical functions with microfluidic functions on silicon-based electronic functions.

  8. Information Integration; The process of integration, evolution and versioning

    NARCIS (Netherlands)

    de Keijzer, Ander; van Keulen, Maurice

    2005-01-01

    At present, many information sources are available wherever you are. Most of the time, the information needed is spread across several of those information sources. Gathering this information is a tedious and time consuming job. Automating this process would assist the user in its task. Integration

  9. Integrated information theory of consciousness: an updated account.

    Science.gov (United States)

    Tononi, G

    2012-12-01

    This article presents an updated account of integrated information theory of consciousness (liT) and some of its implications. /IT stems from thought experiments that lead to phenomenological axioms (existence, compositionality, information, integration, exclusion) and corresponding ontological postulates. The information axiom asserts that every experience is spec~fic - it is what it is by differing in its particular way from a large repertoire of alternatives. The integration axiom asserts that each experience is unified- it cannot be reduced to independent components. The exclusion axiom asserts that every experience is definite - it is limited to particular things and not others and flows at a particular speed and resolution. /IT formalizes these intuitions with postulates. The information postulate states that only "differences that make a difference" from the intrinsic perpective of a system matter: a mechanism generates cause-effect information if its present state has selective past causes and selective future effects within a system. The integration postulate states that only information that is irreducible matters: mechanisms generate integrated information only to the extent that the information they generate cannot be partitioned into that generated within independent components. The exclusion postulate states that only maxima of integrated information matter: a mechanism specifies only one maximally irreducible set of past causes and future effects - a concept. A complex is a set of elements specifying a maximally irreducible constellation of concepts, where the maximum is evaluated over elements and at the optimal spatiatemporal scale. Its concepts specify a maximally integrated conceptual information structure or quale, which is identical with an experience. Finally, changes in information integration upon exposure to the environment reflect a system's ability to match the causal structure of the world. After introducing an updated definition of

  10. Regional Logistics Information Resources Integration Patterns and Countermeasures

    Science.gov (United States)

    Wu, Hui; Shangguan, Xu-ming

    Effective integration of regional logistics information resources can provide collaborative services in information flow, business flow and logistics for regional logistics enterprises, which also can reduce operating costs and improve market responsiveness. First, this paper analyzes the realistic significance on the integration of regional logistics information. Second, this paper brings forward three feasible patterns on the integration of regional logistics information resources, These three models have their own strengths and the scope of application and implementation, which model is selected will depend on the specific business and the regional distribution of enterprises. Last, this paper discusses the related countermeasures on the integration of regional logistics information resources, because the integration of regional logistics information is a systems engineering, when the integration is advancing, the countermeasures should pay close attention to the current needs and long-term development of regional enterprises.

  11. Rapidly reconfigurable high-fidelity optical arbitrary waveform generation in heterogeneous photonic integrated circuits.

    Science.gov (United States)

    Feng, Shaoqi; Qin, Chuan; Shang, Kuanping; Pathak, Shibnath; Lai, Weicheng; Guan, Binbin; Clements, Matthew; Su, Tiehui; Liu, Guangyao; Lu, Hongbo; Scott, Ryan P; Ben Yoo, S J

    2017-04-17

    This paper demonstrates rapidly reconfigurable, high-fidelity optical arbitrary waveform generation (OAWG) in a heterogeneous photonic integrated circuit (PIC). The heterogeneous PIC combines advantages of high-speed indium phosphide (InP) modulators and low-loss, high-contrast silicon nitride (Si3N4) arrayed waveguide gratings (AWGs) so that high-fidelity optical waveform syntheses with rapid waveform updates are possible. The generated optical waveforms spanned a 160 GHz spectral bandwidth starting from an optical frequency comb consisting of eight comb lines separated by 20 GHz channel spacing. The Error Vector Magnitude (EVM) values of the generated waveforms were approximately 16.4%. The OAWG module can rapidly and arbitrarily reconfigure waveforms upon every pulse arriving at 2 ns repetition time. The result of this work indicates the feasibility of truly dynamic optical arbitrary waveform generation where the reconfiguration rate or the modulator bandwidth must exceed the channel spacing of the AWG and the optical frequency comb.

  12. Integrating non-colocated well and geophysical data to capture subsurface heterogeneity at an aquifer recharge and recovery site

    Science.gov (United States)

    Gottschalk, Ian P.; Hermans, Thomas; Knight, Rosemary; Caers, Jef; Cameron, David A.; Regnery, Julia; McCray, John E.

    2017-12-01

    Geophysical data have proven to be very useful for lithological characterization. However, quantitatively integrating the information gained from acquiring geophysical data generally requires colocated lithological and geophysical data for constructing a rock-physics relationship. In this contribution, the issue of integrating noncolocated geophysical and lithological data is addressed, and the results are applied to simulate groundwater flow in a heterogeneous aquifer in the Prairie Waters Project North Campus aquifer recharge site, Colorado. Two methods of constructing a rock-physics transform between electrical resistivity tomography (ERT) data and lithology measurements are assessed. In the first approach, a maximum likelihood estimation (MLE) is used to fit a bimodal lognormal distribution to horizontal crosssections of the ERT resistivity histogram. In the second approach, a spatial bootstrap is applied to approximate the rock-physics relationship. The rock-physics transforms provide soft data for multiple point statistics (MPS) simulations. Subsurface models are used to run groundwater flow and tracer test simulations. Each model's uncalibrated, predicted breakthrough time is evaluated based on its agreement with measured subsurface travel time values from infiltration basins to selected groundwater recovery wells. We find that incorporating geophysical information into uncalibrated flow models reduces the difference with observed values, as compared to flow models without geophysical information incorporated. The integration of geophysical data also narrows the variance of predicted tracer breakthrough times substantially. Accuracy is highest and variance is lowest in breakthrough predictions generated by the MLE-based rock-physics transform. Calibrating the ensemble of geophysically constrained models would help produce a suite of realistic flow models for predictive purposes at the site. We find that the success of breakthrough predictions is highly

  13. Quasi-linear score for capturing heterogeneous structure in biomarkers.

    Science.gov (United States)

    Omae, Katsuhiro; Komori, Osamu; Eguchi, Shinto

    2017-06-19

    Linear scores are widely used to predict dichotomous outcomes in biomedical studies because of their learnability and understandability. Such approaches, however, cannot be used to elucidate biodiversity when there is heterogeneous structure in target population. Our study was focused on describing intrinsic heterogeneity in predictions. Because heterogeneity can be captured by a clustering method, integrating different information from different clusters should yield better predictions. Accordingly, we developed a quasi-linear score, which effectively combines the linear scores of clustered markers. We extended the linear score to the quasi-linear score by a generalized average form, the Kolmogorov-Nagumo average. We observed that two shrinkage methods worked well: ridge shrinkage for estimating the quasi-linear score, and lasso shrinkage for selecting markers within each cluster. Simulation studies and applications to real data show that the proposed method has good predictive performance compared with existing methods. Heterogeneous structure is captured by a clustering method. Quasi-linear scores combine such heterogeneity and have a better predictive ability compared with linear scores.

  14. Vertically Integrated Models for Carbon Storage Modeling in Heterogeneous Domains

    Science.gov (United States)

    Bandilla, K.; Celia, M. A.

    2017-12-01

    Numerical modeling is an essential tool for studying the impacts of geologic carbon storage (GCS). Injection of carbon dioxide (CO2) into deep saline aquifers leads to multi-phase flow (injected CO2 and resident brine), which can be described by a set of three-dimensional governing equations, including mass-balance equation, volumetric flux equations (modified Darcy), and constitutive equations. This is the modeling approach on which commonly used reservoir simulators such as TOUGH2 are based. Due to the large density difference between CO2 and brine, GCS models can often be simplified by assuming buoyant segregation and integrating the three-dimensional governing equations in the vertical direction. The integration leads to a set of two-dimensional equations coupled with reconstruction operators for vertical profiles of saturation and pressure. Vertically-integrated approaches have been shown to give results of comparable quality as three-dimensional reservoir simulators when applied to realistic CO2 injection sites such as the upper sand wedge at the Sleipner site. However, vertically-integrated approaches usually rely on homogeneous properties over the thickness of a geologic layer. Here, we investigate the impact of general (vertical and horizontal) heterogeneity in intrinsic permeability, relative permeability functions, and capillary pressure functions. We consider formations involving complex fluvial deposition environments and compare the performance of vertically-integrated models to full three-dimensional models for a set of hypothetical test cases consisting of high permeability channels (streams) embedded in a low permeability background (floodplains). The domains are randomly generated assuming that stream channels can be represented by sinusoidal waves in the plan-view and by parabolas for the streams' cross-sections. Stream parameters such as width, thickness and wavelength are based on values found at the Ketzin site in Germany. Results from the

  15. Use of positioning information for performance enhancement of uncoordinated heterogeneous network deployment

    DEFF Research Database (Denmark)

    Semov, Plamen T.; Mihovska, Albena D.; Prasad, Ramjee

    2013-01-01

    with information but the locations of the mobile users and neighboring cells to solve the problem of dynamic physical resource assignment in uncoordinated scenario while accounting for improved allocation and scheduling. The results are compared to the performance when known scheduling algorithms are employed......This paper proposes a novel algorithm for dynamic physical resource allocation based on the use of positioning information during carrier aggregation (CA) in a semi-and uncoordinated deployment of heterogeneous networks (HetNet). The algorithm uses the known Q-learning method enhanced...... and show increased cell throughput, while maintaining an adequate user throughput when employing Q-learning with positioning information....

  16. Heterogeneous treatment in the variational nodal method

    International Nuclear Information System (INIS)

    Fanning, T.H.

    1995-01-01

    The variational nodal transport method is reduced to its diffusion form and generalized for the treatment of heterogeneous nodes while maintaining nodal balances. Adapting variational methods to heterogeneous nodes requires the ability to integrate over a node with discontinuous cross sections. In this work, integrals are evaluated using composite gaussian quadrature rules, which permit accurate integration while minimizing computing time. Allowing structure within a nodal solution scheme avoids some of the necessity of cross section homogenization, and more accurately defines the intra-nodal flux shape. Ideally, any desired heterogeneity can be constructed within the node; but in reality, the finite set of basis functions limits the practical resolution to which fine detail can be defined within the node. Preliminary comparison tests show that the heterogeneous variational nodal method provides satisfactory results even if some improvements are needed for very difficult, configurations

  17. Integrated Information Management (IIM)

    National Research Council Canada - National Science Library

    McIlvain, Jason

    2007-01-01

    Information Technology is the core capability required to align our resources and increase our effectiveness on the battlefield by integrating and coordinating our preventative measures and responses...

  18. Developmental Anatomy Ontology of Zebrafish: an Integrative semantic framework

    Directory of Open Access Journals (Sweden)

    Belmamoune Mounia

    2007-12-01

    Full Text Available Integration of information is quintessential to make use of the wealth of bioinformatics resources. One aspect of integration is to make databases interoperable through well annotated information. With new databases one strives to store complementary information and such results in collections of heterogeneous information systems. Concepts in these databases need to be connected and ontologies typically provide a common terminology to share information among different resources.

  19. A Parallel Strategy for Convolutional Neural Network Based on Heterogeneous Cluster for Mobile Information System

    Directory of Open Access Journals (Sweden)

    Jilin Zhang

    2017-01-01

    Full Text Available With the development of the mobile systems, we gain a lot of benefits and convenience by leveraging mobile devices; at the same time, the information gathered by smartphones, such as location and environment, is also valuable for business to provide more intelligent services for customers. More and more machine learning methods have been used in the field of mobile information systems to study user behavior and classify usage patterns, especially convolutional neural network. With the increasing of model training parameters and data scale, the traditional single machine training method cannot meet the requirements of time complexity in practical application scenarios. The current training framework often uses simple data parallel or model parallel method to speed up the training process, which is why heterogeneous computing resources have not been fully utilized. To solve these problems, our paper proposes a delay synchronization convolutional neural network parallel strategy, which leverages the heterogeneous system. The strategy is based on both synchronous parallel and asynchronous parallel approaches; the model training process can reduce the dependence on the heterogeneous architecture in the premise of ensuring the model convergence, so the convolution neural network framework is more adaptive to different heterogeneous system environments. The experimental results show that the proposed delay synchronization strategy can achieve at least three times the speedup compared to the traditional data parallelism.

  20. Use of integrated geologic and geophysical information for characterizing the structure of fracture systems at the US/BK Site, Grimsel Laboratory, Switzerland

    International Nuclear Information System (INIS)

    Martel, S.J.; Peterson, J.E. Jr.

    1990-05-01

    Fracture systems form the primary fluid flow paths in a number of rock types, including some of those being considered for high level nuclear waste repositories. In some cases, flow along fractures must be modeled explicitly as part of a site characterization effort. Fractures commonly are concentrated in fracture zones, and even where fractures are seemingly ubiquitous, the hydrology of a site can be dominated by a few discrete fracture zones. We have implemented a site characterization methodology that combines information gained from geophysical and geologic investigations. The general philosophy is to identify and locate the major fracture zones, and then to characterize their systematics. Characterizing the systematics means establishing the essential and recurring patterns in which fractures are organized within the zones. We make a concerted effort to use information on the systematics of the fracture systems to link the site-specific geologic, borehole and geophysical information. This report illustrates how geologic and geophysical information on geologic heterogeneities can be integrated to guide the development of hydrologic models. The report focuses on fractures, a particularly common type of geologic heterogeneity. However, many aspects of the methodology we present can be applied to other geologic heterogeneities as well. 57 refs., 40 figs., 1 tab

  1. Integrated Reporting Information System -

    Data.gov (United States)

    Department of Transportation — The Integrated Reporting Information System (IRIS) is a flexible and scalable web-based system that supports post operational analysis and evaluation of the National...

  2. Integration of Information Technologies in Enterprise Application Development

    Directory of Open Access Journals (Sweden)

    Iulia SURUGIU

    2012-05-01

    Full Text Available Healthcare enterprises are disconnected. In the era of integrated information systems and Internet explosion, the necessity of information systems integration reside from business process evolution, on the one hand, and from information technology tendencies, on the other hand. In order to become more efficient and adaptive to change, healthcare organizations are tremendously preoccupied of business process automation, flexibility and complexity. The need of information systems integration arise from these goals, explaining, at the same time, the special interest in EAI. Extensible software integration architectures and business orientation of process modeling and information systems functionalities, the same as open-connectivity, accessibility and virtualization lead to most suitable integration solutions: SOA and BPM architectural styles in a cloud computing environment.

  3. Pointwise mutual information quantifies intratumor heterogeneity in tissue sections labeled with multiple fluorescent biomarkers

    Directory of Open Access Journals (Sweden)

    Daniel M Spagnolo

    2016-01-01

    Full Text Available Background: Measures of spatial intratumor heterogeneity are potentially important diagnostic biomarkers for cancer progression, proliferation, and response to therapy. Spatial relationships among cells including cancer and stromal cells in the tumor microenvironment (TME are key contributors to heterogeneity. Methods: We demonstrate how to quantify spatial heterogeneity from immunofluorescence pathology samples, using a set of 3 basic breast cancer biomarkers as a test case. We learn a set of dominant biomarker intensity patterns and map the spatial distribution of the biomarker patterns with a network. We then describe the pairwise association statistics for each pattern within the network using pointwise mutual information (PMI and visually represent heterogeneity with a two-dimensional map. Results: We found a salient set of 8 biomarker patterns to describe cellular phenotypes from a tissue microarray cohort containing 4 different breast cancer subtypes. After computing PMI for each pair of biomarker patterns in each patient and tumor replicate, we visualize the interactions that contribute to the resulting association statistics. Then, we demonstrate the potential for using PMI as a diagnostic biomarker, by comparing PMI maps and heterogeneity scores from patients across the 4 different cancer subtypes. Estrogen receptor positive invasive lobular carcinoma patient, AL13-6, exhibited the highest heterogeneity score among those tested, while estrogen receptor negative invasive ductal carcinoma patient, AL13-14, exhibited the lowest heterogeneity score. Conclusions: This paper presents an approach for describing intratumor heterogeneity, in a quantitative fashion (via PMI, which departs from the purely qualitative approaches currently used in the clinic. PMI is generalizable to highly multiplexed/hyperplexed immunofluorescence images, as well as spatial data from complementary in situ methods including FISSEQ and CyTOF, sampling many different

  4. Distributed Input and State Estimation Using Local Information in Heterogeneous Sensor Networks

    Directory of Open Access Journals (Sweden)

    Dzung Tran

    2017-07-01

    Full Text Available A new distributed input and state estimation architecture is introduced and analyzed for heterogeneous sensor networks. Specifically, nodes of a given sensor network are allowed to have heterogeneous information roles in the sense that a subset of nodes can be active (that is, subject to observations of a process of interest and the rest can be passive (that is, subject to no observation. Both fixed and varying active and passive roles of sensor nodes in the network are investigated. In addition, these nodes are allowed to have non-identical sensor modalities under the common underlying assumption that they have complimentary properties distributed over the sensor network to achieve collective observability. The key feature of our framework is that it utilizes local information not only during the execution of the proposed distributed input and state estimation architecture but also in its design in that global uniform ultimate boundedness of error dynamics is guaranteed once each node satisfies given local stability conditions independent from the graph topology and neighboring information of these nodes. As a special case (e.g., when all nodes are active and a positive real condition is satisfied, the asymptotic stability can be achieved with our algorithm. Several illustrative numerical examples are further provided to demonstrate the efficacy of the proposed architecture.

  5. Integration of Information Technologies in Enterprise Application Development

    OpenAIRE

    Iulia SURUGIU

    2012-01-01

    Healthcare enterprises are disconnected. In the era of integrated information systems and Internet explosion, the necessity of information systems integration reside from business process evolution, on the one hand, and from information technology tendencies, on the other hand. In order to become more efficient and adaptive to change, healthcare organizations are tremendously preoccupied of business process automation, flexibility and complexity. The need of information systems integration ar...

  6. CLASSIFICATION OF INFORMAL SETTLEMENTS THROUGH THE INTEGRATION OF 2D AND 3D FEATURES EXTRACTED FROM UAV DATA

    Directory of Open Access Journals (Sweden)

    C. M. Gevaert

    2016-06-01

    Full Text Available Unmanned Aerial Vehicles (UAVs are capable of providing very high resolution and up-to-date information to support informal settlement upgrading projects. In order to provide accurate basemaps, urban scene understanding through the identification and classification of buildings and terrain is imperative. However, common characteristics of informal settlements such as small, irregular buildings with heterogeneous roof material and large presence of clutter challenge state-of-the-art algorithms. Especially the dense buildings and steeply sloped terrain cause difficulties in identifying elevated objects. This work investigates how 2D radiometric and textural features, 2.5D topographic features, and 3D geometric features obtained from UAV imagery can be integrated to obtain a high classification accuracy in challenging classification problems for the analysis of informal settlements. It compares the utility of pixel-based and segment-based features obtained from an orthomosaic and DSM with point-based and segment-based features extracted from the point cloud to classify an unplanned settlement in Kigali, Rwanda. Findings show that the integration of 2D and 3D features leads to higher classification accuracies.

  7. Heterogeneous Economic Integration Agreement Effects

    OpenAIRE

    Baier, Scott L.; Bergstrand, Jeffrey H.; Clance, Matthew W.

    2015-01-01

    Gravity equations have been used for more than 50 years to estimate ex post the partial effects of trade costs on international trade flows, and the well-known - and traditionally presumed exogenous – "trade-cost elasticity" plays a central role in computing general equilibrium trade-flow and welfare effects of trade-cost changes. This paper addresses theoretically and empirically the influence of variable and fixed export costs in explaining the likely heterogeneity in the trade-cost elast...

  8. InP-DHBT-on-BiCMOS technology with fT/fmax of 400/350 GHz for heterogeneous integrated millimeter-wave sources

    DEFF Research Database (Denmark)

    Kraemer, Tomas; Ostermay, Ina; Jensen, Thomas

    2013-01-01

    -100 GHz. The 0.8 × 5 μm2 InP DHBTs show fT/fmax of 400/350 GHz with an output power of more than 26 mW at 96 GHz. These are record values for a heterogeneously integrated transistor on silicon. As a circuit example, a 164-GHz signal source is presented. It features a voltage-controlled oscillator in Bi......This paper presents a novel InP-SiGe BiCMOS technology using wafer-scale heterogeneous integration. The vertical stacking of the InP double heterojunction bipolar transistor (DHBT) circuitry directly on top of the BiCMOS wafer enables ultra-broadband interconnects with

  9. Convergence to consensus in heterogeneous groups and the emergence of informal leadership.

    Science.gov (United States)

    Gavrilets, Sergey; Auerbach, Jeremy; van Vugt, Mark

    2016-07-14

    When group cohesion is essential, groups must have efficient strategies in place for consensus decision-making. Recent theoretical work suggests that shared decision-making is often the most efficient way for dealing with both information uncertainty and individual variation in preferences. However, some animal and most human groups make collective decisions through particular individuals, leaders, that have a disproportionate influence on group decision-making. To address this discrepancy between theory and data, we study a simple, but general, model that explicitly focuses on the dynamics of consensus building in groups composed by individuals who are heterogeneous in preferences, certain personality traits (agreeability and persuasiveness), reputation, and social networks. We show that within-group heterogeneity can significantly delay democratic consensus building as well as give rise to the emergence of informal leaders, i.e. individuals with a disproportionately large impact on group decisions. Our results thus imply strong benefits of leadership particularly when groups experience time pressure and significant conflict of interest between members (due to various between-individual differences). Overall, our models shed light on why leadership and decision-making hierarchies are widespread, especially in human groups.

  10. A Reconfigurable Readout Integrated Circuit for Heterogeneous Display-Based Multi-Sensor Systems

    Directory of Open Access Journals (Sweden)

    Kyeonghwan Park

    2017-04-01

    Full Text Available This paper presents a reconfigurable multi-sensor interface and its readout integrated circuit (ROIC for display-based multi-sensor systems, which builds up multi-sensor functions by utilizing touch screen panels. In addition to inherent touch detection, physiological and environmental sensor interfaces are incorporated. The reconfigurable feature is effectively implemented by proposing two basis readout topologies of amplifier-based and oscillator-based circuits. For noise-immune design against various noises from inherent human-touch operations, an alternate-sampling error-correction scheme is proposed and integrated inside the ROIC, achieving a 12-bit resolution of successive approximation register (SAR of analog-to-digital conversion without additional calibrations. A ROIC prototype that includes the whole proposed functions and data converters was fabricated in a 0.18 μm complementary metal oxide semiconductor (CMOS process, and its feasibility was experimentally verified to support multiple heterogeneous sensing functions of touch, electrocardiogram, body impedance, and environmental sensors.

  11. On the area spectral efficiency improvement of heterogeneous network by exploiting the integration of macro-femto cellular networks

    KAUST Repository

    Shakir, Muhammad; Alouini, Mohamed-Slim

    2012-01-01

    . In this paper, we consider a Heterogeneous network where we complement the macrocell network with low-power low-cost user deployed nodes, such as femtocell base stations to increase the mean achievable capacity of the system. In this context, we integrate macro

  12. kpath: integration of metabolic pathway linked data.

    Science.gov (United States)

    Navas-Delgado, Ismael; García-Godoy, María Jesús; López-Camacho, Esteban; Rybinski, Maciej; Reyes-Palomares, Armando; Medina, Miguel Ángel; Aldana-Montes, José F

    2015-01-01

    In the last few years, the Life Sciences domain has experienced a rapid growth in the amount of available biological databases. The heterogeneity of these databases makes data integration a challenging issue. Some integration challenges are locating resources, relationships, data formats, synonyms or ambiguity. The Linked Data approach partially solves the heterogeneity problems by introducing a uniform data representation model. Linked Data refers to a set of best practices for publishing and connecting structured data on the Web. This article introduces kpath, a database that integrates information related to metabolic pathways. kpath also provides a navigational interface that enables not only the browsing, but also the deep use of the integrated data to build metabolic networks based on existing disperse knowledge. This user interface has been used to showcase relationships that can be inferred from the information available in several public databases. © The Author(s) 2015. Published by Oxford University Press.

  13. Vertical and lateral heterogeneous integration

    Science.gov (United States)

    Geske, Jon; Okuno, Yae L.; Bowers, John E.; Jayaraman, Vijay

    2001-09-01

    A technique for achieving large-scale monolithic integration of lattice-mismatched materials in the vertical direction and the lateral integration of dissimilar lattice-matched structures has been developed. The technique uses a single nonplanar direct-wafer-bond step to transform vertically integrated epitaxial structures into lateral epitaxial variation across the surface of a wafer. Nonplanar wafer bonding is demonstrated by integrating four different unstrained multi-quantum-well active regions lattice matched to InP on a GaAs wafer surface. Microscopy is used to verify the quality of the bonded interface, and photoluminescence is used to verify that the bonding process does not degrade the optical quality of the laterally integrated wells. The authors propose this technique as a means to achieve greater levels of wafer-scale integration in optical, electrical, and micromechanical devices.

  14. Efficient Algorithms for Searching the Minimum Information Partition in Integrated Information Theory

    Science.gov (United States)

    Kitazono, Jun; Kanai, Ryota; Oizumi, Masafumi

    2018-03-01

    The ability to integrate information in the brain is considered to be an essential property for cognition and consciousness. Integrated Information Theory (IIT) hypothesizes that the amount of integrated information ($\\Phi$) in the brain is related to the level of consciousness. IIT proposes that to quantify information integration in a system as a whole, integrated information should be measured across the partition of the system at which information loss caused by partitioning is minimized, called the Minimum Information Partition (MIP). The computational cost for exhaustively searching for the MIP grows exponentially with system size, making it difficult to apply IIT to real neural data. It has been previously shown that if a measure of $\\Phi$ satisfies a mathematical property, submodularity, the MIP can be found in a polynomial order by an optimization algorithm. However, although the first version of $\\Phi$ is submodular, the later versions are not. In this study, we empirically explore to what extent the algorithm can be applied to the non-submodular measures of $\\Phi$ by evaluating the accuracy of the algorithm in simulated data and real neural data. We find that the algorithm identifies the MIP in a nearly perfect manner even for the non-submodular measures. Our results show that the algorithm allows us to measure $\\Phi$ in large systems within a practical amount of time.

  15. Climate forcing and infectious disease transmission in urban landscapes: integrating demographic and socioeconomic heterogeneity.

    Science.gov (United States)

    Santos-Vega, Mauricio; Martinez, Pamela P; Pascual, Mercedes

    2016-10-01

    Urbanization and climate change are the two major environmental challenges of the 21st century. The dramatic expansion of cities around the world creates new conditions for the spread, surveillance, and control of infectious diseases. In particular, urban growth generates pronounced spatial heterogeneity within cities, which can modulate the effect of climate factors at local spatial scales in large urban environments. Importantly, the interaction between environmental forcing and socioeconomic heterogeneity at local scales remains an open area in infectious disease dynamics, especially for urban landscapes of the developing world. A quantitative and conceptual framework on urban health with a focus on infectious diseases would benefit from integrating aspects of climate forcing, population density, and level of wealth. In this paper, we review what is known about these drivers acting independently and jointly on urban infectious diseases; we then outline elements that are missing and would contribute to building such a framework. © 2016 New York Academy of Sciences.

  16. Spatial Preference Heterogeneity for Integrated River Basin Management: The Case of the Shiyang River Basin, China

    Directory of Open Access Journals (Sweden)

    Fanus Asefaw Aregay

    2016-09-01

    Full Text Available Integrated river basin management (IRBM programs have been launched in most parts of China to ease escalating environmental degradation. Meanwhile, little is known about the benefits from and the support for these programs. This paper presents a case study of the preference heterogeneity for IRBM in the Shiyang River Basin, China, as measured by the Willingness to Pay (WTP, for a set of major restoration attributes. A discrete choice analysis of relevant restoration attributes was conducted. The results based on a sample of 1012 households in the whole basin show that, on average, there is significant support for integrated ecological restoration as indicated by significant WTP for all ecological attributes. However, residential location induced preference heterogeneities are prevalent. Generally, compared to upper-basin residents, middle sub-basin residents have lower mean WTP while lower sub-basin residents express higher mean WTP. The disparity in utility is partially explained by the difference in ecological and socio-economic status of the residents. In conclusion, estimating welfare benefit of IRBM projects based on sample responses from a specific sub-section of the basin only may either understate or overstate the welfare estimate.

  17. Heterogeneous Embedded Real-Time Systems Environment

    Science.gov (United States)

    2003-12-01

    AFRL-IF-RS-TR-2003-290 Final Technical Report December 2003 HETEROGENEOUS EMBEDDED REAL - TIME SYSTEMS ENVIRONMENT Integrated...HETEROGENEOUS EMBEDDED REAL - TIME SYSTEMS ENVIRONMENT 6. AUTHOR(S) Cosmo Castellano and James Graham 5. FUNDING NUMBERS C - F30602-97-C-0259

  18. Intratumor heterogeneity alters most effective drugs in designed combinations.

    Science.gov (United States)

    Zhao, Boyang; Hemann, Michael T; Lauffenburger, Douglas A

    2014-07-22

    The substantial spatial and temporal heterogeneity observed in patient tumors poses considerable challenges for the design of effective drug combinations with predictable outcomes. Currently, the implications of tissue heterogeneity and sampling bias during diagnosis are unclear for selection and subsequent performance of potential combination therapies. Here, we apply a multiobjective computational optimization approach integrated with empirical information on efficacy and toxicity for individual drugs with respect to a spectrum of genetic perturbations, enabling derivation of optimal drug combinations for heterogeneous tumors comprising distributions of subpopulations possessing these perturbations. Analysis across probabilistic samplings from the spectrum of various possible distributions reveals that the most beneficial (considering both efficacy and toxicity) set of drugs changes as the complexity of genetic heterogeneity increases. Importantly, a significant likelihood arises that a drug selected as the most beneficial single agent with respect to the predominant subpopulation in fact does not reside within the most broadly useful drug combinations for heterogeneous tumors. The underlying explanation appears to be that heterogeneity essentially homogenizes the benefit of drug combinations, reducing the special advantage of a particular drug on a specific subpopulation. Thus, this study underscores the importance of considering heterogeneity in choosing drug combinations and offers a principled approach toward designing the most likely beneficial set, even if the subpopulation distribution is not precisely known.

  19. Information content of long-range NMR data for the characterization of conformational heterogeneity

    Energy Technology Data Exchange (ETDEWEB)

    Andrałojć, Witold [University of Florence, Center for Magnetic Resonance (CERM) (Italy); Berlin, Konstantin; Fushman, David, E-mail: fushman@umd.edu [University of Maryland, Department of Chemistry and Biochemistry, Center for Biomolecular Structure and Organization (United States); Luchinat, Claudio, E-mail: luchinat@cerm.unifi.it; Parigi, Giacomo; Ravera, Enrico [University of Florence, Center for Magnetic Resonance (CERM) (Italy); Sgheri, Luca [CNR, Istituto per le Applicazioni del Calcolo, Sezione di Firenze (Italy)

    2015-07-15

    Long-range NMR data, namely residual dipolar couplings (RDCs) from external alignment and paramagnetic data, are becoming increasingly popular for the characterization of conformational heterogeneity of multidomain biomacromolecules and protein complexes. The question addressed here is how much information is contained in these averaged data. We have analyzed and compared the information content of conformationally averaged RDCs caused by steric alignment and of both RDCs and pseudocontact shifts caused by paramagnetic alignment, and found that, despite the substantial differences, they contain a similar amount of information. Furthermore, using several synthetic tests we find that both sets of data are equally good towards recovering the major state(s) in conformational distributions.

  20. Patterns and effects of GC3 heterogeneity and parsimony informative sites on the phylogenetic tree of genes.

    Science.gov (United States)

    Ma, Shuai; Wu, Qi; Hu, Yibo; Wei, Fuwen

    2018-05-20

    The explosive growth in genomic data has provided novel insights into the conflicting signals hidden in phylogenetic trees. Although some studies have explored the effects of the GC content and parsimony informative sites (PIS) on the phylogenetic tree, the effect of the heterogeneity of the GC content at the first/second/third codon position on parsimony informative sites (GC1/2/3 PIS ) among different species and the effect of PIS on phylogenetic tree construction remain largely unexplored. Here, we used two different mammal genomic datasets to explore the patterns of GC1/2/3 PIS heterogeneity and the effect of PIS on the phylogenetic tree of genes: (i) all GC1/2/3 PIS have obvious heterogeneity between different mammals, and the levels of heterogeneity are GC3 PIS  > GC2 PIS  > GC1 PIS ; (ii) the number of PIS is positively correlated with the metrics of "good" gene tree topologies, and excluding the third codon position (C3) decreases the quality of gene trees by removing too many PIS. These results provide novel insights into the heterogeneity pattern of GC1/2/3 PIS in mammals and the relationship between GC3/PIS and gene trees. Additionally, it is necessary to carefully consider whether to exclude C3 to improve the quality of gene trees, especially in the super-tree method. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. Theoretical information reuse and integration

    CERN Document Server

    Rubin, Stuart

    2016-01-01

    Information Reuse and Integration addresses the efficient extension and creation of knowledge through the exploitation of Kolmogorov complexity in the extraction and application of domain symmetry. Knowledge, which seems to be novel, can more often than not be recast as the image of a sequence of transformations, which yield symmetric knowledge. When the size of those transformations and/or the length of that sequence of transforms exceeds the size of the image, then that image is said to be novel or random. It may also be that the new knowledge is random in that no such sequence of transforms, which produces it exists, or is at least known. The nine chapters comprising this volume incorporate symmetry, reuse, and integration as overt operational procedures or as operations built into the formal representations of data and operators employed. Either way, the aforementioned theoretical underpinnings of information reuse and integration are supported.

  2. Exploring the dynamic integration of heterogeneous services

    CSIR Research Space (South Africa)

    Makamba, M

    2016-08-01

    Full Text Available components for communication and collaboration amongst enterprises internally and externally. Since Internet has stimulated the use of services, different services have been developed for different purposes prompting those services to be heterogeneous due...

  3. Cell signaling heterogeneity is modulated by both cell-intrinsic and -extrinsic mechanisms: An integrated approach to understanding targeted therapy.

    Science.gov (United States)

    Kim, Eunjung; Kim, Jae-Young; Smith, Matthew A; Haura, Eric B; Anderson, Alexander R A

    2018-03-01

    During the last decade, our understanding of cancer cell signaling networks has significantly improved, leading to the development of various targeted therapies that have elicited profound but, unfortunately, short-lived responses. This is, in part, due to the fact that these targeted therapies ignore context and average out heterogeneity. Here, we present a mathematical framework that addresses the impact of signaling heterogeneity on targeted therapy outcomes. We employ a simplified oncogenic rat sarcoma (RAS)-driven mitogen-activated protein kinase (MAPK) and phosphoinositide 3-kinase-protein kinase B (PI3K-AKT) signaling pathway in lung cancer as an experimental model system and develop a network model of the pathway. We measure how inhibition of the pathway modulates protein phosphorylation as well as cell viability under different microenvironmental conditions. Training the model on this data using Monte Carlo simulation results in a suite of in silico cells whose relative protein activities and cell viability match experimental observation. The calibrated model predicts distributional responses to kinase inhibitors and suggests drug resistance mechanisms that can be exploited in drug combination strategies. The suggested combination strategies are validated using in vitro experimental data. The validated in silico cells are further interrogated through an unsupervised clustering analysis and then integrated into a mathematical model of tumor growth in a homogeneous and resource-limited microenvironment. We assess posttreatment heterogeneity and predict vast differences across treatments with similar efficacy, further emphasizing that heterogeneity should modulate treatment strategies. The signaling model is also integrated into a hybrid cellular automata (HCA) model of tumor growth in a spatially heterogeneous microenvironment. As a proof of concept, we simulate tumor responses to targeted therapies in a spatially segregated tissue structure containing tumor

  4. On the area spectral efficiency improvement of heterogeneous network by exploiting the integration of macro-femto cellular networks

    KAUST Repository

    Shakir, Muhammad

    2012-06-01

    Heterogeneous networks are an attractive means of expanding mobile network capacity. A heterogeneous network is typically composed of multiple radio access technologies (RATs) where the base stations are transmitting with variable power. In this paper, we consider a Heterogeneous network where we complement the macrocell network with low-power low-cost user deployed nodes, such as femtocell base stations to increase the mean achievable capacity of the system. In this context, we integrate macro-femto cellular networks and derive the area spectral efficiency of the proposed two tier Heterogeneous network. We consider the deployment of femtocell base stations around the edge of the macrocell such that this configuration is referred to as femto-on-edge (FOE) configuration. Moreover, FOE configuration mandates reduction in intercell interference due to the mobile users which are located around the edge of the macrocell since these femtocell base stations are low-power nodes which has significantly lower transmission power than macrocell base stations. We present a mathematical analysis to calculate the instantaneous carrier to interference ratio (CIR) of the desired mobile user in macro and femto cellular networks and determine the total area spectral efficiency of the Heterogeneous network. Details of the simulation processes are included to support the analysis and show the efficacy of the proposed deployment. It has been shown that the proposed setup of the Heterogeneous network offers higher area spectral efficiency which aims to fulfill the expected demand of the future mobile users. © 2012 IEEE.

  5. Environment, safety, and health information technology systems integration.

    Energy Technology Data Exchange (ETDEWEB)

    Hendrickson, David A.; Bayer, Gregory W.

    2006-02-01

    The ES&H Information Systems department, motivated by the numerous isolated information technology systems under its control, undertook a significant integration effort. This effort was planned and executed over the course of several years and parts of it still continue today. The effect was to help move the ES&H Information Systems department toward integration with the corporate Information Solutions and Services center.

  6. Provably Secure Heterogeneous Access Control Scheme for Wireless Body Area Network.

    Science.gov (United States)

    Omala, Anyembe Andrew; Mbandu, Angolo Shem; Mutiria, Kamenyi Domenic; Jin, Chunhua; Li, Fagen

    2018-04-28

    Wireless body area network (WBAN) provides a medium through which physiological information could be harvested and transmitted to application provider (AP) in real time. Integrating WBAN in a heterogeneous Internet of Things (IoT) ecosystem would enable an AP to monitor patients from anywhere and at anytime. However, the IoT roadmap of interconnected 'Things' is still faced with many challenges. One of the challenges in healthcare is security and privacy of streamed medical data from heterogeneously networked devices. In this paper, we first propose a heterogeneous signcryption scheme where a sender is in a certificateless cryptographic (CLC) environment while a receiver is in identity-based cryptographic (IBC) environment. We then use this scheme to design a heterogeneous access control protocol. Formal security proof for indistinguishability against adaptive chosen ciphertext attack and unforgeability against adaptive chosen message attack in random oracle model is presented. In comparison with some of the existing access control schemes, our scheme has lower computation and communication cost.

  7. Astrocytes regulate heterogeneity of presynaptic strengths in hippocampal networks

    Science.gov (United States)

    Letellier, Mathieu; Park, Yun Kyung; Chater, Thomas E.; Chipman, Peter H.; Gautam, Sunita Ghimire; Oshima-Takago, Tomoko; Goda, Yukiko

    2016-01-01

    Dendrites are neuronal structures specialized for receiving and processing information through their many synaptic inputs. How input strengths are modified across dendrites in ways that are crucial for synaptic integration and plasticity remains unclear. We examined in single hippocampal neurons the mechanism of heterosynaptic interactions and the heterogeneity of synaptic strengths of pyramidal cell inputs. Heterosynaptic presynaptic plasticity that counterbalances input strengths requires N-methyl-d-aspartate receptors (NMDARs) and astrocytes. Importantly, this mechanism is shared with the mechanism for maintaining highly heterogeneous basal presynaptic strengths, which requires astrocyte Ca2+ signaling involving NMDAR activation, astrocyte membrane depolarization, and L-type Ca2+ channels. Intracellular infusion of NMDARs or Ca2+-channel blockers into astrocytes, conditionally ablating the GluN1 NMDAR subunit, or optogenetically hyperpolarizing astrocytes with archaerhodopsin promotes homogenization of convergent presynaptic inputs. Our findings support the presence of an astrocyte-dependent cellular mechanism that enhances the heterogeneity of presynaptic strengths of convergent connections, which may help boost the computational power of dendrites. PMID:27118849

  8. Multiclass classification for skin cancer profiling based on the integration of heterogeneous gene expression series.

    Science.gov (United States)

    Gálvez, Juan Manuel; Castillo, Daniel; Herrera, Luis Javier; San Román, Belén; Valenzuela, Olga; Ortuño, Francisco Manuel; Rojas, Ignacio

    2018-01-01

    Most of the research studies developed applying microarray technology to the characterization of different pathological states of any disease may fail in reaching statistically significant results. This is largely due to the small repertoire of analysed samples, and to the limitation in the number of states or pathologies usually addressed. Moreover, the influence of potential deviations on the gene expression quantification is usually disregarded. In spite of the continuous changes in omic sciences, reflected for instance in the emergence of new Next-Generation Sequencing-related technologies, the existing availability of a vast amount of gene expression microarray datasets should be properly exploited. Therefore, this work proposes a novel methodological approach involving the integration of several heterogeneous skin cancer series, and a later multiclass classifier design. This approach is thus a way to provide the clinicians with an intelligent diagnosis support tool based on the use of a robust set of selected biomarkers, which simultaneously distinguishes among different cancer-related skin states. To achieve this, a multi-platform combination of microarray datasets from Affymetrix and Illumina manufacturers was carried out. This integration is expected to strengthen the statistical robustness of the study as well as the finding of highly-reliable skin cancer biomarkers. Specifically, the designed operation pipeline has allowed the identification of a small subset of 17 differentially expressed genes (DEGs) from which to distinguish among 7 involved skin states. These genes were obtained from the assessment of a number of potential batch effects on the gene expression data. The biological interpretation of these genes was inspected in the specific literature to understand their underlying information in relation to skin cancer. Finally, in order to assess their possible effectiveness in cancer diagnosis, a cross-validation Support Vector Machines (SVM

  9. Semantic Observation Integration

    Directory of Open Access Journals (Sweden)

    Werner Kuhn

    2012-09-01

    Full Text Available Although the integration of sensor-based information into analysis and decision making has been a research topic for many years, semantic interoperability has not yet been reached. The advent of user-generated content for the geospatial domain, Volunteered Geographic Information (VGI, makes it even more difficult to establish semantic integration. This paper proposes a novel approach to integrating conventional sensor information and VGI, which is exploited in the context of detecting forest fires. In contrast to common logic-based semantic descriptions, we present a formal system using algebraic specifications to unambiguously describe the processing steps from natural phenomena to value-added information. A generic ontology of observations is extended and profiled for forest fire detection in order to illustrate how the sensing process, and transformations between heterogeneous sensing systems, can be represented as mathematical functions and grouped into abstract data types. We discuss the required ontological commitments and a possible generalization.

  10. 3D stacked chips from emerging processes to heterogeneous systems

    CERN Document Server

    Fettweis, Gerhard

    2016-01-01

    This book explains for readers how 3D chip stacks promise to increase the level of on-chip integration, and to design new heterogeneous semiconductor devices that combine chips of different integration technologies (incl. sensors) in a single package of the smallest possible size.  The authors focus on heterogeneous 3D integration, addressing some of the most important challenges in this emerging technology, including contactless, optics-based, and carbon-nanotube-based 3D integration, as well as signal-integrity and thermal management issues in copper-based 3D integration. Coverage also includes the 3D heterogeneous integration of power sources, photonic devices, and non-volatile memories based on new materials systems.   •Provides single-source reference to the latest research in 3D optoelectronic integration: process, devices, and systems; •Explains the use of wireless 3D integration to improve 3D IC reliability and yield; •Describes techniques for monitoring and mitigating thermal behavior in 3D I...

  11. Information Integration Technology Demonstration (IITD)

    National Research Council Canada - National Science Library

    Loe, Richard

    2001-01-01

    The objectives of the Information Integration Technology Demonstration (IITD) were to investigate, design a software architecture and demonstrate a capability to display intelligence data from multiple disciplines...

  12. Accounting for Heterogeneity in Relative Treatment Effects for Use in Cost-Effectiveness Models and Value-of-Information Analyses.

    Science.gov (United States)

    Welton, Nicky J; Soares, Marta O; Palmer, Stephen; Ades, Anthony E; Harrison, David; Shankar-Hari, Manu; Rowan, Kathy M

    2015-07-01

    Cost-effectiveness analysis (CEA) models are routinely used to inform health care policy. Key model inputs include relative effectiveness of competing treatments, typically informed by meta-analysis. Heterogeneity is ubiquitous in meta-analysis, and random effects models are usually used when there is variability in effects across studies. In the absence of observed treatment effect modifiers, various summaries from the random effects distribution (random effects mean, predictive distribution, random effects distribution, or study-specific estimate [shrunken or independent of other studies]) can be used depending on the relationship between the setting for the decision (population characteristics, treatment definitions, and other contextual factors) and the included studies. If covariates have been measured that could potentially explain the heterogeneity, then these can be included in a meta-regression model. We describe how covariates can be included in a network meta-analysis model and how the output from such an analysis can be used in a CEA model. We outline a model selection procedure to help choose between competing models and stress the importance of clinical input. We illustrate the approach with a health technology assessment of intravenous immunoglobulin for the management of adult patients with severe sepsis in an intensive care setting, which exemplifies how risk of bias information can be incorporated into CEA models. We show that the results of the CEA and value-of-information analyses are sensitive to the model and highlight the importance of sensitivity analyses when conducting CEA in the presence of heterogeneity. The methods presented extend naturally to heterogeneity in other model inputs, such as baseline risk. © The Author(s) 2015.

  13. Interconnecting heterogeneous database management systems

    Science.gov (United States)

    Gligor, V. D.; Luckenbaugh, G. L.

    1984-01-01

    It is pointed out that there is still a great need for the development of improved communication between remote, heterogeneous database management systems (DBMS). Problems regarding the effective communication between distributed DBMSs are primarily related to significant differences between local data managers, local data models and representations, and local transaction managers. A system of interconnected DBMSs which exhibit such differences is called a network of distributed, heterogeneous DBMSs. In order to achieve effective interconnection of remote, heterogeneous DBMSs, the users must have uniform, integrated access to the different DBMs. The present investigation is mainly concerned with an analysis of the existing approaches to interconnecting heterogeneous DBMSs, taking into account four experimental DBMS projects.

  14. Content-Agnostic Malware Detection in Heterogeneous Malicious Distribution Graph

    KAUST Repository

    Alabdulmohsin, Ibrahim

    2016-10-26

    Malware detection has been widely studied by analysing either file dropping relationships or characteristics of the file distribution network. This paper, for the first time, studies a global heterogeneous malware delivery graph fusing file dropping relationship and the topology of the file distribution network. The integration offers a unique ability of structuring the end-to-end distribution relationship. However, it brings large heterogeneous graphs to analysis. In our study, an average daily generated graph has more than 4 million edges and 2.7 million nodes that differ in type, such as IPs, URLs, and files. We propose a novel Bayesian label propagation model to unify the multi-source information, including content-agnostic features of different node types and topological information of the heterogeneous network. Our approach does not need to examine the source codes nor inspect the dynamic behaviours of a binary. Instead, it estimates the maliciousness of a given file through a semi-supervised label propagation procedure, which has a linear time complexity w.r.t. the number of nodes and edges. The evaluation on 567 million real-world download events validates that our proposed approach efficiently detects malware with a high accuracy. © 2016 Copyright held by the owner/author(s).

  15. Integrated Risk Information System (IRIS)

    Data.gov (United States)

    U.S. Environmental Protection Agency — EPA?s Integrated Risk Information System (IRIS) is a compilation of electronic reports on specific substances found in the environment and their potential to cause...

  16. Informational segmentation in international capital markets

    OpenAIRE

    Wahl, Jack E.

    1988-01-01

    The economic influence of barriers to international information acquisition and, hence, of informational segmentation in international capital markets depends heavily upon the prevailing level of risk aversion. We find that these barriers are likely to have second order economic impact only. Furthermore, improving international informational integration is likely to Increase all asset prices when causing less heterogeneity of international subjective probability beliefs.

  17. Integration issues of information engineering based I-CASE tools

    OpenAIRE

    Kurbel, Karl; Schnieder, Thomas

    1994-01-01

    Problems and requirements regarding integration of methods and tools across phases of the software-development life cycle are discussed. Information engineering (IE) methodology and I-CASE (integrated CASE) tools supporting IE claim to have an integrated view across major stages of enterprise-wide information-system development: information strategy planning, business area analysis, system design, and construction. In the main part of this paper, two comprehensive I-CASE tools, ADW (Applicati...

  18. Heterogeneous integration of thin film compound semiconductor lasers and SU8 waveguides on SiO2/Si

    Science.gov (United States)

    Palit, Sabarni; Kirch, Jeremy; Mawst, Luke; Kuech, Thomas; Jokerst, Nan Marie

    2010-02-01

    We present the heterogeneous integration of a 3.8 μm thick InGaAs/GaAs edge emitting laser that was metal-metal bonded to SiO2/Si and end-fire coupled into a 2.8 μm thick tapered SU8 polymer waveguide integrated on the same substrate. The system was driven in pulsed mode and the waveguide output was captured on an IR imaging array to characterize the mode. The waveguide output was also coupled into a multimode fiber, and into an optical head and spectrum analyzer, indicating lasing at ~997 nm and a threshold current density of 250 A/cm2.

  19. A service integration platform for collaborative networks

    NARCIS (Netherlands)

    Osorio, A. L.; Afsarmanesh, H.; Camarinha-Matos, L.M.

    2011-01-01

    Integrated manufacturing constitutes a complex system made of heterogeneous information and control subsystems. Those subsystems are not designed to the cooperation. Typically each subsystem automates specific processes, and establishes closed application domains, therefore it is very difficult to

  20. Knowledge and information management for integrated water resource management

    Science.gov (United States)

    Watershed information systems that integrate data and analytical tools are critical enabling technologies to support Integrated Water Resource Management (IWRM) by converting data into information, and information into knowledge. Many factors bring people to the table to participate in an IWRM fra...

  1. Integrated Information Systems Across the Weather-Climate Continuum

    Science.gov (United States)

    Pulwarty, R. S.; Higgins, W.; Nierenberg, C.; Trtanj, J.

    2015-12-01

    The increasing demand for well-organized (integrated) end-to-end research-based information has been highlighted in several National Academy studies, in IPCC Reports (such as the SREX and Fifth Assessment) and by public and private constituents. Such information constitutes a significant component of the "environmental intelligence" needed to address myriad societal needs for early warning and resilience across the weather-climate continuum. The next generation of climate research in service to the nation requires an even more visible, authoritative and robust commitment to scientific integration in support of adaptive information systems that address emergent risks and inform longer-term resilience strategies. A proven mechanism for resourcing such requirements is to demonstrate vision, purpose, support, connection to constituencies, and prototypes of desired capabilities. In this presentation we will discuss efforts at NOAA, and elsewhere, that: Improve information on how changes in extremes in key phenomena such as drought, floods, and heat stress impact management decisions for resource planning and disaster risk reduction Develop regional integrated information systems to address these emergent challenges, that integrate observations, monitoring and prediction, impacts assessments and scenarios, preparedness and adaptation, and coordination and capacity-building. Such systems, as illustrated through efforts such as NIDIS, have strengthened the integration across the foundational research enterprise (through for instance, RISAs, Modeling Analysis Predictions and Projections) by increasing agility for responding to emergent risks. The recently- initiated Climate Services Information System, in support of the WMO Global Framework for Climate Services draws on the above models and will be introduced during the presentation.

  2. EFFICIENCY INDICATORS INFORMATION MANAGEMENT IN INTEGRATED SECURITY SYSTEMS

    Directory of Open Access Journals (Sweden)

    N. S. Rodionova

    2014-01-01

    Full Text Available Summary. Introduction of information technology to improve the efficiency of security activity leads to the need to consider a number of negative factors associated with in consequence of the use of these technologies as a key element of modern security systems. One of the most notable factor is the exposure to information processes in protection systems security threats. This largely relates to integrated security systems (ISS is the system of protection with the highest level of informatization security functions. Significant damage to protected objects that they could potentially incur as a result of abnormal operation ISS, puts a very actual problem of assessing factors that reduce the efficiency of the ISS to justify the ways and methods to improve it. Because of the nature of threats and blocking distortion of information in the ISS of interest are: the volume undistorted ISF working environment, as a characteristic of data integrity; time access to information as a feature of its availability. This in turn leads to the need to use these parameters as the performance characteristics of information processes in the ISS - the completeness and timeliness of information processing. The article proposes performance indicators of information processes in integrated security systems in terms of optimal control procedures to protect information from unauthorized access. Set the considered parameters allows to conduct comprehensive security analysis of integrated security systems, and to provide recommendations to improve the management of information security procedures in them.

  3. Risk Informed Structural Systems Integrity Management

    DEFF Research Database (Denmark)

    Nielsen, Michael Havbro Faber

    2017-01-01

    The present paper is predominantly a conceptual contribution with an appraisal of major developments in risk informed structural integrity management for offshore installations together with a discussion of their merits and the challenges which still lie ahead. Starting point is taken in a selected...... overview of research and development contributions which have formed the basis for Risk Based Inspection Planning (RBI) as we know it today. Thereafter an outline of the methodical basis for risk informed structural systems integrity management, i.e. the Bayesian decision analysis is provided in summary....... The main focus is here directed on RBI for offshore facilities subject to fatigue damages. New ideas and methodical frameworks in the area of robustness and resilience modeling of structural systems are then introduced, and it is outlined how these may adequately be utilized to enhance Structural Integrity...

  4. CSIR's new integrated electronic library information-system

    CSIR Research Space (South Africa)

    Michie, A

    1995-08-01

    Full Text Available The CSIR has developed a CDROM-based electronic library information system which provides the ability to reproduce and search for published information and colour brochures on the computer screen. The system integrates this information with online...

  5. Mass Spectrometry Imaging for the Investigation of Intratumor Heterogeneity.

    Science.gov (United States)

    Balluff, B; Hanselmann, M; Heeren, R M A

    2017-01-01

    One of the big clinical challenges in the treatment of cancer is the different behavior of cancer patients under guideline therapy. An important determinant for this phenomenon has been identified as inter- and intratumor heterogeneity. While intertumor heterogeneity refers to the differences in cancer characteristics between patients, intratumor heterogeneity refers to the clonal and nongenetic molecular diversity within a patient. The deciphering of intratumor heterogeneity is recognized as key to the development of novel therapeutics or treatment regimens. The investigation of intratumor heterogeneity is challenging since it requires an untargeted molecular analysis technique that accounts for the spatial and temporal dynamics of the tumor. So far, next-generation sequencing has contributed most to the understanding of clonal evolution within a cancer patient. However, it falls short in accounting for the spatial dimension. Mass spectrometry imaging (MSI) is a powerful tool for the untargeted but spatially resolved molecular analysis of biological tissues such as solid tumors. As it provides multidimensional datasets by the parallel acquisition of hundreds of mass channels, multivariate data analysis methods can be applied for the automated annotation of tissues. Moreover, it integrates the histology of the sample, which enables studying the molecular information in a histopathological context. This chapter will illustrate how MSI in combination with statistical methods and histology has been used for the description and discovery of intratumor heterogeneity in different cancers. This will give evidence that MSI constitutes a unique tool for the investigation of intratumor heterogeneity, and could hence become a key technology in cancer research. © 2017 Elsevier Inc. All rights reserved.

  6. Information Security Management - Part Of The Integrated Management System

    Science.gov (United States)

    Manea, Constantin Adrian

    2015-07-01

    The international management standards allow their integrated approach, thereby combining aspects of particular importance to the activity of any organization, from the quality management systems or the environmental management of the information security systems or the business continuity management systems. Although there is no national or international regulation, nor a defined standard for the Integrated Management System, the need to implement an integrated system occurs within the organization, which feels the opportunity to integrate the management components into a cohesive system, in agreement with the purpose and mission publicly stated. The issues relating to information security in the organization, from the perspective of the management system, raise serious questions to any organization in the current context of electronic information, reason for which we consider not only appropriate but necessary to promote and implement an Integrated Management System Quality - Environment - Health and Operational Security - Information Security

  7. The architecture of enterprise hospital information system.

    Science.gov (United States)

    Lu, Xudong; Duan, Huilong; Li, Haomin; Zhao, Chenhui; An, Jiye

    2005-01-01

    Because of the complexity of the hospital environment, there exist a lot of medical information systems from different vendors with incompatible structures. In order to establish an enterprise hospital information system, the integration among these heterogeneous systems must be considered. Complete integration should cover three aspects: data integration, function integration and workflow integration. However most of the previous design of architecture did not accomplish such a complete integration. This article offers an architecture design of the enterprise hospital information system based on the concept of digital neural network system in hospital. It covers all three aspects of integration, and eventually achieves the target of one virtual data center with Enterprise Viewer for users of different roles. The initial implementation of the architecture in the 5-year Digital Hospital Project in Huzhou Central hospital of Zhejiang Province is also described.

  8. Information Integration Architecture Development

    OpenAIRE

    Faulkner, Stéphane; Kolp, Manuel; Nguyen, Duy Thai; Coyette, Adrien; Do, Thanh Tung; 16th International Conference on Software Engineering and Knowledge Engineering

    2004-01-01

    Multi-Agent Systems (MAS) architectures are gaining popularity for building open, distributed, and evolving software required by systems such as information integration applications. Unfortunately, despite considerable work in software architecture during the last decade, few research efforts have aimed at truly defining patterns and languages for designing such multiagent architectures. We propose a modern approach based on organizational structures and architectural description lan...

  9. The Dilution Effect and Information Integration in Perceptual Decision Making.

    Science.gov (United States)

    Hotaling, Jared M; Cohen, Andrew L; Shiffrin, Richard M; Busemeyer, Jerome R

    2015-01-01

    In cognitive science there is a seeming paradox: On the one hand, studies of human judgment and decision making have repeatedly shown that people systematically violate optimal behavior when integrating information from multiple sources. On the other hand, optimal models, often Bayesian, have been successful at accounting for information integration in fields such as categorization, memory, and perception. This apparent conflict could be due, in part, to different materials and designs that lead to differences in the nature of processing. Stimuli that require controlled integration of information, such as the quantitative or linguistic information (commonly found in judgment studies), may lead to suboptimal performance. In contrast, perceptual stimuli may lend themselves to automatic processing, resulting in integration that is closer to optimal. We tested this hypothesis with an experiment in which participants categorized faces based on resemblance to a family patriarch. The amount of evidence contained in the top and bottom halves of each test face was independently manipulated. These data allow us to investigate a canonical example of sub-optimal information integration from the judgment and decision making literature, the dilution effect. Splitting the top and bottom halves of a face, a manipulation meant to encourage controlled integration of information, produced farther from optimal behavior and larger dilution effects. The Multi-component Information Accumulation model, a hybrid optimal/averaging model of information integration, successfully accounts for key accuracy, response time, and dilution effects.

  10. The Dilution Effect and Information Integration in Perceptual Decision Making.

    Directory of Open Access Journals (Sweden)

    Jared M Hotaling

    Full Text Available In cognitive science there is a seeming paradox: On the one hand, studies of human judgment and decision making have repeatedly shown that people systematically violate optimal behavior when integrating information from multiple sources. On the other hand, optimal models, often Bayesian, have been successful at accounting for information integration in fields such as categorization, memory, and perception. This apparent conflict could be due, in part, to different materials and designs that lead to differences in the nature of processing. Stimuli that require controlled integration of information, such as the quantitative or linguistic information (commonly found in judgment studies, may lead to suboptimal performance. In contrast, perceptual stimuli may lend themselves to automatic processing, resulting in integration that is closer to optimal. We tested this hypothesis with an experiment in which participants categorized faces based on resemblance to a family patriarch. The amount of evidence contained in the top and bottom halves of each test face was independently manipulated. These data allow us to investigate a canonical example of sub-optimal information integration from the judgment and decision making literature, the dilution effect. Splitting the top and bottom halves of a face, a manipulation meant to encourage controlled integration of information, produced farther from optimal behavior and larger dilution effects. The Multi-component Information Accumulation model, a hybrid optimal/averaging model of information integration, successfully accounts for key accuracy, response time, and dilution effects.

  11. An integrated healthcare enterprise information portal and healthcare information system framework.

    Science.gov (United States)

    Hsieh, S L; Lai, Feipei; Cheng, P H; Chen, J L; Lee, H H; Tsai, W N; Weng, Y C; Hsieh, S H; Hsu, K P; Ko, L F; Yang, T H; Chen, C H

    2006-01-01

    The paper presents an integrated, distributed Healthcare Enterprise Information Portal (HEIP) and Hospital Information Systems (HIS) framework over wireless/wired infrastructure at National Taiwan University Hospital (NTUH). A single sign-on solution for the hospital customer relationship management (CRM) in HEIP has been established. The outcomes of the newly developed Outpatient Information Systems (OIS) in HIS are discussed. The future HEIP blueprints with CRM oriented features: e-Learning, Remote Consultation and Diagnosis (RCD), as well as on-Line Vaccination Services are addressed. Finally, the integrated HEIP and HIS architectures based on the middleware technologies are proposed along with the feasible approaches. The preliminary performance of multi-media, time-based data exchanges over the wireless HEIP side is collected to evaluate the efficiency of the architecture.

  12. INTEGRATED INFORMATION SYSTEM ARCHITECTURE PROVIDING BEHAVIORAL FEATURE

    Directory of Open Access Journals (Sweden)

    Vladimir N. Shvedenko

    2016-11-01

    Full Text Available The paper deals with creation of integrated information system architecture capable of supporting management decisions using behavioral features. The paper considers the architecture of information decision support system for production system management. The behavioral feature is given to an information system, and it ensures extraction, processing of information, management decision-making with both automated and automatic modes of decision-making subsystem being permitted. Practical implementation of information system with behavior is based on service-oriented architecture: there is a set of independent services in the information system that provides data of its subsystems or data processing by separate application under the chosen variant of the problematic situation settlement. For creation of integrated information system with behavior we propose architecture including the following subsystems: data bus, subsystem for interaction with the integrated applications based on metadata, business process management subsystem, subsystem for the current state analysis of the enterprise and management decision-making, behavior training subsystem. For each problematic situation a separate logical layer service is created in Unified Service Bus handling problematic situations. This architecture reduces system information complexity due to the fact that with a constant amount of system elements the number of links decreases, since each layer provides communication center of responsibility for the resource with the services of corresponding applications. If a similar problematic situation occurs, its resolution is automatically removed from problem situation metamodel repository and business process metamodel of its settlement. In the business process performance commands are generated to the corresponding centers of responsibility to settle a problematic situation.

  13. Effect of Heterogeneous Interest Similarity on the Spread of Information in Mobile Social Networks

    Science.gov (United States)

    Zhao, Narisa; Sui, Guoqin; Yang, Fan

    2018-06-01

    Mobile social networks (MSNs) are important platforms for spreading news. The fact that individuals usually forward information aligned with their own interests inevitably changes the dynamics of information spread. Thereby, first we present a theoretical model based on the discrete Markov chain and mean field theory to evaluate the effect of interest similarity on the information spread in MSNs. Meanwhile, individuals' interests are heterogeneous and vary with time. These two features result in interest shift behavior, and both features are considered in our model. A leveraging simulation demonstrates the accuracy of our model. Moreover, the basic reproduction number R0 is determined. Further extensive numerical analyses based on the model indicate that interest similarity has a critical impact on information spread at the early spreading stage. Specifically, the information always spreads more quickly and widely if the interest similarity between an individual and the information is higher. Finally, five actual data sets from Sina Weibo illustrate the validity of the model.

  14. Data integration for plant genomics--exemplars from the integration of Arabidopsis thaliana databases.

    Science.gov (United States)

    Lysenko, Artem; Lysenko, Atem; Hindle, Matthew Morritt; Taubert, Jan; Saqi, Mansoor; Rawlings, Christopher John

    2009-11-01

    The development of a systems based approach to problems in plant sciences requires integration of existing information resources. However, the available information is currently often incomplete and dispersed across many sources and the syntactic and semantic heterogeneity of the data is a challenge for integration. In this article, we discuss strategies for data integration and we use a graph based integration method (Ondex) to illustrate some of these challenges with reference to two example problems concerning integration of (i) metabolic pathway and (ii) protein interaction data for Arabidopsis thaliana. We quantify the degree of overlap for three commonly used pathway and protein interaction information sources. For pathways, we find that the AraCyc database contains the widest coverage of enzyme reactions and for protein interactions we find that the IntAct database provides the largest unique contribution to the integrated dataset. For both examples, however, we observe a relatively small amount of data common to all three sources. Analysis and visual exploration of the integrated networks was used to identify a number of practical issues relating to the interpretation of these datasets. We demonstrate the utility of these approaches to the analysis of groups of coexpressed genes from an individual microarray experiment, in the context of pathway information and for the combination of coexpression data with an integrated protein interaction network.

  15. Tumor Heterogeneity: Mechanisms and Bases for a Reliable Application of Molecular Marker Design

    Science.gov (United States)

    Diaz-Cano, Salvador J.

    2012-01-01

    Tumor heterogeneity is a confusing finding in the assessment of neoplasms, potentially resulting in inaccurate diagnostic, prognostic and predictive tests. This tumor heterogeneity is not always a random and unpredictable phenomenon, whose knowledge helps designing better tests. The biologic reasons for this intratumoral heterogeneity would then be important to understand both the natural history of neoplasms and the selection of test samples for reliable analysis. The main factors contributing to intratumoral heterogeneity inducing gene abnormalities or modifying its expression include: the gradient ischemic level within neoplasms, the action of tumor microenvironment (bidirectional interaction between tumor cells and stroma), mechanisms of intercellular transference of genetic information (exosomes), and differential mechanisms of sequence-independent modifications of genetic material and proteins. The intratumoral heterogeneity is at the origin of tumor progression and it is also the byproduct of the selection process during progression. Any analysis of heterogeneity mechanisms must be integrated within the process of segregation of genetic changes in tumor cells during the clonal expansion and progression of neoplasms. The evaluation of these mechanisms must also consider the redundancy and pleiotropism of molecular pathways, for which appropriate surrogate markers would support the presence or not of heterogeneous genetics and the main mechanisms responsible. This knowledge would constitute a solid scientific background for future therapeutic planning. PMID:22408433

  16. Integrated risk information system (IRIS)

    Energy Technology Data Exchange (ETDEWEB)

    Tuxen, L. [Environmental Protection Agency, Washington, DC (United States)

    1990-12-31

    The Integrated Risk Information System (IRIS) is an electronic information system developed by the US Environmental Protection Agency (EPA) containing information related to health risk assessment. IRIS is the Agency`s primary vehicle for communication of chronic health hazard information that represents Agency consensus following comprehensive review by intra-Agency work groups. The original purpose for developing IRIS was to provide guidance to EPA personnel in making risk management decisions. This original purpose for developing IRIS was to guidance to EPA personnel in making risk management decisions. This role has expanded and evolved with wider access and use of the system. IRIS contains chemical-specific information in summary format for approximately 500 chemicals. IRIS is available to the general public on the National Library of Medicine`s Toxicology Data Network (TOXNET) and on diskettes through the National Technical Information Service (NTIS).

  17. Information Systems Integration and Enterprise Application Integration (EAI) Adoption: A Case from Financial Services

    Science.gov (United States)

    Lam, Wing

    2007-01-01

    Increasingly, organizations find that they need to integrate large number of information systems in order to support enterprise-wide business initiatives such as e-business, supply chain management and customer relationship management. To date, organizations have largely tended to address information systems (IS) integration in an ad-hoc manner.…

  18. Optimized ECC Implementation for Secure Communication between Heterogeneous IoT Devices

    Directory of Open Access Journals (Sweden)

    Leandro Marin

    2015-08-01

    Full Text Available The Internet of Things is integrating information systems, places, users and billions of constrained devices into one global network. This network requires secure and private means of communications. The building blocks of the Internet of Things are devices manufactured by various producers and are designed to fulfil different needs. There would be no common hardware platform that could be applied in every scenario. In such a heterogeneous environment, there is a strong need for the optimization of interoperable security. We present optimized elliptic curve Cryptography algorithms that address the security issues in the heterogeneous IoT networks. We have combined cryptographic algorithms for the NXP/Jennic 5148- and MSP430-based IoT devices and used them to created novel key negotiation protocol.

  19. Optimized ECC Implementation for Secure Communication between Heterogeneous IoT Devices.

    Science.gov (United States)

    Marin, Leandro; Pawlowski, Marcin Piotr; Jara, Antonio

    2015-08-28

    The Internet of Things is integrating information systems, places, users and billions of constrained devices into one global network. This network requires secure and private means of communications. The building blocks of the Internet of Things are devices manufactured by various producers and are designed to fulfil different needs. There would be no common hardware platform that could be applied in every scenario. In such a heterogeneous environment, there is a strong need for the optimization of interoperable security. We present optimized elliptic curve Cryptography algorithms that address the security issues in the heterogeneous IoT networks. We have combined cryptographic algorithms for the NXP/Jennic 5148- and MSP430-based IoT devices and used them to created novel key negotiation protocol.

  20. Development of an integrated medical supply information system

    Science.gov (United States)

    Xu, Eric; Wermus, Marek; Blythe Bauman, Deborah

    2011-08-01

    The integrated medical supply inventory control system introduced in this study is a hybrid system that is shaped by the nature of medical supply, usage and storage capacity limitations of health care facilities. The system links demand, service provided at the clinic, health care service provider's information, inventory storage data and decision support tools into an integrated information system. ABC analysis method, economic order quantity model, two-bin method and safety stock concept are applied as decision support models to tackle inventory management issues at health care facilities. In the decision support module, each medical item and storage location has been scrutinised to determine the best-fit inventory control policy. The pilot case study demonstrates that the integrated medical supply information system holds several advantages for inventory managers, since it entails benefits of deploying enterprise information systems to manage medical supply and better patient services.

  1. An information integration theory of consciousness

    Directory of Open Access Journals (Sweden)

    Tononi Giulio

    2004-11-01

    Full Text Available Abstract Background Consciousness poses two main problems. The first is understanding the conditions that determine to what extent a system has conscious experience. For instance, why is our consciousness generated by certain parts of our brain, such as the thalamocortical system, and not by other parts, such as the cerebellum? And why are we conscious during wakefulness and much less so during dreamless sleep? The second problem is understanding the conditions that determine what kind of consciousness a system has. For example, why do specific parts of the brain contribute specific qualities to our conscious experience, such as vision and audition? Presentation of the hypothesis This paper presents a theory about what consciousness is and how it can be measured. According to the theory, consciousness corresponds to the capacity of a system to integrate information. This claim is motivated by two key phenomenological properties of consciousness: differentiation – the availability of a very large number of conscious experiences; and integration – the unity of each such experience. The theory states that the quantity of consciousness available to a system can be measured as the Φ value of a complex of elements. Φ is the amount of causally effective information that can be integrated across the informational weakest link of a subset of elements. A complex is a subset of elements with Φ>0 that is not part of a subset of higher Φ. The theory also claims that the quality of consciousness is determined by the informational relationships among the elements of a complex, which are specified by the values of effective information among them. Finally, each particular conscious experience is specified by the value, at any given time, of the variables mediating informational interactions among the elements of a complex. Testing the hypothesis The information integration theory accounts, in a principled manner, for several neurobiological observations

  2. Coordinated Energy Management in Heterogeneous Processors

    Directory of Open Access Journals (Sweden)

    Indrani Paul

    2014-01-01

    Full Text Available This paper examines energy management in a heterogeneous processor consisting of an integrated CPU–GPU for high-performance computing (HPC applications. Energy management for HPC applications is challenged by their uncompromising performance requirements and complicated by the need for coordinating energy management across distinct core types – a new and less understood problem. We examine the intra-node CPU–GPU frequency sensitivity of HPC applications on tightly coupled CPU–GPU architectures as the first step in understanding power and performance optimization for a heterogeneous multi-node HPC system. The insights from this analysis form the basis of a coordinated energy management scheme, called DynaCo, for integrated CPU–GPU architectures. We implement DynaCo on a modern heterogeneous processor and compare its performance to a state-of-the-art power- and performance-management algorithm. DynaCo improves measured average energy-delay squared (ED2 product by up to 30% with less than 2% average performance loss across several exascale and other HPC workloads.

  3. Coding of time-dependent stimuli in homogeneous and heterogeneous neural populations.

    Science.gov (United States)

    Beiran, Manuel; Kruscha, Alexandra; Benda, Jan; Lindner, Benjamin

    2018-04-01

    We compare the information transmission of a time-dependent signal by two types of uncoupled neuron populations that differ in their sources of variability: i) a homogeneous population whose units receive independent noise and ii) a deterministic heterogeneous population, where each unit exhibits a different baseline firing rate ('disorder'). Our criterion for making both sources of variability quantitatively comparable is that the interspike-interval distributions are identical for both systems. Numerical simulations using leaky integrate-and-fire neurons unveil that a non-zero amount of both noise or disorder maximizes the encoding efficiency of the homogeneous and heterogeneous system, respectively, as a particular case of suprathreshold stochastic resonance. Our findings thus illustrate that heterogeneity can render similarly profitable effects for neuronal populations as dynamic noise. The optimal noise/disorder depends on the system size and the properties of the stimulus such as its intensity or cutoff frequency. We find that weak stimuli are better encoded by a noiseless heterogeneous population, whereas for strong stimuli a homogeneous population outperforms an equivalent heterogeneous system up to a moderate noise level. Furthermore, we derive analytical expressions of the coherence function for the cases of very strong noise and of vanishing intrinsic noise or heterogeneity, which predict the existence of an optimal noise intensity. Our results show that, depending on the type of signal, noise as well as heterogeneity can enhance the encoding performance of neuronal populations.

  4. INEL Waste and Environmental Information Integration Project approach and concepts

    International Nuclear Information System (INIS)

    Dean, L.A.; Fairbourn, P.J.; Randall, V.C.; Riedesel, A.M.

    1994-06-01

    The Idaho National Engineering, Laboratory (INEL) Waste and Environmental Information integration Project (IWEIIP) was established in December 1993 to address issues related to INEL waste and environmental information including: Data quality; Data redundancy; Data accessibility; Data integration. This effort includes existing information, new development, and acquisition activities. Existing information may not be a database record; it may be an entire document (electronic, scanned, or hard-copy), a video clip, or a file cabinet of information. The IWEIIP will implement an effective integrated information framework to manage INEL waste and environmental information as an asset. This will improve data quality, resolve data redundancy, and increase data accessibility; therefore, providing more effective utilization of the dollars spent on waste and environmental information

  5. Using integrated information systems in supply chain management

    Science.gov (United States)

    Gonzálvez-Gallego, Nicolás; Molina-Castillo, Francisco-Jose; Soto-Acosta, Pedro; Varajao, Joao; Trigo, Antonio

    2015-02-01

    The aim of this paper is to empirically test not only the direct effects of information and communication technology (ICT) capabilities and integrated information systems (IS) on firm performance, but also the moderating role of IS integration along the supply chain in the relationship between ICT external and capabilities and business performance. Data collected from 102 large Iberian firms from Spain and Portugal are used to test the research model. The hierarchical multiple regression analysis is employed to test the direct effects and the moderating relationships proposed. Results show that external and internal ICT capabilities are important drivers of firm performance, while merely having integrated IS do not lead to better firm performance. In addition, a moderating effect of IS integration in the relationship between ICT capabilities and business performance is found, although this integration only contributes to firm performance when it is directed to connect with suppliers or customers rather than when integrating the whole supply chain.

  6. Standards to support information systems integration in anatomic pathology.

    Science.gov (United States)

    Daniel, Christel; García Rojo, Marcial; Bourquard, Karima; Henin, Dominique; Schrader, Thomas; Della Mea, Vincenzo; Gilbertson, John; Beckwith, Bruce A

    2009-11-01

    Integrating anatomic pathology information- text and images-into electronic health care records is a key challenge for enhancing clinical information exchange between anatomic pathologists and clinicians. The aim of the Integrating the Healthcare Enterprise (IHE) international initiative is precisely to ensure interoperability of clinical information systems by using existing widespread industry standards such as Digital Imaging and Communication in Medicine (DICOM) and Health Level Seven (HL7). To define standard-based informatics transactions to integrate anatomic pathology information to the Healthcare Enterprise. We used the methodology of the IHE initiative. Working groups from IHE, HL7, and DICOM, with special interest in anatomic pathology, defined consensual technical solutions to provide end-users with improved access to consistent information across multiple information systems. The IHE anatomic pathology technical framework describes a first integration profile, "Anatomic Pathology Workflow," dedicated to the diagnostic process including basic image acquisition and reporting solutions. This integration profile relies on 10 transactions based on HL7 or DICOM standards. A common specimen model was defined to consistently identify and describe specimens in both HL7 and DICOM transactions. The IHE anatomic pathology working group has defined standard-based informatics transactions to support the basic diagnostic workflow in anatomic pathology laboratories. In further stages, the technical framework will be completed to manage whole-slide images and semantically rich structured reports in the diagnostic workflow and to integrate systems used for patient care and those used for research activities (such as tissue bank databases or tissue microarrayers).

  7. [Research on medical instrument information integration technology based on IHE PCD].

    Science.gov (United States)

    Zheng, Jianli; Liao, Yun; Yang, Yongyong

    2014-06-01

    Integrating medical instruments with medical information systems becomes more and more important in healthcare industry. To make medical instruments without standard communication interface possess the capability of interoperating and sharing information with medical information systems, we developed a medical instrument integration gateway based on Integrating the Healthcare Enterprise Patient Care Device (IHE PCD) integration profiles in this research. The core component is an integration engine which is implemented according to integration profiles and Health Level Seven (HL7) messages defined in IHE PCD. Working with instrument specific Javascripts, the engine transforms medical instrument data into HL7 ORU message. This research enables medical instruments to interoperate and exchange medical data with information systems in a standardized way, and is valuable for medical instrument integration, especially for traditional instruments.

  8. Integrating Information & Communications Technologies into the Classroom

    Science.gov (United States)

    Tomei, Lawrence, Ed.

    2007-01-01

    "Integrating Information & Communications Technologies Into the Classroom" examines topics critical to business, computer science, and information technology education, such as: school improvement and reform, standards-based technology education programs, data-driven decision making, and strategic technology education planning. This book also…

  9. Meta-path based heterogeneous combat network link prediction

    Science.gov (United States)

    Li, Jichao; Ge, Bingfeng; Yang, Kewei; Chen, Yingwu; Tan, Yuejin

    2017-09-01

    The combat system-of-systems in high-tech informative warfare, composed of many interconnected combat systems of different types, can be regarded as a type of complex heterogeneous network. Link prediction for heterogeneous combat networks (HCNs) is of significant military value, as it facilitates reconfiguring combat networks to represent the complex real-world network topology as appropriate with observed information. This paper proposes a novel integrated methodology framework called HCNMP (HCN link prediction based on meta-path) to predict multiple types of links simultaneously for an HCN. More specifically, the concept of HCN meta-paths is introduced, through which the HCNMP can accumulate information by extracting different features of HCN links for all the six defined types. Next, an HCN link prediction model, based on meta-path features, is built to predict all types of links of the HCN simultaneously. Then, the solution algorithm for the HCN link prediction model is proposed, in which the prediction results are obtained by iteratively updating with the newly predicted results until the results in the HCN converge or reach a certain maximum iteration number. Finally, numerical experiments on the dataset of a real HCN are conducted to demonstrate the feasibility and effectiveness of the proposed HCNMP, in comparison with 30 baseline methods. The results show that the performance of the HCNMP is superior to those of the baseline methods.

  10. Broad knowledge of information technologies: a prerequisite for the effective management of the integrated information system

    Energy Technology Data Exchange (ETDEWEB)

    Landau, H.B.

    1980-09-01

    There is a trend towards the bringing together of various information technologies into integrated information systems. The managers of these total systems therefore must be familiar with each of the component technologies and how they may be combined into a total information system. To accomplish this, the effective manager should first define the overall system as an integrated flow of information with each step identified; then, the alternate technologies applicable to each step may be selected. Methods of becoming technologically aware are suggested and examples of integrated systems are discussed.

  11. H∞ Consensus for Multiagent Systems with Heterogeneous Time-Varying Delays

    Directory of Open Access Journals (Sweden)

    Beibei Wang

    2013-01-01

    Full Text Available We apply the linear matrix inequality method to consensus and H∞ consensus problems of the single integrator multiagent system with heterogeneous delays in directed networks. To overcome the difficulty caused by heterogeneous time-varying delays, we rewrite the multiagent system into a partially reduced-order system and an integral system. As a result, a particular Lyapunov function is constructed to derive sufficient conditions for consensus of multiagent systems with fixed (switched topologies. We also apply this method to the H∞ consensus of multiagent systems with disturbances and heterogeneous delays. Numerical examples are given to illustrate the theoretical results.

  12. Influence of neighbourhood information on 'Local Climate Zone' mapping in heterogeneous cities

    Science.gov (United States)

    Verdonck, Marie-Leen; Okujeni, Akpona; van der Linden, Sebastian; Demuzere, Matthias; De Wulf, Robert; Van Coillie, Frieke

    2017-10-01

    Local climate zone (LCZ) mapping is an emerging field in urban climate research. LCZs potentially provide an objective framework to assess urban form and function worldwide. The scheme is currently being used to globally map LCZs as a part of the World Urban Database and Access Portal Tools (WUDAPT) initiative. So far, most of the LCZ maps lack proper quantitative assessment, challenging the generic character of the WUDAPT workflow. Using the standard method introduced by the WUDAPT community difficulties arose concerning the built zones due to high levels of heterogeneity. To overcome this problem a contextual classifier is adopted in the mapping process. This paper quantitatively assesses the influence of neighbourhood information on the LCZ mapping result of three cities in Belgium: Antwerp, Brussels and Ghent. Overall accuracies for the maps were respectively 85.7 ± 0.5, 79.6 ± 0.9, 90.2 ± 0.4%. The approach presented here results in overall accuracies of 93.6 ± 0.2, 92.6 ± 0.3 and 95.6 ± 0.3% for Antwerp, Brussels and Ghent. The results thus indicate a positive influence of neighbourhood information for all study areas with an increase in overall accuracies of 7.9, 13.0 and 5.4%. This paper reaches two main conclusions. Firstly, evidence was introduced on the relevance of a quantitative accuracy assessment in LCZ mapping, showing that the accuracies reported in previous papers are not easily achieved. Secondly, the method presented in this paper proves to be highly effective in Belgian cities, and given its open character shows promise for application in other heterogeneous cities worldwide.

  13. Principles and core functions of integrated child health information systems.

    Science.gov (United States)

    Hinman, Alan R; Atkinson, Delton; Diehn, Tonya Norvell; Eichwald, John; Heberer, Jennifer; Hoyle, Therese; King, Pam; Kossack, Robert E; Williams, Donna C; Zimmerman, Amy

    2004-11-01

    Infants undergo a series of preventive and therapeutic health interventions and activities. Typically, each activity includes collection and submission of data to a dedicated information system. Subsequently, health care providers, families, and health programs must query each information system to determine the child's status in a given area. Efforts are underway to integrate information in these separate information systems. This requires specifying the core functions that integrated information systems must perform.

  14. CLASS-PAIR-GUIDED MULTIPLE KERNEL LEARNING OF INTEGRATING HETEROGENEOUS FEATURES FOR CLASSIFICATION

    Directory of Open Access Journals (Sweden)

    Q. Wang

    2017-10-01

    Full Text Available In recent years, many studies on remote sensing image classification have shown that using multiple features from different data sources can effectively improve the classification accuracy. As a very powerful means of learning, multiple kernel learning (MKL can conveniently be embedded in a variety of characteristics. The conventional combined kernel learned by MKL can be regarded as the compromise of all basic kernels for all classes in classification. It is the best of the whole, but not optimal for each specific class. For this problem, this paper proposes a class-pair-guided MKL method to integrate the heterogeneous features (HFs from multispectral image (MSI and light detection and ranging (LiDAR data. In particular, the one-against-one strategy is adopted, which converts multiclass classification problem to a plurality of two-class classification problem. Then, we select the best kernel from pre-constructed basic kernels set for each class-pair by kernel alignment (KA in the process of classification. The advantage of the proposed method is that only the best kernel for the classification of any two classes can be retained, which leads to greatly enhanced discriminability. Experiments are conducted on two real data sets, and the experimental results show that the proposed method achieves the best performance in terms of classification accuracies in integrating the HFs for classification when compared with several state-of-the-art algorithms.

  15. Curriculum integrated information literacy: a challenge

    DEFF Research Database (Denmark)

    Bønløkke, Mette; Kobow, Else; Kristensen, Anne-Kirstine Østergaard

    2012-01-01

    Information literacy is a competence needed for students and for practitioners in the nursing profession. A curriculum integrated intervention was qualitatively evaluated by focus group interviews of students, lecturers and the university librarian. Information literacy makes sense for students...... when it is linked to assignments, timed right, prepared, systematic and continuous. Support is needed to help students understand the meaning of seeking information, to focus their problem and to make them reflect on their search and its results. Feedback on materials used is also asked for...

  16. Repeat immigration: A previously unobserved source of heterogeneity?

    Science.gov (United States)

    Aradhya, Siddartha; Scott, Kirk; Smith, Christopher D

    2017-07-01

    Register data allow for nuanced analyses of heterogeneities between sub-groups which are not observable in other data sources. One heterogeneity for which register data is particularly useful is in identifying unique migration histories of immigrant populations, a group of interest across disciplines. Years since migration is a commonly used measure of integration in studies seeking to understand the outcomes of immigrants. This study constructs detailed migration histories to test whether misclassified migrations may mask important heterogeneities. In doing so, we identify a previously understudied group of migrants called repeat immigrants, and show that they differ systematically from permanent immigrants. In addition, we quantify the degree to which migration information is misreported in the registers. The analysis is carried out in two steps. First, we estimate income trajectories for repeat immigrants and permanent immigrants to understand the degree to which they differ. Second, we test data validity by cross-referencing migration information with changes in income to determine whether there are inconsistencies indicating misreporting. From the first part of the analysis, the results indicate that repeat immigrants systematically differ from permanent immigrants in terms of income trajectories. Furthermore, income trajectories differ based on the way in which years since migration is calculated. The second part of the analysis suggests that misreported migration events, while present, are negligible. Repeat immigrants differ in terms of income trajectories, and may differ in terms of other outcomes as well. Furthermore, this study underlines that Swedish registers provide a reliable data source to analyze groups which are unidentifiable in other data sources.

  17. Tumor Heterogeneity: Mechanisms and Bases for a Reliable Application of Molecular Marker Design

    Directory of Open Access Journals (Sweden)

    Salvador J. Diaz-Cano

    2012-02-01

    Full Text Available Tumor heterogeneity is a confusing finding in the assessment of neoplasms, potentially resulting in inaccurate diagnostic, prognostic and predictive tests. This tumor heterogeneity is not always a random and unpredictable phenomenon, whose knowledge helps designing better tests. The biologic reasons for this intratumoral heterogeneity would then be important to understand both the natural history of neoplasms and the selection of test samples for reliable analysis. The main factors contributing to intratumoral heterogeneity inducing gene abnormalities or modifying its expression include: the gradient ischemic level within neoplasms, the action of tumor microenvironment (bidirectional interaction between tumor cells and stroma, mechanisms of intercellular transference of genetic information (exosomes, and differential mechanisms of sequence-independent modifications of genetic material and proteins. The intratumoral heterogeneity is at the origin of tumor progression and it is also the byproduct of the selection process during progression. Any analysis of heterogeneity mechanisms must be integrated within the process of segregation of genetic changes in tumor cells during the clonal expansion and progression of neoplasms. The evaluation of these mechanisms must also consider the redundancy and pleiotropism of molecular pathways, for which appropriate surrogate markers would support the presence or not of heterogeneous genetics and the main mechanisms responsible. This knowledge would constitute a solid scientific background for future therapeutic planning.

  18. Linking transcriptional and genetic tumor heterogeneity through allele analysis of single-cell RNA-seq data.

    Science.gov (United States)

    Fan, Jean; Lee, Hae-Ock; Lee, Soohyun; Ryu, Da-Eun; Lee, Semin; Xue, Catherine; Kim, Seok Jin; Kim, Kihyun; Barkas, Nikolas; Park, Peter J; Park, Woong-Yang; Kharchenko, Peter V

    2018-06-13

    Characterization of intratumoral heterogeneity is critical to cancer therapy, as presence of phenotypically diverse cell populations commonly fuels relapse and resistance to treatment. Although genetic variation is a well-studied source of intratumoral heterogeneity, the functional impact of most genetic alterations remains unclear. Even less understood is the relative importance of other factors influencing heterogeneity, such as epigenetic state or tumor microenvironment. To investigate the relationship between genetic and transcriptional heterogeneity in a context of cancer progression, we devised a computational approach called HoneyBADGER to identify copy number variation and loss-of-heterozygosity in individual cells from single-cell RNA-sequencing data. By integrating allele and normalized expression information, HoneyBADGER is able to identify and infer the presence of subclone-specific alterations in individual cells and reconstruct underlying subclonal architecture. Examining several tumor types, we show that HoneyBADGER is effective at identifying deletion, amplifications, and copy-neutral loss-of-heterozygosity events, and is capable of robustly identifying subclonal focal alterations as small as 10 megabases. We further apply HoneyBADGER to analyze single cells from a progressive multiple myeloma patient to identify major genetic subclones that exhibit distinct transcriptional signatures relevant to cancer progression. Surprisingly, other prominent transcriptional subpopulations within these tumors did not line up with the genetic subclonal structure, and were likely driven by alternative, non-clonal mechanisms. These results highlight the need for integrative analysis to understand the molecular and phenotypic heterogeneity in cancer. Published by Cold Spring Harbor Laboratory Press.

  19. The tsunami service bus, an integration platform for heterogeneous sensor systems

    Science.gov (United States)

    Haener, R.; Waechter, J.; Kriegel, U.; Fleischer, J.; Mueller, S.

    2009-04-01

    components remain unchanged, components can be maintained and evolved independently on each other and service functionality as a whole can be reused. In GITEWS the functional integration pattern was adopted by applying the principles of an Enterprise Service Bus (ESB) as a backbone. Four services provided by the so called Tsunami Service Bus (TSB) which are essential for early warning systems are realized compliant to services specified within the Sensor Web Enablement (SWE) initiative of the Open Geospatial Consortium (OGC). 3. ARCHITECTURE The integration platform was developed to access proprietary, heterogeneous sensor data and to provide them in a uniform manner for further use. Its core, the TSB provides both a messaging-backbone and -interfaces on the basis of a Java Messaging Service (JMS). The logical architecture of GITEWS consists of four independent layers: • A resource layer where physical or virtual sensors as well as data or model storages provide relevant measurement-, event- and analysis-data: Utilizable for the TSB are any kind of data. In addition to sensors databases, model data and processing applications are adopted. SWE specifies encoding both to access and to describe these data in a comprehensive way: 1. Sensor Model Language (SensorML): Standardized description of sensors and sensor data 2. Observations and Measurements (O&M): Model and encoding of sensor measurements • A service layer to collect and conduct data from heterogeneous and proprietary resources and provide them via standardized interfaces: The TSB enables interaction with sensors via the following services: 1. Sensor Observation Service (SOS): Standardized access to sensor data 2. Sensor Planning Service (SPS): Controlling of sensors and sensor networks 3. Sensor Alert Service (SAS): Active sending of data if defined events occur 4. Web Notification Service (WNS): Conduction of asynchronous dialogues between services • An orchestration layer where atomic services are composed and

  20. Integrated design of intelligent surveillance systems and their user interface

    NARCIS (Netherlands)

    Toet, A.

    2005-01-01

    Modern complex surveillance systems consisting of multiple and heterogeneous sensors, automatic information registration and data analysis techniques, and decision support tools should provide the human operator an integrated, transparent and easily comprehensible view of the surveyed scene.

  1. Heterogeneous patterns enhancing static and dynamic texture classification

    International Nuclear Information System (INIS)

    Silva, Núbia Rosa da; Martinez Bruno, Odemir

    2013-01-01

    Some mixtures, such as colloids like milk, blood, and gelatin, have homogeneous appearance when viewed with the naked eye, however, to observe them at the nanoscale is possible to understand the heterogeneity of its components. The same phenomenon can occur in pattern recognition in which it is possible to see heterogeneous patterns in texture images. However, current methods of texture analysis can not adequately describe such heterogeneous patterns. Common methods used by researchers analyse the image information in a global way, taking all its features in an integrated manner. Furthermore, multi-scale analysis verifies the patterns at different scales, but still preserving the homogeneous analysis. On the other hand various methods use textons to represent the texture, breaking texture down into its smallest unit. To tackle this problem, we propose a method to identify texture patterns not small as textons at distinct scales enhancing the separability among different types of texture. We find sub patterns of texture according to the scale and then group similar patterns for a more refined analysis. Tests were performed in four static texture databases and one dynamical one. Results show that our method provide better classification rate compared with conventional approaches both in static and in dynamic texture.

  2. Association of Informal Clinical Integration of Physicians With Cardiac Surgery Payments.

    Science.gov (United States)

    Funk, Russell J; Owen-Smith, Jason; Kaufman, Samuel A; Nallamothu, Brahmajee K; Hollingsworth, John M

    2018-05-01

    To reduce inefficiency and waste associated with care fragmentation, many current programs target greater clinical integration among physicians. However, these programs have led to only modest Medicare spending reductions. Most programs focus on formal integration, which often bears little resemblance to actual physician interaction patterns. To examine how physician interaction patterns vary between health systems and to assess whether variation in informal integration is associated with care delivery payments. National Medicare data from January 1, 2008, through December 31, 2011, identified 253 545 Medicare beneficiaries (aged ≥66 years) from 1186 health systems where Medicare beneficiaries underwent coronary artery bypass grafting (CABG) procedures. Interactions were mapped between all physicians who treated these patients-including primary care physicians and surgical and medical specialists-within a health system during their surgical episode. The level of informal integration was measured in these networks of interacting physicians. Multivariate regression models were fitted to evaluate associations between payments for each surgical episode made on a beneficiary's behalf and the level of informal integration in the health system where the patient was treated. The informal integration level of a health system. Price-standardized total surgical episode and component payments. The total 253 545 study participants included 175 520 men (69.2%; mean [SD] age, 74.51 [5.75] years) and 78 024 women (34.3%; 75.67 [5.91] years). One beneficiary of the 253 545 participants did not have sex information. The low level of informal clinical integration included 84 598 patients (33.4%; mean [SD] age, 75.00 [5.93] years); medium level, 84 442 (33.30%; 74.94 [5.87] years); and high level, 84 505 (33.34%; 74.66 [5.72] years) (P integration levels varied across health systems. After adjusting for patient, health-system, and community factors, higher levels

  3. Development Trends of Cartography and Geographic Information Engineering

    Directory of Open Access Journals (Sweden)

    WANG Jiayao

    2010-04-01

    Full Text Available Aimed at the problems of cartography and geographic information engineering and increasing demands of national and military infomationization construction, the paper proposes six hotspots on the research of cartography and geographic information engineering for the future on the foundation of analyzing the development track of cartology, which are heterogeneous geospatial data assimilation, transferring from emphasizing geography infor-mation gaining to user-oriented geographic information deep processing, web or grid geographic information service. intelligent spatial data generalization. integration of GIS and VGE. cartography and geographic information engineering theory system with multi-mode(Map,.GlS..VGE spatial-temporal integrated cognition as the core. And discusses the necessity ,existing groundwork and research contents on studying these hotspots.

  4. Spatial and molecular resolution of diffuse malignant mesothelioma heterogeneity by integrating label-free FTIR imaging, laser capture microdissection and proteomics

    Science.gov (United States)

    Großerueschkamp, Frederik; Bracht, Thilo; Diehl, Hanna C.; Kuepper, Claus; Ahrens, Maike; Kallenbach-Thieltges, Angela; Mosig, Axel; Eisenacher, Martin; Marcus, Katrin; Behrens, Thomas; Brüning, Thomas; Theegarten, Dirk; Sitek, Barbara; Gerwert, Klaus

    2017-03-01

    Diffuse malignant mesothelioma (DMM) is a heterogeneous malignant neoplasia manifesting with three subtypes: epithelioid, sarcomatoid and biphasic. DMM exhibit a high degree of spatial heterogeneity that complicates a thorough understanding of the underlying different molecular processes in each subtype. We present a novel approach to spatially resolve the heterogeneity of a tumour in a label-free manner by integrating FTIR imaging and laser capture microdissection (LCM). Subsequent proteome analysis of the dissected homogenous samples provides in addition molecular resolution. FTIR imaging resolves tumour subtypes within tissue thin-sections in an automated and label-free manner with accuracy of about 85% for DMM subtypes. Even in highly heterogeneous tissue structures, our label-free approach can identify small regions of interest, which can be dissected as homogeneous samples using LCM. Subsequent proteome analysis provides a location specific molecular characterization. Applied to DMM subtypes, we identify 142 differentially expressed proteins, including five protein biomarkers commonly used in DMM immunohistochemistry panels. Thus, FTIR imaging resolves not only morphological alteration within tissue but it resolves even alterations at the level of single proteins in tumour subtypes. Our fully automated workflow FTIR-guided LCM opens new avenues collecting homogeneous samples for precise and predictive biomarkers from omics studies.

  5. Integrated information in discrete dynamical systems: motivation and theoretical framework.

    Directory of Open Access Journals (Sweden)

    David Balduzzi

    2008-06-01

    Full Text Available This paper introduces a time- and state-dependent measure of integrated information, phi, which captures the repertoire of causal states available to a system as a whole. Specifically, phi quantifies how much information is generated (uncertainty is reduced when a system enters a particular state through causal interactions among its elements, above and beyond the information generated independently by its parts. Such mathematical characterization is motivated by the observation that integrated information captures two key phenomenological properties of consciousness: (i there is a large repertoire of conscious experiences so that, when one particular experience occurs, it generates a large amount of information by ruling out all the others; and (ii this information is integrated, in that each experience appears as a whole that cannot be decomposed into independent parts. This paper extends previous work on stationary systems and applies integrated information to discrete networks as a function of their dynamics and causal architecture. An analysis of basic examples indicates the following: (i phi varies depending on the state entered by a network, being higher if active and inactive elements are balanced and lower if the network is inactive or hyperactive. (ii phi varies for systems with identical or similar surface dynamics depending on the underlying causal architecture, being low for systems that merely copy or replay activity states. (iii phi varies as a function of network architecture. High phi values can be obtained by architectures that conjoin functional specialization with functional integration. Strictly modular and homogeneous systems cannot generate high phi because the former lack integration, whereas the latter lack information. Feedforward and lattice architectures are capable of generating high phi but are inefficient. (iv In Hopfield networks, phi is low for attractor states and neutral states, but increases if the networks

  6. Integrated information in discrete dynamical systems: motivation and theoretical framework.

    Science.gov (United States)

    Balduzzi, David; Tononi, Giulio

    2008-06-13

    This paper introduces a time- and state-dependent measure of integrated information, phi, which captures the repertoire of causal states available to a system as a whole. Specifically, phi quantifies how much information is generated (uncertainty is reduced) when a system enters a particular state through causal interactions among its elements, above and beyond the information generated independently by its parts. Such mathematical characterization is motivated by the observation that integrated information captures two key phenomenological properties of consciousness: (i) there is a large repertoire of conscious experiences so that, when one particular experience occurs, it generates a large amount of information by ruling out all the others; and (ii) this information is integrated, in that each experience appears as a whole that cannot be decomposed into independent parts. This paper extends previous work on stationary systems and applies integrated information to discrete networks as a function of their dynamics and causal architecture. An analysis of basic examples indicates the following: (i) phi varies depending on the state entered by a network, being higher if active and inactive elements are balanced and lower if the network is inactive or hyperactive. (ii) phi varies for systems with identical or similar surface dynamics depending on the underlying causal architecture, being low for systems that merely copy or replay activity states. (iii) phi varies as a function of network architecture. High phi values can be obtained by architectures that conjoin functional specialization with functional integration. Strictly modular and homogeneous systems cannot generate high phi because the former lack integration, whereas the latter lack information. Feedforward and lattice architectures are capable of generating high phi but are inefficient. (iv) In Hopfield networks, phi is low for attractor states and neutral states, but increases if the networks are optimized

  7. InterMine: a flexible data warehouse system for the integration and analysis of heterogeneous biological data.

    Science.gov (United States)

    Smith, Richard N; Aleksic, Jelena; Butano, Daniela; Carr, Adrian; Contrino, Sergio; Hu, Fengyuan; Lyne, Mike; Lyne, Rachel; Kalderimis, Alex; Rutherford, Kim; Stepan, Radek; Sullivan, Julie; Wakeling, Matthew; Watkins, Xavier; Micklem, Gos

    2012-12-01

    InterMine is an open-source data warehouse system that facilitates the building of databases with complex data integration requirements and a need for a fast customizable query facility. Using InterMine, large biological databases can be created from a range of heterogeneous data sources, and the extensible data model allows for easy integration of new data types. The analysis tools include a flexible query builder, genomic region search and a library of 'widgets' performing various statistical analyses. The results can be exported in many commonly used formats. InterMine is a fully extensible framework where developers can add new tools and functionality. Additionally, there is a comprehensive set of web services, for which client libraries are provided in five commonly used programming languages. Freely available from http://www.intermine.org under the LGPL license. g.micklem@gen.cam.ac.uk Supplementary data are available at Bioinformatics online.

  8. Earth science information: Planning for the integration and use of global change information

    Science.gov (United States)

    Lousma, Jack R.

    1992-01-01

    Activities and accomplishments of the first six months of the Consortium for International Earth Science Information Network (CIESIN's) 1992 technical program have focused on four main missions: (1) the development and implementation of plans for initiation of the Socioeconomic Data and Applications Center (SEDAC) as part of the EOSDIS Program; (2) the pursuit and development of a broad-based global change information cooperative by providing systems analysis and integration between natural science and social science data bases held by numerous federal agencies and other sources; (3) the fostering of scientific research into the human dimensions of global change and providing integration between natural science and social science data and information; and (4) the serving of CIESIN as a gateway for global change data and information distribution through development of the Global Change Research Information Office and other comprehensive knowledge sharing systems.

  9. Information delivery manuals to integrate building product information into design

    DEFF Research Database (Denmark)

    Berard, Ole Bengt; Karlshøj, Jan

    2011-01-01

    Despite continuing BIM progress, professionals in the AEC industry often lack the information they need to perform their work. Although this problem could be alleviated by information systems similar to those in other industries, companies struggle to model processes and information needs...... them in information systems. BIM implies that objects are bearers of information and logic. The present study has three main aims: (1) to explore IDMs capability to capture all four perspectives, (2) to determine whether an IDM’s collaborative methodology is valid for developing standardized processes......, and (3) to ascertain whether IDM’s business rules can support the development of information and logic-bearing BIM objects. The research is based on a case study of re-engineering the bidding process for a design-build project to integrate building product manufacturers, subcontractors...

  10. Integrated plant information technology design support functionality

    International Nuclear Information System (INIS)

    Kim, Yeon Seung; Kim, Dae Jin; Barber, P. W.; Goland, D.

    1996-06-01

    This technical report was written as a result of Integrated Plant Information System (IPIS) feasibility study on CANDU 9 project which had been carried out from January, 1994 to March, 1994 at AECL (Atomic Energy Canada Limited) in Canada. From 1987, AECL had done endeavour to change engineering work process from paper based work process to computer based work process through CANDU 3 project. Even though AECL had a lot of good results form computerizing the Process Engineering, Instrumentation Control and Electrical Engineering, Mechanical Engineering, Computer Aided Design and Drafting, and Document Management System, but there remains the problem of information isolation and integration. On this feasibility study, IPIS design support functionality guideline was suggested by evaluating current AECL CAE tools, analyzing computer aided engineering task and work flow, investigating request for implementing integrated computer aided engineering and describing Korean request for future CANDU design including CANDU 9. 6 figs. (Author)

  11. Integrated plant information technology design support functionality

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yeon Seung; Kim, Dae Jin [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of); Barber, P W; Goland, D [Atomic Energy Canada Ltd., (Canada)

    1996-06-01

    This technical report was written as a result of Integrated Plant Information System (IPIS) feasibility study on CANDU 9 project which had been carried out from January, 1994 to March, 1994 at AECL (Atomic Energy Canada Limited) in Canada. From 1987, AECL had done endeavour to change engineering work process from paper based work process to computer based work process through CANDU 3 project. Even though AECL had a lot of good results form computerizing the Process Engineering, Instrumentation Control and Electrical Engineering, Mechanical Engineering, Computer Aided Design and Drafting, and Document Management System, but there remains the problem of information isolation and integration. On this feasibility study, IPIS design support functionality guideline was suggested by evaluating current AECL CAE tools, analyzing computer aided engineering task and work flow, investigating request for implementing integrated computer aided engineering and describing Korean request for future CANDU design including CANDU 9. 6 figs. (Author).

  12. Green heterogeneous wireless networks

    CERN Document Server

    Ismail, Muhammad; Nee, Hans-Peter; Qaraqe, Khalid A; Serpedin, Erchin

    2016-01-01

    This book focuses on the emerging research topic "green (energy efficient) wireless networks" which has drawn huge attention recently from both academia and industry. This topic is highly motivated due to important environmental, financial, and quality-of-experience (QoE) considerations. Specifically, the high energy consumption of the wireless networks manifests in approximately 2% of all CO2 emissions worldwide. This book presents the authors’ visions and solutions for deployment of energy efficient (green) heterogeneous wireless communication networks. The book consists of three major parts. The first part provides an introduction to the "green networks" concept, the second part targets the green multi-homing resource allocation problem, and the third chapter presents a novel deployment of device-to-device (D2D) communications and its successful integration in Heterogeneous Networks (HetNets). The book is novel in that it specifically targets green networking in a heterogeneous wireless medium, which re...

  13. Integrated occupational radiation exposure information system

    International Nuclear Information System (INIS)

    Hunt, H.W.

    1983-06-01

    The integrated (Occupational Radiation Exposure) data base information system has many advantages. Radiation exposure information is available to operating management in a more timely manner and in a more flexible mode. The ORE system has permitted the integration of scattered files and data to be stored in a more cost-effective method that permits easy and simultaneous access by a variety of users with different data needs. The external storage needs of the radiation exposure source documents are several orders of magnitude less through the use of the computer assisted retrieval techniques employed in the ORE system. Groundwork is being layed to automate the historical files, which are maintained to help describe the radiation protection programs and policies at any one point in time. The file unit will be microfilmed for topical indexing on the ORE data base

  14. A link prediction method for heterogeneous networks based on BP neural network

    Science.gov (United States)

    Li, Ji-chao; Zhao, Dan-ling; Ge, Bing-Feng; Yang, Ke-Wei; Chen, Ying-Wu

    2018-04-01

    Most real-world systems, composed of different types of objects connected via many interconnections, can be abstracted as various complex heterogeneous networks. Link prediction for heterogeneous networks is of great significance for mining missing links and reconfiguring networks according to observed information, with considerable applications in, for example, friend and location recommendations and disease-gene candidate detection. In this paper, we put forward a novel integrated framework, called MPBP (Meta-Path feature-based BP neural network model), to predict multiple types of links for heterogeneous networks. More specifically, the concept of meta-path is introduced, followed by the extraction of meta-path features for heterogeneous networks. Next, based on the extracted meta-path features, a supervised link prediction model is built with a three-layer BP neural network. Then, the solution algorithm of the proposed link prediction model is put forward to obtain predicted results by iteratively training the network. Last, numerical experiments on the dataset of examples of a gene-disease network and a combat network are conducted to verify the effectiveness and feasibility of the proposed MPBP. It shows that the MPBP with very good performance is superior to the baseline methods.

  15. 48 CFR 9.104-6 - Federal Awardee Performance and Integrity Information System.

    Science.gov (United States)

    2010-10-01

    ... Performance and Integrity Information System. 9.104-6 Section 9.104-6 Federal Acquisition Regulations System... Contractors 9.104-6 Federal Awardee Performance and Integrity Information System. (a) Before awarding a... Federal Awardee Performance and Integrity Information System (FAPIIS), (available at www.ppirs.gov, then...

  16. Assessment of Integrated Information System (IIS) in organization ...

    African Journals Online (AJOL)

    Assessment of Integrated Information System (IIS) in organization. ... to enable the Information System (IS) managers, as well as top management to understand the ... since organisational and strategic aspects in IIS should also be considered.

  17. Design of the Hospital Integrated Information Management System Based on Cloud Platform.

    Science.gov (United States)

    Aijing, L; Jin, Y

    2015-12-01

    At present, the outdated information management style cannot meet the needs of hospital management, and has become the bottleneck of hospital's management and development. In order to improve the integrated management of information, hospitals have increased their investment in integrated information management systems. On account of the lack of reasonable and scientific design, some hospital integrated information management systems have common problems, such as unfriendly interface, poor portability and maintainability, low security and efficiency, lack of interactivity and information sharing. To solve the problem, this paper carries out the research and design of a hospital information management system based on cloud platform, which can realize the optimized integration of hospital information resources and save money.

  18. Development of Integrated Information System for Travel Bureau Company

    Science.gov (United States)

    Karma, I. G. M.; Susanti, J.

    2018-01-01

    Related to the effectiveness of decision-making by the management of travel bureau company, especially by managers, information serves frequent delays or incomplete. Although already computer-assisted, the existing application-based is used only handle one particular activity only, not integrated. This research is intended to produce an integrated information system that handles the overall operational activities of the company. By applying the object-oriented system development approach, the system is built with Visual Basic. Net programming language and MySQL database package. The result is a system that consists of 4 (four) separated program packages, including Reservation System, AR System, AP System and Accounting System. Based on the output, we can conclude that this system is able to produce integrated information that related to the problem of reservation, operational and financial those produce up-to-date information in order to support operational activities and decisionmaking process by related parties.

  19. Heterogeneous computing with OpenCL

    CERN Document Server

    2013-01-01

    Heterogeneous Computing with OpenCL teaches OpenCL and parallel programming for complex systems that may include a variety of device architectures: multi-core CPUs, GPUs, and fully-integrated Accelerated Processing Units (APUs) such as AMD Fusion technology. Designed to work on multiple platforms and with wide industry support, OpenCL will help you more effectively program for a heterogeneous future. Written by leaders in the parallel computing and OpenCL communities, this book will give you hands-on OpenCL experience to address a range of fundamental parallel algorithms. The authors explore memory spaces, optimization techniques, graphics interoperability, extensions, and debugging and profiling. Intended to support a parallel programming course, Heterogeneous Computing with OpenCL includes detailed examples throughout, plus additional online exercises and other supporting materials.

  20. Proof-of-Concept of a Millimeter-Wave Integrated Heterogeneous Network for 5G Cellular.

    Science.gov (United States)

    Okasaka, Shozo; Weiler, Richard J; Keusgen, Wilhelm; Pudeyev, Andrey; Maltsev, Alexander; Karls, Ingolf; Sakaguchi, Kei

    2016-08-25

    The fifth-generation mobile networks (5G) will not only enhance mobile broadband services, but also enable connectivity for a massive number of Internet-of-Things devices, such as wireless sensors, meters or actuators. Thus, 5G is expected to achieve a 1000-fold or more increase in capacity over 4G. The use of the millimeter-wave (mmWave) spectrum is a key enabler to allowing 5G to achieve such enhancement in capacity. To fully utilize the mmWave spectrum, 5G is expected to adopt a heterogeneous network (HetNet) architecture, wherein mmWave small cells are overlaid onto a conventional macro-cellular network. In the mmWave-integrated HetNet, splitting of the control plane (CP) and user plane (UP) will allow continuous connectivity and increase the capacity of the mmWave small cells. mmWave communication can be used not only for access linking, but also for wireless backhaul linking, which will facilitate the installation of mmWave small cells. In this study, a proof-of-concept (PoC) was conducted to demonstrate the practicality of a prototype mmWave-integrated HetNet, using mmWave technologies for both backhaul and access.

  1. Review of Statistical Learning Methods in Integrated Omics Studies (An Integrated Information Science).

    Science.gov (United States)

    Zeng, Irene Sui Lan; Lumley, Thomas

    2018-01-01

    Integrated omics is becoming a new channel for investigating the complex molecular system in modern biological science and sets a foundation for systematic learning for precision medicine. The statistical/machine learning methods that have emerged in the past decade for integrated omics are not only innovative but also multidisciplinary with integrated knowledge in biology, medicine, statistics, machine learning, and artificial intelligence. Here, we review the nontrivial classes of learning methods from the statistical aspects and streamline these learning methods within the statistical learning framework. The intriguing findings from the review are that the methods used are generalizable to other disciplines with complex systematic structure, and the integrated omics is part of an integrated information science which has collated and integrated different types of information for inferences and decision making. We review the statistical learning methods of exploratory and supervised learning from 42 publications. We also discuss the strengths and limitations of the extended principal component analysis, cluster analysis, network analysis, and regression methods. Statistical techniques such as penalization for sparsity induction when there are fewer observations than the number of features and using Bayesian approach when there are prior knowledge to be integrated are also included in the commentary. For the completeness of the review, a table of currently available software and packages from 23 publications for omics are summarized in the appendix.

  2. A Preliminary Study on the Multiple Mapping Structure of Classification Systems for Heterogeneous Databases

    OpenAIRE

    Seok-Hyoung Lee; Hwan-Min Kim; Ho-Seop Choe

    2012-01-01

    While science and technology information service portals and heterogeneous databases produced in Korea and other countries are integrated, methods of connecting the unique classification systems applied to each database have been studied. Results of technologists' research, such as, journal articles, patent specifications, and research reports, are organically related to each other. In this case, if the most basic and meaningful classification systems are not connected, it is difficult to ach...

  3. Nuclear plants gain integrated information systems

    International Nuclear Information System (INIS)

    Villavicencio-Ramirez, A.; Rodriquez-Alvarez, J.M.

    1994-01-01

    With the objective of simplifying the complex mesh of computing devices employed within nuclear power plants, modern technology and integration techniques are being used to form centralized (but backed up) databases and distributed processing and display networks. Benefits are immediate as a result of the integration and the use of standards. The use of a unique data acquisition and database subsystem optimizes the high costs of engineering, as this task is done only once for the life span of the system. This also contributes towards a uniform user interface and allows for graceful expansion and maintenance. This article features an integrated information system, Sistema Integral de Informacion de Proceso (SIIP). The development of this system enabled the Laguna Verde Nuclear Power plant to fully use the already existing universe of signals and its related engineering during all plant conditions, namely, start up, normal operation, transient analysis, and emergency operation. Integrated systems offer many advantages over segregated systems, and this experience should benefit similar development efforts in other electric power utilities, not only for nuclear but also for other types of generating plants

  4. The impact of IAIMS on the work of information experts. Integrated Advanced Information Management Systems.

    Science.gov (United States)

    Ash, J

    1995-10-01

    Integrated Advanced Information Management Systems (IAIMS) programs differ but have certain characteristics in common. Technological and organizational integration are universal goals. As integration takes place, what happens to those implementing the vision? A survey of 125 staff members, or information experts, involved in information or informatics at an IAIMS-funded institution was conducted during the last year of the implementation phase. The purpose was to measure the impact of IAIMS on the jobs of those in the library and related service units, and the computing, telecommunications, and health informatics divisions. The researchers used newly developed scales measuring levels of integration (knowledge of and involvement with other departments), customer orientation (focus on the user), and informatedness (changes in the nature of work beyond automation of former routines). Ninety-four percent of respondents indicated that their jobs had changed a great deal; the changes were similar regardless of division. To further investigate the impact of IAIMS on librarians in particular, a separate skills survey was conducted. The IAIMS librarians indicated that technology and training skills are especially needed in the new, integrated environment.

  5. Information Science and integrative Science. A sistemic approach to information units

    Directory of Open Access Journals (Sweden)

    Rita Dolores Santaella Ruiz

    2006-01-01

    Full Text Available Structured in two parts: The Documentation like integrating science and Systematics approach to the documentary units, this work understands the Documentation from a brought integrating perspective of the twinning that supposes same modus operandi in the information systems through the use of the technologies of the communication. From the General Theory of Systems, the present work interprets this science to multidiscipline like a system formed by the technical subsystems, of elements and individuals

  6. Equilibria, information and frustration in heterogeneous network games with conflicting preferences

    Science.gov (United States)

    Mazzoli, M.; Sánchez, A.

    2017-11-01

    Interactions between people are the basis on which the structure of our society arises as a complex system and, at the same time, are the starting point of any physical description of it. In the last few years, much theoretical research has addressed this issue by combining the physics of complex networks with a description of interactions in terms of evolutionary game theory. We here take this research a step further by introducing a most salient societal factor such as the individuals’ preferences, a characteristic that is key to understanding much of the social phenomenology these days. We consider a heterogeneous, agent-based model in which agents interact strategically with their neighbors, but their preferences and payoffs for the possible actions differ. We study how such a heterogeneous network behaves under evolutionary dynamics and different strategic interactions, namely coordination games and best shot games. With this model we study the emergence of the equilibria predicted analytically in random graphs under best response dynamics, and we extend this test to unexplored contexts like proportional imitation and scale free networks. We show that some theoretically predicted equilibria do not arise in simulations with incomplete information, and we demonstrate the importance of the graph topology and the payoff function parameters for some games. Finally, we discuss our results with the available experimental evidence on coordination games, showing that our model agrees better with the experiment than standard economic theories, and draw hints as to how to maximize social efficiency in situations of conflicting preferences.

  7. The integrated approach methodology for operator information evaluation

    International Nuclear Information System (INIS)

    Stroube, K.; Modarres, M.; Roush, M.; Hunt, N.; Pearce, R.

    1986-01-01

    The Integrated Approach has developed a complete method for evaluating the relative importance of operation information improvements. By use of decision trees the impact of information on success probability of a function or system can be evaluated. This approach couples goal trees and human success likelihoods to estimate anticipated consequences of a given information system

  8. Using web services for linking genomic data to medical information systems.

    Science.gov (United States)

    Maojo, V; Crespo, J; de la Calle, G; Barreiro, J; Garcia-Remesal, M

    2007-01-01

    To develop a new perspective for biomedical information systems, regarding the introduction of ideas, methods and tools related to the new scenario of genomic medicine. Technological aspects related to the analysis and integration of heterogeneous clinical and genomic data include mapping clinical and genetic concepts, potential future standards or the development of integrated biomedical ontologies. In this clinicomics scenario, we describe the use of Web services technologies to improve access to and integrate different information sources. We give a concrete example of the use of Web services technologies: the OntoFusion project. Web services provide new biomedical informatics (BMI) approaches related to genomic medicine. Customized workflows will aid research tasks by linking heterogeneous Web services. Two significant examples of these European Commission-funded efforts are the INFOBIOMED Network of Excellence and the Advancing Clinico-Genomic Trials on Cancer (ACGT) integrated project. Supplying medical researchers and practitioners with omics data and biologists with clinical datasets can help to develop genomic medicine. BMI is contributing by providing the informatics methods and technological infrastructure needed for these collaborative efforts.

  9. Conceptual information processing: A robust approach to KBS-DBMS integration

    Science.gov (United States)

    Lazzara, Allen V.; Tepfenhart, William; White, Richard C.; Liuzzi, Raymond

    1987-01-01

    Integrating the respective functionality and architectural features of knowledge base and data base management systems is a topic of considerable interest. Several aspects of this topic and associated issues are addressed. The significance of integration and the problems associated with accomplishing that integration are discussed. The shortcomings of current approaches to integration and the need to fuse the capabilities of both knowledge base and data base management systems motivates the investigation of information processing paradigms. One such paradigm is concept based processing, i.e., processing based on concepts and conceptual relations. An approach to robust knowledge and data base system integration is discussed by addressing progress made in the development of an experimental model for conceptual information processing.

  10. Development of the Integrated Information Technology System

    National Research Council Canada - National Science Library

    2005-01-01

    The Integrated Medical Information Technology System (IMITS) Program is focused on implementation of advanced technology solutions that eliminate inefficiencies, increase utilization and improve quality of care for active duty forces...

  11. Management of information in development projects – a proposed integrated model

    Directory of Open Access Journals (Sweden)

    C. Bester

    2008-11-01

    Full Text Available The first section of the article focuses on the need for development in Africa and the specific challenges of development operations. It describes the need for a holistic and integrated information management model as part of the project management body of knowledge aimed at managing the information flow between communities and development project teams. It is argued that information, and access to information, is crucial in development projects and can therefore be seen as a critical success factor in any development project. In the second section of the article, the three information areas of the holistic and integrated information management model are described. In the section thereafter we suggest roles and actions for information managers to facilitate information processes integral to the model. These processes seek to create a developing information community that aligns itself with the development project, and supports and sustains it.

  12. Adsorption of gas mixtures on heterogeneous solid surfaces

    Energy Technology Data Exchange (ETDEWEB)

    Jaroniec, M; Rudzinski, W

    1977-01-01

    A review of theoretical studies on the physical adsorption from gas mixtures on heterogeneous solid surfaces, mainly by Jaroniec and coworkers, covers the vector notation used in the calculations; adsorption isotherms for multicomponent gases; the generalized integral equation for adsorption of gas mixtures, its numerical and analytical solutions, applied, (e.g., to interpret the experimental adsorption isotherms of ethane/ethylene on Nuxit-AL); thermodynamic relations, applied, (e.g., to calculating isosteric adsorption heats from experimental parameters for the adsorption of propylene from propane/propylene mixtures on Nuxit-AL); and the derivation and use of a simplified integral equation for describing the adsorption from gas mixtures on heterogeneous surfaces. 75 references.

  13. FY1995 study on three-dimensional integrated information environment toward human media; 1995 nendo human media e muketa sanjigen togo joho kankyo no kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    In the next generation media environment, it is required to remove the boundary between virtual and real environment. The integration of these heterogeneous environments will enhance the applicability and availability of the human media. The aim of this work is to pioneer the new technology of 3-D integrated information environment in which both virtual and real environment are embedded, and to give a guide into the construction of human media. Our results consists of three parts as follows : (1) As a benchmark of the 3-D integrated information environment, the immersive television which has surrounding multi-projection displays was investigated. (2) A new method to synthesize arbitrary 3-D viewpoint images from 2-D real images was developed. On the other hand, a new concept of ray data description was introduced to represent whole visual data of 3-D real space. In the new concept, the whole visual data is treated as a set of ray data. New methods for superimposing and handling ray data were proposed. Potential applicability of the methods were clarified. (3) In order to enhance the reality of operations under the virtual environment, quantitative analysis was performed assuming that the HMD (Head Mounted Display) was used for displaying 3-D space information. (NEDO)

  14. High-Throughput Multiple Dies-to-Wafer Bonding Technology and III/V-on-Si Hybrid Lasers for Heterogeneous Integration of Optoelectronic Integrated Circuits

    Directory of Open Access Journals (Sweden)

    Xianshu eLuo

    2015-04-01

    Full Text Available Integrated optical light source on silicon is one of the key building blocks for optical interconnect technology. Great research efforts have been devoting worldwide to explore various approaches to integrate optical light source onto the silicon substrate. The achievements so far include the successful demonstration of III/V-on-Si hybrid lasers through III/V-gain material to silicon wafer bonding technology. However, for potential large-scale integration, leveraging on mature silicon complementary metal oxide semiconductor (CMOS fabrication technology and infrastructure, more effective bonding scheme with high bonding yield is in great demand considering manufacturing needs. In this paper, we propose and demonstrate a high-throughput multiple dies-to-wafer (D2W bonding technology which is then applied for the demonstration of hybrid silicon lasers. By temporarily bonding III/V dies to a handle silicon wafer for simultaneous batch processing, it is expected to bond unlimited III/V dies to silicon device wafer with high yield. As proof-of-concept, more than 100 III/V dies bonding to 200 mm silicon wafer is demonstrated. The high performance of the bonding interface is examined with various characterization techniques. Repeatable demonstrations of 16-III/V-die bonding to pre-patterned 200 mm silicon wafers have been performed for various hybrid silicon lasers, in which device library including Fabry-Perot (FP laser, lateral-coupled distributed feedback (LC-DFB laser with side wall grating, and mode-locked laser (MLL. From these results, the presented multiple D2W bonding technology can be a key enabler towards the large-scale heterogeneous integration of optoelectronic integrated circuits (H-OEIC.

  15. Classification Method in Integrated Information Network Using Vector Image Comparison

    Directory of Open Access Journals (Sweden)

    Zhou Yuan

    2014-05-01

    Full Text Available Wireless Integrated Information Network (WMN consists of integrated information that can get data from its surrounding, such as image, voice. To transmit information, large resource is required which decreases the service time of the network. In this paper we present a Classification Approach based on Vector Image Comparison (VIC for WMN that improve the service time of the network. The available methods for sub-region selection and conversion are also proposed.

  16. Results of the Collaborative Energy and Water Cycle Information Services (CEWIS) Workshop on Heterogeneous Dataset Analysis Preparation

    Science.gov (United States)

    Kempler, Steven; Teng, William; Acker, James; Belvedere, Deborah; Liu, Zhong; Leptoukh, Gregory

    2010-01-01

    In support of the NASA Energy and Water Cycle Study (NEWS), the Collaborative Energy and Water Cycle Information Services (CEWIS), sponsored by NEWS Program Manager Jared Entin, was initiated to develop an evolving set of community-based data and information services that would facilitate users to locate, access, and bring together multiple distributed heterogeneous energy and water cycle datasets. The CEWIS workshop, June 15-16, 2010, at NASA/GSFC, was the initial step of the process, starting with identifying and scoping the issues, as defined by the community.

  17. A RuleML Study on Integrating Geographical and Health Information

    DEFF Research Database (Denmark)

    Gao, Sheng; Mioc, Darka; Boley, Harold

    2008-01-01

    To facilitate health surveillance, flexible ways to represent, integrate, and deduce health information become increasingly important. In this paper, an ontology is used to support the semantic definition of spatial, temporal and thematic factors of health information. The ontology is realized...... as an interchangeable RuleML knowledge base, consisting of facts and rules. Rules are also used for integrating geographical and health information. The implemented eHealthGeo system uses the OO jDREW reasoning engine to deduce implicit information such as spatial relationships. The system combines this with spatial...

  18. Research on monitoring and management information integration technique in waste treatment and management

    International Nuclear Information System (INIS)

    Kong Jinsong; Yu Ren; Mao Wei

    2013-01-01

    The integration of the waste treatment process and the device status monitoring information and management information is a key problem required to be solved in the information integration of the waste treatment and management. The main content of the monitoring and management information integration is discussed in the paper. The data exchange techniques, which are based on the OPC, FTP and data push technology, are applied to the different monitoring system respectively, according to their development platform, to realize the integration of the waste treatment process and device status monitoring information and management information in a waste treatment center. (authors)

  19. The Effect of Information Security Management on Organizational Processes Integration in Supply Chain

    Directory of Open Access Journals (Sweden)

    Mohsen Shafiei Nikabadi

    2012-03-01

    Full Text Available : The major purpose of this article was that how information security management has effect on supply chain integration and the effect of implementing "information security management system" on enhancing supplies chain integration. In this respect, current research was seeking a combination overview to these tow approaches (Information Security Management and Organizational Processes Integration by Enterprise Resources Planning System and after that determined factors of these two important issue by factor analysis. Researchers using a series of comments in the automotive experts (production planning and management and supply chain experts and caregivers car makers and suppliers in the first level and second level supply chain industry. In this way, it has been done that impact on how information security management processes enterprise supply chain integration with the help of statistical correlation analysis. The results of this investigation indicated effect of "information security management system" various dimensions that were coordination of information, prevent human errors and hardware, the accuracy of information and education for users on two dimensions of internal and external integration of business processes, supply chain and finally, it can increased integration of business processes in supply chain. At the end owing to quite these results, deployment of "information security management system" increased the integration of organizational processes in supply chain. It could be demonstrate with the consideration of relation of organizational integration processes whit the level of coordination of information, prevent errors and accuracy of information throughout the supply chain.

  20. ETANA-DL: Managing Complex Information Applications - an Archaeology Digital Library

    OpenAIRE

    Ravindranathan, Unni; Shen, Rao; Goncalves, Marcos A.; Fan, Weiguo; Fox, Edward A.; Flanagan, James

    2004-01-01

    Archaeological research results in the generation of large quantities of heterogeneous information managed by different projects using custom information systems. We will demonstrate a prototype Digital Library (DL) for integrating and managing archaeological data and providing services useful to various user communities. ETANA-DL is a model-based, componentized, extensible, archaeological DL that manages complex information sources using the client-server paradigm of the Open Archives Initia...

  1. A semantic data dictionary method for database schema integration in CIESIN

    Science.gov (United States)

    Hinds, N.; Huang, Y.; Ravishankar, C.

    1993-08-01

    CIESIN (Consortium for International Earth Science Information Network) is funded by NASA to investigate the technology necessary to integrate and facilitate the interdisciplinary use of Global Change information. A clear of this mission includes providing a link between the various global change data sets, in particular the physical sciences and the human (social) sciences. The typical scientist using the CIESIN system will want to know how phenomena in an outside field affects his/her work. For example, a medical researcher might ask: how does air-quality effect emphysema? This and many similar questions will require sophisticated semantic data integration. The researcher who raised the question may be familiar with medical data sets containing emphysema occurrences. But this same investigator may know little, if anything, about the existance or location of air-quality data. It is easy to envision a system which would allow that investigator to locate and perform a ``join'' on two data sets, one containing emphysema cases and the other containing air-quality levels. No such system exists today. One major obstacle to providing such a system will be overcoming the heterogeneity which falls into two broad categories. ``Database system'' heterogeneity involves differences in data models and packages. ``Data semantic'' heterogeneity involves differences in terminology between disciplines which translates into data semantic issues, and varying levels of data refinement, from raw to summary. Our work investigates a global data dictionary mechanism to facilitate a merged data service. Specially, we propose using a semantic tree during schema definition to aid in locating and integrating heterogeneous databases.

  2. The use of agents and objects to integrate virtual enterprises

    Energy Technology Data Exchange (ETDEWEB)

    Pancerella, C.M.

    1998-01-01

    The manufacturing complex for the Department of Energy (DOE) is distributed: design laboratories, manufacturing facilities, and industrial partners. Designers must have a concurrent engineering environment to support all aspects of the cradle-to-grave product realization process across the distributed sites. Engineers must be able to analyze and simulate processes, retrieve and process heterogeneous information, both archived and current, and access multiple databases. Manufacturers must be able to coordinate activities of various manufacturing centers, which may involve a negotiation process. Furthermore, Sandia must be able to export manufacturing capabilities, such as on-machine acceptance, to outside suppliers. A key element to making this a reality is a flexible information architecture. The DOE information architecture must support a wide-area virtual enterprise, with distributed intelligent software components. The architecture must provide for asynchronous communication; multiple programming languages and operating systems; incorporation of geographically distributed manufacturing services; various hardware platforms; and heterogeneous workstations, PC`s, machine tool controllers, and special-purpose compute engines. Further, it is critical that manufacturing facilities are not isolated from design, planning, and other business activities and that information flows easily and bidirectionally between these activities. To accomplish this seamlessly, heterogeneous knowledge must be exchanged across both domain and organizational boundaries. Distributed object and software agent technologies are two methods for connecting such engineering and manufacturing systems. The two technologies have overlapping goals - interoperability and architectural support for integrating software components - though to date little or no integration of the two technologies has been made.

  3. Information Security and Integrity Systems

    Science.gov (United States)

    1990-01-01

    Viewgraphs from the Information Security and Integrity Systems seminar held at the University of Houston-Clear Lake on May 15-16, 1990 are presented. A tutorial on computer security is presented. The goals of this tutorial are the following: to review security requirements imposed by government and by common sense; to examine risk analysis methods to help keep sight of forest while in trees; to discuss the current hot topic of viruses (which will stay hot); to examine network security, now and in the next year to 30 years; to give a brief overview of encryption; to review protection methods in operating systems; to review database security problems; to review the Trusted Computer System Evaluation Criteria (Orange Book); to comment on formal verification methods; to consider new approaches (like intrusion detection and biometrics); to review the old, low tech, and still good solutions; and to give pointers to the literature and to where to get help. Other topics covered include security in software applications and development; risk management; trust: formal methods and associated techniques; secure distributed operating system and verification; trusted Ada; a conceptual model for supporting a B3+ dynamic multilevel security and integrity in the Ada runtime environment; and information intelligence sciences.

  4. Pervasive Sensing: Addressing the Heterogeneity Problem

    International Nuclear Information System (INIS)

    O'Grady, Michael J; Murdoch, Olga; Kroon, Barnard; Lillis, David; Carr, Dominic; Collier, Rem W; O'Hare, Gregory M P

    2013-01-01

    Pervasive sensing is characterized by heterogeneity across a number of dimensions. This raises significant problems for those designing, implementing and deploying sensor networks, irrespective of application domain. Such problems include for example, issues of data provenance and integrity, security, and privacy amongst others. Thus engineering a network that is fit-for-purpose represents a significant challenge. In this paper, the issue of heterogeneity is explored from the perspective of those who seek to harness a pervasive sensing element in their applications. A initial solution is proposed based on the middleware construct.

  5. An Extended VIKOR-Based Approach for Pumped Hydro Energy Storage Plant Site Selection with Heterogeneous Information

    Directory of Open Access Journals (Sweden)

    Yunna Wu

    2017-08-01

    Full Text Available The selection of a desirable site for constructing a pumped hydro energy storage plant (PHESP plays a vital important role in the whole life cycle. However, little research has been done on the site selection of PHESP, which affects the rapid development of PHESP. Therefore, this paper aims to select the most ideal PHESP site from numerous candidate alternatives using the multi-criteria decision-making (MCDM technique. Firstly, a comprehensive evaluation criteria system is established for the first time. Then, considering quantitative and qualitative criteria coexist in this system, multiple types of representations, including crisp numerical values (CNVs, triangular intuitionistic fuzzy numbers (TIFNs, and 2-dimension uncertain linguistic variables (2DULVs, are employed to deal with heterogeneous criteria information. To determine the weight of criteria and fully take the preference of the decision makers (DMs into account, the analytic hierarchy process (AHP method is applied for criteria weighting. After that, an extended Vlsekriterijumska Optimizacija I Kompromisno Resenje (VIKOR method is utilized to provide compromise solutions for the PHESP site considering such heterogeneous information. At last, the proposed model is then applied in a case study of Zhejiang province, China to illustrate its practicality and efficiency. The result shows the Changlongshan should be selected as the optimal PHESP.

  6. Integrated Reporting and Assurance of Sustainability Information: An Experimental Study on Professional Investors’ Information Processing

    NARCIS (Netherlands)

    Reimsbach, D.; Hahn, R.; Gürtürk, A.

    2018-01-01

    Sustainability-related non-financial information is increasingly deemed value relevant. Against this background, two recent trends in non-financial reporting are frequently discussed: integrated reporting and assurance of sustainability information. Using an established framework of information

  7. Modeling and interoperability of heterogeneous genomic big data for integrative processing and querying.

    Science.gov (United States)

    Masseroli, Marco; Kaitoua, Abdulrahman; Pinoli, Pietro; Ceri, Stefano

    2016-12-01

    While a huge amount of (epi)genomic data of multiple types is becoming available by using Next Generation Sequencing (NGS) technologies, the most important emerging problem is the so-called tertiary analysis, concerned with sense making, e.g., discovering how different (epi)genomic regions and their products interact and cooperate with each other. We propose a paradigm shift in tertiary analysis, based on the use of the Genomic Data Model (GDM), a simple data model which links genomic feature data to their associated experimental, biological and clinical metadata. GDM encompasses all the data formats which have been produced for feature extraction from (epi)genomic datasets. We specifically describe the mapping to GDM of SAM (Sequence Alignment/Map), VCF (Variant Call Format), NARROWPEAK (for called peaks produced by NGS ChIP-seq or DNase-seq methods), and BED (Browser Extensible Data) formats, but GDM supports as well all the formats describing experimental datasets (e.g., including copy number variations, DNA somatic mutations, or gene expressions) and annotations (e.g., regarding transcription start sites, genes, enhancers or CpG islands). We downloaded and integrated samples of all the above-mentioned data types and formats from multiple sources. The GDM is able to homogeneously describe semantically heterogeneous data and makes the ground for providing data interoperability, e.g., achieved through the GenoMetric Query Language (GMQL), a high-level, declarative query language for genomic big data. The combined use of the data model and the query language allows comprehensive processing of multiple heterogeneous data, and supports the development of domain-specific data-driven computations and bio-molecular knowledge discovery. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Proof-of-Concept of a Millimeter-Wave Integrated Heterogeneous Network for 5G Cellular

    Directory of Open Access Journals (Sweden)

    Shozo Okasaka

    2016-08-01

    Full Text Available The fifth-generation mobile networks (5G will not only enhance mobile broadband services, but also enable connectivity for a massive number of Internet-of-Things devices, such as wireless sensors, meters or actuators. Thus, 5G is expected to achieve a 1000-fold or more increase in capacity over 4G. The use of the millimeter-wave (mmWave spectrum is a key enabler to allowing 5G to achieve such enhancement in capacity. To fully utilize the mmWave spectrum, 5G is expected to adopt a heterogeneous network (HetNet architecture, wherein mmWave small cells are overlaid onto a conventional macro-cellular network. In the mmWave-integrated HetNet, splitting of the control plane (CP and user plane (UP will allow continuous connectivity and increase the capacity of the mmWave small cells. mmWave communication can be used not only for access linking, but also for wireless backhaul linking, which will facilitate the installation of mmWave small cells. In this study, a proof-of-concept (PoC was conducted to demonstrate the practicality of a prototype mmWave-integrated HetNet, using mmWave technologies for both backhaul and access.

  9. Understanding Information Systems Integration Deficiencies in Mergers and Acquisitions

    DEFF Research Database (Denmark)

    Henningsson, Stefan; Kettinger, William J.

    2017-01-01

    Information systems (IS) integration is a critical challenge for value-creating mergers and acquisitions. Appropriate design and implementation of IS integration is typically a precondition for enabling a majority of the anticipated business benefits of a combined organization. Often...

  10. Comparison of information-theoretic to statistical methods for gene-gene interactions in the presence of genetic heterogeneity

    Directory of Open Access Journals (Sweden)

    Sucheston Lara

    2010-09-01

    Full Text Available Abstract Background Multifactorial diseases such as cancer and cardiovascular diseases are caused by the complex interplay between genes and environment. The detection of these interactions remains challenging due to computational limitations. Information theoretic approaches use computationally efficient directed search strategies and thus provide a feasible solution to this problem. However, the power of information theoretic methods for interaction analysis has not been systematically evaluated. In this work, we compare power and Type I error of an information-theoretic approach to existing interaction analysis methods. Methods The k-way interaction information (KWII metric for identifying variable combinations involved in gene-gene interactions (GGI was assessed using several simulated data sets under models of genetic heterogeneity driven by susceptibility increasing loci with varying allele frequency, penetrance values and heritability. The power and proportion of false positives of the KWII was compared to multifactor dimensionality reduction (MDR, restricted partitioning method (RPM and logistic regression. Results The power of the KWII was considerably greater than MDR on all six simulation models examined. For a given disease prevalence at high values of heritability, the power of both RPM and KWII was greater than 95%. For models with low heritability and/or genetic heterogeneity, the power of the KWII was consistently greater than RPM; the improvements in power for the KWII over RPM ranged from 4.7% to 14.2% at for α = 0.001 in the three models at the lowest heritability values examined. KWII performed similar to logistic regression. Conclusions Information theoretic models are flexible and have excellent power to detect GGI under a variety of conditions that characterize complex diseases.

  11. Integrated Information System for Higher Education Qualifications

    Directory of Open Access Journals (Sweden)

    Catalin Ionut SILVESTRU

    2012-10-01

    Full Text Available In the present article we aim to study thoroughly and detail aspects related to architectures specific for e-learning and management of human resources training interconnected to management of qualifications. In addition, we take into consideration combining e-learning architectures with software in an e-learning system interconnected with the National Registry of Qualifications of Higher Education, in view of developing and information system that correlates educational supply from higher education from Romania with labor market demands through qualifications. The scientific endeavor consists of original architectural solutions to integrate data, systems, processes, services from various sources and to use them in the proposed system. The practical result of the scientific endeavor is represented by design of architectures required for developing an e-learning system interconnected with the National Registry of Qualifications from Romania, which involve in first stage the qualifications provided by higher education. The proposed innovative solution consists in the fact that the proposed information system combines the advantages of content management system (CMS with learning content management system (LCMS and with reusable learning objects (RLO. Thus, the architecture proposed in the research ensures the integration of a content management system with a portal for information, guidance and support in making a professional project. The integration enables correlation of competences with content areas and specific items from various teaching subjects, thus evaluating the usefulness for this registry from learning/educational perspective. Using the proposed information system in enables correlation among qualifications, content of educational program and continuous self-evaluation opportunities, which facilitate monitoring of progress and adjustment of learning content.

  12. Visualization and Integrated Data Mining of Disparate Information

    Energy Technology Data Exchange (ETDEWEB)

    Saffer, Jeffrey D.(OMNIVIZ, INC); Albright, Cory L.(BATTELLE (PACIFIC NW LAB)); Calapristi, Augustin J.(BATTELLE (PACIFIC NW LAB)); Chen, Guang (OMNIVIZ, INC); Crow, Vernon L.(BATTELLE (PACIFIC NW LAB)); Decker, Scott D.(BATTELLE (PACIFIC NW LAB)); Groch, Kevin M.(BATTELLE (PACIFIC NW LAB)); Havre, Susan L.(BATTELLE (PACIFIC NW LAB)); Malard, Joel (BATTELLE (PACIFIC NW LAB)); Martin, Tonya J.(BATTELLE (PACIFIC NW LAB)); Miller, Nancy E.(BATTELLE (PACIFIC NW LAB)); Monroe, Philip J.(OMNIVIZ, INC); Nowell, Lucy T.(BATTELLE (PACIFIC NW LAB)); Payne, Deborah A.(BATTELLE (PACIFIC NW LAB)); Reyes Spindola, Jorge F.(BATTELLE (PACIFIC NW LAB)); Scarberry, Randall E.(OMNIVIZ, INC); Sofia, Heidi J.(BATTELLE (PACIFIC NW LAB)); Stillwell, Lisa C.(OMNIVIZ, INC); Thomas, Gregory S.(BATTELLE (PACIFIC NW LAB)); Thurston, Sarah J.(OMNIVIZ, INC); Williams, Leigh K.(BATTELLE (PACIFIC NW LAB)); Zabriskie, Sean J.(OMNIVIZ, INC); MG Hicks

    2001-05-11

    The volumes and diversity of information in the discovery, development, and business processes within the chemical and life sciences industries require new approaches for analysis. Traditional list- or spreadsheet-based methods are easily overwhelmed by large amounts of data. Furthermore, generating strong hypotheses and, just as importantly, ruling out weak ones, requires integration across different experimental and informational sources. We have developed a framework for this integration, including common conceptual data models for multiple data types and linked visualizations that provide an overview of the entire data set, a measure of how each data record is related to every other record, and an assessment of the associations within the data set.

  13. Testing can counteract proactive interference by integrating competing information

    Science.gov (United States)

    Wahlheim, Christopher N.

    2015-01-01

    Testing initially learned information before presenting new information has been shown to counteract the deleterious effects of proactive interference by segregating competing sources of information. The present experiments were conducted to demonstrate that testing can also have its effects in part by integrating competing information. Variations of classic A–B, A–D paired-associate learning paradigms were employed that included two lists of word pairs and a cued-recall test. Repeated pairs appeared in both lists (A–B, A–B), control pairs appeared in List 2 only (A–B, C–D), and changed pairs appeared with the same cue in both lists but with different responses (A–B, A–D). The critical manipulation was whether pairs were tested or restudied in an interpolated phase that occurred between Lists 1 and 2. On a final cued-recall test, participants recalled List 2 responses and then indicated when they recollected that responses had earlier changed between lists. The change recollection measure indexed the extent to which competing responses were integrated during List 2. Change was recollected more often for tested than for restudied pairs. Proactive facilitation was obtained in cued recall when change was recollected, whereas proactive interference was obtained when change was not recollected. These results provide evidence that testing counteracted proactive interference in part by making List 1 responses more accessible during List 2, thus promoting integration and increasing later recollection of change. These results have theoretical implications because they show that testing can counteract proactive interference by integrating or segregating competing information. PMID:25120241

  14. Testing can counteract proactive interference by integrating competing information.

    Science.gov (United States)

    Wahlheim, Christopher N

    2015-01-01

    Testing initially learned information before presenting new information has been shown to counteract the deleterious effects of proactive interference by segregating competing sources of information. The present experiments were conducted to demonstrate that testing can also have its effects in part by integrating competing information. Variations of classic A-B, A-D paired-associate learning paradigms were employed that included two lists of word pairs and a cued-recall test. Repeated pairs appeared in both lists (A-B, A-B), control pairs appeared in List 2 only (A-B, C-D), and changed pairs appeared with the same cue in both lists but with different responses (A-B, A-D). The critical manipulation was whether pairs were tested or restudied in an interpolated phase that occurred between Lists 1 and 2. On a final cued-recall test, participants recalled List 2 responses and then indicated when they recollected that responses had earlier changed between lists. The change recollection measure indexed the extent to which competing responses were integrated during List 2. Change was recollected more often for tested than for restudied pairs. Proactive facilitation was obtained in cued recall when change was recollected, whereas proactive interference was obtained when change was not recollected. These results provide evidence that testing counteracted proactive interference in part by making List 1 responses more accessible during List 2, thus promoting integration and increasing later recollection of change. These results have theoretical implications because they show that testing can counteract proactive interference by integrating or segregating competing information.

  15. The OXL format for the exchange of integrated datasets

    Directory of Open Access Journals (Sweden)

    Taubert Jan

    2007-12-01

    Full Text Available A prerequisite for systems biology is the integration and analysis of heterogeneous experimental data stored in hundreds of life-science databases and millions of scientific publications. Several standardised formats for the exchange of specific kinds of biological information exist. Such exchange languages facilitate the integration process; however they are not designed to transport integrated datasets. A format for exchanging integrated datasets needs to i cover data from a broad range of application domains, ii be flexible and extensible to combine many different complex data structures, iii include metadata and semantic definitions, iv include inferred information, v identify the original data source for integrated entities and vi transport large integrated datasets. Unfortunately, none of the exchange formats from the biological domain (e.g. BioPAX, MAGE-ML, PSI-MI, SBML or the generic approaches (RDF, OWL fulfil these requirements in a systematic way.

  16. Integrating information systems : linking global business goals to local database applications

    NARCIS (Netherlands)

    Dignum, F.P.M.; Houben, G.J.P.M.

    1999-01-01

    This paper describes a new approach to design modern information systems that offer an integrated access to the data and knowledge that is available in local applications. By integrating the local data management activities into one transparent information distribution process, modern organizations

  17. Digital Microdroplet Ejection Technology-Based Heterogeneous Objects Prototyping

    OpenAIRE

    Li, Na; Yang, Jiquan; Feng, Chunmei; Yang, Jianfei; Zhu, Liya; Guo, Aiqing

    2016-01-01

    An integrate fabrication framework is presented to build heterogeneous objects (HEO) using digital microdroplets injecting technology and rapid prototyping. The heterogeneous materials part design and manufacturing method in structure and material was used to change the traditional process. The net node method was used for digital modeling that can configure multimaterials in time. The relationship of material, color, and jetting nozzle was built. The main important contributions are to combi...

  18. Monte Carlo-narrow resonance calculational techniques for treating double-heterogeneity effects

    International Nuclear Information System (INIS)

    Gelbard, E.M.; Chen, I.J.

    1986-01-01

    Reliable methods already exist for computing resonance integrals (RI's) in regular lattices. But lattice structures always contain irregularities. Such effects have been called ''double-heterogeneity'' effects. Two methods for computing double heterogeneity effects on RI's are reviewed and evaluated. 2 refs., 1 tab

  19. Information-integration category learning and the human uncertainty response.

    Science.gov (United States)

    Paul, Erick J; Boomer, Joseph; Smith, J David; Ashby, F Gregory

    2011-04-01

    The human response to uncertainty has been well studied in tasks requiring attention and declarative memory systems. However, uncertainty monitoring and control have not been studied in multi-dimensional, information-integration categorization tasks that rely on non-declarative procedural memory. Three experiments are described that investigated the human uncertainty response in such tasks. Experiment 1 showed that following standard categorization training, uncertainty responding was similar in information-integration tasks and rule-based tasks requiring declarative memory. In Experiment 2, however, uncertainty responding in untrained information-integration tasks impaired the ability of many participants to master those tasks. Finally, Experiment 3 showed that the deficit observed in Experiment 2 was not because of the uncertainty response option per se, but rather because the uncertainty response provided participants a mechanism via which to eliminate stimuli that were inconsistent with a simple declarative response strategy. These results are considered in the light of recent models of category learning and metacognition.

  20. Multicriterion problem of allocation of resources in the heterogeneous distributed information processing systems

    Science.gov (United States)

    Antamoshkin, O. A.; Kilochitskaya, T. R.; Ontuzheva, G. A.; Stupina, A. A.; Tynchenko, V. S.

    2018-05-01

    This study reviews the problem of allocation of resources in the heterogeneous distributed information processing systems, which may be formalized in the form of a multicriterion multi-index problem with the linear constraints of the transport type. The algorithms for solution of this problem suggest a search for the entire set of Pareto-optimal solutions. For some classes of hierarchical systems, it is possible to significantly speed up the procedure of verification of a system of linear algebraic inequalities for consistency due to the reducibility of them to the stream models or the application of other solution schemes (for strongly connected structures) that take into account the specifics of the hierarchies under consideration.

  1. Spatial Data Integration Using Ontology-Based Approach

    Science.gov (United States)

    Hasani, S.; Sadeghi-Niaraki, A.; Jelokhani-Niaraki, M.

    2015-12-01

    In today's world, the necessity for spatial data for various organizations is becoming so crucial that many of these organizations have begun to produce spatial data for that purpose. In some circumstances, the need to obtain real time integrated data requires sustainable mechanism to process real-time integration. Case in point, the disater management situations that requires obtaining real time data from various sources of information. One of the problematic challenges in the mentioned situation is the high degree of heterogeneity between different organizations data. To solve this issue, we introduce an ontology-based method to provide sharing and integration capabilities for the existing databases. In addition to resolving semantic heterogeneity, better access to information is also provided by our proposed method. Our approach is consisted of three steps, the first step is identification of the object in a relational database, then the semantic relationships between them are modelled and subsequently, the ontology of each database is created. In a second step, the relative ontology will be inserted into the database and the relationship of each class of ontology will be inserted into the new created column in database tables. Last step is consisted of a platform based on service-oriented architecture, which allows integration of data. This is done by using the concept of ontology mapping. The proposed approach, in addition to being fast and low cost, makes the process of data integration easy and the data remains unchanged and thus takes advantage of the legacy application provided.

  2. SPATIAL DATA INTEGRATION USING ONTOLOGY-BASED APPROACH

    Directory of Open Access Journals (Sweden)

    S. Hasani

    2015-12-01

    Full Text Available In today's world, the necessity for spatial data for various organizations is becoming so crucial that many of these organizations have begun to produce spatial data for that purpose. In some circumstances, the need to obtain real time integrated data requires sustainable mechanism to process real-time integration. Case in point, the disater management situations that requires obtaining real time data from various sources of information. One of the problematic challenges in the mentioned situation is the high degree of heterogeneity between different organizations data. To solve this issue, we introduce an ontology-based method to provide sharing and integration capabilities for the existing databases. In addition to resolving semantic heterogeneity, better access to information is also provided by our proposed method. Our approach is consisted of three steps, the first step is identification of the object in a relational database, then the semantic relationships between them are modelled and subsequently, the ontology of each database is created. In a second step, the relative ontology will be inserted into the database and the relationship of each class of ontology will be inserted into the new created column in database tables. Last step is consisted of a platform based on service-oriented architecture, which allows integration of data. This is done by using the concept of ontology mapping. The proposed approach, in addition to being fast and low cost, makes the process of data integration easy and the data remains unchanged and thus takes advantage of the legacy application provided.

  3. Heterogeneous inflation expectations, learning, and market outcomes

    OpenAIRE

    Madeira, Carlos; Zafar, Basit

    2012-01-01

    Using the panel component of the Michigan Survey of Consumers, we show that individuals, in particular women and ethnic minorities, are highly heterogeneous in their expectations of inflation. We estimate a model of inflation expectations based on learning from experience that also allows for heterogeneity in both private information and updating. Our model vastly outperforms existing models of inflation expectations in explaining the heterogeneity in the data. We find that women, ethnic mino...

  4. QUANTITATIVE СHARACTERISTICS OF COMPLEMENTARY INTEGRATED HEALTH CARE SYSTEM AND INTEGRATED MEDICATION MANAGEMENT INFORMATION SYSTEM

    Directory of Open Access Journals (Sweden)

    L. Yu. Babintseva

    2015-05-01

    i mportant elements of state regulation of the pharmaceutical sector health. For the first time creation of two information systems: integrated medication management infor mation system and integrated health care system in an integrated medical infor mation area, operating based on th e principle of complementarity was justified. Global and technological coefficients of these systems’ functioning were introduced.

  5. Characterizing heterogeneous cellular responses to perturbations.

    Science.gov (United States)

    Slack, Michael D; Martinez, Elisabeth D; Wu, Lani F; Altschuler, Steven J

    2008-12-09

    Cellular populations have been widely observed to respond heterogeneously to perturbation. However, interpreting the observed heterogeneity is an extremely challenging problem because of the complexity of possible cellular phenotypes, the large dimension of potential perturbations, and the lack of methods for separating meaningful biological information from noise. Here, we develop an image-based approach to characterize cellular phenotypes based on patterns of signaling marker colocalization. Heterogeneous cellular populations are characterized as mixtures of phenotypically distinct subpopulations, and responses to perturbations are summarized succinctly as probabilistic redistributions of these mixtures. We apply our method to characterize the heterogeneous responses of cancer cells to a panel of drugs. We find that cells treated with drugs of (dis-)similar mechanism exhibit (dis-)similar patterns of heterogeneity. Despite the observed phenotypic diversity of cells observed within our data, low-complexity models of heterogeneity were sufficient to distinguish most classes of drug mechanism. Our approach offers a computational framework for assessing the complexity of cellular heterogeneity, investigating the degree to which perturbations induce redistributions of a limited, but nontrivial, repertoire of underlying states and revealing functional significance contained within distinct patterns of heterogeneous responses.

  6. Survival differences of CIMP subtypes integrated with CNA information in human breast cancer.

    Science.gov (United States)

    Wang, Huihan; Yan, Weili; Zhang, Shumei; Gu, Yue; Wang, Yihan; Wei, Yanjun; Liu, Hongbo; Wang, Fang; Wu, Qiong; Zhang, Yan

    2017-07-25

    CpG island methylator phenotype of breast cancer is associated with widespread aberrant methylation at specified CpG islands and distinct patient outcomes. However, the influence of copy number contributing to the prognosis of tumors with different CpG island methylator phenotypes is still unclear. We analyzed both genetic (copy number) and epigenetic alterations in 765 breast cancers from The Cancer Genome Atlas data portal and got a panel of 15 biomarkers for copy number and methylation status evaluation. The gene panel identified two groups corresponding to distinct copy number profiles. In status of mere-loss copy number, patients were faced with a greater risk if they presented a higher CpG islands methylation pattern in biomarker panels. But for samples presenting merely-gained copy number, higher methylation level of CpG islands was associated with improved viability. In all, the integration of copy number alteration and methylation information enhanced the classification power on prognosis. Moreover, we found the molecular subtypes of breast cancer presented different distributions in two CpG island methylation phenotypes. Generated by the same set of human methylation 450K data, additional copy number information could provide insights into survival prediction of cancers with less heterogeneity and might help to determine the biomarkers for diagnosis and treatment for breast cancer patients in a more personalized approach.

  7. A heterogeneous graph-based recommendation simulator

    Energy Technology Data Exchange (ETDEWEB)

    Yeonchan, Ahn [Seoul National University; Sungchan, Park [Seoul National University; Lee, Matt Sangkeun [ORNL; Sang-goo, Lee [Seoul National University

    2013-01-01

    Heterogeneous graph-based recommendation frameworks have flexibility in that they can incorporate various recommendation algorithms and various kinds of information to produce better results. In this demonstration, we present a heterogeneous graph-based recommendation simulator which enables participants to experience the flexibility of a heterogeneous graph-based recommendation method. With our system, participants can simulate various recommendation semantics by expressing the semantics via meaningful paths like User Movie User Movie. The simulator then returns the recommendation results on the fly based on the user-customized semantics using a fast Monte Carlo algorithm.

  8. Construction of an ortholog database using the semantic web technology for integrative analysis of genomic data.

    Science.gov (United States)

    Chiba, Hirokazu; Nishide, Hiroyo; Uchiyama, Ikuo

    2015-01-01

    Recently, various types of biological data, including genomic sequences, have been rapidly accumulating. To discover biological knowledge from such growing heterogeneous data, a flexible framework for data integration is necessary. Ortholog information is a central resource for interlinking corresponding genes among different organisms, and the Semantic Web provides a key technology for the flexible integration of heterogeneous data. We have constructed an ortholog database using the Semantic Web technology, aiming at the integration of numerous genomic data and various types of biological information. To formalize the structure of the ortholog information in the Semantic Web, we have constructed the Ortholog Ontology (OrthO). While the OrthO is a compact ontology for general use, it is designed to be extended to the description of database-specific concepts. On the basis of OrthO, we described the ortholog information from our Microbial Genome Database for Comparative Analysis (MBGD) in the form of Resource Description Framework (RDF) and made it available through the SPARQL endpoint, which accepts arbitrary queries specified by users. In this framework based on the OrthO, the biological data of different organisms can be integrated using the ortholog information as a hub. Besides, the ortholog information from different data sources can be compared with each other using the OrthO as a shared ontology. Here we show some examples demonstrating that the ortholog information described in RDF can be used to link various biological data such as taxonomy information and Gene Ontology. Thus, the ortholog database using the Semantic Web technology can contribute to biological knowledge discovery through integrative data analysis.

  9. Designing Domain-Specific Heterogeneous Architectures from Dataflow Programs

    Directory of Open Access Journals (Sweden)

    Süleyman Savas

    2018-04-01

    Full Text Available The last ten years have seen performance and power requirements pushing computer architectures using only a single core towards so-called manycore systems with hundreds of cores on a single chip. To further increase performance and energy efficiency, we are now seeing the development of heterogeneous architectures with specialized and accelerated cores. However, designing these heterogeneous systems is a challenging task due to their inherent complexity. We proposed an approach for designing domain-specific heterogeneous architectures based on instruction augmentation through the integration of hardware accelerators into simple cores. These hardware accelerators were determined based on their common use among applications within a certain domain.The objective was to generate heterogeneous architectures by integrating many of these accelerated cores and connecting them with a network-on-chip. The proposed approach aimed to ease the design of heterogeneous manycore architectures—and, consequently, exploration of the design space—by automating the design steps. To evaluate our approach, we enhanced our software tool chain with a tool that can generate accelerated cores from dataflow programs. This new tool chain was evaluated with the aid of two use cases: radar signal processing and mobile baseband processing. We could achieve an approximately 4 × improvement in performance, while executing complete applications on the augmented cores with a small impact (2.5–13% on area usage. The generated accelerators are competitive, achieving more than 90% of the performance of hand-written implementations.

  10. Heterogeneous Wireless Networks for Smart Grid Distribution Systems: Advantages and Limitations.

    Science.gov (United States)

    Khalifa, Tarek; Abdrabou, Atef; Shaban, Khaled; Gaouda, A M

    2018-05-11

    Supporting a conventional power grid with advanced communication capabilities is a cornerstone to transferring it to a smart grid. A reliable communication infrastructure with a high throughput can lay the foundation towards the ultimate objective of a fully automated power grid with self-healing capabilities. In order to realize this objective, the communication infrastructure of a power distribution network needs to be extended to cover all substations including medium/low voltage ones. This shall enable information exchange among substations for a variety of system automation purposes with a low latency that suits time critical applications. This paper proposes the integration of two heterogeneous wireless technologies (such as WiFi and cellular 3G/4G) to provide reliable and fast communication among primary and secondary distribution substations. This integration allows the transmission of different data packets (not packet replicas) over two radio interfaces, making these interfaces act like a one data pipe. Thus, the paper investigates the applicability and effectiveness of employing heterogeneous wireless networks (HWNs) in achieving the desired reliability and timeliness requirements of future smart grids. We study the performance of HWNs in a realistic scenario under different data transfer loads and packet loss ratios. Our findings reveal that HWNs can be a viable data transfer option for smart grids.

  11. Heterogeneous Wireless Networks for Smart Grid Distribution Systems: Advantages and Limitations

    Directory of Open Access Journals (Sweden)

    Tarek Khalifa

    2018-05-01

    Full Text Available Supporting a conventional power grid with advanced communication capabilities is a cornerstone to transferring it to a smart grid. A reliable communication infrastructure with a high throughput can lay the foundation towards the ultimate objective of a fully automated power grid with self-healing capabilities. In order to realize this objective, the communication infrastructure of a power distribution network needs to be extended to cover all substations including medium/low voltage ones. This shall enable information exchange among substations for a variety of system automation purposes with a low latency that suits time critical applications. This paper proposes the integration of two heterogeneous wireless technologies (such as WiFi and cellular 3G/4G to provide reliable and fast communication among primary and secondary distribution substations. This integration allows the transmission of different data packets (not packet replicas over two radio interfaces, making these interfaces act like a one data pipe. Thus, the paper investigates the applicability and effectiveness of employing heterogeneous wireless networks (HWNs in achieving the desired reliability and timeliness requirements of future smart grids. We study the performance of HWNs in a realistic scenario under different data transfer loads and packet loss ratios. Our findings reveal that HWNs can be a viable data transfer option for smart grids.

  12. THE IMPORTANCE OF THE IMPLEMENTATION OF INTEGRATED INFORMATION SYSTEMS IN THE RESTRUCTURING AND EUROPEAN INTEGRATION PROCESS OF ORGANIZATIONS

    Directory of Open Access Journals (Sweden)

    Steliac Nela

    2010-12-01

    Full Text Available Many of the organizations that are part of the public and private domain in Romania have reached the stage in which the existing information systems can no longer comply with the requests of users. Therefore, we are compelled by necessity to use integrated information systems which should be able to control all kinds of data and to allow access to them, to ensure the coherence and consistency of the stored information. Managers must be aware of the importance of the implementation of integrated information systems in the background restructuring of the organization, which can thus become consistent and competitive with the European Union one, so the integration process becomes a real and possible one.

  13. Towards a Unified Approach to Information Integration - A review paper on data/information fusion

    Energy Technology Data Exchange (ETDEWEB)

    Whitney, Paul D.; Posse, Christian; Lei, Xingye C.

    2005-10-14

    Information or data fusion of data from different sources are ubiquitous in many applications, from epidemiology, medical, biological, political, and intelligence to military applications. Data fusion involves integration of spectral, imaging, text, and many other sensor data. For example, in epidemiology, information is often obtained based on many studies conducted by different researchers at different regions with different protocols. In the medical field, the diagnosis of a disease is often based on imaging (MRI, X-Ray, CT), clinical examination, and lab results. In the biological field, information is obtained based on studies conducted on many different species. In military field, information is obtained based on data from radar sensors, text messages, chemical biological sensor, acoustic sensor, optical warning and many other sources. Many methodologies are used in the data integration process, from classical, Bayesian, to evidence based expert systems. The implementation of the data integration ranges from pure software design to a mixture of software and hardware. In this review we summarize the methodologies and implementations of data fusion process, and illustrate in more detail the methodologies involved in three examples. We propose a unified multi-stage and multi-path mapping approach to the data fusion process, and point out future prospects and challenges.

  14. Integrated modeling and up-scaling of landfill processes and heterogeneity using stochastic approach

    NARCIS (Netherlands)

    Bun, A.; Heimovaara, T.J.; Baviskar, S.M.; van Turnhout, A.G.; Konstantaki, L.A.

    2012-01-01

    Municipal solid waste landfills are a very complex and heterogeneous systems. The waste in a landfill body is a heterogeneous mixture of a wide range of materials containing high levels of organic matter, high amounts of salts and a wide range of different organic and inorganic substances, such as

  15. A New Multi-Sensor Track Fusion Architecture for Multi-Sensor Information Integration

    National Research Council Canada - National Science Library

    Jean, Buddy H; Younker, John; Hung, Chih-Cheng

    2004-01-01

    .... This new technology will integrate multi-sensor information and extract integrated multi-sensor information to detect, track and identify multiple targets at any time, in any place under all weather conditions...

  16. Integrating SAP to Information Systems Curriculum: Design and Delivery

    Science.gov (United States)

    Wang, Ming

    2011-01-01

    Information Systems (IS) education is being transformed from the segmented applications toward the integrated enterprise-wide system software Enterprise Resource Planning (ERP). ERP is a platform that integrates all business functions with its centralized data repository shared by all the business operations in the enterprise. This tremendous…

  17. Integrated system of production information processing for surface mines

    Energy Technology Data Exchange (ETDEWEB)

    Li, K.; Wang, S.; Zeng, Z.; Wei, J.; Ren, Z. [China University of Mining and Technology, Xuzhou (China). Dept of Mining Engineering

    2000-09-01

    Based on the concept of geological statistic, mathematical program, condition simulation, system engineering, and the features and duties of each main department in surface mine production, an integrated system for surface mine production information was studied systematically and developed by using the technology of data warehousing, CAD, object-oriented and system integration, which leads to the systematizing and automating of the information management, data processing, optimization computing and plotting. In this paper, its overall object, system design, structure and functions and some key techniques were described. 2 refs., 3 figs.

  18. The integration of Information and Communication Technology into nursing.

    Science.gov (United States)

    Lupiáñez-Villanueva, Francisco; Hardey, Michael; Torrent, Joan; Ficapal, Pilar

    2011-02-01

    To identify and characterise different profiles of nurses' utilization of Information and Communication Technology (ICT) and the Internet and to identify factors that can enhance or inhibit the use of these technologies within nursing. An online survey of the 13,588 members of the Nurses Association of Barcelona who had a registered email account in 2006 was carried out. Factor analysis, cluster analysis and binomial logit model was undertaken. Although most of the nurses (76.70%) are utilizing the Internet within their daily work, multivariate statistics analysis revealed two profiles of the adoption of ICT. The first profile (4.58%) represents those nurses who value ICT and the Internet so that it forms an integral part of their practice. This group is thus referred to as 'integrated nurses'. The second profile (95.42%) represents those nurses who place less emphasis on ICT and the Internet and are consequently labelled 'non-integrated nurses'. From the statistical modelling, it was observed that undertaking research activities an emphasis on international information and a belief that health information available on the Internet was 'very relevant' play a positive and significant role in the probability of being an integrated nurse. The emerging world of the 'integrated nurse' cannot be adequately understood without examining how nurses make use of ICT and the Internet within nursing practice and the way this is shaped by institutional, technical and professional opportunities and constraints. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  19. Integration of Landscape Metrics and Variograms to Characterize and Quantify the Spatial Heterogeneity Change of Vegetation Induced by the 2008 Wenchuan Earthquake

    Directory of Open Access Journals (Sweden)

    Ling Wang

    2017-06-01

    Full Text Available The quantification of spatial heterogeneity can be used to examine the structure of ecological systems. The 2008 Wenchuan earthquake caused severe vegetation damage. In addition to simply detecting change, the magnitude of changes must also be examined. Remote sensing and geographic information system techniques were used to produce landscape maps before and after the earthquake and analyze the spatial-temporal change of the vegetation pattern. Landscape metrics were selected to quantify the spatial heterogeneity in a categorical map at both the class and landscape levels. The results reveal that the Wenchuan earthquake greatly increased the heterogeneity in the study area. In particular, forests experienced the most fragmentation among all of the landscape types. In addition, spatial heterogeneity in a numerical map was studied by using variogram analysis of normalized difference vegetation indices derived from Landsat images. In comparison to before the earthquake, the spatial variability after the earthquake had doubled. The structure of the spatial heterogeneity represented by the range of normalized difference vegetation index (NDVI variograms also changed due to the earthquake. Moreover, the results of the NDVI variogram analysis of three contrasting landscapes, which were farmland, broadleaved forest, and coniferous forest, confirm that the earthquake produced spatial variability and changed the structure of the landscapes. Regardless of before or after the earthquake, farmland sites are the most heterogeneous among the three landscapes studied.

  20. GeneNotes – A novel information management software for biologists

    Directory of Open Access Journals (Sweden)

    Wong Wing H

    2005-02-01

    Full Text Available Abstract Background Collecting and managing information is a challenging task in a genome-wide profiling research project. Most databases and online computational tools require a direct human involvement. Information and computational results are presented in various multimedia formats (e.g., text, image, PDF, word files, etc., many of which cannot be automatically processed by computers in biologically meaningful ways. In addition, the quality of computational results is far from perfect and requires nontrivial manual examination. The timely selection, integration and interpretation of heterogeneous biological information still heavily rely on the sensibility of biologists. Biologists often feel overwhelmed by the huge amount of and the great diversity of distributed heterogeneous biological information. Description We developed an information management application called GeneNotes. GeneNotes is the first application that allows users to collect and manage multimedia biological information about genes/ESTs. GeneNotes provides an integrated environment for users to surf the Internet, collect notes for genes/ESTs, and retrieve notes. GeneNotes is supported by a server that integrates gene annotations from many major databases (e.g., HGNC, MGI, etc.. GeneNotes uses the integrated gene annotations to (a identify genes given various types of gene IDs (e.g., RefSeq ID, GenBank ID, etc., and (b provide quick views of genes. GeneNotes is free for academic usage. The program and the tutorials are available at: http://bayes.fas.harvard.edu/genenotes/. Conclusions GeneNotes provides a novel human-computer interface to assist researchers to collect and manage biological information. It also provides a platform for studying how users behave when they manipulate biological information. The results of such study can lead to innovation of more intelligent human-computer interfaces that greatly shorten the cycle of biology research.

  1. Information Technology Integration in Higher Education: A Novel Approach for Impact Assessment

    Directory of Open Access Journals (Sweden)

    Abdulkareem Al-Alwani

    2014-12-01

    Full Text Available In the current technological world of Information services, academic systems are also in the process of adapting information technology solutions. Information systems vary for different applications and specifically in academia domain, a range of information systems are available for different institutions worldwide. Integration of e-learning can optimize implementation of computer-based and computer-assisted educational processes at all levels. Therefore it is imperative to assess and evaluate integration of these information systems because they have serious impact on e-learning processes. In this study an instrument survey is presented for evaluating integration of information technology systems and practices in an educational environment. Survey is constructed using descriptive questions related to information technology tools to assess qualitative impact and usage of such tools. Critical feedback, analysis and suggestions from 25 educationists played a pivotal role in finalizing proposed survey questionnaire. A subsequent test evaluation by teachers and students is also carried out to assess adequate utilization of information systems in Yanbu University College. The results showed that feedback using this survey can help in identifying technological gaps and facilitate effective integration of information technology in an educational environment. Survey instrument proposed in this research can greatly enhance integration of IT tools as it can identify shortcomings by collecting statistical data from feedback of both faculty and students. Solution to these problems is deterministic and can be easily implemented to optimize overall performance of e-learning systems.

  2. Quality and integration of public health information systems: A systematic review focused on immunization and vital records systems.

    Science.gov (United States)

    Vest, Joshua R; Kirk, Hilary M; Issel, L Michele

    2012-01-01

    Public health professionals rely on quantitative data for the daily practice of public health as well as organizational decision making and planning. However, several factors work against effective data sharing among public health agencies in the US. This review characterizes the reported barriers and enablers of effective use of public health IS from an informatics perspective. A systematic review of the English language literature for 2005 to 2011 followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) format. The review focused on immunization information systems (IIS) and vital records information systems (VRIS). Systems were described according to the structural aspects of IS integration and data quality. Articles describing IIS documented issues pertaining to the distribution of the system, the autonomy of the data providers, the heterogeneous nature of information sharing as well as the quality of the data. Articles describing VRIS were focused much more heavily on data quality, particularly whether or not the data were free from errors. For state and local practitioners to effectively utilize data, public health IS will have to overcome the challenges posed by a large number of autonomous data providers utilizing a variety of technologies.

  3. Heterogeneous Gossip

    Science.gov (United States)

    Frey, Davide; Guerraoui, Rachid; Kermarrec, Anne-Marie; Koldehofe, Boris; Mogensen, Martin; Monod, Maxime; Quéma, Vivien

    Gossip-based information dissemination protocols are considered easy to deploy, scalable and resilient to network dynamics. Load-balancing is inherent in these protocols as the dissemination work is evenly spread among all nodes. Yet, large-scale distributed systems are usually heterogeneous with respect to network capabilities such as bandwidth. In practice, a blind load-balancing strategy might significantly hamper the performance of the gossip dissemination.

  4. How to measure genetic heterogeneity

    International Nuclear Information System (INIS)

    Yamada, Ryo

    2009-01-01

    Genetic information of organisms is coded as a string of four letters, A, T, G and C, a sequence in macromolecules called deoxyribonucleic acid (DNA). DNA sequence offers blueprint of organisms and its heterogeneity determines identity and variation of species. The quantitation of this genetic heterogeneity is fundamental to understand biology. We compared previously-reported three measures, covariance matrix expression of list of loci (pair-wise r 2 ), the most popular index in genetics, and its multi-dimensional form, Ψ, and entropy-based index, ε. Thereafter we proposed two methods so that we could handle the diplotypic heterogeneity and quantitate the conditions where the number of DNA sequence samples is much smaller than the number of possible variants.

  5. Functional Heterogeneity and Senior Management Team Effectiveness

    Science.gov (United States)

    Benoliel, Pascale; Somech, Anit

    2016-01-01

    Purpose: There has been an increasing trend toward the creation of senior management teams (SMTs) which are characterized by a high degree of functional heterogeneity. Although such teams may create better linkages to information, along with the benefits of functional heterogeneity comes the potential for conflicts that stem from the value…

  6. Upscaling of permeability heterogeneities in reservoir rocks; an integrated approach

    NARCIS (Netherlands)

    Mikes, D.

    2002-01-01

    This thesis presents a hierarchical and geologically constrained deterministic approach to incorporate small-scale heterogeneities into reservoir flow simulators. We use a hierarchical structure to encompass all scales from laminae to an entire depositional system. For the geological models under

  7. The Integrated Information System for Natural Disaster Mitigation

    Directory of Open Access Journals (Sweden)

    Junxiu Wu

    2007-08-01

    Full Text Available Supported by the World Bank, the Integrated Information System for Natural Disaster Mitigation (ISNDM, including the operational service system and network telecommunication system, has been in development for three years in the Center of Disaster Reduction, Chinese Academy of Sciences, based on the platform of the GIS software Arcview. It has five main modules: disaster background information, socio- economic information, disaster-induced factors database, disaster scenarios database, and disaster assessment. ISNDM has several significant functions, which include information collection, information processing, data storage, and information distribution. It is a simple but comprehensive demonstration system for our national center for natural disaster reduction.

  8. Automated granularity to integrate digital information: the "Antarctic Treaty Searchable Database" case study

    Directory of Open Access Journals (Sweden)

    Paul Arthur Berkman

    2006-06-01

    Full Text Available Access to information is necessary, but not sufficient in our digital era. The challenge is to objectively integrate digital resources based on user-defined objectives for the purpose of discovering information relationships that facilitate interpretations and decision making. The Antarctic Treaty Searchable Database (http://aspire.nvi.net, which is in its sixth edition, provides an example of digital integration based on the automated generation of information granules that can be dynamically combined to reveal objective relationships within and between digital information resources. This case study further demonstrates that automated granularity and dynamic integration can be accomplished simply by utilizing the inherent structure of the digital information resources. Such information integration is relevant to library and archival programs that require long-term preservation of authentic digital resources.

  9. Assessing Extinction Risk: Integrating Genetic Information

    Directory of Open Access Journals (Sweden)

    Jason Dunham

    1999-06-01

    Full Text Available Risks of population extinction have been estimated using a variety of methods incorporating information from different spatial and temporal scales. We briefly consider how several broad classes of extinction risk assessments, including population viability analysis, incidence functions, and ranking methods integrate information on different temporal and spatial scales. In many circumstances, data from surveys of neutral genetic variability within, and among, populations can provide information useful for assessing extinction risk. Patterns of genetic variability resulting from past and present ecological and demographic events, can indicate risks of extinction that are otherwise difficult to infer from ecological and demographic analyses alone. We provide examples of how patterns of neutral genetic variability, both within, and among populations, can be used to corroborate and complement extinction risk assessments.

  10. Information Extraction, Data Integration, and Uncertain Data Management: The State of The Art

    NARCIS (Netherlands)

    Habib, Mena Badieh; van Keulen, Maurice

    2011-01-01

    Information Extraction, data Integration, and uncertain data management are different areas of research that got vast focus in the last two decades. Many researches tackled those areas of research individually. However, information extraction systems should have integrated with data integration

  11. A Geospatial Information Grid Framework for Geological Survey

    OpenAIRE

    Wu, Liang; Xue, Lei; Li, Chaoling; Lv, Xia; Chen, Zhanlong; Guo, Mingqiang; Xie, Zhong

    2015-01-01

    The use of digital information in geological fields is becoming very important. Thus, informatization in geological surveys should not stagnate as a result of the level of data accumulation. The integration and sharing of distributed, multi-source, heterogeneous geological information is an open problem in geological domains. Applications and services use geological spatial data with many features, including being cross-region and cross-domain and requiring real-time updating. As a result of ...

  12. Joint sparsity based heterogeneous data-level fusion for target detection and estimation

    Science.gov (United States)

    Niu, Ruixin; Zulch, Peter; Distasio, Marcello; Blasch, Erik; Shen, Dan; Chen, Genshe

    2017-05-01

    Typical surveillance systems employ decision- or feature-level fusion approaches to integrate heterogeneous sensor data, which are sub-optimal and incur information loss. In this paper, we investigate data-level heterogeneous sensor fusion. Since the sensors monitor the common targets of interest, whose states can be determined by only a few parameters, it is reasonable to assume that the measurement domain has a low intrinsic dimensionality. For heterogeneous sensor data, we develop a joint-sparse data-level fusion (JSDLF) approach based on the emerging joint sparse signal recovery techniques by discretizing the target state space. This approach is applied to fuse signals from multiple distributed radio frequency (RF) signal sensors and a video camera for joint target detection and state estimation. The JSDLF approach is data-driven and requires minimum prior information, since there is no need to know the time-varying RF signal amplitudes, or the image intensity of the targets. It can handle non-linearity in the sensor data due to state space discretization and the use of frequency/pixel selection matrices. Furthermore, for a multi-target case with J targets, the JSDLF approach only requires discretization in a single-target state space, instead of discretization in a J-target state space, as in the case of the generalized likelihood ratio test (GLRT) or the maximum likelihood estimator (MLE). Numerical examples are provided to demonstrate that the proposed JSDLF approach achieves excellent performance with near real-time accurate target position and velocity estimates.

  13. Mediator infrastructure for information integration and semantic data integration environment for biomedical research.

    Science.gov (United States)

    Grethe, Jeffrey S; Ross, Edward; Little, David; Sanders, Brian; Gupta, Amarnath; Astakhov, Vadim

    2009-01-01

    This paper presents current progress in the development of semantic data integration environment which is a part of the Biomedical Informatics Research Network (BIRN; http://www.nbirn.net) project. BIRN is sponsored by the National Center for Research Resources (NCRR), a component of the National Institutes of Health (NIH). A goal is the development of a cyberinfrastructure for biomedical research that supports advance data acquisition, data storage, data management, data integration, data mining, data visualization, and other computing and information processing services over the Internet. Each participating institution maintains storage of their experimental or computationally derived data. Mediator-based data integration system performs semantic integration over the databases to enable researchers to perform analyses based on larger and broader datasets than would be available from any single institution's data. This paper describes recent revision of the system architecture, implementation, and capabilities of the semantically based data integration environment for BIRN.

  14. A Framework for Understanding Post-Merger Information Systems Integration

    DEFF Research Database (Denmark)

    Alaranta, Maria; Kautz, Karlheinz

    2012-01-01

    This paper develops a theoretical framework for the integration of information systems (IS) after a merger or an acquisition. The framework integrates three perspectives: a structuralist, an individualist, and an interactive process perspective to analyze and understand such integrations....... The framework is applied to a longitudinal case study of a manufacturing company that grew through an acquisition. The management decided to integrate the production control IS via tailoring a new system that blends together features of existing IS. The application of the framework in the case study confirms...... several known impediments to IS integrations. It also identifies a number of new inhibitors, as well as known and new facilitators that can bring post-merger IS integration to a success. Our findings provide relevant insights to researching and managing post-merger IS integrations. They emphasize...

  15. Monitoring technology and firm boundaries: physician-hospital integration and technology utilization.

    Science.gov (United States)

    McCullough, Jeffrey S; Snir, Eli M

    2010-05-01

    We study the relationship between physician-hospital integration and its relation to monitoring IT utilization. We develop a theoretical model in which monitoring IT may complement or substitute for integration and test these relationships using a novel data source. Physician labor market heterogeneity identifies the empirical model. We find that monitoring IT utilization is increasing in integration, implying that expanded firm boundaries complement monitoring IT adoption. We argue that the relationship between monitoring IT and firm boundaries depends upon the contractibility of the monitored information.

  16. The Effect of Information Security Management on Organizational Processes Integration in Supply Chain

    OpenAIRE

    Mohsen Shafiei Nikabadi; Ahmad Jafarian; Azam Jalili Bolhasani

    2012-01-01

    : The major purpose of this article was that how information security management has effect on supply chain integration and the effect of implementing "information security management system" on enhancing supplies chain integration. In this respect, current research was seeking a combination overview to these tow approaches (Information Security Management and Organizational Processes Integration by Enterprise Resources Planning System) and after that determined factors of these two import...

  17. Multiscale Characterization of Structural Compositional and Textural Heterogeneity of Nano-porous Geomaterials

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Hongkyu [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Geomechanics Dept.

    2017-09-01

    The purpose of the project was to perform multiscale characterization of low permeability rocks to determine the effect of physical and chemical heterogeneity on the poromechanical and flow responses of shales and carbonate rocks with a broad range of physical and chemical heterogeneity . An integrated multiscale imaging of shale and carbonate rocks from nanometer to centimeter scales include s dual focused ion beam - scanning electron microscopy (FIB - SEM) , micro computed tomography (micro - CT) , optical and confocal microscopy, and 2D and 3D energy dispersive spectroscopy (EDS). In addition, mineralogical mapping and backscattered imaging with nanoindentation testing advanced the quantitative evaluat ion of the relationship between material heterogeneity and mechanical behavior. T he spatial distribution of compositional heterogeneity, anisotropic bedding patterns, and mechanical anisotropy were employed as inputs for brittle fracture simulations using a phase field model . Comparison of experimental and numerical simulations reveal ed that proper incorporation of additional material information, such as bedding layer thickness and other geometrical attributes of the microstructures, can yield improvements on the numerical prediction of the mesoscale fracture patterns and hence the macroscopic effective toughness. Overall, a comprehensive framework to evaluate the relationship between mechanical response and micro-lithofacial features can allow us to make more accurate prediction of reservoir performance by developing a multi - scale understanding of poromechanical response to coupled chemical and mechanical interactions for subsurface energy related activities.

  18. An entropy approach to size and variance heterogeneity

    NARCIS (Netherlands)

    Balasubramanyan, L.; Stefanou, S.E.; Stokes, J.R.

    2012-01-01

    In this paper, we investigate the effect of bank size differences on cost efficiency heterogeneity using a heteroskedastic stochastic frontier model. This model is implemented by using an information theoretic maximum entropy approach. We explicitly model both bank size and variance heterogeneity

  19. Heterogeneous slip and rupture models of the San Andreas fault zone based upon three-dimensional earthquake tomography

    Energy Technology Data Exchange (ETDEWEB)

    Foxall, William [Univ. of California, Berkeley, CA (United States)

    1992-11-01

    Crystal fault zones exhibit spatially heterogeneous slip behavior at all scales, slip being partitioned between stable frictional sliding, or fault creep, and unstable earthquake rupture. An understanding the mechanisms underlying slip segmentation is fundamental to research into fault dynamics and the physics of earthquake generation. This thesis investigates the influence that large-scale along-strike heterogeneity in fault zone lithology has on slip segmentation. Large-scale transitions from the stable block sliding of the Central 4D Creeping Section of the San Andreas, fault to the locked 1906 and 1857 earthquake segments takes place along the Loma Prieta and Parkfield sections of the fault, respectively, the transitions being accomplished in part by the generation of earthquakes in the magnitude range 6 (Parkfield) to 7 (Loma Prieta). Information on sub-surface lithology interpreted from the Loma Prieta and Parkfield three-dimensional crustal velocity models computed by Michelini (1991) is integrated with information on slip behavior provided by the distributions of earthquakes located using, the three-dimensional models and by surface creep data to study the relationships between large-scale lithological heterogeneity and slip segmentation along these two sections of the fault zone.

  20. Analysis of Factors Affect to Organizational Performance In Using Accounting Information Systems Through Users Satisfaction and Integration Information Systems

    Directory of Open Access Journals (Sweden)

    Anton Arisman

    2017-09-01

    Full Text Available The aim of this research is to investigate the factors affecting organizational performance in using accounting information system through users satisfaction and integration information systems. The research respondents were 447 companies that listed in Indonesian Stock Exchange. The data are gathered through consensus method and in total there are 176 responses with complete data. Structural Equation Model (SEM is used in analyzing the data and system theory is utilized in this research. The result shows that knowledge management systems and management control system have significant influence on users satisfaction and integration information systems.  Integration information system and users satisfaction has positive significant on organizational performance.

  1. A Regression-based K nearest neighbor algorithm for gene function prediction from heterogeneous data

    Directory of Open Access Journals (Sweden)

    Ruzzo Walter L

    2006-03-01

    Full Text Available Abstract Background As a variety of functional genomic and proteomic techniques become available, there is an increasing need for functional analysis methodologies that integrate heterogeneous data sources. Methods In this paper, we address this issue by proposing a general framework for gene function prediction based on the k-nearest-neighbor (KNN algorithm. The choice of KNN is motivated by its simplicity, flexibility to incorporate different data types and adaptability to irregular feature spaces. A weakness of traditional KNN methods, especially when handling heterogeneous data, is that performance is subject to the often ad hoc choice of similarity metric. To address this weakness, we apply regression methods to infer a similarity metric as a weighted combination of a set of base similarity measures, which helps to locate the neighbors that are most likely to be in the same class as the target gene. We also suggest a novel voting scheme to generate confidence scores that estimate the accuracy of predictions. The method gracefully extends to multi-way classification problems. Results We apply this technique to gene function prediction according to three well-known Escherichia coli classification schemes suggested by biologists, using information derived from microarray and genome sequencing data. We demonstrate that our algorithm dramatically outperforms the naive KNN methods and is competitive with support vector machine (SVM algorithms for integrating heterogenous data. We also show that by combining different data sources, prediction accuracy can improve significantly. Conclusion Our extension of KNN with automatic feature weighting, multi-class prediction, and probabilistic inference, enhance prediction accuracy significantly while remaining efficient, intuitive and flexible. This general framework can also be applied to similar classification problems involving heterogeneous datasets.

  2. On the effects of multimodal information integration in multitasking.

    Science.gov (United States)

    Stock, Ann-Kathrin; Gohil, Krutika; Huster, René J; Beste, Christian

    2017-07-07

    There have recently been considerable advances in our understanding of the neuronal mechanisms underlying multitasking, but the role of multimodal integration for this faculty has remained rather unclear. We examined this issue by comparing different modality combinations in a multitasking (stop-change) paradigm. In-depth neurophysiological analyses of event-related potentials (ERPs) were conducted to complement the obtained behavioral data. Specifically, we applied signal decomposition using second order blind identification (SOBI) to the multi-subject ERP data and source localization. We found that both general multimodal information integration and modality-specific aspects (potentially related to task difficulty) modulate behavioral performance and associated neurophysiological correlates. Simultaneous multimodal input generally increased early attentional processing of visual stimuli (i.e. P1 and N1 amplitudes) as well as measures of cognitive effort and conflict (i.e. central P3 amplitudes). Yet, tactile-visual input caused larger impairments in multitasking than audio-visual input. General aspects of multimodal information integration modulated the activity in the premotor cortex (BA 6) as well as different visual association areas concerned with the integration of visual information with input from other modalities (BA 19, BA 21, BA 37). On top of this, differences in the specific combination of modalities also affected performance and measures of conflict/effort originating in prefrontal regions (BA 6).

  3. FEMA's Integrated Emergency Management Information System (IEMIS)

    International Nuclear Information System (INIS)

    Jaske, R.T.; Meitzler, W.

    1987-01-01

    FEMA is implementing a computerized system for use in optimizing planning, and for supporting exercises of these plans. Called the Integrated Emergency Management Information System (IEMIS), it consists of a base geographic information system upon which analytical models are superimposed in order to load data and report results analytically. At present, it supports FEMA's work in offsite preparedness around nuclear power stations, but is being developed to deal with a full range of natural and technological accident hazards for which emergency evacuation or population movement is required

  4. Integrating knowledge based functionality in commercial hospital information systems.

    Science.gov (United States)

    Müller, M L; Ganslandt, T; Eich, H P; Lang, K; Ohmann, C; Prokosch, H U

    2000-01-01

    Successful integration of knowledge-based functions in the electronic patient record depends on direct and context-sensitive accessibility and availability to clinicians and must suit their workflow. In this paper we describe an exemplary integration of an existing standalone scoring system for acute abdominal pain into two different commercial hospital information systems using Java/Corba technolgy.

  5. Identifying and quantifying heterogeneity in high content analysis: application of heterogeneity indices to drug discovery.

    Directory of Open Access Journals (Sweden)

    Albert H Gough

    Full Text Available One of the greatest challenges in biomedical research, drug discovery and diagnostics is understanding how seemingly identical cells can respond differently to perturbagens including drugs for disease treatment. Although heterogeneity has become an accepted characteristic of a population of cells, in drug discovery it is not routinely evaluated or reported. The standard practice for cell-based, high content assays has been to assume a normal distribution and to report a well-to-well average value with a standard deviation. To address this important issue we sought to define a method that could be readily implemented to identify, quantify and characterize heterogeneity in cellular and small organism assays to guide decisions during drug discovery and experimental cell/tissue profiling. Our study revealed that heterogeneity can be effectively identified and quantified with three indices that indicate diversity, non-normality and percent outliers. The indices were evaluated using the induction and inhibition of STAT3 activation in five cell lines where the systems response including sample preparation and instrument performance were well characterized and controlled. These heterogeneity indices provide a standardized method that can easily be integrated into small and large scale screening or profiling projects to guide interpretation of the biology, as well as the development of therapeutics and diagnostics. Understanding the heterogeneity in the response to perturbagens will become a critical factor in designing strategies for the development of therapeutics including targeted polypharmacology.

  6. Heterogeneous inflation expectations and learning

    OpenAIRE

    Madeira, Carlos; Zafar, Basit

    2012-01-01

    Using the panel component of the Michigan Survey of Consumers, we estimate a learning model of inflation expectations, allowing for heterogeneous use of both private information and lifetime inflation experience. “Life-experience inflation” has a significant impact on individual expectations, but only for one-year-ahead inflation. Public information is substantially more relevant for longer-horizon expectations. Even controlling for life-experience inflation and public information, idiosyncra...

  7. Exploring heterogeneous market hypothesis using realized volatility

    Science.gov (United States)

    Chin, Wen Cheong; Isa, Zaidi; Mohd Nor, Abu Hassan Shaari

    2013-04-01

    This study investigates the heterogeneous market hypothesis using high frequency data. The cascaded heterogeneous trading activities with different time durations are modelled by the heterogeneous autoregressive framework. The empirical study indicated the presence of long memory behaviour and predictability elements in the financial time series which supported heterogeneous market hypothesis. Besides the common sum-of-square intraday realized volatility, we also advocated two power variation realized volatilities in forecast evaluation and risk measurement in order to overcome the possible abrupt jumps during the credit crisis. Finally, the empirical results are used in determining the market risk using the value-at-risk approach. The findings of this study have implications for informationally market efficiency analysis, portfolio strategies and risk managements.

  8. Resource externalities and the persistence of heterogeneous pricing behavior in an energy commodity market

    International Nuclear Information System (INIS)

    Bunn, Derek; Koc, Veli; Sapio, Alessandro

    2015-01-01

    In competitive product markets, repeated interaction among producers with similar economic characteristics would be expected to result in convergence of their behaviors. If convergence does not occur, it raises fundamental questions related to the sustainability of heterogeneous competitive strategies. This paper examines the prices submitted to the British wholesale electricity market by four coal-fired plants, separately owned, approximately of the same age, size and efficiency, and located in the same transmission network zone. Due to the repetitive nature of the spot market, one would expect convergence in strategies. Yet, we find evidence of persistent price dispersion and heterogeneous strategies. We consider several propositions for these effects including market power, company size, forward commitments, vertical integration and the management of interrelated assets. - Highlights: • Time series models of offer prices from 4 companies, UK electricity spot market • Focus on coal-fired plants of similar size, efficiency, age, same network zone • Low, less volatile offers by small, not vertically integrated, only-coal company • Operational risks of nuclear plants in a portfolio imply finer tracking of PX prices • Market leadership from private information on the Anglo-French interconnector flows

  9. ABOUT APPROACHES OF CREATION OF INTEGRATED INFORMATION SYSTEM PDM-ERP

    Directory of Open Access Journals (Sweden)

    V. G. Mikhailov

    2016-01-01

    Full Text Available The problems which has added in the field of creation of systems PDM and their integration with ERP is considered. The analysis of the reasons of low efficiency existing PDM is carried out: insufficiency of the primary information brought in PDM unit, structures of a DB, entering of designations in one field, application of referential character of guiding of composition that leads to lowering of its functionality and creates problems with integration with ERP.It is shown that the uniform integrated information system created on uniform databases is necessary for the enterprises with a full stroke, using as the primary document card part-bom-unit, instead of a file. For it other is necessary in difference from databases existing the general-purpose structure in which it is possible to bring any information.Implementation of the new system CDRP, uniting on functional PDM-ERP and providing enterprise basic needs is offered.

  10. Unifying Kohlberg with Information Integration: The Moral Algebra of Recompense and of Kohlbergian Moral Informers

    Directory of Open Access Journals (Sweden)

    Wilfried Hommers

    2010-01-01

    Full Text Available In order to unify two major theories of moral judgment, a novel task is employed which combines elements of Kohlberg's stage theory and of the theory of information integration. In contrast to the format of Kohlberg's moral judgment interview, a nonverbal and quantitative response which makes low demands on verbal facility was used . Moral informers differing in value, i.e. high and low, are presented. The differences in effect of those two pieces of information should be substantial for a person at that specific moral stage, but small for a person at a different stage. Hence, these differences may diagnose the person's moral stage in the simplest possible way as the two levels of each of the thoughts were about typical content of the four Kohlbergian preconventional and conventional stages. The novel task allowed additionally to measure the influence of another moral concept which was about the non-Kohlbergian moral concept of recompense. After a training phase, pairs of those thoughts were presented to allow for the study of integration and individual differences. German and Korean children, 8, 10, and 12 years in age, judged deserved punishment. The patterns of means, correlations and factor loadings showed that elements of both theories can be unified, but produced unexpected results also. Additive integration of each of the two pairs of moral informers appeared, either with two Kohlbergian moral informers or with another Kohlbergian moral informer in combination with information about recompense. Also cultural independence as well as dependence, developmental changes between 8 and 10 years, and an outstanding moral impact of recompense in size and distinctiveness were observed.

  11. A computational method based on the integration of heterogeneous networks for predicting disease-gene associations.

    Directory of Open Access Journals (Sweden)

    Xingli Guo

    Full Text Available The identification of disease-causing genes is a fundamental challenge in human health and of great importance in improving medical care, and provides a better understanding of gene functions. Recent computational approaches based on the interactions among human proteins and disease similarities have shown their power in tackling the issue. In this paper, a novel systematic and global method that integrates two heterogeneous networks for prioritizing candidate disease-causing genes is provided, based on the observation that genes causing the same or similar diseases tend to lie close to one another in a network of protein-protein interactions. In this method, the association score function between a query disease and a candidate gene is defined as the weighted sum of all the association scores between similar diseases and neighbouring genes. Moreover, the topological correlation of these two heterogeneous networks can be incorporated into the definition of the score function, and finally an iterative algorithm is designed for this issue. This method was tested with 10-fold cross-validation on all 1,126 diseases that have at least a known causal gene, and it ranked the correct gene as one of the top ten in 622 of all the 1,428 cases, significantly outperforming a state-of-the-art method called PRINCE. The results brought about by this method were applied to study three multi-factorial disorders: breast cancer, Alzheimer disease and diabetes mellitus type 2, and some suggestions of novel causal genes and candidate disease-causing subnetworks were provided for further investigation.

  12. Heterogeneous Information about the Term Structure of Interest rates, Least-Squares Learning and Optimal Interest Rate Rules for Inflation Forecast Targeting

    NARCIS (Netherlands)

    Schaling, E.; Eijffinger, S.C.W.; Tesfaselassie, M.F.

    2004-01-01

    In this paper we incorporate the term structure of interest rates in a standard inflation forecast targeting framework.Learning about the transmission process of monetary policy is introduced by having heterogeneous agents - i.e. the central bank and private agents - who have different information

  13. Accounting for response behavior heterogeneity in the measurement of attitudes: an application to demand for electric vehicles

    OpenAIRE

    Glerum, Aurélie; Bierlaire, Michel

    2012-01-01

    Hybrid choice models have proved to be a powerful framework that integrates attitudinal and perceptional data into discrete choice models. However the measurement component of such a framework often fails to exploit individual-specific information that might affect the way subjects answer to indicators of opinion. In this paper we propose an HCM with a measurement model that takes into account heterogeneity in the response behavior. Precisely, we capture effects of exaggeration in answers to ...

  14. A framework for interactive visual analysis of heterogeneous marine data in an integrated problem solving environment

    Science.gov (United States)

    Liu, Shuai; Chen, Ge; Yao, Shifeng; Tian, Fenglin; Liu, Wei

    2017-07-01

    This paper presents a novel integrated marine visualization framework which focuses on processing, analyzing the multi-dimension spatiotemporal marine data in one workflow. Effective marine data visualization is needed in terms of extracting useful patterns, recognizing changes, and understanding physical processes in oceanography researches. However, the multi-source, multi-format, multi-dimension characteristics of marine data pose a challenge for interactive and feasible (timely) marine data analysis and visualization in one workflow. And, global multi-resolution virtual terrain environment is also needed to give oceanographers and the public a real geographic background reference and to help them to identify the geographical variation of ocean phenomena. This paper introduces a data integration and processing method to efficiently visualize and analyze the heterogeneous marine data. Based on the data we processed, several GPU-based visualization methods are explored to interactively demonstrate marine data. GPU-tessellated global terrain rendering using ETOPO1 data is realized and the video memory usage is controlled to ensure high efficiency. A modified ray-casting algorithm for the uneven multi-section Argo volume data is also presented and the transfer function is designed to analyze the 3D structure of ocean phenomena. Based on the framework we designed, an integrated visualization system is realized. The effectiveness and efficiency of the framework is demonstrated. This system is expected to make a significant contribution to the demonstration and understanding of marine physical process in a virtual global environment.

  15. Realizing Relevance: The Influence of Domain-Specific Information on Generation of New Knowledge through Integration in 4- to 8-Year-Old Children

    Science.gov (United States)

    Bauer, Patricia J.; Larkina, Marina

    2017-01-01

    In accumulating knowledge, direct modes of learning are complemented by productive processes, including self-generation based on integration of separate episodes. Effects of the number of potentially relevant episodes on integration were examined in 4- to 8-year-olds (N = 121; racially/ethnically heterogeneous sample, English speakers, from large…

  16. Assessment of the integrity of degraded steam generator tube by the use of heterogeneous finite element method

    International Nuclear Information System (INIS)

    Duan, X.; Kozluk, M.; Pagan, S.; Mills, B.

    2006-01-01

    Steam generator tubes at Ontario Power Generation (OPG) have been experiencing a variety of degradations such as pitting, fretting wear, erosion-corrosion, thinning and denting. To assist with steam generator life cycle management, OPG has developed Fitness-For-Service Guidelines (FFSG) for steam generator tubes. The FFSG are intended to provide standard acceptance criteria and evaluation procedures for assessing the condition of steam generator tubes for structural integrity, operational leak rate, and consequential leakage during an upset or abnormal event. Based on inspection results in conjunction with representative, postulated distributions of flaws in the un-inspected tubes, the FFSG provide an acceptable method of satisfying the intent of CSA-N285.4 and justifying the continued operation of degraded steam generator tubes. Some non-mandatory empirical axial and circumferential flaw models are also provided in the FFSG for structural integrity assessments. The test data from the OPG Steam Generator Tube Test Program (SGTTP) showed that the FFSG axial flaw model is conservative for a wide range of defect morphologies. A defect-specific axial flaw model was proposed for lattice-bar fret defects in I800 tubes by utilizing the SGTTP database of extensive test results. A defect-specific flaw model for outer diameter (OD) pitting and inner diameter (ID) intergranular attack in Monel 400 tubes was also developed using the SGTTP test data. More tests have been scheduled to support the development of defect specific models for axial flaws (OD cracks or ID laps) in Monel 400 and to supplement the database for Monel 400 pits. This paper explores the use of simulated testing for use in developing defect specific flaw models to reduce the amount of expensive tests. The Heterogeneous Finite Element Model (HFEM) has been developed and successfully applied to predict the failure behaviour of ductile metals under various deformation modes, i.e. plane stress, plane strain and

  17. Heterogeneous Multicore Parallel Programming for Graphics Processing Units

    Directory of Open Access Journals (Sweden)

    Francois Bodin

    2009-01-01

    Full Text Available Hybrid parallel multicore architectures based on graphics processing units (GPUs can provide tremendous computing power. Current NVIDIA and AMD Graphics Product Group hardware display a peak performance of hundreds of gigaflops. However, exploiting GPUs from existing applications is a difficult task that requires non-portable rewriting of the code. In this paper, we present HMPP, a Heterogeneous Multicore Parallel Programming workbench with compilers, developed by CAPS entreprise, that allows the integration of heterogeneous hardware accelerators in a unintrusive manner while preserving the legacy code.

  18. Research on Heterogeneous Data Exchange based on XML

    Science.gov (United States)

    Li, Huanqin; Liu, Jinfeng

    Integration of multiple data sources is becoming increasingly important for enterprises that cooperate closely with their partners for e-commerce. OLAP enables analysts and decision makers fast access to various materialized views from data warehouses. However, many corporations have internal business applications deployed on different platforms. This paper introduces a model for heterogeneous data exchange based on XML. The system can exchange and share the data among the different sources. The method used to realize the heterogeneous data exchange is given in this paper.

  19. Specification of an integrated information architecture for a mobile teleoperated robot for home telecare.

    Science.gov (United States)

    Iannuzzi, David; Grant, Andrew; Corriveau, Hélène; Boissy, Patrick; Michaud, Francois

    2016-12-01

    The objective of this study was to design effectively integrated information architecture for a mobile teleoperated robot in remote assistance to the delivery of home health care. Three role classes were identified related to the deployment of a telerobot, namely, engineer, technology integrator, and health professional. Patients and natural caregivers were indirectly considered, this being a component of future field studies. Interviewing representatives of each class provided the functions, and information content and flows for each function. Interview transcripts enabled the formulation of UML (Universal Modeling Language) diagrams for feedback from participants. The proposed information architecture was validated with a use-case scenario. The integrated information architecture incorporates progressive design, ergonomic integration, and the home care needs from medical specialist, nursing, physiotherapy, occupational therapy, and social worker care perspectives. The integrated architecture iterative process promoted insight among participants. The use-case scenario evaluation showed the design's robustness. Complex innovation such as a telerobot must coherently mesh with health-care service delivery needs. The deployment of integrated information architecture bridging development, with specialist and home care applications, is necessary for home care technology innovation. It enables continuing evolution of robot and novel health information design in the same integrated architecture, while accounting for patient ecological need.

  20. A Critical Review of the Integration of Geographic Information System and Building Information Modelling at the Data Level

    Directory of Open Access Journals (Sweden)

    Junxiang Zhu

    2018-02-01

    Full Text Available The benefits brought by the integration of Building Information Modelling (BIM and Geographic Information Systems (GIS are being proved by more and more research. The integration of the two systems is difficult for many reasons. Among them, data incompatibility is the most significant, as BIM and GIS data are created, managed, analyzed, stored, and visualized in different ways in terms of coordinate systems, scope of interest, and data structures. The objective of this paper is to review the relevant research papers to (1 identify the most relevant data models used in BIM/GIS integration and understand their advantages and disadvantages; (2 consider the possibility of other data models that are available for data level integration; and (3 provide direction on the future of BIM/GIS data integration.

  1. Transient well flow in vertically heterogeneous aquifers.

    NARCIS (Netherlands)

    Hemker, C.J.

    1999-01-01

    A solution for the general problem of computing well flow in vertically heterogeneous aquifers is found by an integration of both analytical and numerical techniques. The radial component of flow is treated analytically; the drawdown is a continuous function of the distance to the well. The

  2. Representation and Integration of Scientific Information

    Science.gov (United States)

    1998-01-01

    The objective of this Joint Research Interchange with NASA-Ames was to investigate how the Tsimmis technology could be used to represent and integrate scientific information. The main goal of the Tsimmis project is to allow a decision maker to find information of interest from such sources, fuse it, and process it (e.g., summarize it, visualize it, discover trends). Another important goal is the easy incorporation of new sources, as well the ability to deal with sources whose structure or services evolve. During the Interchange we had research meetings approximately every month or two. The funds provided by NASA supported work that lead to the following two papers: Fusion Queries over Internet Databases; Efficient Query Subscription Processing in a Multicast Environment.

  3. Intertumoral Heterogeneity within Medulloblastoma Subgroups.

    Science.gov (United States)

    Cavalli, Florence M G; Remke, Marc; Rampasek, Ladislav; Peacock, John; Shih, David J H; Luu, Betty; Garzia, Livia; Torchia, Jonathon; Nor, Carolina; Morrissy, A Sorana; Agnihotri, Sameer; Thompson, Yuan Yao; Kuzan-Fischer, Claudia M; Farooq, Hamza; Isaev, Keren; Daniels, Craig; Cho, Byung-Kyu; Kim, Seung-Ki; Wang, Kyu-Chang; Lee, Ji Yeoun; Grajkowska, Wieslawa A; Perek-Polnik, Marta; Vasiljevic, Alexandre; Faure-Conter, Cecile; Jouvet, Anne; Giannini, Caterina; Nageswara Rao, Amulya A; Li, Kay Ka Wai; Ng, Ho-Keung; Eberhart, Charles G; Pollack, Ian F; Hamilton, Ronald L; Gillespie, G Yancey; Olson, James M; Leary, Sarah; Weiss, William A; Lach, Boleslaw; Chambless, Lola B; Thompson, Reid C; Cooper, Michael K; Vibhakar, Rajeev; Hauser, Peter; van Veelen, Marie-Lise C; Kros, Johan M; French, Pim J; Ra, Young Shin; Kumabe, Toshihiro; López-Aguilar, Enrique; Zitterbart, Karel; Sterba, Jaroslav; Finocchiaro, Gaetano; Massimino, Maura; Van Meir, Erwin G; Osuka, Satoru; Shofuda, Tomoko; Klekner, Almos; Zollo, Massimo; Leonard, Jeffrey R; Rubin, Joshua B; Jabado, Nada; Albrecht, Steffen; Mora, Jaume; Van Meter, Timothy E; Jung, Shin; Moore, Andrew S; Hallahan, Andrew R; Chan, Jennifer A; Tirapelli, Daniela P C; Carlotti, Carlos G; Fouladi, Maryam; Pimentel, José; Faria, Claudia C; Saad, Ali G; Massimi, Luca; Liau, Linda M; Wheeler, Helen; Nakamura, Hideo; Elbabaa, Samer K; Perezpeña-Diazconti, Mario; Chico Ponce de León, Fernando; Robinson, Shenandoah; Zapotocky, Michal; Lassaletta, Alvaro; Huang, Annie; Hawkins, Cynthia E; Tabori, Uri; Bouffet, Eric; Bartels, Ute; Dirks, Peter B; Rutka, James T; Bader, Gary D; Reimand, Jüri; Goldenberg, Anna; Ramaswamy, Vijay; Taylor, Michael D

    2017-06-12

    While molecular subgrouping has revolutionized medulloblastoma classification, the extent of heterogeneity within subgroups is unknown. Similarity network fusion (SNF) applied to genome-wide DNA methylation and gene expression data across 763 primary samples identifies very homogeneous clusters of patients, supporting the presence of medulloblastoma subtypes. After integration of somatic copy-number alterations, and clinical features specific to each cluster, we identify 12 different subtypes of medulloblastoma. Integrative analysis using SNF further delineates group 3 from group 4 medulloblastoma, which is not as readily apparent through analyses of individual data types. Two clear subtypes of infants with Sonic Hedgehog medulloblastoma with disparate outcomes and biology are identified. Medulloblastoma subtypes identified through integrative clustering have important implications for stratification of future clinical trials. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Semantic Metadata for Heterogeneous Spatial Planning Documents

    Science.gov (United States)

    Iwaniak, A.; Kaczmarek, I.; Łukowicz, J.; Strzelecki, M.; Coetzee, S.; Paluszyński, W.

    2016-09-01

    Spatial planning documents contain information about the principles and rights of land use in different zones of a local authority. They are the basis for administrative decision making in support of sustainable development. In Poland these documents are published on the Web according to a prescribed non-extendable XML schema, designed for optimum presentation to humans in HTML web pages. There is no document standard, and limited functionality exists for adding references to external resources. The text in these documents is discoverable and searchable by general-purpose web search engines, but the semantics of the content cannot be discovered or queried. The spatial information in these documents is geographically referenced but not machine-readable. Major manual efforts are required to integrate such heterogeneous spatial planning documents from various local authorities for analysis, scenario planning and decision support. This article presents results of an implementation using machine-readable semantic metadata to identify relationships among regulations in the text, spatial objects in the drawings and links to external resources. A spatial planning ontology was used to annotate different sections of spatial planning documents with semantic metadata in the Resource Description Framework in Attributes (RDFa). The semantic interpretation of the content, links between document elements and links to external resources were embedded in XHTML pages. An example and use case from the spatial planning domain in Poland is presented to evaluate its efficiency and applicability. The solution enables the automated integration of spatial planning documents from multiple local authorities to assist decision makers with understanding and interpreting spatial planning information. The approach is equally applicable to legal documents from other countries and domains, such as cultural heritage and environmental management.

  5. Integrating an Information Literacy Quiz into the Learning Management System

    Science.gov (United States)

    Lowe, M. Sara; Booth, Char; Tagge, Natalie; Stone, Sean

    2014-01-01

    The Claremont Colleges Library Instruction Services Department developed a quiz that could be integrated into the consortial learning management software to accompany a local online, open-source information literacy tutorial. The quiz is integrated into individual course pages, allowing students to receive a grade for completion and improving…

  6. Strengthening Rehabilitation in Health Systems Worldwide by Integrating Information on Functioning in National Health Information Systems.

    Science.gov (United States)

    Stucki, Gerold; Bickenbach, Jerome; Melvin, John

    2017-09-01

    A complete understanding of the experience of health requires information relevant not merely to the health indicators of mortality and morbidity but also to functioning-that is, information about what it means to live in a health state, "the lived experience of health." Not only is functioning information relevant to healthcare and the overall objectives of person-centered healthcare but to the successful operation of all components of health systems.In light of population aging and major epidemiological trends, the health strategy of rehabilitation, whose aim has always been to optimize functioning and minimize disability, will become a key health strategy. The increasing prominence of the rehabilitative strategy within the health system drives the argument for the integration of functioning information as an essential component in national health information systems.Rehabilitation professionals and researchers have long recognized in WHO's International Classification of Functioning, Disability and Health the best prospect for an internationally recognized, sufficiently complete and powerful information reference for the documentation of functioning information. This paper opens the discussion of the promise of integrating the ICF as an essential component in national health systems to secure access to functioning information for rehabilitation, across health systems and countries.

  7. MONIL Language, an Alternative for Data Integration

    OpenAIRE

    Larre, Mónica; Torres-Jiménez, José; Morales, Eduardo; Frausto-Solís, Juan; Torres, Sócrates

    2006-01-01

    Data integration is a process of retrieving, merging and storing of data originated in heterogeneous sources of data. The main problem facing the data integration is the structural and semantic heterogeneity of participating data. A concern of research communities in computer sciences is the development of semi-automatic tools to assist the user in an effective way in the data integration processes. This paper introduces a programming language called MONIL, as an alternative to integrate data...

  8. E-health and healthcare enterprise information system leveraging service-oriented architecture.

    Science.gov (United States)

    Hsieh, Sung-Huai; Hsieh, Sheau-Ling; Cheng, Po-Hsun; Lai, Feipei

    2012-04-01

    To present the successful experiences of an integrated, collaborative, distributed, large-scale enterprise healthcare information system over a wired and wireless infrastructure in National Taiwan University Hospital (NTUH). In order to smoothly and sequentially transfer from the complex relations among the old (legacy) systems to the new-generation enterprise healthcare information system, we adopted the multitier framework based on service-oriented architecture to integrate the heterogeneous systems as well as to interoperate among many other components and multiple databases. We also present mechanisms of a logical layer reusability approach and data (message) exchange flow via Health Level 7 (HL7) middleware, DICOM standard, and the Integrating the Healthcare Enterprise workflow. The architecture and protocols of the NTUH enterprise healthcare information system, especially in the Inpatient Information System (IIS), are discussed in detail. The NTUH Inpatient Healthcare Information System is designed and deployed on service-oriented architecture middleware frameworks. The mechanisms of integration as well as interoperability among the components and the multiple databases apply the HL7 standards for data exchanges, which are embedded in XML formats, and Microsoft .NET Web services to integrate heterogeneous platforms. The preliminary performance of the current operation IIS is evaluated and analyzed to verify the efficiency and effectiveness of the designed architecture; it shows reliability and robustness in the highly demanding traffic environment of NTUH. The newly developed NTUH IIS provides an open and flexible environment not only to share medical information easily among other branch hospitals, but also to reduce the cost of maintenance. The HL7 message standard is widely adopted to cover all data exchanges in the system. All services are independent modules that enable the system to be deployed and configured to the highest degree of flexibility

  9. Broadband microwave photonic fully tunable filter using a single heterogeneously integrated III-V/SOI-microdisk-based phase shifter.

    Science.gov (United States)

    Lloret, Juan; Morthier, Geert; Ramos, Francisco; Sales, Salvador; Van Thourhout, Dries; Spuesens, Thijs; Olivier, Nicolas; Fédéli, Jean-Marc; Capmany, José

    2012-05-07

    A broadband microwave photonic phase shifter based on a single III-V microdisk resonator heterogeneously integrated on and coupled to a nanophotonic silicon-on-insulator waveguide is reported. The phase shift tunability is accomplished by modifying the effective index through carrier injection. A comprehensive semi-analytical model aiming at predicting its behavior is formulated and confirmed by measurements. Quasi-linear and continuously tunable 2π phase shifts at radiofrequencies greater than 18 GHz are experimentally demonstrated. The phase shifter performance is also evaluated when used as a key element in tunable filtering schemes. Distortion-free and wideband filtering responses with a tuning range of ~100% over the free spectral range are obtained.

  10. Heterogeneous information sharing of sensor information in contested environments

    Science.gov (United States)

    Wampler, Jason A.; Hsieh, Chien; Toth, Andrew; Sheatsley, Ryan

    2017-05-01

    The inherent nature of unattended sensors makes these devices most vulnerable to detection, exploitation, and denial in contested environments. Physical access is often cited as the easiest way to compromise any device or network. A new mechanism for mitigating these types of attacks developed under the Assistant Secretary of Defense for Research and Engineering, ASD(R and E) project, "Smoke Screen in Cyberspace", was demonstrated in a live, over-the-air experiment. Smoke Screen encrypts, slices up, and disburses redundant fragments of files throughout the network. Recovery is only possible after recovering all fragments and attacking/denying one or more nodes does not limit the availability of other fragment copies in the network. This experiment proved the feasibility of redundant file fragmentation, and is the foundation for developing sophisticated methods to blacklist compromised nodes, move data fragments from risks of compromise, and forward stored data fragments closer to the anticipated retrieval point. This paper outlines initial results in scalability of node members, fragment size, file size, and performance in a heterogeneous network consisting of the Wireless Network after Next (WNaN) radio and Common Sensor Radio (CSR).

  11. An integration strategy for large enterprises

    Directory of Open Access Journals (Sweden)

    Risimić Dejan

    2007-01-01

    Full Text Available Integration is the process of enabling a communication between disparate software components. Integration has been the burning issue for large enterprises in the last twenty years, due to the fact that 70% of the development and deployment budget is spent on integrating complex and heterogeneous back-end and front-end IT systems. The need to integrate existing applications is to support newer, faster, more accurate business processes and to provide meaningful, consistent management information. Historically, integration started with the introduction of point-to-point approaches evolving into simpler hub-and spoke topologies. These topologies were combined with custom remote procedure calls, distributed object technologies and message-oriented middleware (MOM, continued with enterprise application integration (EAI and used an application server as a primary vehicle for integration. The current phase of the evolution is service-oriented architecture (SOA combined with an enterprise service bus (ESB. Technical aspects of the comparison between the aforementioned technologies are analyzed and presented. The result of the study is the recommended integration strategy for large enterprises.

  12. Strategic Voting in Heterogeneous Electorates: An Experimental Study

    Directory of Open Access Journals (Sweden)

    Marcelo Tyszler

    2013-11-01

    Full Text Available We study strategic voting in a setting where voters choose from three options and Condorcet cycles may occur. We introduce in the electorate heterogeneity in preference intensity by allowing voters to differ in the extent to which they value the three options. Three information conditions are tested: uninformed, in which voters know only their own preference ordering and the own benefits from each option; aggregate information, in which in addition they know the aggregate realized distribution of the preference orderings and full information, in which they also know how the relative importance attributed to the options are distributed within the electorate. As a general result, heterogeneity seems to decrease the level of strategic voting in our experiment compared to the homogenous preference case that we study in a companion paper. Both theoretically and empirically (with data collected in a laboratory experiment, the main comparative static results obtained for the homogenous case carry over to the present setting with preference heterogeneity. Moreover, information about the realized aggregate distribution of preferences seems to be the element that best explains observed differences in voting behavior. Additional information about the realized distribution of preference intensity does not yield significant further changes.

  13. A spring-mass-damper system dynamics-based driver-vehicle integrated model for representing heterogeneous traffic

    Science.gov (United States)

    Munigety, Caleb Ronald

    2018-04-01

    The traditional traffic microscopic simulation models consider driver and vehicle as a single unit to represent the movements of drivers in a traffic stream. Due to this very fact, the traditional car-following models have the driver behavior related parameters, but ignore the vehicle related aspects. This approach is appropriate for homogeneous traffic conditions where car is the major vehicle type. However, in heterogeneous traffic conditions where multiple vehicle types are present, it becomes important to incorporate the vehicle related parameters exclusively to account for the varying dynamic and static characteristics. Thus, this paper presents a driver-vehicle integrated model hinged on the principles involved in physics-based spring-mass-damper mechanical system. While the spring constant represents the driver’s aggressiveness, the damping constant and the mass component take care of the stability and size/weight related aspects, respectively. The proposed model when tested, behaved pragmatically in representing the vehicle-type dependent longitudinal movements of vehicles.

  14. Information Integration in Risky Choice: Identification and Stability

    OpenAIRE

    Stewart, Neil

    2011-01-01

    How is information integrated across the\\ud attributes of an option when making risky\\ud choices? In most descriptive models of\\ud decision under risk, information about\\ud risk, and reward is combined multiplicatively\\ud (e.g., expected value; expected utility\\ud theory, Bernouli, 1738/1954; subjective\\ud expected utility theory, Savage, 1954;\\ud Edwards, 1955; prospect theory, Kahneman\\ud and Tversky, 1979; rank-dependent utility,\\ud Quiggin, 1993; decision field theory,\\ud Busemeyer and To...

  15. Entropy-based analysis and bioinformatics-inspired integration of global economic information transfer.

    Directory of Open Access Journals (Sweden)

    Jinkyu Kim

    Full Text Available The assessment of information transfer in the global economic network helps to understand the current environment and the outlook of an economy. Most approaches on global networks extract information transfer based mainly on a single variable. This paper establishes an entirely new bioinformatics-inspired approach to integrating information transfer derived from multiple variables and develops an international economic network accordingly. In the proposed methodology, we first construct the transfer entropies (TEs between various intra- and inter-country pairs of economic time series variables, test their significances, and then use a weighted sum approach to aggregate information captured in each TE. Through a simulation study, the new method is shown to deliver better information integration compared to existing integration methods in that it can be applied even when intra-country variables are correlated. Empirical investigation with the real world data reveals that Western countries are more influential in the global economic network and that Japan has become less influential following the Asian currency crisis.

  16. Entropy-based analysis and bioinformatics-inspired integration of global economic information transfer.

    Science.gov (United States)

    Kim, Jinkyu; Kim, Gunn; An, Sungbae; Kwon, Young-Kyun; Yoon, Sungroh

    2013-01-01

    The assessment of information transfer in the global economic network helps to understand the current environment and the outlook of an economy. Most approaches on global networks extract information transfer based mainly on a single variable. This paper establishes an entirely new bioinformatics-inspired approach to integrating information transfer derived from multiple variables and develops an international economic network accordingly. In the proposed methodology, we first construct the transfer entropies (TEs) between various intra- and inter-country pairs of economic time series variables, test their significances, and then use a weighted sum approach to aggregate information captured in each TE. Through a simulation study, the new method is shown to deliver better information integration compared to existing integration methods in that it can be applied even when intra-country variables are correlated. Empirical investigation with the real world data reveals that Western countries are more influential in the global economic network and that Japan has become less influential following the Asian currency crisis.

  17. Dynamic Systems for Individual Tracking via Heterogeneous Information Integration and Crowd Source Distributed Simulation

    Science.gov (United States)

    2015-12-04

    Hierarchical Alternative Least Squares (HALS) (Cichocki et al. 2009) (iii) Block Principal Pivoting ( BPP ) (Kim and Park 2011) and (iv) Stochastic...implementation uses a fast active-set based method called Block Principal Pivoting ( BPP ) (Kim and Park 2007), but the parallel algorithm proposed in this...communication. However, for low rank approximation algorithms that depend on global information (like MU, HALS, and BPP ), some communication is necessary. The

  18. A State-of-the-Art Review on the Integration of Building Information Modeling (BIM and Geographic Information System (GIS

    Directory of Open Access Journals (Sweden)

    Xin Liu

    2017-02-01

    Full Text Available The integration of Building Information Modeling (BIM and Geographic Information System (GIS has been identified as a promising but challenging topic to transform information towards the generation of knowledge and intelligence. Achievement of integrating these two concepts and enabling technologies will have a significant impact on solving problems in the civil, building and infrastructure sectors. However, since GIS and BIM were originally developed for different purposes, numerous challenges are being encountered for the integration. To better understand these two different domains, this paper reviews the development and dissimilarities of GIS and BIM, the existing integration methods, and investigates their potential in various applications. This study shows that the integration methods are developed for various reasons and aim to solve different problems. The parameters influencing the choice can be summarized and named as “EEEF” criteria: effectiveness, extensibility, effort, and flexibility. Compared with other methods, semantic web technologies provide a promising and generalized integration solution. However, the biggest challenges of this method are the large efforts required at early stage and the isolated development of ontologies within one particular domain. The isolation problem also applies to other methods. Therefore, openness is the key of the success of BIM and GIS integration.

  19. Creativity, Complexity, and Precision: Information Visualization for (Landscape) Architecture

    DEFF Research Database (Denmark)

    Buscher, Monika; Christensen, Michael; Mogensen, Preben Holst

    2000-01-01

    Drawing on ethnographic studies of (landscape) architects at work, this paper presents a human-centered approach to information visualization. A 3D collaborative electronic workspace allows people to configure, save and browse arrangements of heterogeneous work materials. Spatial arrangements...... and links are created and maintained as an integral part of ongoing work with `live' documents and objects. The result is an extension of the physical information space of the architects' studio that utilizes the potential of electronic data storage, visualization and network technologies to support work...... with information in context...

  20. Moral Judgment as Information Processing: An Integrative Review

    Directory of Open Access Journals (Sweden)

    Steve eGuglielmo

    2015-10-01

    Full Text Available This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two fundamental questions: (1 What input information guides moral judgments?; and (2 What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework, critically evaluates them on empirical and theoretical grounds, outlines a general integrative model grounded in information processing, and offers conceptual and methodological suggestions for future research. The information processing perspective provides a useful theoretical framework for organizing extant and future work in the rapidly growing field of moral judgment.

  1. Loss Performance Modeling for Hierarchical Heterogeneous Wireless Networks With Speed-Sensitive Call Admission Control

    DEFF Research Database (Denmark)

    Huang, Qian; Huang, Yue-Cai; Ko, King-Tim

    2011-01-01

    . This approach avoids unnecessary and frequent handoff between cells and reduces signaling overheads. An approximation model with guaranteed accuracy and low computational complexity is presented for the loss performance of multiservice traffic. The accuracy of numerical results is validated by comparing......A hierarchical overlay structure is an alternative solution that integrates existing and future heterogeneous wireless networks to provide subscribers with better mobile broadband services. Traffic loss performance in such integrated heterogeneous networks is necessary for an operator's network...

  2. Integrated environmental monitoring and information system

    International Nuclear Information System (INIS)

    Klinda, J.; Lieskovska, Z.

    1998-01-01

    The concept of the environmental monitoring within the territory of the Slovak Republic and the concept of the integrated environmental information system of the Slovak Republic were accepted and confirmed by the Government Order No. 449/1992. The state monitoring system covering the whole territory of Slovakia is the most important and consists of 13 Partial Monitoring Systems (PMSs). List of PMSs is included. The listed PMSs are managed according to the concept of the Sectoral Information System (SIS) of the Ministry of the Environment of the Slovak Republic (MESR) which was established by the National Council Act No. 261/1995 Coll. on the SIS. The SIS consists of 18 subsystems which are listed. The overviews of budget of PMSs as well as of environmental publications and periodicals of the MESR are included

  3. Information integration for a sky survey by data warehousing

    Science.gov (United States)

    Luo, A.; Zhang, Y.; Zhao, Y.

    The virtualization service of data system for a sky survey LAMOST is very important for astronomers The service needs to integrate information from data collections catalogs and references and support simple federation of a set of distributed files and associated metadata Data warehousing has been in existence for several years and demonstrated superiority over traditional relational database management systems by providing novel indexing schemes that supported efficient on-line analytical processing OLAP of large databases Now relational database systems such as Oracle etc support the warehouse capability which including extensions to the SQL language to support OLAP operations and a number of metadata management tools have been created The information integration of LAMOST by applying data warehousing is to effectively provide data and knowledge on-line

  4. Real-time simulation of contact and cutting of heterogeneous soft-tissues.

    Science.gov (United States)

    Courtecuisse, Hadrien; Allard, Jérémie; Kerfriden, Pierre; Bordas, Stéphane P A; Cotin, Stéphane; Duriez, Christian

    2014-02-01

    This paper presents a numerical method for interactive (real-time) simulations, which considerably improves the accuracy of the response of heterogeneous soft-tissue models undergoing contact, cutting and other topological changes. We provide an integrated methodology able to deal both with the ill-conditioning issues associated with material heterogeneities, contact boundary conditions which are one of the main sources of inaccuracies, and cutting which is one of the most challenging issues in interactive simulations. Our approach is based on an implicit time integration of a non-linear finite element model. To enable real-time computations, we propose a new preconditioning technique, based on an asynchronous update at low frequency. The preconditioner is not only used to improve the computation of the deformation of the tissues, but also to simulate the contact response of homogeneous and heterogeneous bodies with the same accuracy. We also address the problem of cutting the heterogeneous structures and propose a method to update the preconditioner according to the topological modifications. Finally, we apply our approach to three challenging demonstrators: (i) a simulation of cataract surgery (ii) a simulation of laparoscopic hepatectomy (iii) a brain tumor surgery. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. The integration of weighted gene association networks based on information entropy.

    Science.gov (United States)

    Yang, Fan; Wu, Duzhi; Lin, Limei; Yang, Jian; Yang, Tinghong; Zhao, Jing

    2017-01-01

    Constructing genome scale weighted gene association networks (WGAN) from multiple data sources is one of research hot spots in systems biology. In this paper, we employ information entropy to describe the uncertain degree of gene-gene links and propose a strategy for data integration of weighted networks. We use this method to integrate four existing human weighted gene association networks and construct a much larger WGAN, which includes richer biology information while still keeps high functional relevance between linked gene pairs. The new WGAN shows satisfactory performance in disease gene prediction, which suggests the reliability of our integration strategy. Compared with existing integration methods, our method takes the advantage of the inherent characteristics of the component networks and pays less attention to the biology background of the data. It can make full use of existing biological networks with low computational effort.

  6. Effects of integrated designs of alarm and process information on diagnosis performance in digital nuclear power plants.

    Science.gov (United States)

    Wu, Xiaojun; She, Manrong; Li, Zhizhong; Song, Fei; Sang, Wei

    2017-12-01

    In the main control rooms of nuclear power plants (NPPs), operators frequently switch between alarm displays and system-information displays to incorporate information from different screens. In this study, we investigated two integrated designs of alarm and process information - integrating alarm information into process displays (denoted as Alarm2Process integration) and integrating process information into alarm displays (denoted as Process2Alarm integration). To analyse the effects of the two integration approaches and time pressure on the diagnosis performance, a laboratory experiment was conducted with ninety-six students. The results show that compared with the non-integrated case, Process2Alarm integration yields better diagnosis performance in terms of diagnosis accuracy, time required to generate correct hypothesis and completion time. In contrast, the Alarm2Process integration leads to higher levels of workload, with no improvement in diagnosis performance. The diagnosis performance of Process2Alarm integration was consistently better than that of Alarm2Process integration, regardless of the levels of time pressure. Practitioner Summary: To facilitate operator's synthesis of NPP information when performing diagnosis tasks, we proposed to integrate process information into alarm displays. The laboratory validation shows that the integration approach significantly improves the diagnosis performance for both low and high time-pressure levels.

  7. Using remote sensing to inform integrated coastal zone management

    CSIR Research Space (South Africa)

    Roberts, W

    2010-06-01

    Full Text Available TO INFORM INTERGRATED COASTAL ZONE MANAGEMENT GISSA Western Cape Regional Meeting Wesley Roberts & Melanie Luck-Vogel 2 June 2010 CSIR NRE Ecosystems Earth Observation Group What is Integrated Coastal Zone Management? Integrated coastal management... D1D1 B a n d 1 Band 2 Quick theory of CVA Magnitude Direction ( ) ( )22 xaxbyaybM ?+?= Quadrant 1 (++) Accretion Quadrant 2 (-+) Quadrant 4 (+-) Quadrant 3 (--) Erosion CVA Results & Conclusions ? Change in image time series...

  8. A Preliminary Study on the Multiple Mapping Structure of Classification Systems for Heterogeneous Databases

    Directory of Open Access Journals (Sweden)

    Seok-Hyoung Lee

    2012-06-01

    Full Text Available While science and technology information service portals and heterogeneous databases produced in Korea and other countries are integrated, methods of connecting the unique classification systems applied to each database have been studied. Results of technologists' research, such as, journal articles, patent specifications, and research reports, are organically related to each other. In this case, if the most basic and meaningful classification systems are not connected, it is difficult to achieve interoperability of the information and thus not easy to implement meaningful science technology information services through information convergence. This study aims to address the aforementioned issue by analyzing mapping systems between classification systems in order to design a structure to connect a variety of classification systems used in the academic information database of the Korea Institute of Science and Technology Information, which provides science and technology information portal service. This study also aims to design a mapping system for the classification systems to be applied to actual science and technology information services and information management systems.

  9. Heterogeneous computing with OpenCL 2.0

    CERN Document Server

    Kaeli, David R; Schaa, Dana; Zhang, Dong Ping

    2015-01-01

    Heterogeneous Computing with OpenCL 2.0 teaches OpenCL and parallel programming for complex systems that may include a variety of device architectures: multi-core CPUs, GPUs, and fully-integrated Accelerated Processing Units (APUs). This fully-revised edition includes the latest enhancements in OpenCL 2.0 including: Shared virtual memory to increase programming flexibility and reduce data transfers that consume resources Dynamic parallelism which reduces processor load and avoids bottlenecks Improved imaging support and integration with OpenGL  Designed to work on multiple platfor

  10. Integrated Modeling Approach for the Development of Climate-Informed, Actionable Information

    Directory of Open Access Journals (Sweden)

    David R. Judi

    2018-06-01

    Full Text Available Flooding is a prevalent natural disaster with both short and long-term social, economic, and infrastructure impacts. Changes in intensity and frequency of precipitation (including rain, snow, and rain-on-snow events create challenges for the planning and management of resilient infrastructure and communities. While there is general acknowledgment that new infrastructure design should account for future climate change, no clear methods or actionable information are available to community planners and designers to ensure resilient designs considering an uncertain climate future. This research demonstrates an approach for an integrated, multi-model, and multi-scale simulation to evaluate future flood impacts. This research used regional climate projections to drive high-resolution hydrology and flood models to evaluate social, economic, and infrastructure resilience for the Snohomish Watershed, WA, USA. Using the proposed integrated modeling approach, the peaks of precipitation and streamflows were found to shift from spring and summer to the earlier winter season. Moreover, clear non-stationarities in future flood risk were discovered under various climate scenarios. This research provides a clear approach for the incorporation of climate science in flood resilience analysis and to also provides actionable information relative to the frequency and intensity of future precipitation events.

  11. Evaluating Information System Integration approaches for fixed asset management framework in Tanzania

    Directory of Open Access Journals (Sweden)

    Theophil Assey

    2017-10-01

    Full Text Available Information systems are developed based on different requirements and different technologies. Integration of these systems is of vital importance as they cannot work in isolation, they need to share and exchange data with other information systems. The Information Systems handle data of different types and formats’, finding a way to make them communicate is important as they need to exchange data during transactions, communication and different aspects which may require their interactions. In Tanzanian Local Government Authorities (LGAs, fixed asset data are not centralized, individual Local Government Authority stores their own data in isolation yet accountability is required through the provision of centralized storage for easy data access and easier data integration with other Information Systems in order to enhance fixed asset accountability. The study was carried out through reviewing of literature on the existing Information System integration approaches in order to identify and propose the best approach to be used in fixed asset management systems in LGA’s in Tanzania. The different approaches which are used for systems integration such as Service Oriented Architecture (SOA, Common Object Request Broker (CORBA, Common Object Model (COM and eXtensible Markup Language (XML were evaluated under the factors considered at the LGA. The XML was preferred over SOA, CORBA and COM because of some challenges in governance, data security, availability of expertise for support, maintenance, implementation cost, performance, compliance with government changing policies and service reliability. The proposed approach integrates data for all the Local Government Authorities at a centralized location and middleware transforms the centralized data into XML so it can easily be used by other Information Systems.

  12. Biomedical data integration in computational drug design and bioinformatics.

    Science.gov (United States)

    Seoane, Jose A; Aguiar-Pulido, Vanessa; Munteanu, Cristian R; Rivero, Daniel; Rabunal, Juan R; Dorado, Julian; Pazos, Alejandro

    2013-03-01

    In recent years, in the post genomic era, more and more data is being generated by biological high throughput technologies, such as proteomics and transcriptomics. This omics data can be very useful, but the real challenge is to analyze all this data, as a whole, after integrating it. Biomedical data integration enables making queries to different, heterogeneous and distributed biomedical data sources. Data integration solutions can be very useful not only in the context of drug design, but also in biomedical information retrieval, clinical diagnosis, system biology, etc. In this review, we analyze the most common approaches to biomedical data integration, such as federated databases, data warehousing, multi-agent systems and semantic technology, as well as the solutions developed using these approaches in the past few years.

  13. Information and image integration: project spectrum

    Science.gov (United States)

    Blaine, G. James; Jost, R. Gilbert; Martin, Lori; Weiss, David A.; Lehmann, Ron; Fritz, Kevin

    1998-07-01

    The BJC Health System (BJC) and the Washington University School of Medicine (WUSM) formed a technology alliance with industry collaborators to develop and implement an integrated, advanced clinical information system. The industry collaborators include IBM, Kodak, SBC and Motorola. The activity, called Project Spectrum, provides an integrated clinical repository for the multiple hospital facilities of the BJC. The BJC System consists of 12 acute care hospitals serving over one million patients in Missouri and Illinois. An interface engine manages transactions from each of the hospital information systems, lab systems and radiology information systems. Data is normalized to provide a consistent view for the primary care physician. Access to the clinical repository is supported by web-based server/browser technology which delivers patient data to the physician's desktop. An HL7 based messaging system coordinates the acquisition and management of radiological image data and sends image keys to the clinical data repository. Access to the clinical chart browser currently provides radiology reports, laboratory data, vital signs and transcribed medical reports. A chart metaphor provides tabs for the selection of the clinical record for review. Activation of the radiology tab facilitates a standardized view of radiology reports and provides an icon used to initiate retrieval of available radiology images. The selection of the image icon spawns an image browser plug-in and utilizes the image key from the clinical repository to access the image server for the requested image data. The Spectrum system is collecting clinical data from five hospital systems and imaging data from two hospitals. Domain specific radiology imaging systems support the acquisition and primary interpretation of radiology exams. The spectrum clinical workstations are deployed to over 200 sites utilizing local area networks and ISDN connectivity.

  14. Integrating Records Management (RM) and Information Technology (IT)

    Energy Technology Data Exchange (ETDEWEB)

    NUSBAUM,ANNA W.; CUSIMANO,LINDA J.

    2000-03-02

    Records Managers are continually exploring ways to integrate their services with those offered by Information Technology-related professions to capitalize on the advantages of providing customers a total solution to managing their records and information. In this day and age, where technology abounds, there often exists a fear on the part of records management that this integration will result in a loss of identity and the focus of one's own mission - a fear that records management may become subordinated to the fast-paced technology fields. They need to remember there is strength in numbers and it benefits RM, IT, and the customer when they can bring together the unique offerings each possess to reach synergy for the benefit of all the corporations. Records Managers, need to continually strive to move ''outside the records management box'', network, expand their knowledge, and influence the IT disciplines to incorporate the concept of ''management'' into their customer solutions.

  15. Knowledge-Intensive Gathering and Integration of Statistical Information on European Fisheries

    NARCIS (Netherlands)

    Klinkert, M.; Treur, J.; Verwaart, T.; Loganantharaj, R.; Palm, G.; Ali, M.

    2000-01-01

    Gathering, maintenance, integration and presentation of statistics are major activities of the Dutch Agricultural Economics Research Institute LEI. In this paper we explore how knowledge and agent technology can be exploited to support the information gathering and integration process. In

  16. Integration of Multisensor Hybrid Reasoners to Support Personal Autonomy in the Smart Home

    Directory of Open Access Journals (Sweden)

    Miguel Ángel Valero

    2014-09-01

    Full Text Available The deployment of the Ambient Intelligence (AmI paradigm requires designing and integrating user-centered smart environments to assist people in their daily life activities. This research paper details an integration and validation of multiple heterogeneous sensors with hybrid reasoners that support decision making in order to monitor personal and environmental data at a smart home in a private way. The results innovate on knowledge-based platforms, distributed sensors, connected objects, accessibility and authentication methods to promote independent living for elderly people. TALISMAN+, the AmI framework deployed, integrates four subsystems in the smart home: (i a mobile biomedical telemonitoring platform to provide elderly patients with continuous disease management; (ii an integration middleware that allows context capture from heterogeneous sensors to program environment´s reaction; (iii a vision system for intelligent monitoring of daily activities in the home; and (iv an ontologies-based integrated reasoning platform to trigger local actions and manage private information in the smart home. The framework was integrated in two real running environments, the UPM Accessible Digital Home and MetalTIC house, and successfully validated by five experts in home care, elderly people and personal autonomy.

  17. Integration of multisensor hybrid reasoners to support personal autonomy in the smart home.

    Science.gov (United States)

    Valero, Miguel Ángel; Bravo, José; Chamizo, Juan Manuel García; López-de-Ipiña, Diego

    2014-09-17

    The deployment of the Ambient Intelligence (AmI) paradigm requires designing and integrating user-centered smart environments to assist people in their daily life activities. This research paper details an integration and validation of multiple heterogeneous sensors with hybrid reasoners that support decision making in order to monitor personal and environmental data at a smart home in a private way. The results innovate on knowledge-based platforms, distributed sensors, connected objects, accessibility and authentication methods to promote independent living for elderly people. TALISMAN+, the AmI framework deployed, integrates four subsystems in the smart home: (i) a mobile biomedical telemonitoring platform to provide elderly patients with continuous disease management; (ii) an integration middleware that allows context capture from heterogeneous sensors to program environment's reaction; (iii) a vision system for intelligent monitoring of daily activities in the home; and (iv) an ontologies-based integrated reasoning platform to trigger local actions and manage private information in the smart home. The framework was integrated in two real running environments, the UPM Accessible Digital Home and MetalTIC house, and successfully validated by five experts in home care, elderly people and personal autonomy.

  18. Integration of Multisensor Hybrid Reasoners to Support Personal Autonomy in the Smart Home

    Science.gov (United States)

    Valero, Miguel Ángel; Bravo, José; Chamizo, Juan Manuel García; López-de-Ipiña, Diego

    2014-01-01

    The deployment of the Ambient Intelligence (AmI) paradigm requires designing and integrating user-centered smart environments to assist people in their daily life activities. This research paper details an integration and validation of multiple heterogeneous sensors with hybrid reasoners that support decision making in order to monitor personal and environmental data at a smart home in a private way. The results innovate on knowledge-based platforms, distributed sensors, connected objects, accessibility and authentication methods to promote independent living for elderly people. TALISMAN+, the AmI framework deployed, integrates four subsystems in the smart home: (i) a mobile biomedical telemonitoring platform to provide elderly patients with continuous disease management; (ii) an integration middleware that allows context capture from heterogeneous sensors to program environment's reaction; (iii) a vision system for intelligent monitoring of daily activities in the home; and (iv) an ontologies-based integrated reasoning platform to trigger local actions and manage private information in the smart home. The framework was integrated in two real running environments, the UPM Accessible Digital Home and MetalTIC house, and successfully validated by five experts in home care, elderly people and personal autonomy. PMID:25232910

  19. Epidemic Spreading with Heterogeneous Awareness on Human Networks

    Directory of Open Access Journals (Sweden)

    Yanling Lu

    2017-01-01

    Full Text Available The spontaneous awareness behavioral responses of individuals have a significant impact on epidemic spreading. In this paper, a modified Susceptible-Alert-Infected-Susceptible (SAIS epidemic model with heterogeneous awareness is presented to study epidemic spreading in human networks and the impact of heterogeneous awareness on epidemic dynamics. In this model, when susceptible individuals receive awareness information about the presence of epidemic from their infected neighbor nodes, they will become alert individuals with heterogeneous awareness rate. Theoretical analysis and numerical simulations show that heterogeneous awareness can enhance the epidemic threshold with certain conditions and reduce the scale of virus outbreaks compared with no awareness. What is more, for the same awareness parameter, it also shows that heterogeneous awareness can slow effectively the spreading size and does not delay the arrival time of epidemic spreading peak compared with homogeneous awareness.

  20. 45 CFR 61.14 - Confidentiality of Healthcare Integrity and Protection Data Bank information.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Confidentiality of Healthcare Integrity and Protection Data Bank information. 61.14 Section 61.14 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION HEALTHCARE INTEGRITY AND PROTECTION DATA BANK FOR FINAL ADVERSE INFORMATION ON...

  1. Three-tiered integration of PACS and HIS toward next generation total hospital information system.

    Science.gov (United States)

    Kim, J H; Lee, D H; Choi, J W; Cho, H I; Kang, H S; Yeon, K M; Han, M C

    1998-01-01

    The Seoul National University Hospital (SNUH) started a project to innovate the hospital information facilities. This project includes installation of high speed hospital network, development of new HIS, OCS (order communication system), RIS and PACS. This project aims at the implementation of the first total hospital information system by seamlessly integrating these systems together. To achieve this goal, we took three-tiered systems integration approach: network level, database level, and workstation level integration. There are 3 loops of networks in SNUH: proprietary star network for host computer based HIS, Ethernet based hospital LAN for OCS and RIS, and ATM based network for PACS. They are linked together at the backbone level to allow high speed communication between these systems. We have developed special communication modules for each system that allow data interchange between different databases and computer platforms. We have also developed an integrated workstation in which both the OCS and PACS application programs run on a single computer in an integrated manner allowing the clinical users to access and display radiological images as well as textual clinical information within a single user environment. A study is in progress toward a total hospital information system in SNUH by seamlessly integrating the main hospital information resources such as HIS, OCS, and PACS. With the three-tiered systems integration approach, we could successfully integrate the systems from the network level to the user application level.

  2. HL7 and DICOM based integration of radiology departments with healthcare enterprise information systems.

    Science.gov (United States)

    Blazona, Bojan; Koncar, Miroslav

    2007-12-01

    Integration based on open standards, in order to achieve communication and information interoperability, is one of the key aspects of modern health care information systems. However, this requirement represents one of the major challenges for the Information and Communication Technology (ICT) solutions, as systems today use diverse technologies, proprietary protocols and communication standards which are often not interoperable. One of the main producers of clinical information in healthcare settings represent Radiology Information Systems (RIS) that communicate using widely adopted DICOM (Digital Imaging and COmmunications in Medicine) standard, but in very few cases can efficiently integrate information of interest with other systems. In this context we identified HL7 standard as the world's leading medical ICT standard that is envisioned to provide the umbrella for medical data semantic interoperability, which amongst other things represents the cornerstone for the Croatia's National Integrated Healthcare Information System (IHCIS). The aim was to explore the ability to integrate and exchange RIS originated data with Hospital Information Systems based on HL7's CDA (Clinical Document Architecture) standard. We explored the ability of HL7 CDA specifications and methodology to address the need of RIS integration HL7 based healthcare information systems. We introduced the use of WADO service interconnection to IHCIS and finally CDA rendering in widely used Internet explorers. The outcome of our pilot work proves our original assumption of HL7 standard being able to adopt radiology data into the integrated healthcare systems. Uniform DICOM to CDA translation scripts and business processes within IHCIS is desired and cost effective regarding to use of supporting IHCIS services aligned to SOA.

  3. The Effects of Inquiry-Based Integrated Information Literacy Instruction: Four-Year Trends

    Directory of Open Access Journals (Sweden)

    Lin Ching Chen

    2014-07-01

    Full Text Available The purpose of this study was to examine the effects of four-year integrated information literacy instruction via a framework of inquiry-based learning on elementary students’ memory and comprehension. Moderating factors of students’ academic achievement was another focus of this study. The subjects were 72 students who have participated in this study since they entered an elementary school in Chiayi district. This elementary school adopted the integrated information literacy instruction, designed by the researchers and elementary school teachers, and integrated it into various subject matters via a framework of inquiry-based learning, such as Super 3 and Big6 models. A series of inquiry-based integrated information literacy instruction has been implemented since the second semester of the subjects’ first grade. A total of seven inquiry learning projects has been implemented from grade one through grade four. Fourteen instruments were used as pretests and posttests to assess students’ factual recall and conceptual understanding of subject contents in different projects. The results showed that inquiry-based integrated information literacy instruction couldhelp students memorize facts and comprehend concepts of subject contents. Regardless ofacademic achievements, if students would like to devote their efforts to inquiry processes, their memory and comprehension of subject contents improvedeffectively. However, students of low-academic achievement might need more time to be familiar with the inquiry-based learning strategy.

  4. Emerging understanding of multiscale tumor heterogeneity

    Directory of Open Access Journals (Sweden)

    Michael J Gerdes

    2014-12-01

    Full Text Available Cancer is a multifaceted disease characterized by heterogeneous genetic alterations and cellular metabolism, at the organ, tissue, and cellular level. Key features of cancer heterogeneity are summarized by ten acquired capabilities, which govern malignant transformation and progression of invasive tumors. The relative contribution of these hallmark features to the disease process varies between cancers. At the DNA and cellular level, germ-line and somatic gene mutations are found across all cancer types, causing abnormal protein production, cell behavior, and growth. The tumor microenvironment and its individual components (immune cells, fibroblasts, collagen, and blood vessels can also facilitate or restrict tumor growth and metastasis. Oncology research is currently in the midst of a tremendous surge of comprehension of these disease mechanisms. This will lead not only to novel drug targets, but also to new challenges in drug discovery. Integrated, multi-omic, multiplexed technologies are essential tools in the quest to understand all of the various cellular changes involved in tumorigenesis. This review examines features of cancer heterogeneity and discusses how multiplexed technologies can facilitate a more comprehensive understanding of these features.

  5. Semi-supervised drug-protein interaction prediction from heterogeneous biological spaces.

    Science.gov (United States)

    Xia, Zheng; Wu, Ling-Yun; Zhou, Xiaobo; Wong, Stephen T C

    2010-09-13

    Predicting drug-protein interactions from heterogeneous biological data sources is a key step for in silico drug discovery. The difficulty of this prediction task lies in the rarity of known drug-protein interactions and myriad unknown interactions to be predicted. To meet this challenge, a manifold regularization semi-supervised learning method is presented to tackle this issue by using labeled and unlabeled information which often generates better results than using the labeled data alone. Furthermore, our semi-supervised learning method integrates known drug-protein interaction network information as well as chemical structure and genomic sequence data. Using the proposed method, we predicted certain drug-protein interactions on the enzyme, ion channel, GPCRs, and nuclear receptor data sets. Some of them are confirmed by the latest publicly available drug targets databases such as KEGG. We report encouraging results of using our method for drug-protein interaction network reconstruction which may shed light on the molecular interaction inference and new uses of marketed drugs.

  6. Efficacy of integrating information literacy education into a women's health course on information literacy for RN-BSN students.

    Science.gov (United States)

    Ku, Ya-Lie; Sheu, Sheila; Kuo, Shih-Ming

    2007-03-01

    Information literacy, essential to evidences-based nursing, can promote nurses' capability for life-long learning. Nursing education should strive to employ information literacy education in nursing curricula to improve information literacy abilities among nursing students. This study explored the effectiveness of information literacy education by comparing information literacy skills among a group of RN-BSN (Registered Nurse to Bachelors of Science in Nursing) students who received information literacy education with a group that did not. This quasi-experimental study was conducted during a women's health issues course taught between March and June 2004. Content was presented to the 32 RN-BSN students enrolled in this course, which also taught skills on searching and screening, integrating, analyzing, applying, and presenting information. At the beginning and end of the program, 75 RN-BSN student self-evaluated on a 10 point Likert scale their attained skills in searching and screening, integrating, analyzing, applying, and presenting information. Results identified no significant differences between the experimental (n = 32) and control groups (n = 43) in terms of age, marital status, job title, work unit, years of work experience, and information literacy skills as measured at the beginning of the semester. At the end of the semester during which content was taught, the information literacy of the experimental group in all categories, with the exception of information presentation, was significantly improved as compared to that of the control group. Results were especially significant in terms of integrating, analyzing, and applying skill categories. It is hoped that in the future nursing students will apply enhanced information literacy to address and resolve patients' health problems in clinical settings.

  7. Activity-Centred Tool Integration

    DEFF Research Database (Denmark)

    Hansen, Klaus Marius

    2003-01-01

    This paper is concerned with integration of heterogeneous tools for system development. We argue that such tools should support concrete activities (e.g., programming, unit testing, conducting workshops) in contrast to abstract concerns (e.g., analysis, design, implementation). A consequence of t...... of this is that tools — or components —that support activities well should be integrated in ad-hoc, dynamic, and heterogeneous ways. We present a peer-to-peer architecture for this based on type-based publish subscribe and give an example of its use....

  8. Integrating information for better environmental decisions.

    Energy Technology Data Exchange (ETDEWEB)

    MacDonell, M.; Morgan, K.; Newland, L.; Environmental Assessment; Texas Christian Univ.

    2002-01-01

    As more is learned about the complex nature and extent of environmental impacts from progressive human disturbance, scientists, policy analysts, decision makers, educators, and communicators are increasingly joining forces to develop strategies for preserving and protecting the environment. The Eco-Informa Foundation is an educational scientific organization dedicated to promoting the collaborative development and sharing of scientific information. The Foundation participated in a recent international conference on environmental informatics through a special symposium on integrating information for better environmental decisions. Presentations focused on four general themes: (1) remote sensing and data interpretation, including through new knowledge management tools; (2) risk assessment and communication, including for radioactively contaminated facilities, introduced biological hazards, and food safety; (3) community involvement in cleanup projects; and (4) environmental education. The general context for related issues, methods and applications, and results and recommendations from those discussions are highlighted here.

  9. Integration of Information Literacy into the Curriculum: Constructive Alignment from Theory into Practice

    Directory of Open Access Journals (Sweden)

    Claes Dahlqvist

    2016-12-01

    Full Text Available Librarian-teacher cooperation is essential for the integration of information literacy into course syllabi. Therefore, a common theoretical and methodological platform is needed. As librarians at Kristianstad University we have had the opportunity to develop such a platform when teaching information literacy in a basic course for teachers in higher education pedagogy. Information literacy is taught in context with academic writing, distance learning and teaching, and development of course syllabi. Constructive Alignment in Theory: We used constructive alignment in designing our part of the course. John Biggs’ ideas tell us that assessment tasks (ATs should be aligned to what is intended to be learned. Intended learning outcomes (ILOs specify teaching/learning activities (TLAs based on the content of learning. TLAs should be designed in ways that enable students to construct knowledge from their own experience. The ILOs for the course are to have arguments for the role of information literacy in higher education and ideas of implementing them in TLAs. The content of learning is for example the concept of information literacy, theoretical perspectives and constructive alignment for integration in course syllabi. TLAs are written pre-lecture reflections on the concept of information literacy, used as a starting point for the three-hour seminar. Learning reflections are written afterwards. The AT is to revise a syllabus (preferably using constructive alignment for a course the teacher is responsible for, where information literacy must be integrated with the other parts and topics of the course. Constructive Alignment in Practice: Using constructive alignment has taught us that this model serves well as the foundation of the theoretical and methodological platform for librarian-teacher cooperation when integrating information literacy in course syllabi. It contains all important aspects of the integration of information literacy in course

  10. A Multianalyzer Machine Learning Model for Marine Heterogeneous Data Schema Mapping

    Science.gov (United States)

    Yan, Wang; Jiajin, Le; Yun, Zhang

    2014-01-01

    The main challenges that marine heterogeneous data integration faces are the problem of accurate schema mapping between heterogeneous data sources. In order to improve the schema mapping efficiency and get more accurate learning results, this paper proposes a heterogeneous data schema mapping method basing on multianalyzer machine learning model. The multianalyzer analysis the learning results comprehensively, and a fuzzy comprehensive evaluation system is introduced for output results' evaluation and multi factor quantitative judging. Finally, the data mapping comparison experiment on the East China Sea observing data confirms the effectiveness of the model and shows multianalyzer's obvious improvement of mapping error rate. PMID:25250372

  11. Social Information Is Integrated into Value and Confidence Judgments According to Its Reliability.

    Science.gov (United States)

    De Martino, Benedetto; Bobadilla-Suarez, Sebastian; Nouguchi, Takao; Sharot, Tali; Love, Bradley C

    2017-06-21

    How much we like something, whether it be a bottle of wine or a new film, is affected by the opinions of others. However, the social information that we receive can be contradictory and vary in its reliability. Here, we tested whether the brain incorporates these statistics when judging value and confidence. Participants provided value judgments about consumer goods in the presence of online reviews. We found that participants updated their initial value and confidence judgments in a Bayesian fashion, taking into account both the uncertainty of their initial beliefs and the reliability of the social information. Activity in dorsomedial prefrontal cortex tracked the degree of belief update. Analogous to how lower-level perceptual information is integrated, we found that the human brain integrates social information according to its reliability when judging value and confidence. SIGNIFICANCE STATEMENT The field of perceptual decision making has shown that the sensory system integrates different sources of information according to their respective reliability, as predicted by a Bayesian inference scheme. In this work, we hypothesized that a similar coding scheme is implemented by the human brain to process social signals and guide complex, value-based decisions. We provide experimental evidence that the human prefrontal cortex's activity is consistent with a Bayesian computation that integrates social information that differs in reliability and that this integration affects the neural representation of value and confidence. Copyright © 2017 De Martino et al.

  12. Mining and Integration of Environmental Data

    Science.gov (United States)

    Tran, V.; Hluchy, L.; Habala, O.; Ciglan, M.

    2009-04-01

    The project ADMIRE (Advanced Data Mining and Integration Research for Europe) is a 7th FP EU ICT project aims to deliver a consistent and easy-to-use technology for extracting information and knowledge. The project is motivated by the difficulty of extracting meaningful information by data mining combinations of data from multiple heterogeneous and distributed resources. It will also provide an abstract view of data mining and integration, which will give users and developers the power to cope with complexity and heterogeneity of services, data and processes. The data sets describing phenomena from domains like business, society, and environment often contain spatial and temporal dimensions. Integration of spatio-temporal data from different sources is a challenging task due to those dimensions. Different spatio-temporal data sets contain data at different resolutions (e.g. size of the spatial grid) and frequencies. This heterogeneity is the principal challenge of geo-spatial and temporal data sets integration - the integrated data set should hold homogeneous data of the same resolution and frequency. Thus, to integrate heterogeneous spatio-temporal data from distinct source, transformation of one or more data sets is necessary. Following transformation operation are required: • transformation to common spatial and temporal representation - (e.g. transformation to common coordinate system), • spatial and/or temporal aggregation - data from detailed data source are aggregated to match the resolution of other resources involved in the integration process, • spatial and/or temporal record decomposition - records from source with lower resolution data are decomposed to match the granularity of the other data source. This operation decreases data quality (e.g. transformation of data from 50km grid to 10 km grid) - data from lower resolution data set in the integrated schema are imprecise, but it allows us to preserve higher resolution data. We can decompose the

  13. Moral judgment as information processing: an integrative review.

    Science.gov (United States)

    Guglielmo, Steve

    2015-01-01

    How do humans make moral judgments about others' behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment.

  14. Moral judgment as information processing: an integrative review

    Science.gov (United States)

    Guglielmo, Steve

    2015-01-01

    How do humans make moral judgments about others’ behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment. PMID:26579022

  15. A Geospatial Information Grid Framework for Geological Survey.

    Science.gov (United States)

    Wu, Liang; Xue, Lei; Li, Chaoling; Lv, Xia; Chen, Zhanlong; Guo, Mingqiang; Xie, Zhong

    2015-01-01

    The use of digital information in geological fields is becoming very important. Thus, informatization in geological surveys should not stagnate as a result of the level of data accumulation. The integration and sharing of distributed, multi-source, heterogeneous geological information is an open problem in geological domains. Applications and services use geological spatial data with many features, including being cross-region and cross-domain and requiring real-time updating. As a result of these features, desktop and web-based geographic information systems (GISs) experience difficulties in meeting the demand for geological spatial information. To facilitate the real-time sharing of data and services in distributed environments, a GIS platform that is open, integrative, reconfigurable, reusable and elastic would represent an indispensable tool. The purpose of this paper is to develop a geological cloud-computing platform for integrating and sharing geological information based on a cloud architecture. Thus, the geological cloud-computing platform defines geological ontology semantics; designs a standard geological information framework and a standard resource integration model; builds a peer-to-peer node management mechanism; achieves the description, organization, discovery, computing and integration of the distributed resources; and provides the distributed spatial meta service, the spatial information catalog service, the multi-mode geological data service and the spatial data interoperation service. The geological survey information cloud-computing platform has been implemented, and based on the platform, some geological data services and geological processing services were developed. Furthermore, an iron mine resource forecast and an evaluation service is introduced in this paper.

  16. Orchestrating a unified approach to information management.

    Science.gov (United States)

    Friedman, B A

    1997-01-01

    to achieve high level integration across previously heterogeneous and non-integrated department-based clinical information systems.

  17. A review on machine learning principles for multi-view biological data integration.

    Science.gov (United States)

    Li, Yifeng; Wu, Fang-Xiang; Ngom, Alioune

    2018-03-01

    Driven by high-throughput sequencing techniques, modern genomic and clinical studies are in a strong need of integrative machine learning models for better use of vast volumes of heterogeneous information in the deep understanding of biological systems and the development of predictive models. How data from multiple sources (called multi-view data) are incorporated in a learning system is a key step for successful analysis. In this article, we provide a comprehensive review on omics and clinical data integration techniques, from a machine learning perspective, for various analyses such as prediction, clustering, dimension reduction and association. We shall show that Bayesian models are able to use prior information and model measurements with various distributions; tree-based methods can either build a tree with all features or collectively make a final decision based on trees learned from each view; kernel methods fuse the similarity matrices learned from individual views together for a final similarity matrix or learning model; network-based fusion methods are capable of inferring direct and indirect associations in a heterogeneous network; matrix factorization models have potential to learn interactions among features from different views; and a range of deep neural networks can be integrated in multi-modal learning for capturing the complex mechanism of biological systems.

  18. SEMANTIC METADATA FOR HETEROGENEOUS SPATIAL PLANNING DOCUMENTS

    Directory of Open Access Journals (Sweden)

    A. Iwaniak

    2016-09-01

    Full Text Available Spatial planning documents contain information about the principles and rights of land use in different zones of a local authority. They are the basis for administrative decision making in support of sustainable development. In Poland these documents are published on the Web according to a prescribed non-extendable XML schema, designed for optimum presentation to humans in HTML web pages. There is no document standard, and limited functionality exists for adding references to external resources. The text in these documents is discoverable and searchable by general-purpose web search engines, but the semantics of the content cannot be discovered or queried. The spatial information in these documents is geographically referenced but not machine-readable. Major manual efforts are required to integrate such heterogeneous spatial planning documents from various local authorities for analysis, scenario planning and decision support. This article presents results of an implementation using machine-readable semantic metadata to identify relationships among regulations in the text, spatial objects in the drawings and links to external resources. A spatial planning ontology was used to annotate different sections of spatial planning documents with semantic metadata in the Resource Description Framework in Attributes (RDFa. The semantic interpretation of the content, links between document elements and links to external resources were embedded in XHTML pages. An example and use case from the spatial planning domain in Poland is presented to evaluate its efficiency and applicability. The solution enables the automated integration of spatial planning documents from multiple local authorities to assist decision makers with understanding and interpreting spatial planning information. The approach is equally applicable to legal documents from other countries and domains, such as cultural heritage and environmental management.

  19. Integrated Engineering Information Technology, FY93 accommplishments

    Energy Technology Data Exchange (ETDEWEB)

    Harris, R.N.; Miller, D.K.; Neugebauer, G.L.; Orona, J.R.; Partridge, R.A.; Herman, J.D.

    1994-03-01

    The Integrated Engineering Information Technology (IEIT) project is providing a comprehensive, easy-to-use computer network solution or communicating with coworkers both inside and outside Sandia National Laboratories. IEIT capabilities include computer networking, electronic mail, mechanical design, and data management. These network-based tools have one fundamental purpose: to help create a concurrent engineering environment that will enable Sandia organizations to excel in today`s increasingly competitive business environment.

  20. Heterogeneous recurrence monitoring and control of nonlinear stochastic processes

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Hui, E-mail: huiyang@usf.edu; Chen, Yun [Complex Systems Monitoring, Modeling and Analysis Laboratory, University of South Florida, Tampa, Florida 33620 (United States)

    2014-03-15

    Recurrence is one of the most common phenomena in natural and engineering systems. Process monitoring of dynamic transitions in nonlinear and nonstationary systems is more concerned with aperiodic recurrences and recurrence variations. However, little has been done to investigate the heterogeneous recurrence variations and link with the objectives of process monitoring and anomaly detection. Notably, nonlinear recurrence methodologies are based on homogeneous recurrences, which treat all recurrence states in the same way as black dots, and non-recurrence is white in recurrence plots. Heterogeneous recurrences are more concerned about the variations of recurrence states in terms of state properties (e.g., values and relative locations) and the evolving dynamics (e.g., sequential state transitions). This paper presents a novel approach of heterogeneous recurrence analysis that utilizes a new fractal representation to delineate heterogeneous recurrence states in multiple scales, including the recurrences of both single states and multi-state sequences. Further, we developed a new set of heterogeneous recurrence quantifiers that are extracted from fractal representation in the transformed space. To that end, we integrated multivariate statistical control charts with heterogeneous recurrence analysis to simultaneously monitor two or more related quantifiers. Experimental results on nonlinear stochastic processes show that the proposed approach not only captures heterogeneous recurrence patterns in the fractal representation but also effectively monitors the changes in the dynamics of a complex system.

  1. DIGITAL ONCOLOGY PATIENT RECORD - HETEROGENEOUS FILE BASED APPROACH

    Directory of Open Access Journals (Sweden)

    Nikolay Sapundzhiev

    2010-12-01

    Full Text Available Introduction: Oncology patients need extensive follow-up and meticulous documentation. The aim of this study was to introduce a simple, platform independent file based system for documentation of diagnostic and therapeutic procedures in oncology patients and test its function.Material and methods: A file-name based system of the type M1M2M3.F2 was introduced, where M1 is a unique identifier for the patient, M2 is the date of the clinical intervention/event, M3 is an identifier for the author of the medical record and F2 is the specific software generated file-name extension.Results: This system is in use at 5 institutions, where a total of 11 persons on 14 different workstations inputted 16591 entries (files for 2370. The merge process was tested on 2 operating systems - when copied together all files sort up as expected by patient, and for each patient in a chronological order, providing a digital cumulative patient record, which contains heterogeneous file formats.Conclusion: The file based approach for storing heterogeneous digital patient related information is an reliable system, which can handle open-source, proprietary, general and custom file formats and seems to be easily scalable. Further development of software for automatic checks of the integrity and searching and indexing of the files is expected to produce a more user-friendly environment

  2. Development of an Information Database for the Integrated Airline Management System (IAMS

    Directory of Open Access Journals (Sweden)

    Bogdane Ruta

    2017-08-01

    Full Text Available In present conditions the activity of any enterprise is represented as a combination of operational processes. Each of them corresponds to relevant airline management systems. Combining two or more management systems, it is possible to obtain an integrated management system. For the effective functioning of the integrated management system, an appropriate information system should be developed. This article proposes a model of such an information system.

  3. Surface current double-heterogeneous multilayer multicell methodology

    International Nuclear Information System (INIS)

    Stepanek, J.; Segev, M.

    1991-01-01

    A surface current methodology is developed to respond to the need for treating the various levels of material heterogeneity in a double-heterogeneous multilayer multicell in processing neutron multigroup cross sections in the resonance as well as thermal energy range. First, the basic surface cosine current transport equations to calculate the energy-dependent neutron flux spatial distribution in the multilayered multicell are formulated. Slab, spherical and cylindrical geometries, as well as square and hexagonal lattices and pebble-bed configurations with white or reflective cell boundary conditions, are considered. Second, starting from the surface cosine-current formulation, a two-zone three-layer multicell formalism for reduction of heterogeneous flux expressions to equivalent homogeneous flux expression for table method was developed. This formalism allows an infinite, as well as a limited, number of second-heterogeneity cells within a partial first-heterogeneity cell layer to be considered. Also, the number of the first-and second-heterogeneity cell types is quite general. The 'outer' (right side) as well as 'inner' (left side) Dancoff probabilities can be calculated for any particular layer. An accurate, efficient, and compact interpolation procedure is developed to calculate the basic collision probabilities. These are transmission and escape probabilities for shells in slab, cylindrical, and spherical geometries, as well as Dancoff probabilities for cylinders in square and hexagonal lattices. The use of the interpolation procedure is exemplified in a multilayer multicell approximation for the Dancoff probability, enabling a routine evaluation of the equivalence-based shielded resonance integral in highly complex lattices of slab, cylindrical, or spherical cells. (author) 1 fig., 2 tabs., 10 refs

  4. ACCOUNTING INFORMATION INTEGRATION TROUGH AN ENTERPRISE PORTAL

    Directory of Open Access Journals (Sweden)

    Gianina RIZESCU

    2014-06-01

    Full Text Available If companies are lacking integrated enterprise software applications, or they simply do not use them on a large scale, accounting departments have to face lots of difficulties, concerning both the inflexibility in achieving good results and the limited possibility of communicating these results. Thus, most times, accounting departments are limited to generating predefined reports provided by a software application and the most they can do is export these reports into Microsoft Excel. Another cause which leads to late obtaining and publishing of accounting information is the lack of data from other departments and their corresponding software applications. That is why, in many enterprises, accounting data becomes irrelevant for the users. The main goal of this article is to show how accounting can benefit from an integrated software solution, namely an enterprise portal.

  5. Detection of structural heterogeneity of glass melts

    DEFF Research Database (Denmark)

    Yue, Yuanzheng

    2004-01-01

    The structural heterogeneity of both supercooled liquid and molten states of silicate has been studied using calorimetric method. The objects of this study are basaltic glasses and liquids. Two experimental approaches are taken to detect the structural heterogeneity of the liquids. One is the hyp......The structural heterogeneity of both supercooled liquid and molten states of silicate has been studied using calorimetric method. The objects of this study are basaltic glasses and liquids. Two experimental approaches are taken to detect the structural heterogeneity of the liquids. One...... is the hyperquench-anneal-calorimetric scan approach, by which the structural information of a basaltic supercooled liquid and three binary silicate liquids is acquired. Another is the calorimetrically repeated up- and downscanning approach, by which the structural heterogeneity, the intermediate range order...... is discussed. The ordered structure of glass melts above the liquidus temperature is indirectly characterized by use of X-ray diffraction method. The new approaches are of importance for monitoring the glass melting and forming process and for improving the physical properties of glasses and glass fibers....

  6. Behavior Selection of Mobile Robot Based on Integration of Multimodal Information

    Science.gov (United States)

    Chen, Bin; Kaneko, Masahide

    Recently, biologically inspired robots have been developed to acquire the capacity for directing visual attention to salient stimulus generated from the audiovisual environment. On purpose to realize this behavior, a general method is to calculate saliency maps to represent how much the external information attracts the robot's visual attention, where the audiovisual information and robot's motion status should be involved. In this paper, we represent a visual attention model where three modalities, that is, audio information, visual information and robot's motor status are considered, while the previous researches have not considered all of them. Firstly, we introduce a 2-D density map, on which the value denotes how much the robot pays attention to each spatial location. Then we model the attention density using a Bayesian network where the robot's motion statuses are involved. Secondly, the information from both of audio and visual modalities is integrated with the attention density map in integrate-fire neurons. The robot can direct its attention to the locations where the integrate-fire neurons are fired. Finally, the visual attention model is applied to make the robot select the visual information from the environment, and react to the content selected. Experimental results show that it is possible for robots to acquire the visual information related to their behaviors by using the attention model considering motion statuses. The robot can select its behaviors to adapt to the dynamic environment as well as to switch to another task according to the recognition results of visual attention.

  7. Modular design of artificial tissue homeostasis: robust control through synthetic cellular heterogeneity.

    Directory of Open Access Journals (Sweden)

    Miles Miller

    Full Text Available Synthetic biology efforts have largely focused on small engineered gene networks, yet understanding how to integrate multiple synthetic modules and interface them with endogenous pathways remains a challenge. Here we present the design, system integration, and analysis of several large scale synthetic gene circuits for artificial tissue homeostasis. Diabetes therapy represents a possible application for engineered homeostasis, where genetically programmed stem cells maintain a steady population of β-cells despite continuous turnover. We develop a new iterative process that incorporates modular design principles with hierarchical performance optimization targeted for environments with uncertainty and incomplete information. We employ theoretical analysis and computational simulations of multicellular reaction/diffusion models to design and understand system behavior, and find that certain features often associated with robustness (e.g., multicellular synchronization and noise attenuation are actually detrimental for tissue homeostasis. We overcome these problems by engineering a new class of genetic modules for 'synthetic cellular heterogeneity' that function to generate beneficial population diversity. We design two such modules (an asynchronous genetic oscillator and a signaling throttle mechanism, demonstrate their capacity for enhancing robust control, and provide guidance for experimental implementation with various computational techniques. We found that designing modules for synthetic heterogeneity can be complex, and in general requires a framework for non-linear and multifactorial analysis. Consequently, we adapt a 'phenotypic sensitivity analysis' method to determine how functional module behaviors combine to achieve optimal system performance. We ultimately combine this analysis with Bayesian network inference to extract critical, causal relationships between a module's biochemical rate-constants, its high level functional behavior in

  8. Modular design of artificial tissue homeostasis: robust control through synthetic cellular heterogeneity.

    Science.gov (United States)

    Miller, Miles; Hafner, Marc; Sontag, Eduardo; Davidsohn, Noah; Subramanian, Sairam; Purnick, Priscilla E M; Lauffenburger, Douglas; Weiss, Ron

    2012-01-01

    Synthetic biology efforts have largely focused on small engineered gene networks, yet understanding how to integrate multiple synthetic modules and interface them with endogenous pathways remains a challenge. Here we present the design, system integration, and analysis of several large scale synthetic gene circuits for artificial tissue homeostasis. Diabetes therapy represents a possible application for engineered homeostasis, where genetically programmed stem cells maintain a steady population of β-cells despite continuous turnover. We develop a new iterative process that incorporates modular design principles with hierarchical performance optimization targeted for environments with uncertainty and incomplete information. We employ theoretical analysis and computational simulations of multicellular reaction/diffusion models to design and understand system behavior, and find that certain features often associated with robustness (e.g., multicellular synchronization and noise attenuation) are actually detrimental for tissue homeostasis. We overcome these problems by engineering a new class of genetic modules for 'synthetic cellular heterogeneity' that function to generate beneficial population diversity. We design two such modules (an asynchronous genetic oscillator and a signaling throttle mechanism), demonstrate their capacity for enhancing robust control, and provide guidance for experimental implementation with various computational techniques. We found that designing modules for synthetic heterogeneity can be complex, and in general requires a framework for non-linear and multifactorial analysis. Consequently, we adapt a 'phenotypic sensitivity analysis' method to determine how functional module behaviors combine to achieve optimal system performance. We ultimately combine this analysis with Bayesian network inference to extract critical, causal relationships between a module's biochemical rate-constants, its high level functional behavior in isolation, and

  9. IHE, Solution for integration of information systems and PACS

    Directory of Open Access Journals (Sweden)

    Milad Janghorban Lariche

    2014-10-01

    Full Text Available PACS is used as a way to store images and matches well with the workflow in the radiology department and can spread to other parts of hospital. Integration with other PACS and other hospital systems like radiology information system (RIS, hospital information system (HIS, and electronic patient records has been completely done, but there are still problems. PACS also provide good conditions for setting up Tele-radiology. The next step for PACS is where hospitals and health care organizations share photos in integrated electronic patient record. Among the different ways for sharing photos between different hospitals, IHE (integrating the health care enterprise standard indexes the cross-enterprise document sharing profile (XDS and allows sharing photos from various hospitals even if their PACS has different brands and different vendors. Application of XDS is useful for sharing images between health care organizations without duplicating them in a central archive. Images need to be indexed in a central registry. In the XDS profile, IHE defines an indexing mechanism for printing and indexing images in the central document registry. IHE also defines mechanisms to be used by each hospital to retrieve images, regardless of storing them in hospital PACS.

  10. Impact of informal institutions on the development integration processes

    Directory of Open Access Journals (Sweden)

    Sidorova Alexandra, M.

    2015-06-01

    Full Text Available The paper deals with the impact of informal institutions on the definition of the vector integration processes and the development of integration processes in the countries of the Customs Union and Ukraine. The degree of scientific development of the phenomenon in different economic schools is determined in this article. Economic mentality is a basic informal institutions, which determines the degree of effectiveness of the integration processes. This paper examines the nature, characteristics and effects of economic mentality on the economic activities of people. Ethnometrichal method allows to quantify the economic mentality that enables deeper understanding and analysis of the formation and functioning of political and economic system, especially business and management, establishing contacts with other cultures. It was measured modern Belarusian economic mentality based on international methodology Hofstede and compared with the economic mentality of Russia, Ukraine and Kazakhstan. With the help of cluster analysis congruence economic mentality of the Customs Union and Ukraine was determined. Economic mentality of these countries was also compared with the economic mentality of other countries in order to identify the main types of economic culture.

  11. Integrating Web 2.0-Based Informal Learning with Workplace Training

    Science.gov (United States)

    Zhao, Fang; Kemp, Linzi J.

    2012-01-01

    Informal learning takes place in the workplace through connection and collaboration mediated by Web 2.0 applications. However, little research has yet been published that explores informal learning and how to integrate it with workplace training. We aim to address this research gap by developing a conceptual Web 2.0-based workplace learning and…

  12. Optimization of the German integrated information and measurement system (IMIS)

    International Nuclear Information System (INIS)

    Wirth, E.; Weiss, W.

    2002-01-01

    The Chernobyl accident led to a widespread contamination of the environment in most European countries. In Germany, like in all other countries, it took some time to evaluate the radiological situation, time which is extremely valuable in the early phases of an accident when decisions on countermeasures like sheltering, iodine prophylaxis or evacuation have to be taken. For a better emergency preparedness the Integrated Information and Measurement System (IMIS) has been developed and established in Germany. In case of a widespread contamination of the environment, the system will provide the decision makers with all information necessary to evaluate the radiological situation and to decide on countermeasures. Presently this system is upgraded due to the adoption of the European decision supporting system RODOS and by the improvement of the national information exchange. For this purpose the web based information system ELAN has been developed. The national systems have to be integrated into the European and international communication systems. In this presentation the IMIS system is briefly described and the new features and modules of the system are discussed in greater detail

  13. Seamless Data Services for Real Time Communication in a Heterogeneous Networks using Network Tracking and Management

    OpenAIRE

    T, Adiline Macriga.; Kumar, Dr. P. Anandha

    2010-01-01

    Heterogeneous Networks is the integration of all existing networks under a single environment with an understanding between the functional operations and also includes the ability to make use of multiple broadband transport technologies and to support generalized mobility. It is a challenging feature for Heterogeneous networks to integrate several IP-based access technologies in a seamless way. The focus of this paper is on the requirements of a mobility management scheme for multimedia real-...

  14. Heterogeneous MEMS device assembly and integration

    Science.gov (United States)

    Topart, Patrice; Picard, Francis; Ilias, Samir; Alain, Christine; Chevalier, Claude; Fisette, Bruno; Paultre, Jacques E.; Généreux, Francis; Legros, Mathieu; Lepage, Jean-François; Laverdière, Christian; Ngo Phong, Linh; Caron, Jean-Sol; Desroches, Yan

    2014-03-01

    In recent years, smart phone applications have both raised the pressure for cost and time to market reduction, and the need for high performance MEMS devices. This trend has led the MEMS community to develop multi-die packaging of different functionalities or multi-technology (i.e. wafer) approaches to fabricate and assemble devices respectively. This paper reports on the fabrication, assembly and packaging at INO of various MEMS devices using heterogeneous assembly at chip and package-level. First, the performance of a giant (e.g. about 3 mm in diameter), electrostatically actuated beam steering mirror is presented. It can be rotated about two perpendicular axes to steer an optical beam within an angular cone of up to 60° in vector scan mode with an angular resolution of 1 mrad and a response time of 300 ms. To achieve such angular performance relative to mirror size, the microassembly was performed from sub-components fabricated from 4 different wafers. To combine infrared detection with inertial sensing, an electroplated proof mass was flip-chipped onto a 256×1 pixel uncooled bolometric FPA and released using laser ablation. In addition to the microassembly technology, performance results of packaged devices are presented. Finally, to simulate a 3072×3 pixel uncooled detector for cloud and fire imaging in mid and long-wave IR, the staggered assembly of six 512×3 pixel FPAs with a less than 50 micron pixel co-registration is reported.

  15. Ontology based integration of heterogeneous structures in the energy industry; Ontologiebasierte Integration heterogener Standards in der Energiewirtschaft

    Energy Technology Data Exchange (ETDEWEB)

    Uslar, Mathias

    2010-07-01

    Today, utilities face a constant change to their business which is mainly driven by two factors. On the one hand, resources like oil and charcoal which deliver most of the energy for producing electricity become more and more scarce and, therefore, more expensive. This forces utilities to look for alternatives to those resources in order to avoid the price pressure. New renewable energy resources like wind turbines, photovoltaic, bio mass or geothermals become more and more popular. On the other hand, the regulation done by the European Commission has a strong impact on the utilities because of the liberalization of the energy markets. The market was opened by the so called unbundling which is, in fact, the separation of the distribution grid from the capability of producing energy which was common before leading to the fact, that the producers of energy also were the only ones which could sell and distribute the energy which lead to monopolistic structures on the market. Nowadays, we have a market where the customers can choose between the offers from different utilities. Those changes to the utility domain have a direct impact on the IT-landscape of the utility who has to deal with new processes which have to be supported by changes like new systems or services and new interfaces between the existing systems in order to support the new requirements. In general, the utility has to deal with standards and norms for the domain in this described setting in order to exchange data with other market participants or in order to integrate their own systems in an appropriate manner. In the electric utility domain, the Common Information Model CIM has spread for the scope of SCADA (supervisory Control and Data Acquisition) and market communications. It is standardized by the IEC (International Electrotechnical Commission) as the IEC 61970 family of standards. The second important family is the IEC 61850 family which deals with communication networks and systems in

  16. Heterogeneous Deformable Modeling of Bio-Tissues and Haptic Force Rendering for Bio-Object Modeling

    Science.gov (United States)

    Lin, Shiyong; Lee, Yuan-Shin; Narayan, Roger J.

    This paper presents a novel technique for modeling soft biological tissues as well as the development of an innovative interface for bio-manufacturing and medical applications. Heterogeneous deformable models may be used to represent the actual internal structures of deformable biological objects, which possess multiple components and nonuniform material properties. Both heterogeneous deformable object modeling and accurate haptic rendering can greatly enhance the realism and fidelity of virtual reality environments. In this paper, a tri-ray node snapping algorithm is proposed to generate a volumetric heterogeneous deformable model from a set of object interface surfaces between different materials. A constrained local static integration method is presented for simulating deformation and accurate force feedback based on the material properties of a heterogeneous structure. Biological soft tissue modeling is used as an example to demonstrate the proposed techniques. By integrating the heterogeneous deformable model into a virtual environment, users can both observe different materials inside a deformable object as well as interact with it by touching the deformable object using a haptic device. The presented techniques can be used for surgical simulation, bio-product design, bio-manufacturing, and medical applications.

  17. An Accurate Integral Method for Vibration Signal Based on Feature Information Extraction

    Directory of Open Access Journals (Sweden)

    Yong Zhu

    2015-01-01

    Full Text Available After summarizing the advantages and disadvantages of current integral methods, a novel vibration signal integral method based on feature information extraction was proposed. This method took full advantage of the self-adaptive filter characteristic and waveform correction feature of ensemble empirical mode decomposition in dealing with nonlinear and nonstationary signals. This research merged the superiorities of kurtosis, mean square error, energy, and singular value decomposition on signal feature extraction. The values of the four indexes aforementioned were combined into a feature vector. Then, the connotative characteristic components in vibration signal were accurately extracted by Euclidean distance search, and the desired integral signals were precisely reconstructed. With this method, the interference problem of invalid signal such as trend item and noise which plague traditional methods is commendably solved. The great cumulative error from the traditional time-domain integral is effectively overcome. Moreover, the large low-frequency error from the traditional frequency-domain integral is successfully avoided. Comparing with the traditional integral methods, this method is outstanding at removing noise and retaining useful feature information and shows higher accuracy and superiority.

  18. Programming signal processing applications on heterogeneous wireless sensor platforms

    NARCIS (Netherlands)

    Buondonno, L.; Fortino, G.; Galzarano, S.; Giannantonio, R.; Giordano, A.; Gravina, R.; Guerrieri, A.

    2009-01-01

    This paper proposes the SPINE frameworks (SPINE1.x and SPINE2) for the programming of signal processing applications on heterogeneous wireless sensor platforms. In particular, two integrable approaches based on the proposed frameworks are described that allow to develop applications for wireless

  19. Integration of radiology and hospital information systems (RIS, HIS) with PACS

    International Nuclear Information System (INIS)

    Mosser, H.; Urban, M.; Hruby, W.; Duerr, M.; Rueger, W.

    1992-01-01

    PACS development has now reached a stage where it can clearly be stated that the technology for storage, networking and display in a fully digital environment is available. This is reflected by an already large and rapidly increasing number of PACS installations in USA, Western Europe and Japan. Such installations consist of a great variety of information systems, more or less interconnected, like PACS, HIS, RIS and other departmental systems, differing in both hardware and software. Various data -even if they only concern one person- are stored in different systems distributed in the hospital. The integration of all digital systems into a functional unit is determined by the radiologist's need of quick access to all relevant information regardless where it is stored. The interconnection and functional integration of all digital systems in the hospital determine the clinical benefits of PACS. This paper describes the radiologist's requirements concerning this integration, and presents some realistic solutions such as the Siemens ISI (Information System Interface), and a mobile viewing station for the wards (visitBox). (author). 9 refs., 4 figs

  20. Entropy in Postmerger and Acquisition Integration from an Information Technology Perspective

    Science.gov (United States)

    Williams, Gloria S.

    2012-01-01

    Mergers and acquisitions have historically experienced failure rates from 50% to more than 80%. Successful integration of information technology (IT) systems can be the difference between postmerger success or failure. The purpose of this phenomenological study was to explore the entropy phenomenon during postmerger IT integration. To that end, a…

  1. Use of Persistent Identifiers to link Heterogeneous Data Systems in the Integrated Earth Data Applications (IEDA) Facility

    Science.gov (United States)

    Hsu, L.; Lehnert, K. A.; Carbotte, S. M.; Arko, R. A.; Ferrini, V.; O'hara, S. H.; Walker, J. D.

    2012-12-01

    The Integrated Earth Data Applications (IEDA) facility maintains multiple data systems with a wide range of solid earth data types from the marine, terrestrial, and polar environments. Examples of the different data types include syntheses of ultra-high resolution seafloor bathymetry collected on large collaborative cruises and analytical geochemistry measurements collected by single investigators in small, unique projects. These different data types have historically been channeled into separate, discipline-specific databases with search and retrieval tailored for the specific data type. However, a current major goal is to integrate data from different systems to allow interdisciplinary data discovery and scientific analysis. To increase discovery and access across these heterogeneous systems, IEDA employs several unique IDs, including sample IDs (International Geo Sample Number, IGSN), person IDs (GeoPass ID), funding award IDs (NSF Award Number), cruise IDs (from the Marine Geoscience Data System Expedition Metadata Catalog), dataset IDs (DOIs), and publication IDs (DOIs). These IDs allow linking of a sample registry (System for Earth SAmple Registration), data libraries and repositories (e.g. Geochemical Research Library, Marine Geoscience Data System), integrated synthesis databases (e.g. EarthChem Portal, PetDB), and investigator services (IEDA Data Compliance Tool). The linked systems allow efficient discovery of related data across different levels of granularity. In addition, IEDA data systems maintain links with several external data systems, including digital journal publishers. Links have been established between the EarthChem Portal and ScienceDirect through publication DOIs, returning sample-level objects and geochemical analyses for a particular publication. Linking IEDA-hosted data to digital publications with IGSNs at the sample level and with IEDA-allocated dataset DOIs are under development. As an example, an individual investigator could sign up

  2. Short-term synaptic plasticity and heterogeneity in neural systems

    Science.gov (United States)

    Mejias, J. F.; Kappen, H. J.; Longtin, A.; Torres, J. J.

    2013-01-01

    We review some recent results on neural dynamics and information processing which arise when considering several biophysical factors of interest, in particular, short-term synaptic plasticity and neural heterogeneity. The inclusion of short-term synaptic plasticity leads to enhanced long-term memory capacities, a higher robustness of memory to noise, and irregularity in the duration of the so-called up cortical states. On the other hand, considering some level of neural heterogeneity in neuron models allows neural systems to optimize information transmission in rate coding and temporal coding, two strategies commonly used by neurons to codify information in many brain areas. In all these studies, analytical approximations can be made to explain the underlying dynamics of these neural systems.

  3. TechIP: A Methodology for Emerging Information Technology Insertion & Integration

    National Research Council Canada - National Science Library

    Patel, Has

    2004-01-01

    ...) processing and software agents. To implement these requirements, the system designers are required to insert, integrate and manage proven advances in Emerging Information Technology (EIT) in to the...

  4. Waste Information Management System with Integrated Transportation Forecast Data

    International Nuclear Information System (INIS)

    Upadhyay, H.; Quintero, W.; Shoffner, P.; Lagos, L.

    2009-01-01

    The Waste Information Management System with Integrated Transportation Forecast Data was developed to support the Department of Energy (DOE) mandated accelerated cleanup program. The schedule compression required close coordination and a comprehensive review and prioritization of the barriers that impeded treatment and disposition of the waste streams at each site. Many issues related to site waste treatment and disposal were potential critical path issues under the accelerated schedules. In order to facilitate accelerated cleanup initiatives, waste managers at DOE field sites and at DOE Headquarters in Washington, D.C., needed timely waste forecast and transportation information regarding the volumes and types of waste that would be generated by the DOE sites over the next 40 years. Each local DOE site has historically collected, organized, and displayed site waste forecast information in separate and unique systems. However, waste and shipment information from all sites needed a common application to allow interested parties to understand and view the complete complex-wide picture. The Waste Information Management System with Integrated Transportation Forecast Data allows identification of total forecasted waste volumes, material classes, disposition sites, choke points, technological or regulatory barriers to treatment and disposal, along with forecasted waste transportation information by rail, truck and inter-modal shipments. The Applied Research Center (ARC) at Florida International University (FIU) in Miami, Florida, has deployed the web-based forecast and transportation system and is responsible for updating the waste forecast and transportation data on a regular basis to ensure the long-term viability and value of this system. (authors)

  5. A Multianalyzer Machine Learning Model for Marine Heterogeneous Data Schema Mapping

    Directory of Open Access Journals (Sweden)

    Wang Yan

    2014-01-01

    Full Text Available The main challenges that marine heterogeneous data integration faces are the problem of accurate schema mapping between heterogeneous data sources. In order to improve the schema mapping efficiency and get more accurate learning results, this paper proposes a heterogeneous data schema mapping method basing on multianalyzer machine learning model. The multianalyzer analysis the learning results comprehensively, and a fuzzy comprehensive evaluation system is introduced for output results’ evaluation and multi factor quantitative judging. Finally, the data mapping comparison experiment on the East China Sea observing data confirms the effectiveness of the model and shows multianalyzer’s obvious improvement of mapping error rate.

  6. Uncertainty analysis of an integrated energy system based on information theory

    International Nuclear Information System (INIS)

    Fu, Xueqian; Sun, Hongbin; Guo, Qinglai; Pan, Zhaoguang; Xiong, Wen; Wang, Li

    2017-01-01

    Currently, a custom-designed configuration of different renewable technologies named the integrated energy system (IES) has become popular due to its high efficiency, benefiting from complementary multi-energy technologies. This paper proposes an information entropy approach to quantify uncertainty in an integrated energy system based on a stochastic model that drives a power system model derived from an actual network on Barry Island. Due to the complexity of co-behaviours between generators, a copula-based approach is utilized to articulate the dependency structure of the generator outputs with regard to such factors as weather conditions. Correlation coefficients and mutual information, which are effective for assessing the dependence relationships, are applied to judge whether the stochastic IES model is correct. The calculated information values can be used to analyse the impacts of the coupling of power and heat on power flows and heat flows, and this approach will be helpful for improving the operation of IES. - Highlights: • The paper explores uncertainty of an integrated energy system. • The dependent weather model is verified from the perspective of correlativity. • The IES model considers the dependence between power and heat. • The information theory helps analyse the complexity of IES operation. • The application of the model is studied using an operational system on Barry Island.

  7. Physical heterogeneity control on effective mineral dissolution rates

    Science.gov (United States)

    Jung, Heewon; Navarre-Sitchler, Alexis

    2018-04-01

    Hydrologic heterogeneity may be an important factor contributing to the discrepancy in laboratory and field measured dissolution rates, but the governing factors influencing mineral dissolution rates among various representations of physical heterogeneity remain poorly understood. Here, we present multiple reactive transport simulations of anorthite dissolution in 2D latticed random permeability fields and link the information from local grid scale (1 cm or 4 m) dissolution rates to domain-scale (1m or 400 m) effective dissolution rates measured by the flux-weighted average of an ensemble of flow paths. We compare results of homogeneous models to heterogeneous models with different structure and layered permeability distributions within the model domain. Chemistry is simplified to a single dissolving primary mineral (anorthite) distributed homogeneously throughout the domain and a single secondary mineral (kaolinite) that is allowed to dissolve or precipitate. Results show that increasing size in correlation structure (i.e. long integral scales) and high variance in permeability distribution are two important factors inducing a reduction in effective mineral dissolution rates compared to homogeneous permeability domains. Larger correlation structures produce larger zones of low permeability where diffusion is an important transport mechanism. Due to the increased residence time under slow diffusive transport, the saturation state of a solute with respect to a reacting mineral approaches equilibrium and reduces the reaction rate. High variance in permeability distribution favorably develops large low permeability zones that intensifies the reduction in mixing and effective dissolution rate. However, the degree of reduction in effective dissolution rate observed in 1 m × 1 m domains is too small (equilibrium conditions reduce the effective dissolution rate by increasing the saturation state. However, in large domains where less- or non-reactive zones develop, higher

  8. Heterogeneity and contaminant transport modeling for the Savannah River integrated demonstration site

    International Nuclear Information System (INIS)

    Chesnut, D.A.

    1992-11-01

    The effectiveness of remediating aquifers and vadose zone sediments is frequently controlled by spatial heterogeneities. A continuing and long-recognized problem in selecting, planning, implementing, and operating remediation projects is the development of methods for quantitatively describing heterogeneity and predicting its effects on process performance. The similarity to and differences from modeling oil recovery processes in the petroleum industry are illustrated by the extension to contaminant extraction processes of an analytic model originally developed for waterflooding petroleum reservoirs. The resulting equations incorporate the effects of heterogeneity through a single parameter, σ. Fitting this model to the Savannah River in situ Air Stripping test data suggests that the injection of air into a horizontal well below the water table may have improved performance by changing the flow pattern in the vadose zone. This change increased the capture volume, and consequently the contaminant mass inventory, of the horizontal injection well completed in the vadose zone. The apparent increases (compared to extraction only from the horizontal well) are from 10,200 to 21,000 pounds for TCE and from 3,600 pounds to 59,800 pounds for PCE. The predominance of PCE in this calculated increase suggests that redistribution of flow paths in the vadose zone, rather than in-situ stripping, may provide most of the improvement. Although this preliminary conclusion remains to be reinforced by more sophisticated modeling currently in progress, there appears to be a definite improvement, which is attributable to air injection, over conventional remediation methods

  9. 76 FR 17145 - Agency Information Collection Activities: Business Transformation-Automated Integrated Operating...

    Science.gov (United States)

    2011-03-28

    ... Collection Activities: Business Transformation--Automated Integrated Operating Environment (IOE), New... through efforts like USCIS' Business Transformation initiative. The IOE will be implemented by USCIS and... information collection. (2) Title of the Form/Collection: Business Transformation-- Automated Integrated...

  10. An integration of Emergency Department Information and Ambulance Systems.

    Science.gov (United States)

    Al-Harbi, Nada; El-Masri, Samir; Saddik, Basema

    2012-01-01

    In this paper we propose an Emergency Department Information System that will be integrated with the ambulance system to improve the communication, enhance the quality of provided emergency services and facilitate information sharing. The proposed system utilizes new advanced technologies such as mobile web services that overcome the problems of interoperability between different systems, HL7 and GPS. The system is unique in that it allows ambulance officers to locate the nearest specialized hospital and allows access to the patient's electronic health record as well as providing the hospital with required information to prepare for the incoming patient.

  11. Project Integration Architecture: A Practical Demonstration of Information Propagation

    Science.gov (United States)

    Jones, William Henry

    2005-01-01

    One of the goals of the Project Integration Architecture (PIA) effort is to provide the ability to propagate information between disparate applications. With this ability, applications may then be formed into an application graph constituting a super-application. Such a super-application would then provide all of the analysis appropriate to a given technical system. This paper reports on a small demonstration of this concept in which a Computer Aided Design (CAD) application was connected to an inlet analysis code and geometry information automatically propagated from one to the other. The majority of the work reported involved not the technology of information propagation, but rather the conversion of propagated information into a form usable by the receiving application.

  12. Semantic integration of information about orthologs and diseases: the OGO system.

    Science.gov (United States)

    Miñarro-Gimenez, Jose Antonio; Egaña Aranguren, Mikel; Martínez Béjar, Rodrigo; Fernández-Breis, Jesualdo Tomás; Madrid, Marisa

    2011-12-01

    Semantic Web technologies like RDF and OWL are currently applied in life sciences to improve knowledge management by integrating disparate information. Many of the systems that perform such task, however, only offer a SPARQL query interface, which is difficult to use for life scientists. We present the OGO system, which consists of a knowledge base that integrates information of orthologous sequences and genetic diseases, providing an easy to use ontology-constrain driven query interface. Such interface allows the users to define SPARQL queries through a graphical process, therefore not requiring SPARQL expertise. Copyright © 2011 Elsevier Inc. All rights reserved.

  13. Replikasi Unidirectional pada Heterogen Database

    Directory of Open Access Journals (Sweden)

    Hendro Nindito

    2013-12-01

    Full Text Available The use of diverse database technology in enterprise today can not be avoided. Thus, technology is needed to generate information in real time. The purpose of this research is to discuss a database replication technology that can be applied in heterogeneous database environments. In this study we use Windows-based MS SQL Server database to Linux-based Oracle database as the goal. The research method used is prototyping where development can be done quickly and testing of working models of the interaction process is done through repeated. From this research it is obtained that the database replication technolgy using Oracle Golden Gate can be applied in heterogeneous environments in real time as well.

  14. Deconstructing stem cell population heterogeneity: Single-cell analysis and modeling approaches

    Science.gov (United States)

    Wu, Jincheng; Tzanakakis, Emmanuel S.

    2014-01-01

    Isogenic stem cell populations display cell-to-cell variations in a multitude of attributes including gene or protein expression, epigenetic state, morphology, proliferation and proclivity for differentiation. The origins of the observed heterogeneity and its roles in the maintenance of pluripotency and the lineage specification of stem cells remain unclear. Addressing pertinent questions will require the employment of single-cell analysis methods as traditional cell biochemical and biomolecular assays yield mostly population-average data. In addition to time-lapse microscopy and flow cytometry, recent advances in single-cell genomic, transcriptomic and proteomic profiling are reviewed. The application of multiple displacement amplification, next generation sequencing, mass cytometry and spectrometry to stem cell systems is expected to provide a wealth of information affording unprecedented levels of multiparametric characterization of cell ensembles under defined conditions promoting pluripotency or commitment. Establishing connections between single-cell analysis information and the observed phenotypes will also require suitable mathematical models. Stem cell self-renewal and differentiation are orchestrated by the coordinated regulation of subcellular, intercellular and niche-wide processes spanning multiple time scales. Here, we discuss different modeling approaches and challenges arising from their application to stem cell populations. Integrating single-cell analysis with computational methods will fill gaps in our knowledge about the functions of heterogeneity in stem cell physiology. This combination will also aid the rational design of efficient differentiation and reprogramming strategies as well as bioprocesses for the production of clinically valuable stem cell derivatives. PMID:24035899

  15. 45 CFR 61.12 - Requesting information from the Healthcare Integrity and Protection Data Bank.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Requesting information from the Healthcare Integrity and Protection Data Bank. 61.12 Section 61.12 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION HEALTHCARE INTEGRITY AND PROTECTION DATA BANK FOR FINAL ADVERSE INFORMATION...

  16. Research on distributed heterogeneous data PCA algorithm based on cloud platform

    Science.gov (United States)

    Zhang, Jin; Huang, Gang

    2018-05-01

    Principal component analysis (PCA) of heterogeneous data sets can solve the problem that centralized data scalability is limited. In order to reduce the generation of intermediate data and error components of distributed heterogeneous data sets, a principal component analysis algorithm based on heterogeneous data sets under cloud platform is proposed. The algorithm performs eigenvalue processing by using Householder tridiagonalization and QR factorization to calculate the error component of the heterogeneous database associated with the public key to obtain the intermediate data set and the lost information. Experiments on distributed DBM heterogeneous datasets show that the model method has the feasibility and reliability in terms of execution time and accuracy.

  17. Doing One Thing Well: Leveraging Microservices for NASA Earth Science Discovery and Access Across Heterogenous Data Sources

    Science.gov (United States)

    Baynes, K.; Gilman, J.; Pilone, D.; Mitchell, A. E.

    2015-12-01

    The NASA EOSDIS (Earth Observing System Data and Information System) Common Metadata Repository (CMR) is a continuously evolving metadata system that merges all existing capabilities and metadata from EOS ClearingHOuse (ECHO) and the Global Change Master Directory (GCMD) systems. This flagship catalog has been developed with several key requirements: fast search and ingest performance ability to integrate heterogenous external inputs and outputs high availability and resiliency scalability evolvability and expandability This talk will focus on the advantages and potential challenges of tackling these requirements using a microservices architecture, which decomposes system functionality into smaller, loosely-coupled, individually-scalable elements that communicate via well-defined APIs. In addition, time will be spent examining specific elements of the CMR architecture and identifying opportunities for future integrations.

  18. Immediate integration of prosodic information from speech and visual information from pictures in the absence of focused attention: a mismatch negativity study.

    Science.gov (United States)

    Li, X; Yang, Y; Ren, G

    2009-06-16

    Language is often perceived together with visual information. Recent experimental evidences indicated that, during spoken language comprehension, the brain can immediately integrate visual information with semantic or syntactic information from speech. Here we used the mismatch negativity to further investigate whether prosodic information from speech could be immediately integrated into a visual scene context or not, and especially the time course and automaticity of this integration process. Sixteen Chinese native speakers participated in the study. The materials included Chinese spoken sentences and picture pairs. In the audiovisual situation, relative to the concomitant pictures, the spoken sentence was appropriately accented in the standard stimuli, but inappropriately accented in the two kinds of deviant stimuli. In the purely auditory situation, the speech sentences were presented without pictures. It was found that the deviants evoked mismatch responses in both audiovisual and purely auditory situations; the mismatch negativity in the purely auditory situation peaked at the same time as, but was weaker than that evoked by the same deviant speech sounds in the audiovisual situation. This pattern of results suggested immediate integration of prosodic information from speech and visual information from pictures in the absence of focused attention.

  19. Integrated Information Centers within Academic Environments: Introduction and Overview.

    Science.gov (United States)

    Lunin, Luis F., Ed.; D'Elia, George, Ed.

    1991-01-01

    Introduces eight articles on the Integrated Information Center (IIC) Project, which investigated significant behavioral, technological, organizational, financial, and legal factors involved in the management of IICs. Four articles address design and management issues of general interest, and four focus on specific design considerations and a…

  20. Department of Energy's Virtual Lab Infrastructure for Integrated Earth System Science Data

    Science.gov (United States)

    Williams, D. N.; Palanisamy, G.; Shipman, G.; Boden, T.; Voyles, J.

    2014-12-01

    The U.S. Department of Energy (DOE) Office of Biological and Environmental Research (BER) Climate and Environmental Sciences Division (CESD) produces a diversity of data, information, software, and model codes across its research and informatics programs and facilities. This information includes raw and reduced observational and instrumentation data, model codes, model-generated results, and integrated data products. Currently, most of this data and information are prepared and shared for program specific activities, corresponding to CESD organization research. A major challenge facing BER CESD is how best to inventory, integrate, and deliver these vast and diverse resources for the purpose of accelerating Earth system science research. This talk provides a concept for a CESD Integrated Data Ecosystem and an initial roadmap for its implementation to address this integration challenge in the "Big Data" domain. Towards this end, a new BER Virtual Laboratory Infrastructure will be presented, which will include services and software connecting the heterogeneous CESD data holdings, and constructed with open source software based on industry standards, protocols, and state-of-the-art technology.

  1. Interoperability of Geographic Information: A Communication Process –Based Prototype

    Directory of Open Access Journals (Sweden)

    Jean Brodeur

    2005-04-01

    Full Text Available Since 1990, municipal, state/provincial, and federal governments have developed numerous geographic databases over the years to fulfill organizations' specific needs. As such, same real world topographic phenomena have been abstracted differently, for instance vegetation (surface, trees (surface, wooded area (line, wooded area (point and line, milieu boisé (surface, zone boisée (unknown geometry. Today, information about these geographic phenomena is accessible on the Internet from Web infrastructures specially developed to simplify their access. Early in the nineties, the development of interoperability of geographic information has been undertaken to solve syntactic, structural, and semantic heterogeneities as well as spatial and temporal heterogeneities to facilitate sharing and integration of such data. Recently, we have proposed a new conceptual framework for interoperability of geographic information based on the human communication process, cognitive science, and ontology, and introduced geosemantic proximity, a reasoning methodology to qualify dynamically the semantic similarity between geographic abstractions. This framework could be of interest to other disciplines. This paper presents the details of our framework for interoperability of geographic information as well as a prototype.

  2. SCSODC: Integrating Ocean Data for Visualization Sharing and Application

    Science.gov (United States)

    Xu, C.; Li, S.; Wang, D.; Xie, Q.

    2014-02-01

    The South China Sea Ocean Data Center (SCSODC) was founded in 2010 in order to improve collecting and managing of ocean data of the South China Sea Institute of Oceanology (SCSIO). The mission of SCSODC is to ensure the long term scientific stewardship of ocean data, information and products - collected through research groups, monitoring stations and observation cruises - and to facilitate the efficient use and distribution to possible users. However, data sharing and applications were limited due to the characteristics of distribution and heterogeneity that made it difficult to integrate the data. To surmount those difficulties, the Data Sharing System has been developed by the SCSODC using the most appropriate information management and information technology. The Data Sharing System uses open standards and tools to promote the capability to integrate ocean data and to interact with other data portals or users and includes a full range of processes such as data discovery, evaluation and access combining C/S and B/S mode. It provides a visualized management interface for the data managers and a transparent and seamless data access and application environment for users. Users are allowed to access data using the client software and to access interactive visualization application interface via a web browser. The architecture, key technologies and functionality of the system are discussed briefly in this paper. It is shown that the system of SCSODC is able to implement web visualization sharing and seamless access to ocean data in a distributed and heterogeneous environment.

  3. DESIGN OF INFORMATION MANAGEMENT SYSTEM OF VERTICALLY INTEGRATED AGRICULTURAL HOLDINGS

    Directory of Open Access Journals (Sweden)

    Александр Витальевич ШМАТКО

    2015-05-01

    Full Text Available The paper deals with an approach to the design and development of information systems for the management and optimization of the organizational structure of vertically integrated agricultural holdings. A review of the problems of building and improving the organizational structure of vertically integrated agricultural holding is made. A method of constructing a discrete model management structure agricultural holding, which minimizes the costs associated with attracting applicants to work, is proposed.

  4. Large epidemic thresholds emerge in heterogeneous networks of heterogeneous nodes

    Science.gov (United States)

    Yang, Hui; Tang, Ming; Gross, Thilo

    2015-08-01

    One of the famous results of network science states that networks with heterogeneous connectivity are more susceptible to epidemic spreading than their more homogeneous counterparts. In particular, in networks of identical nodes it has been shown that network heterogeneity, i.e. a broad degree distribution, can lower the epidemic threshold at which epidemics can invade the system. Network heterogeneity can thus allow diseases with lower transmission probabilities to persist and spread. However, it has been pointed out that networks in which the properties of nodes are intrinsically heterogeneous can be very resilient to disease spreading. Heterogeneity in structure can enhance or diminish the resilience of networks with heterogeneous nodes, depending on the correlations between the topological and intrinsic properties. Here, we consider a plausible scenario where people have intrinsic differences in susceptibility and adapt their social network structure to the presence of the disease. We show that the resilience of networks with heterogeneous connectivity can surpass those of networks with homogeneous connectivity. For epidemiology, this implies that network heterogeneity should not be studied in isolation, it is instead the heterogeneity of infection risk that determines the likelihood of outbreaks.

  5. Large epidemic thresholds emerge in heterogeneous networks of heterogeneous nodes.

    Science.gov (United States)

    Yang, Hui; Tang, Ming; Gross, Thilo

    2015-08-21

    One of the famous results of network science states that networks with heterogeneous connectivity are more susceptible to epidemic spreading than their more homogeneous counterparts. In particular, in networks of identical nodes it has been shown that network heterogeneity, i.e. a broad degree distribution, can lower the epidemic threshold at which epidemics can invade the system. Network heterogeneity can thus allow diseases with lower transmission probabilities to persist and spread. However, it has been pointed out that networks in which the properties of nodes are intrinsically heterogeneous can be very resilient to disease spreading. Heterogeneity in structure can enhance or diminish the resilience of networks with heterogeneous nodes, depending on the correlations between the topological and intrinsic properties. Here, we consider a plausible scenario where people have intrinsic differences in susceptibility and adapt their social network structure to the presence of the disease. We show that the resilience of networks with heterogeneous connectivity can surpass those of networks with homogeneous connectivity. For epidemiology, this implies that network heterogeneity should not be studied in isolation, it is instead the heterogeneity of infection risk that determines the likelihood of outbreaks.

  6. Integrated care: an Information Model for Patient Safety and Vigilance Reporting Systems.

    Science.gov (United States)

    Rodrigues, Jean-Marie; Schulz, Stefan; Souvignet, Julien

    2015-01-01

    Quality management information systems for safety as a whole or for specific vigilances share the same information types but are not interoperable. An international initiative tries to develop an integrated information model for patient safety and vigilance reporting to support a global approach of heath care quality.

  7. A Comprehensive Database and Analysis Framework To Incorporate Multiscale Data Types and Enable Integrated Analysis of Bioactive Polyphenols.

    Science.gov (United States)

    Ho, Lap; Cheng, Haoxiang; Wang, Jun; Simon, James E; Wu, Qingli; Zhao, Danyue; Carry, Eileen; Ferruzzi, Mario G; Faith, Jeremiah; Valcarcel, Breanna; Hao, Ke; Pasinetti, Giulio M

    2018-03-05

    The development of a given botanical preparation for eventual clinical application requires extensive, detailed characterizations of the chemical composition, as well as the biological availability, biological activity, and safety profiles of the botanical. These issues are typically addressed using diverse experimental protocols and model systems. Based on this consideration, in this study we established a comprehensive database and analysis framework for the collection, collation, and integrative analysis of diverse, multiscale data sets. Using this framework, we conducted an integrative analysis of heterogeneous data from in vivo and in vitro investigation of a complex bioactive dietary polyphenol-rich preparation (BDPP) and built an integrated network linking data sets generated from this multitude of diverse experimental paradigms. We established a comprehensive database and analysis framework as well as a systematic and logical means to catalogue and collate the diverse array of information gathered, which is securely stored and added to in a standardized manner to enable fast query. We demonstrated the utility of the database in (1) a statistical ranking scheme to prioritize response to treatments and (2) in depth reconstruction of functionality studies. By examination of these data sets, the system allows analytical querying of heterogeneous data and the access of information related to interactions, mechanism of actions, functions, etc., which ultimately provide a global overview of complex biological responses. Collectively, we present an integrative analysis framework that leads to novel insights on the biological activities of a complex botanical such as BDPP that is based on data-driven characterizations of interactions between BDPP-derived phenolic metabolites and their mechanisms of action, as well as synergism and/or potential cancellation of biological functions. Out integrative analytical approach provides novel means for a systematic integrative

  8. Flow shop scheduling with heterogeneous workers

    OpenAIRE

    Benavides, Alexander J.; Ritt, Marcus; Miralles Insa, Cristóbal Javier

    2014-01-01

    We propose an extension to the flow shop scheduling problem named Heterogeneous Flow Shop Scheduling Problem (Het-FSSP), where two simultaneous issues have to be resolved: finding the best worker assignment to the workstations, and solving the corresponding scheduling problem. This problem is motivated by Sheltered Work centers for Disabled, whose main objective is the labor integration of persons with disabilities, an important aim not only for these centers but for any company d...

  9. Heterogeneous IP Ecosystem enabling Reuse (HIER)

    Science.gov (United States)

    2017-03-22

    HIER project, DARPA also established additional concepts in the formation of the Common Heterogeneous Integration and IP Reuse Strategies (CHIPS...would need a major change to  business model to offer  Hard  or Soft IP – So CHIPS program can be a better fit to these firms • DoD‐Contractor IP pricing

  10. Micromechanics Based Failure Analysis of Heterogeneous Materials

    Science.gov (United States)

    Sertse, Hamsasew M.

    In recent decades, heterogeneous materials are extensively used in various industries such as aerospace, defense, automotive and others due to their desirable specific properties and excellent capability of accumulating damage. Despite their wide use, there are numerous challenges associated with the application of these materials. One of the main challenges is lack of accurate tools to predict the initiation, progression and final failure of these materials under various thermomechanical loading conditions. Although failure is usually treated at the macro and meso-scale level, the initiation and growth of failure is a complex phenomena across multiple scales. The objective of this work is to enable the mechanics of structure genome (MSG) and its companion code SwiftComp to analyze the initial failure (also called static failure), progressive failure, and fatigue failure of heterogeneous materials using micromechanics approach. The initial failure is evaluated at each numerical integration point using pointwise and nonlocal approach for each constituent of the heterogeneous materials. The effects of imperfect interfaces among constituents of heterogeneous materials are also investigated using a linear traction-displacement model. Moreover, the progressive and fatigue damage analyses are conducted using continuum damage mechanics (CDM) approach. The various failure criteria are also applied at a material point to analyze progressive damage in each constituent. The constitutive equation of a damaged material is formulated based on a consistent irreversible thermodynamics approach. The overall tangent modulus of uncoupled elastoplastic damage for negligible back stress effect is derived. The initiation of plasticity and damage in each constituent is evaluated at each numerical integration point using a nonlocal approach. The accumulated plastic strain and anisotropic damage evolution variables are iteratively solved using an incremental algorithm. The damage analyses

  11. Benefits and problems in implementation for integrated medical information system

    International Nuclear Information System (INIS)

    Park, Chang Seo; Kim, Kee Deog; Park, Hyok; Jeong, Ho Gul

    2005-01-01

    Once the decision has been made to adopt an integrated medical information system (IMIS), there are a number of tissues to overcome. Users need to be aware of the impact the change will make on end users and be prepared to address issues that arise before they become problems. The purpose of this study is to investigate the benefits and unexpected problems encountered in the implementation of IMIS and to determine a useful framework for IMIS. The Yonsei University Dental Hospital is steadily constructing an IMIS. The vendor's PACS software, Piview STAR, supports transactions between workstations that are approved to integrating the health care enterprise (IHE) with security function. It is necessary to develop an excellent framework that is good for the patient, health care provider and information system vendors, in an expert, efficient, and cost-effective manner. The problems encountered with IMIS implementation were high initial investments, delay of EMR enforcement, underdevelopment of digital radiographic appliances and software and insufficient educational training for users. The clinical environments of dental IMIS is some different from the medical situation. The best way to overcome these differences is to establish a gold standard of dental IMIS integration, which estimates the cost payback. The IHE and its technical framework are good for the patient, the health care provider and all information systems vendors.

  12. Aquifer heterogeneity characterization with oscillatory pumping: Sensitivity analysis and imaging potential

    Science.gov (United States)

    Cardiff, M.; Bakhos, T.; Kitanidis, P. K.; Barrash, W.

    2013-09-01

    Periodic pumping tests, in which a fluid is extracted during half a period, then reinjected, have been used historically to estimate effective aquifer properties. In this work, we suggest a modified approach to periodic pumping test analysis in which one uses several periodic pumping signals of different frequencies as stimulation, and responses are analyzed through inverse modeling using a "steady-periodic" model formulation. We refer to this strategy as multifrequency oscillatory hydraulic imaging. Oscillating pumping tests have several advantages that have been noted, including no net water extraction during testing and robust signal measurement through signal processing. Through numerical experiments, we demonstrate additional distinct advantages that multifrequency stimulations have, including: (1) drastically reduced computational cost through use of a steady-periodic numerical model and (2) full utilization of the aquifer heterogeneity information provided by responses at different frequencies. We first perform fully transient numerical modeling for heterogeneous aquifers and show that equivalent results are obtained using a faster steady-periodic heterogeneous numerical model of the wave phasor. The sensitivities of observed signal response to aquifer heterogeneities are derived using an adjoint state-based approach, which shows that different frequency stimulations provide complementary information. Finally, we present an example 2-D application in which sinusoidal signals at multiple frequencies are used as a data source and are inverted to obtain estimates of aquifer heterogeneity. These analyses show the different heterogeneity information that can be obtained from different stimulation frequencies, and that data from several sinusoidal pumping tests can be rapidly inverted using the steady-periodic framework.

  13. Image-based computational quantification and visualization of genetic alterations and tumour heterogeneity.

    Science.gov (United States)

    Zhong, Qing; Rüschoff, Jan H; Guo, Tiannan; Gabrani, Maria; Schüffler, Peter J; Rechsteiner, Markus; Liu, Yansheng; Fuchs, Thomas J; Rupp, Niels J; Fankhauser, Christian; Buhmann, Joachim M; Perner, Sven; Poyet, Cédric; Blattner, Miriam; Soldini, Davide; Moch, Holger; Rubin, Mark A; Noske, Aurelia; Rüschoff, Josef; Haffner, Michael C; Jochum, Wolfram; Wild, Peter J

    2016-04-07

    Recent large-scale genome analyses of human tissue samples have uncovered a high degree of genetic alterations and tumour heterogeneity in most tumour entities, independent of morphological phenotypes and histopathological characteristics. Assessment of genetic copy-number variation (CNV) and tumour heterogeneity by fluorescence in situ hybridization (ISH) provides additional tissue morphology at single-cell resolution, but it is labour intensive with limited throughput and high inter-observer variability. We present an integrative method combining bright-field dual-colour chromogenic and silver ISH assays with an image-based computational workflow (ISHProfiler), for accurate detection of molecular signals, high-throughput evaluation of CNV, expressive visualization of multi-level heterogeneity (cellular, inter- and intra-tumour heterogeneity), and objective quantification of heterogeneous genetic deletions (PTEN) and amplifications (19q12, HER2) in diverse human tumours (prostate, endometrial, ovarian and gastric), using various tissue sizes and different scanners, with unprecedented throughput and reproducibility.

  14. Integrated management of information inside maintenance processes. From the building registry to BIM systems

    Directory of Open Access Journals (Sweden)

    Cinzia Talamo

    2014-10-01

    Full Text Available The paper presents objec- tives, methods and results of two researches dealing with the improvement of integrated information management within maintenance processes. Focusing on information needs regarding the last phases of the building process, the two researches draft approaches characterizing a path of progressive improve- ment of strategies for integration: from a building registry, unique for the whole construction process, to an integrated management of the building process with the support of BIM systems.

  15. Medical Device Integration Model Based on the Internet of Things

    Science.gov (United States)

    Hao, Aiyu; Wang, Ling

    2015-01-01

    At present, hospitals in our country have basically established the HIS system, which manages registration, treatment, and charge, among many others, of patients. During treatment, patients need to use medical devices repeatedly to acquire all sorts of inspection data. Currently, the output data of the medical devices are often manually input into information system, which is easy to get wrong or easy to cause mismatches between inspection reports and patients. For some small hospitals of which information construction is still relatively weak, the information generated by the devices is still presented in the form of paper reports. When doctors or patients want to have access to the data at a given time again, they can only look at the paper files. Data integration between medical devices has long been a difficult problem for the medical information system, because the data from medical devices are lack of mandatory unified global standards and have outstanding heterogeneity of devices. In order to protect their own interests, manufacturers use special protocols, etc., thus causing medical decices to still be the "lonely island" of hospital information system. Besides, unfocused application of the data will lead to failure to achieve a reasonable distribution of medical resources. With the deepening of IT construction in hospitals, medical information systems will be bound to develop towards mobile applications, intelligent analysis, and interconnection and interworking, on the premise that there is an effective medical device integration (MDI) technology. To this end, this paper presents a MDI model based on the Internet of Things (IoT). Through abstract classification, this model is able to extract the common characteristics of the devices, resolve the heterogeneous differences between them, and employ a unified protocol to integrate data between devices. And by the IoT technology, it realizes interconnection network of devices and conducts associate matching

  16. Gulf of Mexico Integrated Science - Tampa Bay Study - Data Information Management System (DIMS)

    Science.gov (United States)

    Johnston, James

    2004-01-01

    The Tampa Bay Integrated Science Study is an effort by the U.S. Geological Survey (USGS) that combines the expertise of federal, state and local partners to address some of the most pressing ecological problems of the Tampa Bay estuary. This project serves as a template for the application of integrated research projects in other estuaries in the Gulf of Mexico. Efficient information and data distribution for the Tampa Bay Study has required the development of a Data Information Management System (DIMS). This information system is being used as an outreach management tool, providing information to scientists, decision makers and the public on the coastal resources of the Gulf of Mexico.

  17. Heterogeneous propellant internal ballistics: criticism and regeneration

    Science.gov (United States)

    Glick, R. L.

    2011-10-01

    Although heterogeneous propellant and its innately nondeterministic, chemically discrete morphology dominates applications, ballisticcharacterization deterministic time-mean burning rate and acoustic admittance measures' absence of explicit, nondeterministic information requires homogeneous propellant with a smooth, uniformly regressing burning surface: inadequate boundary conditions for heterogeneous propellant grained applications. The past age overcame this dichotomy with one-dimensional (1D) models and empirical knowledge from numerous, adequately supported motor developments and supplementary experiments. However, current cost and risk constraints inhibit this approach. Moreover, its fundamental science approach is more sensitive to incomplete boundary condition information (garbage-in still equals garbage-out) and more is expected. This work critiques this situation and sketches a path forward based on enhanced ballistic and motor characterizations in the workplace and approximate model and apparatus developments mentored by CSAR DNS capabilities (or equivalent).

  18. Heterogeneity of obsessive-compulsive disorder: a literature review.

    Science.gov (United States)

    Lochner, Christine; Stein, Dan J

    2003-01-01

    Significant advances have been made in characterizing the phenomenology and psychobiology of obsessive-compulsive disorder (OCD) in recent years. In many ways, such advances suggest a conceptualization of OCD as a relatively homogeneous neuropsychiatric entity, underpinned by particular mechanisms that manifest in universal symptoms. Nevertheless, some data have pointed to the heterogeneity of this disorder. A computerized literature search (MEDLINE: 1964-2001) was used to collect studies addressing the heterogeneity of OCD. In addition, reviews of the phenomenology, psychobiology, family studies, and treatment of OCD were examined in an attempt to collate data addressing this issue. There is a growing consensus that some subtypes of OCD are valid and provide a useful means of integrating data on its symptomatology, neurobiology, and treatment response; for example, OCD with comorbid tics is characterized by earlier onset, a particular range of OCD symptoms, and worse response to selective serotonin reuptake inhibitors. The heterogeneity of OCD has important clinical and research implications.

  19. Integration of genomic information with biological networks using Cytoscape.

    Science.gov (United States)

    Bauer-Mehren, Anna

    2013-01-01

    Cytoscape is an open-source software for visualizing, analyzing, and modeling biological networks. This chapter explains how to use Cytoscape to analyze the functional effect of sequence variations in the context of biological networks such as protein-protein interaction networks and signaling pathways. The chapter is divided into five parts: (1) obtaining information about the functional effect of sequence variation in a Cytoscape readable format, (2) loading and displaying different types of biological networks in Cytoscape, (3) integrating the genomic information (SNPs and mutations) with the biological networks, and (4) analyzing the effect of the genomic perturbation onto the network structure using Cytoscape built-in functions. Finally, we briefly outline how the integrated data can help in building mathematical network models for analyzing the effect of the sequence variation onto the dynamics of the biological system. Each part is illustrated by step-by-step instructions on an example use case and visualized by many screenshots and figures.

  20. MaNIDA: Integration of marine expedition information, data and publications: Data Portal of German Marine Research

    Science.gov (United States)

    Koppe, Roland; Scientific MaNIDA-Team

    2013-04-01

    The Marine Network for Integrated Data Access (MaNIDA) aims to build a sustainable e-infrastructure to support discovery and re-use of marine data from distinct data providers in Germany (see related abstracts in session ESSI 1.2). In order to provide users integrated access and retrieval of expedition or cruise metadata, data, services and publications as well as relationships among the various objects, we are developing (web) applications based on state of the art technologies: the Data Portal of German Marine Research. Since the German network of distributed content providers have distinct objectives and mandates for storing digital objects (e.g. long-term data preservation, near real time data, publication repositories), we have to cope with heterogeneous metadata in terms of syntax and semantic, data types and formats as well as access solutions. We have defined a set of core metadata elements which are common to our content providers and therefore useful for discovery and building relationships among objects. Existing catalogues for various types of vocabularies are being used to assure the mapping to community-wide used terms. We distinguish between expedition metadata and continuously harvestable metadata objects from distinct data providers. • Existing expedition metadata from distinct sources is integrated and validated in order to create an expedition metadata catalogue which is used as authoritative source for expedition-related content. The web application allows browsing by e.g. research vessel and date, exploring expeditions and research gaps by tracklines and viewing expedition details (begin/end, ports, platforms, chief scientists, events, etc.). Also expedition-related objects from harvesting are dynamically associated with expedition information and presented to the user. Hence we will provide web services to detailed expedition information. • Other harvestable content is separated into four categories: archived data and data products, near

  1. Stakeholder engagement: a key component of integrating genomic information into electronic health records.

    Science.gov (United States)

    Hartzler, Andrea; McCarty, Catherine A; Rasmussen, Luke V; Williams, Marc S; Brilliant, Murray; Bowton, Erica A; Clayton, Ellen Wright; Faucett, William A; Ferryman, Kadija; Field, Julie R; Fullerton, Stephanie M; Horowitz, Carol R; Koenig, Barbara A; McCormick, Jennifer B; Ralston, James D; Sanderson, Saskia C; Smith, Maureen E; Trinidad, Susan Brown

    2013-10-01

    Integrating genomic information into clinical care and the electronic health record can facilitate personalized medicine through genetically guided clinical decision support. Stakeholder involvement is critical to the success of these implementation efforts. Prior work on implementation of clinical information systems provides broad guidance to inform effective engagement strategies. We add to this evidence-based recommendations that are specific to issues at the intersection of genomics and the electronic health record. We describe stakeholder engagement strategies employed by the Electronic Medical Records and Genomics Network, a national consortium of US research institutions funded by the National Human Genome Research Institute to develop, disseminate, and apply approaches that combine genomic and electronic health record data. Through select examples drawn from sites of the Electronic Medical Records and Genomics Network, we illustrate a continuum of engagement strategies to inform genomic integration into commercial and homegrown electronic health records across a range of health-care settings. We frame engagement as activities to consult, involve, and partner with key stakeholder groups throughout specific phases of health information technology implementation. Our aim is to provide insights into engagement strategies to guide genomic integration based on our unique network experiences and lessons learned within the broader context of implementation research in biomedical informatics. On the basis of our collective experience, we describe key stakeholder practices, challenges, and considerations for successful genomic integration to support personalized medicine.

  2. SoFIA: a data integration framework for annotating high-throughput datasets.

    Science.gov (United States)

    Childs, Liam Harold; Mamlouk, Soulafa; Brandt, Jörgen; Sers, Christine; Leser, Ulf

    2016-09-01

    Integrating heterogeneous datasets from several sources is a common bioinformatics task that often requires implementing a complex workflow intermixing database access, data filtering, format conversions, identifier mapping, among further diverse operations. Data integration is especially important when annotating next generation sequencing data, where a multitude of diverse tools and heterogeneous databases can be used to provide a large variety of annotation for genomic locations, such a single nucleotide variants or genes. Each tool and data source is potentially useful for a given project and often more than one are used in parallel for the same purpose. However, software that always produces all available data is difficult to maintain and quickly leads to an excess of data, creating an information overload rather than the desired goal-oriented and integrated result. We present SoFIA, a framework for workflow-driven data integration with a focus on genomic annotation. SoFIA conceptualizes workflow templates as comprehensive workflows that cover as many data integration operations as possible in a given domain. However, these templates are not intended to be executed as a whole; instead, when given an integration task consisting of a set of input data and a set of desired output data, SoFIA derives a minimal workflow that completes the task. These workflows are typically fast and create exactly the information a user wants without requiring them to do any implementation work. Using a comprehensive genome annotation template, we highlight the flexibility, extensibility and power of the framework using real-life case studies. https://github.com/childsish/sofia/releases/latest under the GNU General Public License liam.childs@hu-berlin.de Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  3. Integrate offsites management with information systems

    Energy Technology Data Exchange (ETDEWEB)

    Valleur, M. (TECHNIP, Paris (France))

    1993-11-01

    Computerized offsites management systems in oil refineries offer a unique opportunity to integrate advanced technology into a coherent refinery information system that contributes to benefits-driven optimal operations: from long-term, multirefinery linear programming (LP) models to sequential control of transfer lineups in the tank farm. There are strong incentives to automate and optimize the offsites operations, and benefits can be quantified to justify properly sized projects. The paper discusses the following: business opportunities, oil movement and advanced technology, project scoping and sizing, review of functional requirements, transfer automation, blending optimal control, on-line analyzers, oil movement and scheduling, organizational issues, and investment and benefits analysis.

  4. Dutch virtual integration of healthcare information.

    Science.gov (United States)

    de Graaf, J C; Vlug, A E; van Boven, G J

    2007-01-01

    As information technology creates opportunities for cooperation which crosses the boundaries between healthcare institutions, it will become an integral part of the Dutch healthcare system. Along with many involved organizations in healthcare the National IT Institute for Healthcare in the Netherlands (NICTIZ) is working on the realization of a national IT infrastructure for healthcare and a national electronic patient record (EPR). An underlying national architecture is designed to enable the Dutch EPR virtually, not in a national database, nor on a patient's smartcard. The required secure infrastructure provides generic functions for healthcare applications: patient identification, authentication and authorization of healthcare professionals. The first national applications in the EPR program using a national index of where patient data is stored, are the electronic medication record and the electronic record for after hours GP services. The rollout of the electronic medication record and electronic record for after hours GP services has been started in 2007. To guarantee progress of electronic data exchange in healthcare in the Netherlands we have primarily opted for two healthcare applications: the electronic medication record and the electronic record for after hours GP services. The use of a national switch-point containing the registry of where to find what information, guarantees that the professional receives the most recent information and omits large databases to contain downloaded data. Proper authorization, authentication as well as tracing by the national switchpoint also ensures a secure environment for the communication of delicate information.

  5. Mapping soil heterogeneity using RapidEye satellite images

    Science.gov (United States)

    Piccard, Isabelle; Eerens, Herman; Dong, Qinghan; Gobin, Anne; Goffart, Jean-Pierre; Curnel, Yannick; Planchon, Viviane

    2016-04-01

    In the frame of BELCAM, a project funded by the Belgian Science Policy Office (BELSPO), researchers from UCL, ULg, CRA-W and VITO aim to set up a collaborative system to develop and deliver relevant information for agricultural monitoring in Belgium. The main objective is to develop remote sensing methods and processing chains able to ingest crowd sourcing data, provided by farmers or associated partners, and to deliver in return relevant and up-to-date information for crop monitoring at the field and district level based on Sentinel-1 and -2 satellite imagery. One of the developments within BELCAM concerns an automatic procedure to detect soil heterogeneity within a parcel using optical high resolution images. Such heterogeneity maps can be used to adjust farming practices according to the detected heterogeneity. This heterogeneity may for instance be caused by differences in mineral composition of the soil, organic matter content, soil moisture or soil texture. Local differences in plant growth may be indicative for differences in soil characteristics. As such remote sensing derived vegetation indices may be used to reveal soil heterogeneity. VITO started to delineate homogeneous zones within parcels by analyzing a series of RapidEye images acquired in 2015 (as a precursor for Sentinel-2). Both unsupervised classification (ISODATA, K-means) and segmentation techniques were tested. Heterogeneity maps were generated from images acquired at different moments during the season (13 May, 30 June, 17 July, 31 August, 11 September and 1 November 2015). Tests were performed using blue, green, red, red edge and NIR reflectances separately and using derived indices such as NDVI, fAPAR, CIrededge, NDRE2. The results for selected winter wheat, maize and potato fields were evaluated together with experts from the collaborating agricultural research centers. For a few fields UAV images and/or yield measurements were available for comparison.

  6. Managing heterogeneous knowledge: A Theory of External Knowledge Integration

    NARCIS (Netherlands)

    Kraaijenbrink, Jeroen; Wijnhoven, Alphonsus B.J.M.

    2008-01-01

    Knowledge integration has been theorised at the levels of organisations and inter-organisational dyads. However, no theory exists yet of the integration of knowledge from an organisation's environment. This paper addresses this void in the literature by presenting a theory of external knowledge

  7. CellBase, a comprehensive collection of RESTful web services for retrieving relevant biological information from heterogeneous sources.

    Science.gov (United States)

    Bleda, Marta; Tarraga, Joaquin; de Maria, Alejandro; Salavert, Francisco; Garcia-Alonso, Luz; Celma, Matilde; Martin, Ainoha; Dopazo, Joaquin; Medina, Ignacio

    2012-07-01

    During the past years, the advances in high-throughput technologies have produced an unprecedented growth in the number and size of repositories and databases storing relevant biological data. Today, there is more biological information than ever but, unfortunately, the current status of many of these repositories is far from being optimal. Some of the most common problems are that the information is spread out in many small databases; frequently there are different standards among repositories and some databases are no longer supported or they contain too specific and unconnected information. In addition, data size is increasingly becoming an obstacle when accessing or storing biological data. All these issues make very difficult to extract and integrate information from different sources, to analyze experiments or to access and query this information in a programmatic way. CellBase provides a solution to the growing necessity of integration by easing the access to biological data. CellBase implements a set of RESTful web services that query a centralized database containing the most relevant biological data sources. The database is hosted in our servers and is regularly updated. CellBase documentation can be found at http://docs.bioinfo.cipf.es/projects/cellbase.

  8. Experimental methods and modeling techniques for description of cell population heterogeneity

    DEFF Research Database (Denmark)

    Lencastre Fernandes, Rita; Nierychlo, M.; Lundin, L.

    2011-01-01

    With the continuous development, in the last decades, of analytical techniques providing complex information at single cell level, the study of cell heterogeneity has been the focus of several research projects within analytical biotechnology. Nonetheless, the complex interplay between environmen......With the continuous development, in the last decades, of analytical techniques providing complex information at single cell level, the study of cell heterogeneity has been the focus of several research projects within analytical biotechnology. Nonetheless, the complex interplay between...

  9. Isotopic Recorders of Pollution in Heterogeneous Urban Areas

    Science.gov (United States)

    Pataki, D. E.; Cobley, L.; Smith, R. M.; Ehleringer, J. R.; Chritz, K.

    2017-12-01

    A significant difficulty in quantifying urban pollution lies in the extreme spatial and temporal heterogeneity of cities. Dense sources of both point and non-point source pollution as well as the dynamic role of human activities, which vary over very short time scales and small spatial scales, complicate efforts to establish long-term urban monitoring networks that are relevant at neighborhood, municipal, and regional scales. Fortunately, the natural abundance of isotopes of carbon, nitrogen, and other elements provides a wealth of information about the sources and fate of urban atmospheric pollution. In particular, soils and plant material integrate pollution sources and cycling over space and time, and have the potential to provide long-term records of pollution dynamics that extend back before atmospheric monitoring data are available. Similarly, sampling organic material at high spatial resolution can provide "isoscapes" that shed light on the spatial heterogeneity of pollutants in different urban parcels and neighborhoods, along roads of varying traffic density, and across neighborhoods of varying affluence and sociodemographic composition. We have compiled numerous datasets of the isotopic composition of urban organic matter that illustrate the potential for isotopic monitoring of urban areas as a means of understanding hot spots and hot moments in urban atmospheric biogeochemistry. Findings to date already reveal the critical role of affluence, economic activity, demographic change, and land management practices in influencing urban pollution sources and sinks, and suggest an important role of stable isotope and radioisotope measurements in urban atmospheric and biogeochemical monitoring.

  10. Integration of Hospital Information and Clinical Decision Support Systems to Enable the Reuse of Electronic Health Record Data.

    Science.gov (United States)

    Kopanitsa, Georgy

    2017-05-18

    The efficiency and acceptance of clinical decision support systems (CDSS) can increase if they reuse medical data captured during health care delivery. High heterogeneity of the existing legacy data formats has become the main barrier for the reuse of data. Thus, we need to apply data modeling mechanisms that provide standardization, transformation, accumulation and querying medical data to allow its reuse. In this paper, we focus on the interoperability issues of the hospital information systems (HIS) and CDSS data integration. Our study is based on the approach proposed by Marcos et al. where archetypes are used as a standardized mechanism for the interaction of a CDSS with an electronic health record (EHR). We build an integration tool to enable CDSSs collect data from various institutions without a need for modifications in the implementation. The approach implies development of a conceptual level as a set of archetypes representing concepts required by a CDSS. Treatment case data from Regional Clinical Hospital in Tomsk, Russia was extracted, transformed and loaded to the archetype database of a clinical decision support system. Test records' normalization has been performed by defining transformation and aggregation rules between the EHR data and the archetypes. These mapping rules were used to automatically generate openEHR compliant data. After the transformation, archetype data instances were loaded into the CDSS archetype based data storage. The performance times showed acceptable performance for the extraction stage with a mean of 17.428 s per year (3436 case records). The transformation times were also acceptable with 136.954 s per year (0.039 s per one instance). The accuracy evaluation showed the correctness and applicability of the method for the wide range of HISes. These operations were performed without interrupting the HIS workflow to prevent the HISes from disturbing the service provision to the users. The project results have proven that

  11. Multi-fields' coordination information integrated platform for nuclear power plant operation preparation

    International Nuclear Information System (INIS)

    Yuan Chang; Li Yong; Ye Zhiqiang

    2011-01-01

    To realize the coordination in multi-fields' work and information sharing, by applying the method of Enterprise Architecture (EA), the business architecture, functional flow and application architecture of Nuclear Power Plant's operation preparation information integrated platform are designed, which can realize the information sharing and coordination of multi fields. (authors)

  12. Very large scale heterogeneous integration (VLSHI) and wafer-level vacuum packaging for infrared bolometer focal plane arrays

    Science.gov (United States)

    Forsberg, Fredrik; Roxhed, Niclas; Fischer, Andreas C.; Samel, Björn; Ericsson, Per; Hoivik, Nils; Lapadatu, Adriana; Bring, Martin; Kittilsland, Gjermund; Stemme, Göran; Niklaus, Frank

    2013-09-01

    Imaging in the long wavelength infrared (LWIR) range from 8 to 14 μm is an extremely useful tool for non-contact measurement and imaging of temperature in many industrial, automotive and security applications. However, the cost of the infrared (IR) imaging components has to be significantly reduced to make IR imaging a viable technology for many cost-sensitive applications. This paper demonstrates new and improved fabrication and packaging technologies for next-generation IR imaging detectors based on uncooled IR bolometer focal plane arrays. The proposed technologies include very large scale heterogeneous integration for combining high-performance, SiGe quantum-well bolometers with electronic integrated read-out circuits and CMOS compatible wafer-level vacuum packing. The fabrication and characterization of bolometers with a pitch of 25 μm × 25 μm that are arranged on read-out-wafers in arrays with 320 × 240 pixels are presented. The bolometers contain a multi-layer quantum well SiGe thermistor with a temperature coefficient of resistance of -3.0%/K. The proposed CMOS compatible wafer-level vacuum packaging technology uses Cu-Sn solid-liquid interdiffusion (SLID) bonding. The presented technologies are suitable for implementation in cost-efficient fabless business models with the potential to bring about the cost reduction needed to enable low-cost IR imaging products for industrial, security and automotive applications.

  13. Quantitative image analysis of cellular heterogeneity in breast tumors complements genomic profiling.

    Science.gov (United States)

    Yuan, Yinyin; Failmezger, Henrik; Rueda, Oscar M; Ali, H Raza; Gräf, Stefan; Chin, Suet-Feung; Schwarz, Roland F; Curtis, Christina; Dunning, Mark J; Bardwell, Helen; Johnson, Nicola; Doyle, Sarah; Turashvili, Gulisa; Provenzano, Elena; Aparicio, Sam; Caldas, Carlos; Markowetz, Florian

    2012-10-24

    Solid tumors are heterogeneous tissues composed of a mixture of cancer and normal cells, which complicates the interpretation of their molecular profiles. Furthermore, tissue architecture is generally not reflected in molecular assays, rendering this rich information underused. To address these challenges, we developed a computational approach based on standard hematoxylin and eosin-stained tissue sections and demonstrated its power in a discovery and validation cohort of 323 and 241 breast tumors, respectively. To deconvolute cellular heterogeneity and detect subtle genomic aberrations, we introduced an algorithm based on tumor cellularity to increase the comparability of copy number profiles between samples. We next devised a predictor for survival in estrogen receptor-negative breast cancer that integrated both image-based and gene expression analyses and significantly outperformed classifiers that use single data types, such as microarray expression signatures. Image processing also allowed us to describe and validate an independent prognostic factor based on quantitative analysis of spatial patterns between stromal cells, which are not detectable by molecular assays. Our quantitative, image-based method could benefit any large-scale cancer study by refining and complementing molecular assays of tumor samples.

  14. Consistent data models and security standards for power system control through their standard compliant integration via ontologies; Einheitliche Datenmodelle und Sicherheitsstandards in der Netzleittechnik durch ihre standardkonforme Integration mittels Ontologien

    Energy Technology Data Exchange (ETDEWEB)

    Uslar, Mathias; Beenken, Petra; Beer, Sebastian [OFFIS, Oldenburg (Germany)

    2009-07-01

    The ongoing integration of distributed energy recourses into the existing power grid has lead to both grown communication costs and an increased need for interoperability between the involved actors. In this context, standardized and ontology- based data models help to reduce integration costs in heterogeneous system landscapes. Using ontology-based security profiles, such models can be extended with meta-data containing information about security measures for energyrelated data in need of protection. By this approach, we achieve both a unified data model and a unified security level. (orig.)

  15. Optimization of large-scale heterogeneous system-of-systems models.

    Energy Technology Data Exchange (ETDEWEB)

    Parekh, Ojas; Watson, Jean-Paul; Phillips, Cynthia Ann; Siirola, John; Swiler, Laura Painton; Hough, Patricia Diane (Sandia National Laboratories, Livermore, CA); Lee, Herbert K. H. (University of California, Santa Cruz, Santa Cruz, CA); Hart, William Eugene; Gray, Genetha Anne (Sandia National Laboratories, Livermore, CA); Woodruff, David L. (University of California, Davis, Davis, CA)

    2012-01-01

    Decision makers increasingly rely on large-scale computational models to simulate and analyze complex man-made systems. For example, computational models of national infrastructures are being used to inform government policy, assess economic and national security risks, evaluate infrastructure interdependencies, and plan for the growth and evolution of infrastructure capabilities. A major challenge for decision makers is the analysis of national-scale models that are composed of interacting systems: effective integration of system models is difficult, there are many parameters to analyze in these systems, and fundamental modeling uncertainties complicate analysis. This project is developing optimization methods to effectively represent and analyze large-scale heterogeneous system of systems (HSoS) models, which have emerged as a promising approach for describing such complex man-made systems. These optimization methods enable decision makers to predict future system behavior, manage system risk, assess tradeoffs between system criteria, and identify critical modeling uncertainties.

  16. Information Fusion Issues in the UK Environmental Science Community

    Science.gov (United States)

    Giles, J. R.

    2010-12-01

    The Earth is a complex, interacting system which cannot be neatly divided by discipline boundaries. To gain an holistic understanding of even a component of an Earth System requires researchers to draw information from multiple disciplines and integrate these to develop a broader understanding. But the barriers to achieving this are formidable. Research funders attempting to encourage the integration of information across disciplines need to take into account culture issues, the impact of intrusion of projects on existing information systems, ontologies and semantics, scale issues, heterogeneity and the uncertainties associated with combining information from diverse sources. Culture - There is a cultural dualism in the environmental sciences were information sharing is both rewarded and discouraged. Researchers who share information both gain new opportunities and risk reducing their chances of being first author in an high-impact journal. The culture of the environmental science community has to be managed to ensure that information fusion activities are encouraged. Intrusion - Existing information systems have an inertia of there own because of the intellectual and financial capital invested within them. Information fusion activities must recognise and seek to minimise the potential impact of their projects on existing systems. Low intrusion information fusions systems such as OGC web-service and the OpenMI Standard are to be preferred to whole-sale replacement of existing systems. Ontology and Semantics - Linking information across disciplines requires a clear understanding of the concepts deployed in the vocabulary used to describe them. Such work is a critical first step to creating routine information fusion. It is essential that national bodies, such as geological surveys organisations, document and publish their ontologies, semantics, etc. Scale - Environmental processes operate at scales ranging from microns to the scale of the Solar System and

  17. Information security architecture an integrated approach to security in the organization

    CERN Document Server

    Killmeyer, Jan

    2000-01-01

    An information security architecture is made up of several components. Each component in the architecture focuses on establishing acceptable levels of control. These controls are then applied to the operating environment of an organization. Functionally, information security architecture combines technical, practical, and cost-effective solutions to provide an adequate and appropriate level of security.Information Security Architecture: An Integrated Approach to Security in the Organization details the five key components of an information security architecture. It provides C-level executives

  18. Ensuring the integrity of information resources based methods dvooznakovoho structural data encoding

    Directory of Open Access Journals (Sweden)

    О.К. Юдін

    2009-01-01

    Full Text Available  Developed methods of estimation of noise stability and correction of structural code constructions to distortion in comunication of data in informatively communication systems and networks taking into account providing of integrity of informative resource.

  19. Whatever the cost? Information integration in memory-based inferences depends on cognitive effort.

    Science.gov (United States)

    Hilbig, Benjamin E; Michalkiewicz, Martha; Castela, Marta; Pohl, Rüdiger F; Erdfelder, Edgar

    2015-05-01

    One of the most prominent models of probabilistic inferences from memory is the simple recognition heuristic (RH). The RH theory assumes that judgments are based on recognition in isolation, such that other information is ignored. However, some prior research has shown that available knowledge is not generally ignored. In line with the notion of adaptive strategy selection--and, thus, a trade-off between accuracy and effort--we hypothesized that information integration crucially depends on how easily accessible information beyond recognition is, how much confidence decision makers have in this information, and how (cognitively) costly it is to acquire it. In three experiments, we thus manipulated (a) the availability of information beyond recognition, (b) the subjective usefulness of this information, and (c) the cognitive costs associated with acquiring this information. In line with the predictions, we found that RH use decreased substantially, the more easily and confidently information beyond recognition could be integrated, and increased substantially with increasing cognitive costs.

  20. Integrated information system for analysis of nuclear power plants

    International Nuclear Information System (INIS)

    Galperin, A.

    1994-01-01

    Performing complicated engineering analyses of a nuclear power plant requires storage and manipulation of a large amount of information, both data and knowledge. This information is characterized by its multidisciplinary nature, complexity, and diversity. The problems caused by inefficient and lengthy manual operations involving the data flow management within the frame-work of the safety-related analysis of a power plant can be solved by applying the computer aided engineering principles. These principles are the basis of the design of an integrated information storage system (IRIS). The basic idea is to create a computerized environment, which includes both database and functional capabilities. Consideration and analysis of the data types and required data manipulation capabilities as well as operational requirements, resulted in the choice of an object-oriented data-base management system (OODBMS) as a development platform for solving the software engineering problems. Several advantages of OODBMSs over conventional relations database systems were found of crucial importance, especially providing the necessary flexibility for different data types and extensibility potential. A detailed design of a data model is produced for the plant technical data and for the storage of analysis results. The overall system architecture was designed to assure the feasibility of integrating database capabilities with procedures and functions written in conventional algorithmic programming languages

  1. Towards integrating control and information theories from information-theoretic measures to control performance limitations

    CERN Document Server

    Fang, Song; Ishii, Hideaki

    2017-01-01

    This book investigates the performance limitation issues in networked feedback systems. The fact that networked feedback systems consist of control and communication devices and systems calls for the integration of control theory and information theory. The primary contributions of this book lie in two aspects: the newly-proposed information-theoretic measures and the newly-discovered control performance limitations. We first propose a number of information notions to facilitate the analysis. Using those notions, classes of performance limitations of networked feedback systems, as well as state estimation systems, are then investigated. In general, the book presents a unique, cohesive treatment of performance limitation issues of networked feedback systems via an information-theoretic approach. This book is believed to be the first to treat the aforementioned subjects systematically and in a unified manner, offering a unique perspective differing from existing books.

  2. AERIS: An Integrated Domain Information System for Aerospace Science and Technology

    Science.gov (United States)

    Hatua, Sudip Ranjan; Madalli, Devika P.

    2011-01-01

    Purpose: The purpose of this paper is to discuss the methodology in building an integrated domain information system with illustrations that provide proof of concept. Design/methodology/approach: The present work studies the usual search engine approach to information and its pitfalls. A methodology was adopted for construction of a domain-based…

  3. An Integrated Information System for Supporting Quality Management Tasks

    Science.gov (United States)

    Beyer, N.; Helmreich, W.

    2004-08-01

    In a competitive environment, well defined processes become the strategic advantage of a company. Hence, targeted Quality Management ensures efficiency, trans- parency and, ultimately, customer satisfaction. In the particular context of a Space Test Centre, a num- ber of specific Quality Management standards have to be applied. According to the revision of ISO 9001 dur- ing 2000, and due to the adaptation of ECSS-Q20-07, process orientation and data analysis are key tasks for ensuring and evaluating the efficiency of a company's processes. In line with these requirements, an integrated management system for accessing the necessary infor- mation to support Quality Management and other proc- esses has been established. Some of its test-related fea- tures are presented here. Easy access to the integrated management system from any work place at IABG's Space Test Centre is ensured by means of an intranet portal. It comprises a full set of quality-related process descriptions, information on test facilities, emergency procedures, and other relevant in- formation. The portal's web interface provides direct access to a couple of external applications. Moreover, easy updating of all information and low cost mainte- nance are features of this integrated information system. The timely and transparent management of non- conformances is covered by a dedicated NCR database which incorporates full documentation capability, elec- tronic signature and e-mail notification of concerned staff. A search interface allows for queries across all documented non-conformances. Furthermore, print ver- sions can be generated at any stage in the process, e.g. for distribution to customers. Feedback on customer satisfaction is sought through a web-based questionnaire. The process is initiated by the responsible test manager through submission of an e- mail that contains a hyperlink to a secure website, ask- ing the customer to complete the brief online form, which is directly fed to a database

  4. Integrated data analysis of fusion diagnostics by means of the Bayesian probability theory

    International Nuclear Information System (INIS)

    Fischer, R.; Dinklage, A.

    2004-01-01

    Integrated data analysis (IDA) of fusion diagnostics is the combination of heterogeneous diagnostics to obtain validated physical results. Benefits from the integrated approach result from a systematic use of interdependencies; in that sense IDA optimizes the extraction of information from sets of different data. For that purpose IDA requires a systematic and formalized error analysis of all (statistical and systematic) uncertainties involved in each diagnostic. Bayesian probability theory allows for a systematic combination of all information entering the diagnostic model by considering all uncertainties of the measured data, the calibration measurements, and the physical model. Prior physics knowledge on model parameters can be included. Handling of systematic errors is provided. A central goal of the integration of redundant or complementary diagnostics is to provide information to resolve inconsistencies by exploiting interdependencies. A comparable analysis of sets of diagnostics (meta-diagnostics) is performed by combining statistical and systematical uncertainties with model parameters and model uncertainties. Diagnostics improvement and experimental optimization and design of meta-diagnostics will be discussed

  5. An information integration system for structured documents, Web, and databases

    OpenAIRE

    Morishima, Atsuyuki

    1998-01-01

    Rapid advance in computer network technology has changed the style of computer utilization. Distributed computing resources over world-wide computer networks are available from our local computers. They include powerful computers and a variety of information sources. This change is raising more advanced requirements. Integration of distributed information sources is one of such requirements. In addition to conventional databases, structured documents have been widely used, and have increasing...

  6. MiSTIC, an integrated platform for the analysis of heterogeneity in large tumour transcriptome datasets.

    Science.gov (United States)

    Lemieux, Sebastien; Sargeant, Tobias; Laperrière, David; Ismail, Houssam; Boucher, Geneviève; Rozendaal, Marieke; Lavallée, Vincent-Philippe; Ashton-Beaucage, Dariel; Wilhelm, Brian; Hébert, Josée; Hilton, Douglas J; Mader, Sylvie; Sauvageau, Guy

    2017-07-27

    Genome-wide transcriptome profiling has enabled non-supervised classification of tumours, revealing different sub-groups characterized by specific gene expression features. However, the biological significance of these subtypes remains for the most part unclear. We describe herein an interactive platform, Minimum Spanning Trees Inferred Clustering (MiSTIC), that integrates the direct visualization and comparison of the gene correlation structure between datasets, the analysis of the molecular causes underlying co-variations in gene expression in cancer samples, and the clinical annotation of tumour sets defined by the combined expression of selected biomarkers. We have used MiSTIC to highlight the roles of specific transcription factors in breast cancer subtype specification, to compare the aspects of tumour heterogeneity targeted by different prognostic signatures, and to highlight biomarker interactions in AML. A version of MiSTIC preloaded with datasets described herein can be accessed through a public web server (http://mistic.iric.ca); in addition, the MiSTIC software package can be obtained (github.com/iric-soft/MiSTIC) for local use with personalized datasets. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  7. Applying Nightingale charts to evaluate the heterogeneity of biomedical waste in a Hospital

    Directory of Open Access Journals (Sweden)

    Janini Cristina Paiz

    2014-12-01

    Full Text Available OBJECTIVES: to evaluate the heterogeneity of biomedical waste (BW using Nightingale charts.METHOD: cross-sectional study consisting of data collection on wastes (direct observation of receptacles, physical characterisation, and gravimetric composition, development of a Management Information System, and creation of statistical charts.RESULTS: the wastes with the greatest degree of heterogeneity are, in order, recyclable, infectious, and organic wastes; chemical waste had the most efficient segregation; Nightingale charts are useful for quick visualisation and systematisation of information on heterogeneity.CONCLUSION: the development of a management information system and the use of Nightingale charts allows for the identification and correction of errors in waste segregation, which increase health risks and contamination by infectious and chemical wastes and reduce the sale and profit from recyclables.

  8. On transport in formations of large heterogeneity scales

    International Nuclear Information System (INIS)

    Dagan, Gedeon

    1990-01-01

    It has been suggested that in transport through heterogeneous aquifers, the effective dispersivity increases with the travel distance, since plumes encounter heterogeneity of increasing scales. This conclusion is underlain, however, by the assumption of ergodicity. If the plume is viewed as made up of different particles, this means that these particles move independently from a statistical point of view. To satisfy ergodicity the solute body has to be of a much larger extent than heterogeneity scales. Thus, if the latter are increasing for ever and the solute body is finite, ergodicity cannot be obeyed. To demonstrate this thesis we relate to the two-dimensional heterogeneity associated with transmissivity variations in the horizontal plane. First, the effective dispersion coefficient is defined as half the rate of change of the expected value of the solute body second spatial moment relative to its centroid. Subsequently the asymptotic large time limit of dispersivity is evaluated in terms of the log transmissivity integral scale and of the dimensions of the initial solute body in the direction of mean flow and normal to it. It is shown that for a thin plume aligned with the mean flow the effective dispersivity is zero and the effect of heterogeneity is a slight and finite expansion determined solely by the solute body size. In the case of a solute body transverse to the mean flow the effective dispersivity is different from zero, but has a maximal value which is again dependent on the solute body size and not on the heterogeneity scale. It is concluded that from a theoretical standpoint and for the definition of dispersivity adopted here for non-ergodic conditions, the claim of ever-increasing dispersivity with travel distance is not valid for the scale of heterogeneity analyzed here. (Author) (21 refs., 6 figs.)

  9. Statistical characterization of Earth’s heterogeneities from seismic scattering

    Science.gov (United States)

    Zheng, Y.; Wu, R.

    2009-12-01

    The distortion of a teleseismic wavefront carries information about the heterogeneities through which the wave propagates and it is manifestited as logarithmic amplitude (logA) and phase fluctuations of the direct P wave recorded by a seismic network. By cross correlating the fluctuations (e.g., logA-logA or phase-phase), we obtain coherence functions, which depend on spatial lags between stations and incident angles between the incident waves. We have mathematically related the depth-dependent heterogeneity spectrum to the observable coherence functions using seismic scattering theory. We will show that our method has sharp depth resolution. Using the HiNet seismic network data in Japan, we have inverted power spectra for two depth ranges, ~0-120km and below ~120km depth. The coherence functions formed by different groups of stations or by different groups of earthquakes at different back azimuths are similar. This demonstrates that the method is statistically stable and the inhomogeneities are statistically stationary. In both depth intervals, the trend of the spectral amplitude decays from large scale to small scale in a power-law fashion with exceptions at ~50km for the logA data. Due to the spatial spacing of the seismometers, only information from length scale 15km to 200km is inverted. However our scattering method provides new information on small to intermediate scales that are comparable to scales of the recycled materials and thus is complimentary to the global seismic tomography which reveals mainly large-scale heterogeneities on the order of ~1000km. The small-scale heterogeneities revealed here are not likely of pure thermal origin. Therefore, the length scale and strength of heterogeneities as a function of depth may provide important constraints in mechanical mixing of various components in the mantle convection.

  10. Global heterogeneous resource harvesting: the next-generation PanDA Pilot for ATLAS

    CERN Document Server

    Nilsson, Paul; The ATLAS collaboration

    2017-01-01

    The Production and Distributed Analysis system (PanDA), used for workload management in the ATLAS Experiment for over a decade, has in recent years expanded its reach to diverse new resource types such as HPCs, and innovative new workflows such as the Event Service. PanDA meets the heterogeneous resources it harvests in the PanDA pilot, which has embarked on a next-generation reengineering to efficiently integrate and exploit the new platforms and workflows. The new modular architecture is the product of a year of design and prototyping in conjunction with the design of a completely new component, Harvester, that will mediate a richer flow of control and information between pilot and PanDA. Harvester will enable more intelligent and dynamic matching between processing tasks and resources, with an initial focus on HPCs, simplifying the operator and user view of a PanDA site but internally leveraging deep information gathering on the resource to accrue detailed knowledge of a site's capabilities and dynamic sta...

  11. Global heterogeneous resource harvesting: the next-generation PanDA pilot for ATLAS

    CERN Document Server

    Nilsson, Paul; The ATLAS collaboration

    2017-01-01

    The Production and Distributed Analysis system (PanDA), used for workload management in the ATLAS Experiment for over a decade, has in recent years expanded its reach to diverse new resource types such as HPCs, and innovative new workflows such as the event service. PanDA meets the heterogeneous resources it harvests in the PanDA pilot, which has embarked on a next-generation reengineering to efficiently integrate and exploit the new platforms and workflows. The new modular architecture is the product of a year of design and prototyping in conjunction with the design of a completely new component, Harvester, that will mediate a richer flow of control and information between pilot and PanDA. Harvester will enable more intelligent and dynamic matching between processing tasks and resources, with an initial focus on HPCs, simplifying the operator and user view of a PanDA site but internally leveraging deep information gathering on the resource to accrue detailed knowledge of a site's capabilities and dynamic sta...

  12. International seminar on integrated information systems. Book of extended synopses

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-04-01

    The information available to the IAEA under comprehensive safeguards agreement with an Additional protocol is intended to provide for as complete a picture as practicable of a State's current or planned nuclear programme. The central components of the strengthened safeguards system are: increased IAEA access to and evaluation of information about States' nuclear and nuclear-related activities and increased physical access to relevant locations for verification of the exclusively peaceful content of a States' nuclear programme. Strengthening measures implemented under the existing legal authority of the Agency have contributed to increased information and physical access. Thus the role of integrated information systems for safeguards relevant data acquisition became more significant.

  13. International seminar on integrated information systems. Book of extended synopses

    International Nuclear Information System (INIS)

    2000-04-01

    The information available to the IAEA under comprehensive safeguards agreement with an Additional protocol is intended to provide for as complete a picture as practicable of a State's current or planned nuclear programme. The central components of the strengthened safeguards system are: increased IAEA access to and evaluation of information about States' nuclear and nuclear-related activities and increased physical access to relevant locations for verification of the exclusively peaceful content of a States' nuclear programme. Strengthening measures implemented under the existing legal authority of the Agency have contributed to increased information and physical access. Thus the role of integrated information systems for safeguards relevant data acquisition became more significant

  14. Identifying influential factors on integrated marketing planning using information technology

    Directory of Open Access Journals (Sweden)

    Karim Hamdi

    2014-07-01

    Full Text Available This paper presents an empirical investigation to identify important factors influencing integrated marketing planning using information technology. The proposed study designs a questionnaire for measuring integrated marketing planning, which consists of three categories of structural factors, behavioral factors and background factors. There are 40 questions associated with the proposed study in Likert scale. Cronbach alphas have been calculated for structural factors, behavioral factors and background factors as 0.89, 0.86 and 0.83, respectively. Using some statistical test, the study has confirmed the effects of three factors on integrated marketing. In addition, the implementation of Freedman test has revealed that structural factors were the most important factor followed by background factors and behavioral factors.

  15. Integrating Environmental and Information Systems Management: An Enterprise Architecture Approach

    Science.gov (United States)

    Noran, Ovidiu

    Environmental responsibility is fast becoming an important aspect of strategic management as the reality of climate change settles in and relevant regulations are expected to tighten significantly in the near future. Many businesses react to this challenge by implementing environmental reporting and management systems. However, the environmental initiative is often not properly integrated in the overall business strategy and its information system (IS) and as a result the management does not have timely access to (appropriately aggregated) environmental information. This chapter argues for the benefit of integrating the environmental management (EM) project into the ongoing enterprise architecture (EA) initiative present in all successful companies. This is done by demonstrating how a reference architecture framework and a meta-methodology using EA artefacts can be used to co-design the EM system, the organisation and its IS in order to achieve a much needed synergy.

  16. Heterogeneous agents and decison making within firms

    NARCIS (Netherlands)

    Hung, Chung-yu

    2015-01-01

    This dissertation explores the implications of agents’ heterogeneity in decision making within situations where information is not completely contractible. Specifically, the study applies empirical methods across three chapters to examine the role of employees’ traits and their mutual relationships

  17. Integrating Information Services in an Academic Setting: The Organizational and Technical Challenge.

    Science.gov (United States)

    Branin, Joseph J.; And Others

    1993-01-01

    Describes a project to integrate the support and delivery of information services to faculty and staff at the University of Minnesota from the planning phase to implementation of a new organizational entity. Topics addressed include technical and organizational integration, control and delivery of services, and networking and organizational fit.…

  18. Unified Information Access in Product Creation with an Integrated Control Desk

    Science.gov (United States)

    Wrasse, Kevin; Diener, Holger; Hayka, Haygazun; Stark, Rainer

    2017-06-01

    Customers demand for individualized products leads to a large variety of different products in small series and single-unit production. A high flexibility pressure in product creation is one result of this trend. In order to counteract the pressure, the information steadily increasing by Industry 4.0 must be made available at the workplace. Additionally, a better exchange of information between product development, production planning and production is necessary. The improvement of individual systems, like CAD, PDM, ERP and MES, can only achieve this to a limited extent. Since they mostly use systems from different manufacturers, the necessary deeper integration of information is only feasible for SMEs to a limited extend. The presented control desk helps to ensure a more flexible product creation as well as information exchange. It captures information from different IT systems in the production process and presents them integrated, task-oriented and oriented to the user’s mental model, e.g. information of the production combined with the 3D model of product parts, or information about product development on the 3D model of the production. The solution is a digital 3D model of the manufacturing environment, which is enriched by billboards for a quick information overview and web service windows to access detailed MES and PDM information. By this, the level of abstraction can be reduced and reacts to changed requirements in the short term, making informed decisions. The interaction with the control stands utilizes the touch skills of mobile and fixed systems such as smartphones, tablets and multitouch tables.

  19. Spatially heterogeneous dynamics investigated via a time-dependent four-point density correlation function

    DEFF Research Database (Denmark)

    Lacevic, N.; Starr, F. W.; Schrøder, Thomas

    2003-01-01

    correlation function g4(r,t) and corresponding "structure factor" S4(q,t) which measure the spatial correlations between the local liquid density at two points in space, each at two different times, and so are sensitive to dynamical heterogeneity. We study g4(r,t) and S4(q,t) via molecular dynamics......Relaxation in supercooled liquids above their glass transition and below the onset temperature of "slow" dynamics involves the correlated motion of neighboring particles. This correlated motion results in the appearance of spatially heterogeneous dynamics or "dynamical heterogeneity." Traditional...... two-point time-dependent density correlation functions, while providing information about the transient "caging" of particles on cooling, are unable to provide sufficiently detailed information about correlated motion and dynamical heterogeneity. Here, we study a four-point, time-dependent density...

  20. Information heterogeneity and intended college enrollment

    OpenAIRE

    Bleemer, Zachary; Zafar, Basit

    2014-01-01

    Despite a robust college premium, college attendance rates in the United States have remained stagnant and exhibit a substantial socioeconomic gradient. We focus on information gaps - specifically, incomplete information about college benefits and costs - as a potential explanation for these patterns. In a nationally representative survey of U.S. household heads, we show that perceptions of college costs and benefits are severely and systematically biased: 74 percent of our respondents undere...

  1. Integrated project management information systems: the French nuclear industry experience

    International Nuclear Information System (INIS)

    Jacquin, J.-C.; Caupin, G.-M.

    1990-01-01

    The article discusses the desirability of integrated project management systems within the French nuclear power industry. Change in demand for nuclear generation facilities over the last two decades has necessitated a change of policy concerning organization, cost and planning within the industry. Large corporate systems can benefit from integrating equipment and bulk materials tracking. Project management for the nuclear industry will, in future, need to incorporate computer aided design tools and project management information systems data bases as well as equipment and planning data. (UK)

  2. Integrated project management information systems: the French nuclear industry experience

    Energy Technology Data Exchange (ETDEWEB)

    Jacquin, J.-C.; Caupin, G.-M.

    1990-03-01

    The article discusses the desirability of integrated project management systems within the French nuclear power industry. Change in demand for nuclear generation facilities over the last two decades has necessitated a change of policy concerning organization, cost and planning within the industry. Large corporate systems can benefit from integrating equipment and bulk materials tracking. Project management for the nuclear industry will, in future, need to incorporate computer aided design tools and project management information systems data bases as well as equipment and planning data. (UK).

  3. The Integration of the Information and Communication Functions, and the Marketing of the Resulting Products.

    Science.gov (United States)

    Harris, Susan C.

    1985-01-01

    Discusses the theoretical basis for integration of information functions and communication functions, the relevance of this integration in the scientific information cycle, and its positive effect on commodity research networks. The application of this theory is described using three commodity programs of the Centro Internacional de Agricultura…

  4. Travelling wave phenomena in non-heterogeneous tissues

    DEFF Research Database (Denmark)

    Pedersen, Michael

    2006-01-01

    Disturbances (or information) propagating in heterogeneous biological tissues (or other media) are often modeled by a partial differential equation of the form $$ u''(t,x) +D(x)u'(t,x) +A(x)u(t,x)=f(t,x), $$ for $ 0...

  5. Fission gas release of MOX with heterogeneous structure

    International Nuclear Information System (INIS)

    Nakae, N.; Akiyama, H.; Kamimura, K; Delville, R.; Jutier, F.; Verwerft, M.; Miura, H.; Baba, T.

    2015-01-01

    It is very useful for fuel integrity evaluation to accumulate knowledge base on fuel behavior of uranium and plutonium mixed oxide (MOX) fuel used in light water reactors (LWRs). Fission gas release is one of fuel behaviors which have an impact on fuel integrity evaluation. Fission gas release behavior of MOX fuels having heterogeneous structure is focused in this study. MOX fuel rods with a heterogeneous fuel microstructure were irradiated in Halden reactor (IFA-702) and the BR-3/BR-2 CALLISTO Loop (CHIPS program). The 85 Kr gamma spectrometry measurements were carried out in specific cycles in order to examine the concerned LHR (Linear Heat Rate) for fission gas release in the CHIPS program. The concerned LHR is defined in this paper to be the LHR at which a certain additional fission gas release thermally occurs. Post-irradiation examination was performed to understand the fission gas release behavior in connection with the pellet microstructure. The followings conclusions can be made from this study. First, the concerned LHR for fission gas release is estimated to be in the range of 20-23 kW/m with burnup over 37 GWd/tM. It is moreover guessed that the concerned LHR for fission gas release tends to decrease with increasing burnup. Secondly It is observed that FGR (fission gas release rate) is positively correlated with LHR when the LHR exceeds the concerned value. Thirdly, when burnup dependence of fission gas release is discussed, effective burnup should be taken into account. The effective burnup is defined as the burnup at which the LHR should be exceed the concerned value at the last time during all the irradiation period. And fourthly, it appears that FGR inside Pu spots is higher than outside and that retained (not released) fission gases mainly exist in the fission gas bubbles. Since fission gases in bubbles are considered to be easily released during fuel temperature increase, this information is very important to estimate fission gas release behavior

  6. The integration of information and communication technology into community pharmacists practice in Barcelona.

    Science.gov (United States)

    Lupiáñez-Villanueva, Francisco; Hardey, Michael; Lluch, Maria

    2014-03-01

    The study aims to identify community pharmacists' (CPs) utilization of information and communication technology (ICT); to develop and characterize a typology of CPs' utilization of ICT and to identify factors that can enhance or inhibit the use of these technologies. An online survey of the 7649 members of the Pharmacist Association of Barcelona who had a registered email account in 2006 was carried out. Factor analysis, cluster analysis and binomial logit modelling were undertaken. Multivariate analysis of the CPs' responses to the survey (648) revealed two profiles of adoption of ICT. The first profile (40.75%) represents those CPs who place high emphasis on ICT within their practice. This group is therefore referred to as 'integrated CPs'. The second profile (59.25%) represents those CPs who make less use of ICT and so are consequently labelled 'non-integrated CPs'. Statistical modelling was used to identify variables that were important in predisposing CPs to integrate ICT with their work. From the analysis it is evident that responses to questions relating to 'recommend patients going on line for health information'; 'patients discuss or share their Internet health information findings'; 'emphasis on the Internet for communication and dissemination' and 'Pharmacists Professional Association information' play a positive and significant role in the probability of being an 'integrated CP'. The integration of ICT within CPs' practices cannot be adequately understood and appreciated without examining how CPs are making use of ICT within their own practice, their organizational context and the nature of the pharmacists-client relationship.

  7. Information Literacy for Multiple Disciplines: Toward a Campus-Wide Integration Model at Indiana University, Bloomington

    Directory of Open Access Journals (Sweden)

    Brian Winterman

    2011-11-01

    Full Text Available Within disciplines are a set of shared values and thought processes that students must master in order to become participants of that discipline. Information literacy as defined by the ACRL is a set of standards and principles that can apply to all disciplines. In order to produce information literate undergraduates in a given discipline, information literacy standards must be integrated with the values and processes of the discipline. In this study, librarians partnered with faculty in gender studies and molecular biology to integrate information literacy with courses in those areas. Student performance and attitudes improved as a result of the collaboration. This article discusses the collaboration process, the assessment methods and results, and the long-term importance of developing best practices for information literacy integration at the campus level through a disciplinary approach.

  8. Stochastic description of heterogeneities of permeability within groundwater flow models

    International Nuclear Information System (INIS)

    Cacas, M.C.; Lachassagne, P.; Ledoux, E.; Marsily, G. de

    1991-01-01

    In order to model radionuclide migration in the geosphere realistically at the field scale, the hydrogeologist needs to be able to simulate groundwater flow in heterogeneous media. Heterogeneity of the medium can be described using a stochastic approach, that affects the way in which a flow model is formulated. In this paper, we discuss the problems that we have encountered in modelling both continuous and fractured media. The stochastic approach leads to a methodology that enables local measurements of permeability to be integrated into a model which gives a good prediction of groundwater flow on a regional scale. 5 Figs.; 8 Refs

  9. A middleware-based platform for the integration of bioinformatic services

    Directory of Open Access Journals (Sweden)

    Guzmán Llambías

    2015-08-01

    Full Text Available Performing Bioinformatic´s experiments involve an intensive access to distributed services and information resources through Internet. Although existing tools facilitate the implementation of workflow-oriented applications, they lack of capabilities to integrate services beyond low-scale applications, particularly integrating services with heterogeneous interaction patterns and in a larger scale. This is particularly required to enable a large-scale distributed processing of biological data generated by massive sequencing technologies. On the other hand, such integration mechanisms are provided by middleware products like Enterprise Service Buses (ESB, which enable to integrate distributed systems following a Service Oriented Architecture. This paper proposes an integration platform, based on enterprise middleware, to integrate Bioinformatics services. It presents a multi-level reference architecture and focuses on ESB-based mechanisms to provide asynchronous communications, event-based interactions and data transformation capabilities. The paper presents a formal specification of the platform using the Event-B model.

  10. An open, component-based information infrastructure for integrated health information networks.

    Science.gov (United States)

    Tsiknakis, Manolis; Katehakis, Dimitrios G; Orphanoudakis, Stelios C

    2002-12-18

    A fundamental requirement for achieving continuity of care is the seamless sharing of multimedia clinical information. Different technological approaches can be adopted for enabling the communication and sharing of health record segments. In the context of the emerging global information society, the creation of and access to the integrated electronic health record (I-EHR) of a citizen has been assigned high priority in many countries. This requirement is complementary to an overall requirement for the creation of a health information infrastructure (HII) to support the provision of a variety of health telematics and e-health services. In developing a regional or national HII, the components or building blocks that make up the overall information system ought to be defined and an appropriate component architecture specified. This paper discusses current international priorities and trends in developing the HII. It presents technological challenges and alternative approaches towards the creation of an I-EHR, being the aggregation of health data created during all interactions of an individual with the healthcare system. It also presents results from an ongoing Research and Development (R&D) effort towards the implementation of the HII in HYGEIAnet, the regional health information network of Crete, Greece, using a component-based software engineering approach. Critical design decisions and related trade-offs, involved in the process of component specification and development, are also discussed and the current state of development of an I-EHR service is presented. Finally, Human Computer Interaction (HCI) and security issues, which are important for the deployment and use of any I-EHR service, are considered.

  11. Market Integration, Choice of Technology and Welfare

    DEFF Research Database (Denmark)

    Hansen, Jørgen Drud; Nielsen, Jørgen Ulff-Møller

    2010-01-01

    technologies. Market integration may induce a technological restructuring where firms either diversify their technologies or switch to a homogeneous technology. In general, market integration improves welfare. However, a small decrease of trade costs which induces a switch from heterogeneous technologies...... to a homogeneous technology may locally reduce global welfare. The model also shows that productivity differences lead to intra-industry firm heterogeneity in size and exports similar to the "new-new" trade models with monopolistic competition....

  12. Digital Microdroplet Ejection Technology-Based Heterogeneous Objects Prototyping

    Science.gov (United States)

    Yang, Jiquan; Feng, Chunmei; Yang, Jianfei; Zhu, Liya; Guo, Aiqing

    2016-01-01

    An integrate fabrication framework is presented to build heterogeneous objects (HEO) using digital microdroplets injecting technology and rapid prototyping. The heterogeneous materials part design and manufacturing method in structure and material was used to change the traditional process. The net node method was used for digital modeling that can configure multimaterials in time. The relationship of material, color, and jetting nozzle was built. The main important contributions are to combine the structure, material, and visualization in one process and give the digital model for manufacture. From the given model, it is concluded that the method is effective for HEO. Using microdroplet rapid prototyping and the model given in the paper HEO could be gotten basically. The model could be used in 3D biomanufacturing. PMID:26981110

  13. Digital Microdroplet Ejection Technology-Based Heterogeneous Objects Prototyping

    Directory of Open Access Journals (Sweden)

    Na Li

    2016-01-01

    Full Text Available An integrate fabrication framework is presented to build heterogeneous objects (HEO using digital microdroplets injecting technology and rapid prototyping. The heterogeneous materials part design and manufacturing method in structure and material was used to change the traditional process. The net node method was used for digital modeling that can configure multimaterials in time. The relationship of material, color, and jetting nozzle was built. The main important contributions are to combine the structure, material, and visualization in one process and give the digital model for manufacture. From the given model, it is concluded that the method is effective for HEO. Using microdroplet rapid prototyping and the model given in the paper HEO could be gotten basically. The model could be used in 3D biomanufacturing.

  14. Digital Microdroplet Ejection Technology-Based Heterogeneous Objects Prototyping.

    Science.gov (United States)

    Li, Na; Yang, Jiquan; Feng, Chunmei; Yang, Jianfei; Zhu, Liya; Guo, Aiqing

    2016-01-01

    An integrate fabrication framework is presented to build heterogeneous objects (HEO) using digital microdroplets injecting technology and rapid prototyping. The heterogeneous materials part design and manufacturing method in structure and material was used to change the traditional process. The net node method was used for digital modeling that can configure multimaterials in time. The relationship of material, color, and jetting nozzle was built. The main important contributions are to combine the structure, material, and visualization in one process and give the digital model for manufacture. From the given model, it is concluded that the method is effective for HEO. Using microdroplet rapid prototyping and the model given in the paper HEO could be gotten basically. The model could be used in 3D biomanufacturing.

  15. Heterogeneous ice slurry flow and concentration distribution in horizontal pipes

    International Nuclear Information System (INIS)

    Wang, Jihong; Zhang, Tengfei; Wang, Shugang

    2013-01-01

    Highlights: • A Mixture CFD model is applied to describe heterogeneous ice slurry flow. • The ice slurry rheological behavior is considered piecewise. • The coupled flow and concentration profiles in heterogeneous slurry flow is acquired. • The current numerical model achieves good balance between precision and universality. -- Abstract: Ice slurry is an energy-intensive solid–liquid mixture fluid which may play an important role in various cooling purposes. Knowing detailed flow information is important from the system design point of view. However, the heterogeneous ice slurry flow makes it difficult to be quantified due to the complex two phase flow characteristic. The present study applies a Mixture computational fluid dynamics (CFD) model based on different rheological behavior to characterize the heterogeneous ice slurry flow. The Mixture CFD model was firstly validated by three different experiments. Then the validated Mixture CFD model was applied to solve the ice slurry isothermal flow by considering the rheological behavior piecewise. Finally, the numerical solutions have displayed the coupled flow information, such as slurry velocity, ice particle concentration and pressure drop distribution. The results show that, the ice slurry flow distribution will appear varying degree of asymmetry under different operating conditions. The rheological behavior will be affected by the asymmetric flow distributions. When mean flow velocity is high, Thomas equation can be appropriate for describing ice slurry viscosity. While with the decreasing of mean flow velocity, the ice slurry behaves Bingham rheology. As compared with experimental pressure drop results, the relative errors of numerical computation are almost within ±15%. The Mixture CFD model is validated to be an effective model for describing heterogeneous ice slurry flow and could supply plentiful flow information

  16. Enterprise wide transparent information access

    International Nuclear Information System (INIS)

    Brown, J.

    1995-05-01

    The information management needs of the Department of Energy (DOE) represents a fertile domain for the development of highly sophisticated yet intuitive enterprise-wide computing solutions. These solutions must support business operations, research agendas, technology development efforts, decision support, and other application areas with a user base ranging from technical staff to the highest levels of management. One area of primary interest is in the Environmental Restoration and Waste Management Branch of DOE. In this arena, the issue of tracking and managing nuclear waste related to the long legacy of prior defense production and research programs is one of high visibility and great concern. The Tank Waste Information Network System (TWINS) application has been created by the Pacific Northwest Laboratory (PNL) for the DOE to assist in managing and accessing the information related to this mission. The TWINS solution addresses many of the technical issues faced by other efforts to provide integrated information access to a wide variety of stakeholders. TWINS provides secure transparent access to distributed heterogeneous multimedia information sources from around the DOE complex. The users interact with the information through a consistent user interface that presents the desired data in a common format regardless of the structure of the source information. The solutions developed by the TWINS project represent an integration of several technologies and products that can be applied to other mission areas within DOE and other government agencies. These solutions are now being applied to public and private sector problem domains as well. The successful integration and inter-operation of both commercial and custom modules into a flexible and extensible information architecture will help ensure that new problems facing DOE and other clients can be addressed more rapidly in the future by re-use of existing tools and techniques proven viable through the TWINS efforts

  17. HeNCE: A Heterogeneous Network Computing Environment

    Directory of Open Access Journals (Sweden)

    Adam Beguelin

    1994-01-01

    Full Text Available Network computing seeks to utilize the aggregate resources of many networked computers to solve a single problem. In so doing it is often possible to obtain supercomputer performance from an inexpensive local area network. The drawback is that network computing is complicated and error prone when done by hand, especially if the computers have different operating systems and data formats and are thus heterogeneous. The heterogeneous network computing environment (HeNCE is an integrated graphical environment for creating and running parallel programs over a heterogeneous collection of computers. It is built on a lower level package called parallel virtual machine (PVM. The HeNCE philosophy of parallel programming is to have the programmer graphically specify the parallelism of a computation and to automate, as much as possible, the tasks of writing, compiling, executing, debugging, and tracing the network computation. Key to HeNCE is a graphical language based on directed graphs that describe the parallelism and data dependencies of an application. Nodes in the graphs represent conventional Fortran or C subroutines and the arcs represent data and control flow. This article describes the present state of HeNCE, its capabilities, limitations, and areas of future research.

  18. Design and Applications of a GeoSemantic Framework for Integration of Data and Model Resources in Hydrologic Systems

    Science.gov (United States)

    Elag, M.; Kumar, P.

    2016-12-01

    Hydrologists today have to integrate resources such as data and models, which originate and reside in multiple autonomous and heterogeneous repositories over the Web. Several resource management systems have emerged within geoscience communities for sharing long-tail data, which are collected by individual or small research groups, and long-tail models, which are developed by scientists or small modeling communities. While these systems have increased the availability of resources within geoscience domains, deficiencies remain due to the heterogeneity in the methods, which are used to describe, encode, and publish information about resources over the Web. This heterogeneity limits our ability to access the right information in the right context so that it can be efficiently retrieved and understood without the Hydrologist's mediation. A primary challenge of the Web today is the lack of the semantic interoperability among the massive number of resources, which already exist and are continually being generated at rapid rates. To address this challenge, we have developed a decentralized GeoSemantic (GS) framework, which provides three sets of micro-web services to support (i) semantic annotation of resources, (ii) semantic alignment between the metadata of two resources, and (iii) semantic mediation among Standard Names. Here we present the design of the framework and demonstrate its application for semantic integration between data and models used in the IML-CZO. First we show how the IML-CZO data are annotated using the Semantic Annotation Services. Then we illustrate how the Resource Alignment Services and Knowledge Integration Services are used to create a semantic workflow among TopoFlow model, which is a spatially-distributed hydrologic model and the annotated data. Results of this work are (i) a demonstration of how the GS framework advances the integration of heterogeneous data and models of water-related disciplines by seamless handling of their semantic

  19. SCSODC: Integrating Ocean Data for Visualization Sharing and Application

    International Nuclear Information System (INIS)

    Xu, C; Xie, Q; Li, S; Wang, D

    2014-01-01

    The South China Sea Ocean Data Center (SCSODC) was founded in 2010 in order to improve collecting and managing of ocean data of the South China Sea Institute of Oceanology (SCSIO). The mission of SCSODC is to ensure the long term scientific stewardship of ocean data, information and products – collected through research groups, monitoring stations and observation cruises – and to facilitate the efficient use and distribution to possible users. However, data sharing and applications were limited due to the characteristics of distribution and heterogeneity that made it difficult to integrate the data. To surmount those difficulties, the Data Sharing System has been developed by the SCSODC using the most appropriate information management and information technology. The Data Sharing System uses open standards and tools to promote the capability to integrate ocean data and to interact with other data portals or users and includes a full range of processes such as data discovery, evaluation and access combining C/S and B/S mode. It provides a visualized management interface for the data managers and a transparent and seamless data access and application environment for users. Users are allowed to access data using the client software and to access interactive visualization application interface via a web browser. The architecture, key technologies and functionality of the system are discussed briefly in this paper. It is shown that the system of SCSODC is able to implement web visualization sharing and seamless access to ocean data in a distributed and heterogeneous environment

  20. Extending Current Theories of Cross-Boundary Information Sharing and Integration: A Case Study of Taiwan e-Government

    Science.gov (United States)

    Yang, Tung-Mou

    2011-01-01

    Information sharing and integration has long been considered an important approach for increasing organizational efficiency and performance. With advancements in information and communication technologies, sharing and integrating information across organizations becomes more attractive and practical to organizations. However, achieving…