WorldWideScience

Sample records for integrated analysis based

  1. Train integrity detection risk analysis based on PRISM

    Science.gov (United States)

    Wen, Yuan

    2018-04-01

    GNSS based Train Integrity Monitoring System (TIMS) is an effective and low-cost detection scheme for train integrity detection. However, as an external auxiliary system of CTCS, GNSS may be influenced by external environments, such as uncertainty of wireless communication channels, which may lead to the failure of communication and positioning. In order to guarantee the reliability and safety of train operation, a risk analysis method of train integrity detection based on PRISM is proposed in this article. First, we analyze the risk factors (in GNSS communication process and the on-board communication process) and model them. Then, we evaluate the performance of the model in PRISM based on the field data. Finally, we discuss how these risk factors influence the train integrity detection process.

  2. Integrated failure probability estimation based on structural integrity analysis and failure data: Natural gas pipeline case

    International Nuclear Information System (INIS)

    Dundulis, Gintautas; Žutautaitė, Inga; Janulionis, Remigijus; Ušpuras, Eugenijus; Rimkevičius, Sigitas; Eid, Mohamed

    2016-01-01

    In this paper, the authors present an approach as an overall framework for the estimation of the failure probability of pipelines based on: the results of the deterministic-probabilistic structural integrity analysis (taking into account loads, material properties, geometry, boundary conditions, crack size, and defected zone thickness), the corrosion rate, the number of defects and failure data (involved into the model via application of Bayesian method). The proposed approach is applied to estimate the failure probability of a selected part of the Lithuanian natural gas transmission network. The presented approach for the estimation of integrated failure probability is a combination of several different analyses allowing us to obtain: the critical crack's length and depth, the failure probability of the defected zone thickness, dependency of the failure probability on the age of the natural gas transmission pipeline. A model's uncertainty analysis and uncertainty propagation analysis are performed, as well. - Highlights: • Degradation mechanisms of natural gas transmission pipelines. • Fracture mechanic analysis of the pipe with crack. • Stress evaluation of the pipe with critical crack. • Deterministic-probabilistic structural integrity analysis of gas pipeline. • Integrated estimation of pipeline failure probability by Bayesian method.

  3. Harmonic analysis in integrated energy system based on compressed sensing

    International Nuclear Information System (INIS)

    Yang, Ting; Pen, Haibo; Wang, Dan; Wang, Zhaoxia

    2016-01-01

    Highlights: • We propose a harmonic/inter-harmonic analysis scheme with compressed sensing theory. • Property of sparseness of harmonic signal in electrical power system is proved. • The ratio formula of fundamental and harmonic components sparsity is presented. • Spectral Projected Gradient-Fundamental Filter reconstruction algorithm is proposed. • SPG-FF enhances the precision of harmonic detection and signal reconstruction. - Abstract: The advent of Integrated Energy Systems enabled various distributed energy to access the system through different power electronic devices. The development of this has made the harmonic environment more complex. It needs low complexity and high precision of harmonic detection and analysis methods to improve power quality. To solve the shortages of large data storage capacities and high complexity of compression in sampling under the Nyquist sampling framework, this research paper presents a harmonic analysis scheme based on compressed sensing theory. The proposed scheme enables the performance of the functions of compressive sampling, signal reconstruction and harmonic detection simultaneously. In the proposed scheme, the sparsity of the harmonic signals in the base of the Discrete Fourier Transform (DFT) is numerically calculated first. This is followed by providing a proof of the matching satisfaction of the necessary conditions for compressed sensing. The binary sparse measurement is then leveraged to reduce the storage space in the sampling unit in the proposed scheme. In the recovery process, the scheme proposed a novel reconstruction algorithm called the Spectral Projected Gradient with Fundamental Filter (SPG-FF) algorithm to enhance the reconstruction precision. One of the actual microgrid systems is used as simulation example. The results of the experiment shows that the proposed scheme effectively enhances the precision of harmonic and inter-harmonic detection with low computing complexity, and has good

  4. Sensitivity Analysis Based on Markovian Integration by Parts Formula

    Directory of Open Access Journals (Sweden)

    Yongsheng Hang

    2017-10-01

    Full Text Available Sensitivity analysis is widely applied in financial risk management and engineering; it describes the variations brought by the changes of parameters. Since the integration by parts technique for Markov chains is well developed in recent years, in this paper we apply it for computation of sensitivity and show the closed-form expressions for two commonly-used time-continuous Markovian models. By comparison, we conclude that our approach outperforms the existing technique of computing sensitivity on Markovian models.

  5. Geospatial analysis based on GIS integrated with LADAR.

    Science.gov (United States)

    Fetterman, Matt R; Freking, Robert; Fernandez-Cull, Christy; Hinkle, Christopher W; Myne, Anu; Relyea, Steven; Winslow, Jim

    2013-10-07

    In this work, we describe multi-layered analyses of a high-resolution broad-area LADAR data set in support of expeditionary activities. High-level features are extracted from the LADAR data, such as the presence and location of buildings and cars, and then these features are used to populate a GIS (geographic information system) tool. We also apply line-of-sight (LOS) analysis to develop a path-planning module. Finally, visualization is addressed and enhanced with a gesture-based control system that allows the user to navigate through the enhanced data set in a virtual immersive experience. This work has operational applications including military, security, disaster relief, and task-based robotic path planning.

  6. Integrated vehicle-based safety systems (IVBSS) : light vehicle platform field operational test data analysis plan.

    Science.gov (United States)

    2009-12-22

    This document presents the University of Michigan Transportation Research Institutes plan to : perform analysis of data collected from the light vehicle platform field operational test of the : Integrated Vehicle-Based Safety Systems (IVBSS) progr...

  7. Integrated vehicle-based safety systems (IVBSS) : heavy truck platform field operational test data analysis plan.

    Science.gov (United States)

    2009-11-23

    This document presents the University of Michigan Transportation Research Institutes plan to perform : analysis of data collected from the heavy truck platform field operational test of the Integrated Vehicle- : Based Safety Systems (IVBSS) progra...

  8. Performance analysis of IMS based LTE and WIMAX integration architectures

    Directory of Open Access Journals (Sweden)

    A. Bagubali

    2016-12-01

    Full Text Available In the current networking field many research works are going on regarding the integration of different wireless technologies, with the aim of providing uninterrupted connectivity to the user anywhere, with high data rates due to increased demand. However, the number of objects like smart devices, industrial machines, smart homes, connected by wireless interface is dramatically increasing due to the evolution of cloud computing and internet of things technology. This Paper begins with the challenges involved in such integrations and then explains the role of different couplings and different architectures. This paper also gives further improvement in the LTE and Wimax integration architectures to provide seamless vertical handover and flexible quality of service for supporting voice, video, multimedia services over IP network and mobility management with the help of IMS networks. Evaluation of various parameters like handover delay, cost of signalling, packet loss,, is done and the performance of the interworking architecture is analysed from the simulation results. Finally, it concludes that the cross layer scenario is better than the non cross layer scenario.

  9. Integrating forest inventory and analysis data into a LIDAR-based carbon monitoring system

    Science.gov (United States)

    Kristofer D. Johnson; Richard Birdsey; Andrew O Finley; Anu Swantaran; Ralph Dubayah; Craig Wayson; Rachel. Riemann

    2014-01-01

    Forest Inventory and Analysis (FIA) data may be a valuable component of a LIDAR-based carbon monitoring system, but integration of the two observation systems is not without challenges. To explore integration methods, two wall-to-wall LIDAR-derived biomass maps were compared to FIA data at both the plot and county levels in Anne Arundel and Howard Counties in Maryland...

  10. Research on Integrated Analysis Method for Equipment and Tactics Based on Intervention Strategy Discussion

    Institute of Scientific and Technical Information of China (English)

    陈超; 张迎新; 毛赤龙

    2012-01-01

    As the increase of the complexity of the information warfare,its intervention strategy needs to be designed in an integrated environment.However,the current research always breaks the internal relation between equipment and tactics,and it is difficult to meet the requirements of their integrated analysis.In this paper,the research status quo of the integrated analysis about equipment and tactics is discussed first,some shortages of the current methods are summarized then,and an evolvement mechanism of the integrated analysis for equipment and tactics is given finally.Based on these,a framework of integrated analysis is proposed.This method's effectiveness is validated by an example.

  11. An Analysis of Delay-based and Integrator-based Sequence Detectors for Grid-Connected Converters

    DEFF Research Database (Denmark)

    Khazraj, Hesam; Silva, Filipe Miguel Faria da; Bak, Claus Leth

    2017-01-01

    -signal cancellation operators are the main members of the delay-based sequence detectors. The aim of this paper is to provide a theoretical and experimental comparative study between integrator and delay based sequence detectors. The theoretical analysis is conducted based on the small-signal modelling......Detecting and separating positive and negative sequence components of the grid voltage or current is of vital importance in the control of grid-connected power converters, HVDC systems, etc. To this end, several techniques have been proposed in recent years. These techniques can be broadly...... classified into two main classes: The integrator-based techniques and Delay-based techniques. The complex-coefficient filter-based technique, dual second-order generalized integrator-based method, multiple reference frame approach are the main members of the integrator-based sequence detector and the delay...

  12. Simultaneous and integrated neutron-based techniques for material analysis of a metallic ancient flute

    International Nuclear Information System (INIS)

    Festa, G; Andreani, C; Pietropaolo, A; Grazzi, F; Scherillo, A; Barzagli, E; Sutton, L F; Bognetti, L; Bini, A; Schooneveld, E

    2013-01-01

    A metallic 19th century flute was studied by means of integrated and simultaneous neutron-based techniques: neutron diffraction, neutron radiative capture analysis and neutron radiography. This experiment follows benchmark measurements devoted to assessing the effectiveness of a multitask beamline concept for neutron-based investigation on materials. The aim of this study is to show the potential application of the approach using multiple and integrated neutron-based techniques for musical instruments. Such samples, in the broad scenario of cultural heritage, represent an exciting research field. They may represent an interesting link between different disciplines such as nuclear physics, metallurgy and acoustics. (paper)

  13. Stability Analysis and Variational Integrator for Real-Time Formation Based on Potential Field

    Directory of Open Access Journals (Sweden)

    Shengqing Yang

    2014-01-01

    Full Text Available This paper investigates a framework of real-time formation of autonomous vehicles by using potential field and variational integrator. Real-time formation requires vehicles to have coordinated motion and efficient computation. Interactions described by potential field can meet the former requirement which results in a nonlinear system. Stability analysis of such nonlinear system is difficult. Our methodology of stability analysis is discussed in error dynamic system. Transformation of coordinates from inertial frame to body frame can help the stability analysis focus on the structure instead of particular coordinates. Then, the Jacobian of reduced system can be calculated. It can be proved that the formation is stable at the equilibrium point of error dynamic system with the effect of damping force. For consideration of calculation, variational integrator is introduced. It is equivalent to solving algebraic equations. Forced Euler-Lagrange equation in discrete expression is used to construct a forced variational integrator for vehicles in potential field and obstacle environment. By applying forced variational integrator on computation of vehicles' motion, real-time formation of vehicles in obstacle environment can be implemented. Algorithm based on forced variational integrator is designed for a leader-follower formation.

  14. Automics: an integrated platform for NMR-based metabonomics spectral processing and data analysis

    Directory of Open Access Journals (Sweden)

    Qu Lijia

    2009-03-01

    Full Text Available Abstract Background Spectral processing and post-experimental data analysis are the major tasks in NMR-based metabonomics studies. While there are commercial and free licensed software tools available to assist these tasks, researchers usually have to use multiple software packages for their studies because software packages generally focus on specific tasks. It would be beneficial to have a highly integrated platform, in which these tasks can be completed within one package. Moreover, with open source architecture, newly proposed algorithms or methods for spectral processing and data analysis can be implemented much more easily and accessed freely by the public. Results In this paper, we report an open source software tool, Automics, which is specifically designed for NMR-based metabonomics studies. Automics is a highly integrated platform that provides functions covering almost all the stages of NMR-based metabonomics studies. Automics provides high throughput automatic modules with most recently proposed algorithms and powerful manual modules for 1D NMR spectral processing. In addition to spectral processing functions, powerful features for data organization, data pre-processing, and data analysis have been implemented. Nine statistical methods can be applied to analyses including: feature selection (Fisher's criterion, data reduction (PCA, LDA, ULDA, unsupervised clustering (K-Mean and supervised regression and classification (PLS/PLS-DA, KNN, SIMCA, SVM. Moreover, Automics has a user-friendly graphical interface for visualizing NMR spectra and data analysis results. The functional ability of Automics is demonstrated with an analysis of a type 2 diabetes metabolic profile. Conclusion Automics facilitates high throughput 1D NMR spectral processing and high dimensional data analysis for NMR-based metabonomics applications. Using Automics, users can complete spectral processing and data analysis within one software package in most cases

  15. Automics: an integrated platform for NMR-based metabonomics spectral processing and data analysis.

    Science.gov (United States)

    Wang, Tao; Shao, Kang; Chu, Qinying; Ren, Yanfei; Mu, Yiming; Qu, Lijia; He, Jie; Jin, Changwen; Xia, Bin

    2009-03-16

    Spectral processing and post-experimental data analysis are the major tasks in NMR-based metabonomics studies. While there are commercial and free licensed software tools available to assist these tasks, researchers usually have to use multiple software packages for their studies because software packages generally focus on specific tasks. It would be beneficial to have a highly integrated platform, in which these tasks can be completed within one package. Moreover, with open source architecture, newly proposed algorithms or methods for spectral processing and data analysis can be implemented much more easily and accessed freely by the public. In this paper, we report an open source software tool, Automics, which is specifically designed for NMR-based metabonomics studies. Automics is a highly integrated platform that provides functions covering almost all the stages of NMR-based metabonomics studies. Automics provides high throughput automatic modules with most recently proposed algorithms and powerful manual modules for 1D NMR spectral processing. In addition to spectral processing functions, powerful features for data organization, data pre-processing, and data analysis have been implemented. Nine statistical methods can be applied to analyses including: feature selection (Fisher's criterion), data reduction (PCA, LDA, ULDA), unsupervised clustering (K-Mean) and supervised regression and classification (PLS/PLS-DA, KNN, SIMCA, SVM). Moreover, Automics has a user-friendly graphical interface for visualizing NMR spectra and data analysis results. The functional ability of Automics is demonstrated with an analysis of a type 2 diabetes metabolic profile. Automics facilitates high throughput 1D NMR spectral processing and high dimensional data analysis for NMR-based metabonomics applications. Using Automics, users can complete spectral processing and data analysis within one software package in most cases. Moreover, with its open source architecture, interested

  16. Integrative omics analysis. A study based on Plasmodium falciparum mRNA and protein data.

    Science.gov (United States)

    Tomescu, Oana A; Mattanovich, Diethard; Thallinger, Gerhard G

    2014-01-01

    Technological improvements have shifted the focus from data generation to data analysis. The availability of large amounts of data from transcriptomics, protemics and metabolomics experiments raise new questions concerning suitable integrative analysis methods. We compare three integrative analysis techniques (co-inertia analysis, generalized singular value decomposition and integrative biclustering) by applying them to gene and protein abundance data from the six life cycle stages of Plasmodium falciparum. Co-inertia analysis is an analysis method used to visualize and explore gene and protein data. The generalized singular value decomposition has shown its potential in the analysis of two transcriptome data sets. Integrative Biclustering applies biclustering to gene and protein data. Using CIA, we visualize the six life cycle stages of Plasmodium falciparum, as well as GO terms in a 2D plane and interpret the spatial configuration. With GSVD, we decompose the transcriptomic and proteomic data sets into matrices with biologically meaningful interpretations and explore the processes captured by the data sets. IBC identifies groups of genes, proteins, GO Terms and life cycle stages of Plasmodium falciparum. We show method-specific results as well as a network view of the life cycle stages based on the results common to all three methods. Additionally, by combining the results of the three methods, we create a three-fold validated network of life cycle stage specific GO terms: Sporozoites are associated with transcription and transport; merozoites with entry into host cell as well as biosynthetic and metabolic processes; rings with oxidation-reduction processes; trophozoites with glycolysis and energy production; schizonts with antigenic variation and immune response; gametocyctes with DNA packaging and mitochondrial transport. Furthermore, the network connectivity underlines the separation of the intraerythrocytic cycle from the gametocyte and sporozoite stages

  17. Study on Network Error Analysis and Locating based on Integrated Information Decision System

    Science.gov (United States)

    Yang, F.; Dong, Z. H.

    2017-10-01

    Integrated information decision system (IIDS) integrates multiple sub-system developed by many facilities, including almost hundred kinds of software, which provides with various services, such as email, short messages, drawing and sharing. Because the under-layer protocols are different, user standards are not unified, many errors are occurred during the stages of setup, configuration, and operation, which seriously affect the usage. Because the errors are various, which may be happened in different operation phases, stages, TCP/IP communication protocol layers, sub-system software, it is necessary to design a network error analysis and locating tool for IIDS to solve the above problems. This paper studies on network error analysis and locating based on IIDS, which provides strong theory and technology supports for the running and communicating of IIDS.

  18. Performance-Based Technology Selection Filter description report. INEL Buried Waste Integrated Demonstration System Analysis project

    Energy Technology Data Exchange (ETDEWEB)

    O`Brien, M.C.; Morrison, J.L.; Morneau, R.A.; Rudin, M.J.; Richardson, J.G.

    1992-05-01

    A formal methodology has been developed for identifying technology gaps and assessing innovative or postulated technologies for inclusion in proposed Buried Waste Integrated Demonstration (BWID) remediation systems. Called the Performance-Based Technology Selection Filter, the methodology provides a formalized selection process where technologies and systems are rated and assessments made based on performance measures, and regulatory and technical requirements. The results are auditable, and can be validated with field data. This analysis methodology will be applied to the remedial action of transuranic contaminated waste pits and trenches buried at the Idaho National Engineering Laboratory (INEL).

  19. Human reliability analysis of performing tasks in plants based on fuzzy integral

    International Nuclear Information System (INIS)

    Washio, Takashi; Kitamura, Yutaka; Takahashi, Hideaki

    1991-01-01

    The effective improvement of the human working conditions in nuclear power plants might be a solution for the enhancement of the operation safety. The human reliability analysis (HRA) gives a methodological basis of the improvement based on the evaluation of human reliability under various working conditions. This study investigates some difficulties of the human reliability analysis using conventional linear models and recent fuzzy integral models, and provides some solutions to the difficulties. The following practical features of the provided methods are confirmed in comparison with the conventional methods: (1) Applicability to various types of tasks (2) Capability of evaluating complicated dependencies among working condition factors (3) A priori human reliability evaluation based on a systematic task analysis of human action processes (4) A conversion scheme to probability from indices representing human reliability. (author)

  20. Integrating model checking with HiP-HOPS in model-based safety analysis

    International Nuclear Information System (INIS)

    Sharvia, Septavera; Papadopoulos, Yiannis

    2015-01-01

    The ability to perform an effective and robust safety analysis on the design of modern safety–critical systems is crucial. Model-based safety analysis (MBSA) has been introduced in recent years to support the assessment of complex system design by focusing on the system model as the central artefact, and by automating the synthesis and analysis of failure-extended models. Model checking and failure logic synthesis and analysis (FLSA) are two prominent MBSA paradigms. Extensive research has placed emphasis on the development of these techniques, but discussion on their integration remains limited. In this paper, we propose a technique in which model checking and Hierarchically Performed Hazard Origin and Propagation Studies (HiP-HOPS) – an advanced FLSA technique – can be applied synergistically with benefit for the MBSA process. The application of the technique is illustrated through an example of a brake-by-wire system. - Highlights: • We propose technique to integrate HiP-HOPS and model checking. • State machines can be systematically constructed from HiP-HOPS. • The strengths of different MBSA techniques are combined. • Demonstrated through modeling and analysis of brake-by-wire system. • Root cause analysis is automated and system dynamic behaviors analyzed and verified

  1. PANDORA: keyword-based analysis of protein sets by integration of annotation sources.

    Science.gov (United States)

    Kaplan, Noam; Vaaknin, Avishay; Linial, Michal

    2003-10-01

    Recent advances in high-throughput methods and the application of computational tools for automatic classification of proteins have made it possible to carry out large-scale proteomic analyses. Biological analysis and interpretation of sets of proteins is a time-consuming undertaking carried out manually by experts. We have developed PANDORA (Protein ANnotation Diagram ORiented Analysis), a web-based tool that provides an automatic representation of the biological knowledge associated with any set of proteins. PANDORA uses a unique approach of keyword-based graphical analysis that focuses on detecting subsets of proteins that share unique biological properties and the intersections of such sets. PANDORA currently supports SwissProt keywords, NCBI Taxonomy, InterPro entries and the hierarchical classification terms from ENZYME, SCOP and GO databases. The integrated study of several annotation sources simultaneously allows a representation of biological relations of structure, function, cellular location, taxonomy, domains and motifs. PANDORA is also integrated into the ProtoNet system, thus allowing testing thousands of automatically generated clusters. We illustrate how PANDORA enhances the biological understanding of large, non-uniform sets of proteins originating from experimental and computational sources, without the need for prior biological knowledge on individual proteins.

  2. Functional Analysis of OMICs Data and Small Molecule Compounds in an Integrated "Knowledge-Based" Platform.

    Science.gov (United States)

    Dubovenko, Alexey; Nikolsky, Yuri; Rakhmatulin, Eugene; Nikolskaya, Tatiana

    2017-01-01

    Analysis of NGS and other sequencing data, gene variants, gene expression, proteomics, and other high-throughput (OMICs) data is challenging because of its biological complexity and high level of technical and biological noise. One way to deal with both problems is to perform analysis with a high fidelity annotated knowledgebase of protein interactions, pathways, and functional ontologies. This knowledgebase has to be structured in a computer-readable format and must include software tools for managing experimental data, analysis, and reporting. Here, we present MetaCore™ and Key Pathway Advisor (KPA), an integrated platform for functional data analysis. On the content side, MetaCore and KPA encompass a comprehensive database of molecular interactions of different types, pathways, network models, and ten functional ontologies covering human, mouse, and rat genes. The analytical toolkit includes tools for gene/protein list enrichment analysis, statistical "interactome" tool for the identification of over- and under-connected proteins in the dataset, and a biological network analysis module made up of network generation algorithms and filters. The suite also features Advanced Search, an application for combinatorial search of the database content, as well as a Java-based tool called Pathway Map Creator for drawing and editing custom pathway maps. Applications of MetaCore and KPA include molecular mode of action of disease research, identification of potential biomarkers and drug targets, pathway hypothesis generation, analysis of biological effects for novel small molecule compounds and clinical applications (analysis of large cohorts of patients, and translational and personalized medicine).

  3. Integration of a satellite ground support system based on analysis of the satellite ground support domain

    Science.gov (United States)

    Pendley, R. D.; Scheidker, E. J.; Levitt, D. S.; Myers, C. R.; Werking, R. D.

    1994-11-01

    This analysis defines a complete set of ground support functions based on those practiced in real space flight operations during the on-orbit phase of a mission. These functions are mapped against ground support functions currently in use by NASA and DOD. Software components to provide these functions can be hosted on RISC-based work stations and integrated to provide a modular, integrated ground support system. Such modular systems can be configured to provide as much ground support functionality as desired. This approach to ground systems has been widely proposed and prototyped both by government institutions and commercial vendors. The combined set of ground support functions we describe can be used as a standard to evaluate candidate ground systems. This approach has also been used to develop a prototype of a modular, loosely-integrated ground support system, which is discussed briefly. A crucial benefit to a potential user is that all the components are flight-qualified, thus giving high confidence in their accuracy and reliability.

  4. A methodology for developing high-integrity knowledge base using document analysis and ECPN matrix analysis with backward simulation

    International Nuclear Information System (INIS)

    Park, Joo Hyun

    1999-02-01

    When transitions occur in large systems such as nuclear power plants (NPPs) or industrial process plants, it is often difficult to diagnose them. Various computer-based operator-aiding systems have been developed in order to help operators diagnose the transitions of the plants. In procedures for developing knowledge base system like operator-aiding systems, the knowledge acquisition and the knowledge base verification are core activities. This dissertation describes a knowledge acquisition method and a knowledge base verification method for developing high-integrity knowledge base system of NPP expert systems. The knowledge acquisition is one of the most difficult and time-consuming activities in developing knowledge base systems. There are two kinds of knowledge acquisition methods in view of knowledge sources. One is an acquisition method from human expert. This method, however, is not adequate to acquire the knowledge of NPP expert systems because the number of experts is not sufficient. In this work, we propose a novel knowledge acquisition method through documents analysis. The knowledge base can be built correctly, rapidly, and partially automatically through this method. This method is especially useful when it is difficult to find domain experts. Reliability of knowledge base systems depends on the quality of their knowledge base. Petri Net has been used to verify knowledge bases due to their formal outputs. The methods using Petri Net however are difficult to apply to large and complex knowledge bases because the Net becomes very large and complex. Also, with Petri Net, it is difficult to find proper input patterns that make anomalies occur. In order to overcome this difficulty, in this work, the anomaly candidates detection methods are developed based on Extended CPN (ECPN) matrix analysis. This work also defines the backward simulation of CPN to find compact input patterns for anomaly detection, which starts simulation from the anomaly candidates

  5. Integrated, paper-based potentiometric electronic tongue for the analysis of beer and wine

    International Nuclear Information System (INIS)

    Nery, Emilia Witkowska; Kubota, Lauro T.

    2016-01-01

    The following manuscript details the stages of construction of a novel paper-based electronic tongue with an integrated Ag/AgCl reference, which can operate using a minimal amount of sample (40 μL). First, we optimized the fabrication procedure of silver electrodes, testing a set of different methodologies (electroless plating, use of silver nanoparticles and commercial silver paints). Later a novel, integrated electronic tongue system was assembled with the use of readily available materials such as paper, wax, lamination sheets, bleach etc. New system was thoroughly characterized and the ion-selective potentiometric sensors presented performance close to theoretical. An electronic tongue, composed of electrodes sensitive to sodium, calcium, ammonia and a cross-sensitive, anion-selective electrode was used to analyze 34 beer samples (12 types, 19 brands). This system was able to discriminate beers from different brands, and types, indicate presence of stabilizers and antioxidants, dyes or even unmalted cereals and carbohydrates added to the fermentation wort. Samples could be classified by type of fermentation (low, high) and system was able to predict pH and in part also alcohol content of tested beers. In the next step sample volume was minimalized by the use of paper sample pads and measurement in flow conditions. In order to test the impact of this advancement a four electrode system, with cross-sensitive (anion-selective, cation-selective, Ca"2"+/Mg"2"+, K"+/Na"+) electrodes was applied for the analysis of 11 types of wine (4 types of grapes, red/white, 3 countries). Proposed matrix was able to group wines produced from different varieties of grapes (Chardonnay, Americanas, Malbec, Merlot) using only 40 μL of sample. Apart from that, storage stability studies were performed using a multimeter, therefore showing that not only fabrication but also detection can be accomplished by means of off-the-shelf components. This manuscript not only describes new

  6. Integrated, paper-based potentiometric electronic tongue for the analysis of beer and wine

    Energy Technology Data Exchange (ETDEWEB)

    Nery, Emilia Witkowska, E-mail: ewitkowskanery@ichf.edu.pl [Department of Analytical Chemistry, Institute of Chemistry – UNICAMP, P.O. Box 6154, 13084-971 Campinas, SP (Brazil); National Institute of Science and Technology in Bioanalytics, Institute of Chemistry – UNICAMP, P.O. Box 6154, Campinas (Brazil); Kubota, Lauro T. [Department of Analytical Chemistry, Institute of Chemistry – UNICAMP, P.O. Box 6154, 13084-971 Campinas, SP (Brazil); National Institute of Science and Technology in Bioanalytics, Institute of Chemistry – UNICAMP, P.O. Box 6154, Campinas (Brazil)

    2016-04-28

    The following manuscript details the stages of construction of a novel paper-based electronic tongue with an integrated Ag/AgCl reference, which can operate using a minimal amount of sample (40 μL). First, we optimized the fabrication procedure of silver electrodes, testing a set of different methodologies (electroless plating, use of silver nanoparticles and commercial silver paints). Later a novel, integrated electronic tongue system was assembled with the use of readily available materials such as paper, wax, lamination sheets, bleach etc. New system was thoroughly characterized and the ion-selective potentiometric sensors presented performance close to theoretical. An electronic tongue, composed of electrodes sensitive to sodium, calcium, ammonia and a cross-sensitive, anion-selective electrode was used to analyze 34 beer samples (12 types, 19 brands). This system was able to discriminate beers from different brands, and types, indicate presence of stabilizers and antioxidants, dyes or even unmalted cereals and carbohydrates added to the fermentation wort. Samples could be classified by type of fermentation (low, high) and system was able to predict pH and in part also alcohol content of tested beers. In the next step sample volume was minimalized by the use of paper sample pads and measurement in flow conditions. In order to test the impact of this advancement a four electrode system, with cross-sensitive (anion-selective, cation-selective, Ca{sup 2+}/Mg{sup 2+}, K{sup +}/Na{sup +}) electrodes was applied for the analysis of 11 types of wine (4 types of grapes, red/white, 3 countries). Proposed matrix was able to group wines produced from different varieties of grapes (Chardonnay, Americanas, Malbec, Merlot) using only 40 μL of sample. Apart from that, storage stability studies were performed using a multimeter, therefore showing that not only fabrication but also detection can be accomplished by means of off-the-shelf components. This manuscript not only

  7. Aerodynamic multi-objective integrated optimization based on principal component analysis

    Directory of Open Access Journals (Sweden)

    Jiangtao HUANG

    2017-08-01

    Full Text Available Based on improved multi-objective particle swarm optimization (MOPSO algorithm with principal component analysis (PCA methodology, an efficient high-dimension multi-objective optimization method is proposed, which, as the purpose of this paper, aims to improve the convergence of Pareto front in multi-objective optimization design. The mathematical efficiency, the physical reasonableness and the reliability in dealing with redundant objectives of PCA are verified by typical DTLZ5 test function and multi-objective correlation analysis of supercritical airfoil, and the proposed method is integrated into aircraft multi-disciplinary design (AMDEsign platform, which contains aerodynamics, stealth and structure weight analysis and optimization module. Then the proposed method is used for the multi-point integrated aerodynamic optimization of a wide-body passenger aircraft, in which the redundant objectives identified by PCA are transformed to optimization constraints, and several design methods are compared. The design results illustrate that the strategy used in this paper is sufficient and multi-point design requirements of the passenger aircraft are reached. The visualization level of non-dominant Pareto set is improved by effectively reducing the dimension without losing the primary feature of the problem.

  8. WebGimm: An integrated web-based platform for cluster analysis, functional analysis, and interactive visualization of results.

    Science.gov (United States)

    Joshi, Vineet K; Freudenberg, Johannes M; Hu, Zhen; Medvedovic, Mario

    2011-01-17

    Cluster analysis methods have been extensively researched, but the adoption of new methods is often hindered by technical barriers in their implementation and use. WebGimm is a free cluster analysis web-service, and an open source general purpose clustering web-server infrastructure designed to facilitate easy deployment of integrated cluster analysis servers based on clustering and functional annotation algorithms implemented in R. Integrated functional analyses and interactive browsing of both, clustering structure and functional annotations provides a complete analytical environment for cluster analysis and interpretation of results. The Java Web Start client-based interface is modeled after the familiar cluster/treeview packages making its use intuitive to a wide array of biomedical researchers. For biomedical researchers, WebGimm provides an avenue to access state of the art clustering procedures. For Bioinformatics methods developers, WebGimm offers a convenient avenue to deploy their newly developed clustering methods. WebGimm server, software and manuals can be freely accessed at http://ClusterAnalysis.org/.

  9. Thermodynamic analysis and optimization of IT-SOFC-based integrated coal gasification fuel cell power plants

    NARCIS (Netherlands)

    Romano, M.C.; Campanari, S.; Spallina, V.; Lozza, G.

    2011-01-01

    This work discusses the thermodynamic analysis of integrated gasification fuel cell plants, where a simple cycle gas turbine works in a hybrid cycle with a pressurized intermediate temperature–solid oxide fuel cell (SOFC), integrated with a coal gasification and syngas cleanup island and a bottoming

  10. Network Based Integrated Analysis of Phenotype-Genotype Data for Prioritization of Candidate Symptom Genes

    Directory of Open Access Journals (Sweden)

    Xing Li

    2014-01-01

    Full Text Available Background. Symptoms and signs (symptoms in brief are the essential clinical manifestations for individualized diagnosis and treatment in traditional Chinese medicine (TCM. To gain insights into the molecular mechanism of symptoms, we develop a computational approach to identify the candidate genes of symptoms. Methods. This paper presents a network-based approach for the integrated analysis of multiple phenotype-genotype data sources and the prediction of the prioritizing genes for the associated symptoms. The method first calculates the similarities between symptoms and diseases based on the symptom-disease relationships retrieved from the PubMed bibliographic database. Then the disease-gene associations and protein-protein interactions are utilized to construct a phenotype-genotype network. The PRINCE algorithm is finally used to rank the potential genes for the associated symptoms. Results. The proposed method gets reliable gene rank list with AUC (area under curve 0.616 in classification. Some novel genes like CALCA, ESR1, and MTHFR were predicted to be associated with headache symptoms, which are not recorded in the benchmark data set, but have been reported in recent published literatures. Conclusions. Our study demonstrated that by integrating phenotype-genotype relationships into a complex network framework it provides an effective approach to identify candidate genes of symptoms.

  11. Nature-based integration

    DEFF Research Database (Denmark)

    Pitkänen, Kati; Oratuomi, Joose; Hellgren, Daniela

    Increased attention to, and careful planning of the integration of migrants into Nordic societies is ever more important. Nature based integration is a new solution to respond to this need. This report presents the results of a Nordic survey and workshop and illustrates current practices of nature...... based integration by case study descriptions from Denmark, Sweden Norway and Finland. Across Nordic countries several practical projects and initiatives have been launched to promote the benefits of nature in integration and there is also growing academic interest in the topic. Nordic countries have...... the potential of becoming real forerunners in nature based integration even at the global scale....

  12. Driving pattern analysis of Nordic region based on the national travel surveys for electric vehicle integration

    DEFF Research Database (Denmark)

    Liu, Zhaoxi; Wu, Qiuwei; Christensen, Linda

    2015-01-01

    to the power system. This paper presents a methodology to transform driving behavior of persons into the one of cars in order to analyze the driving pattern of electric vehicles (EVs) based on the National Travel Surveys. In the proposed methodology, a statistical process is used to obtain the driving behavior......EVs show great potential to cope with the intermittency of renewable energy sources (RES) and provide demand side flexibility required by the smart grid.On the other hand, the EVs will increase the electricity consumption. Large scale integration of EVs will probably have substantial impacts...... of cars by grouping the survey respondents according to the driving license number and car number and mapping the households with similar characteristics. The proposed methodology was used to carry out the driving pattern analysis in the Nordic region. The detailed driving requirements and the charging...

  13. CAD-Based Modeling of Advanced Rotary Wing Structures for Integrated 3-D Aeromechanics Analysis

    Science.gov (United States)

    Staruk, William

    This dissertation describes the first comprehensive use of integrated 3-D aeromechanics modeling, defined as the coupling of 3-D solid finite element method (FEM) structural dynamics with 3-D computational fluid dynamics (CFD), for the analysis of a real helicopter rotor. The development of this new methodology (a departure from how rotor aeroelastic analysis has been performed for 40 years), its execution on a real rotor, and the fundamental understanding of aeromechanics gained from it, are the key contributions of this dissertation. This work also presents the first CFD/CSD analysis of a tiltrotor in edgewise flight, revealing many of its unique loading mechanisms. The use of 3-D FEM, integrated with a trim solver and aerodynamics modeling, has the potential to enhance the design of advanced rotors by overcoming fundamental limitations of current generation beam-based analysis tools and offering integrated internal dynamic stress and strain predictions for design. Two primary goals drove this research effort: 1) developing a methodology to create 3-D CAD-based brick finite element models of rotors including multibody joints, controls, and aerodynamic interfaces, and 2) refining X3D, the US Army's next generation rotor structural dynamics solver featuring 3-D FEM within a multibody formulation with integrated aerodynamics, to model a tiltrotor in the edgewise conversion flight regime, which drives critical proprotor structural loads. Prior tiltrotor analysis has primarily focused on hover aerodynamics with rigid blades or forward flight whirl-flutter stability with simplified aerodynamics. The first goal was met with the development of a detailed methodology for generating multibody 3-D structural models, starting from CAD geometry, continuing to higher-order hexahedral finite element meshing, to final assembly of the multibody model by creating joints, assigning material properties, and defining the aerodynamic interface. Several levels of verification and

  14. Establishing community-based integrated care for elderly patients through interprofessional teamwork: a qualitative analysis

    Directory of Open Access Journals (Sweden)

    Asakawa T

    2017-10-01

    Full Text Available Tomohiro Asakawa,1 Hidenobu Kawabata,1 Kengo Kisa,2 Takayoshi Terashita,3 Manabu Murakami,4 Junji Otaki1 1Department of Medical Education and General Medicine, Graduate School of Medicine, Hokkaido University, Sapporo, 2Kutchan-Kosei General Hospital, Kutchan, Hokkaido, 3Graduate School of Radiological Technology Gunma Prefectural College of Health Sciences, Kamioki-machi, Maebashi, Gunma, 4International Relations Office, Graduate School of Medicine, Hokkaido University, Sapporo, Hokkaido, Japan Background: Working in multidisciplinary teams is indispensable for ensuring high-quality care for elderly people in Japan’s rapidly aging society. However, health professionals often experience difficulty collaborating in practice because of their different educational backgrounds, ideas, and the roles of each profession. In this qualitative descriptive study, we reveal how to build interdisciplinary collaboration in multidisciplinary teams. Methods: Semi-structured interviews were conducted with a total of 26 medical professionals, including physicians, nurses, public health nurses, medical social workers, and clerical personnel. Each participant worked as a team member of community-based integrated care. The central topic of the interviews was what the participants needed to establish collaboration during the care of elderly residents. Each interview lasted for about 60 minutes. All the interviews were recorded, transcribed verbatim, and subjected to content analysis. Results: The analysis yielded the following three categories concerning the necessary elements of building collaboration: 1 two types of meeting configuration; 2 building good communication; and 3 effective leadership. The two meetings described in the first category – “community care meetings” and “individual care meetings” – were aimed at bringing together the disciplines and discussing individual cases, respectively. Building good communication referred to the activities

  15. A new method to identify the foot of continental slope based on an integrated profile analysis

    Science.gov (United States)

    Wu, Ziyin; Li, Jiabiao; Li, Shoujun; Shang, Jihong; Jin, Xiaobin

    2017-06-01

    A new method is proposed to identify automatically the foot of the continental slope (FOS) based on the integrated analysis of topographic profiles. Based on the extremum points of the second derivative and the Douglas-Peucker algorithm, it simplifies the topographic profiles, then calculates the second derivative of the original profiles and the D-P profiles. Seven steps are proposed to simplify the original profiles. Meanwhile, multiple identification methods are proposed to determine the FOS points, including gradient, water depth and second derivative values of data points, as well as the concave and convex, continuity and segmentation of the topographic profiles. This method can comprehensively and intelligently analyze the topographic profiles and their derived slopes, second derivatives and D-P profiles, based on which, it is capable to analyze the essential properties of every single data point in the profile. Furthermore, it is proposed to remove the concave points of the curve and in addition, to implement six FOS judgment criteria.

  16. Integrated, paper-based potentiometric electronic tongue for the analysis of beer and wine.

    Science.gov (United States)

    Nery, Emilia Witkowska; Kubota, Lauro T

    2016-04-28

    The following manuscript details the stages of construction of a novel paper-based electronic tongue with an integrated Ag/AgCl reference, which can operate using a minimal amount of sample (40 μL). First, we optimized the fabrication procedure of silver electrodes, testing a set of different methodologies (electroless plating, use of silver nanoparticles and commercial silver paints). Later a novel, integrated electronic tongue system was assembled with the use of readily available materials such as paper, wax, lamination sheets, bleach etc. New system was thoroughly characterized and the ion-selective potentiometric sensors presented performance close to theoretical. An electronic tongue, composed of electrodes sensitive to sodium, calcium, ammonia and a cross-sensitive, anion-selective electrode was used to analyze 34 beer samples (12 types, 19 brands). This system was able to discriminate beers from different brands, and types, indicate presence of stabilizers and antioxidants, dyes or even unmalted cereals and carbohydrates added to the fermentation wort. Samples could be classified by type of fermentation (low, high) and system was able to predict pH and in part also alcohol content of tested beers. In the next step sample volume was minimalized by the use of paper sample pads and measurement in flow conditions. In order to test the impact of this advancement a four electrode system, with cross-sensitive (anion-selective, cation-selective, Ca(2+)/Mg(2+), K(+)/Na(+)) electrodes was applied for the analysis of 11 types of wine (4 types of grapes, red/white, 3 countries). Proposed matrix was able to group wines produced from different varieties of grapes (Chardonnay, Americanas, Malbec, Merlot) using only 40 μL of sample. Apart from that, storage stability studies were performed using a multimeter, therefore showing that not only fabrication but also detection can be accomplished by means of off-the-shelf components. This manuscript not only describes new

  17. An Integrative Object-Based Image Analysis Workflow for Uav Images

    Science.gov (United States)

    Yu, Huai; Yan, Tianheng; Yang, Wen; Zheng, Hong

    2016-06-01

    In this work, we propose an integrative framework to process UAV images. The overall process can be viewed as a pipeline consisting of the geometric and radiometric corrections, subsequent panoramic mosaicking and hierarchical image segmentation for later Object Based Image Analysis (OBIA). More precisely, we first introduce an efficient image stitching algorithm after the geometric calibration and radiometric correction, which employs a fast feature extraction and matching by combining the local difference binary descriptor and the local sensitive hashing. We then use a Binary Partition Tree (BPT) representation for the large mosaicked panoramic image, which starts by the definition of an initial partition obtained by an over-segmentation algorithm, i.e., the simple linear iterative clustering (SLIC). Finally, we build an object-based hierarchical structure by fully considering the spectral and spatial information of the super-pixels and their topological relationships. Moreover, an optimal segmentation is obtained by filtering the complex hierarchies into simpler ones according to some criterions, such as the uniform homogeneity and semantic consistency. Experimental results on processing the post-seismic UAV images of the 2013 Ya'an earthquake demonstrate the effectiveness and efficiency of our proposed method.

  18. AN INTEGRATIVE OBJECT-BASED IMAGE ANALYSIS WORKFLOW FOR UAV IMAGES

    Directory of Open Access Journals (Sweden)

    H. Yu

    2016-06-01

    Full Text Available In this work, we propose an integrative framework to process UAV images. The overall process can be viewed as a pipeline consisting of the geometric and radiometric corrections, subsequent panoramic mosaicking and hierarchical image segmentation for later Object Based Image Analysis (OBIA. More precisely, we first introduce an efficient image stitching algorithm after the geometric calibration and radiometric correction, which employs a fast feature extraction and matching by combining the local difference binary descriptor and the local sensitive hashing. We then use a Binary Partition Tree (BPT representation for the large mosaicked panoramic image, which starts by the definition of an initial partition obtained by an over-segmentation algorithm, i.e., the simple linear iterative clustering (SLIC. Finally, we build an object-based hierarchical structure by fully considering the spectral and spatial information of the super-pixels and their topological relationships. Moreover, an optimal segmentation is obtained by filtering the complex hierarchies into simpler ones according to some criterions, such as the uniform homogeneity and semantic consistency. Experimental results on processing the post-seismic UAV images of the 2013 Ya’an earthquake demonstrate the effectiveness and efficiency of our proposed method.

  19. iTRAQ-Based Proteomics Analysis and Network Integration for Kernel Tissue Development in Maize

    Science.gov (United States)

    Dong, Yongbin; Wang, Qilei; Du, Chunguang; Xiong, Wenwei; Li, Xinyu; Zhu, Sailan; Li, Yuling

    2017-01-01

    Grain weight is one of the most important yield components and a developmentally complex structure comprised of two major compartments (endosperm and pericarp) in maize (Zea mays L.), however, very little is known concerning the coordinated accumulation of the numerous proteins involved. Herein, we used isobaric tags for relative and absolute quantitation (iTRAQ)-based comparative proteomic method to analyze the characteristics of dynamic proteomics for endosperm and pericarp during grain development. Totally, 9539 proteins were identified for both components at four development stages, among which 1401 proteins were non-redundant, 232 proteins were specific in pericarp and 153 proteins were specific in endosperm. A functional annotation of the identified proteins revealed the importance of metabolic and cellular processes, and binding and catalytic activities for the tissue development. Three and 76 proteins involved in 49 Kyoto Encyclopedia of Genes and Genomes (KEGG) pathways were integrated for the specific endosperm and pericarp proteins, respectively, reflecting their complex metabolic interactions. In addition, four proteins with important functions and different expression levels were chosen for gene cloning and expression analysis. Different concordance between mRNA level and the protein abundance was observed across different proteins, stages, and tissues as in previous research. These results could provide useful message for understanding the developmental mechanisms in grain development in maize. PMID:28837076

  20. GAUSS Market Analysis for Integrated Satellite Communication and Navigation Location Based services

    Science.gov (United States)

    Di Fazio, Antonella; Dricot, Fabienne; Tata, Francesco

    2003-07-01

    The demand for mobile information services coupled with positioning technologies for delivering value- added services that depend on a user's location has rapidly increased during last years. In particular, services and applications related with improved mobility safety and transport efficiency look very attractive.Solutions for location services vary in respect of positioning accuracy and the technical infrastructure required, and the associated investment in terminals and networks. From the analysis of the state-of-the art, it comes that various technologies are currently available on the European market, while mobile industry is gearing up to launch a wide variety of location services like tracking, alarming and locating.Nevertheless, when addressing safety of life as well as security applications, severe hurdles have to be posed in the light of existing technologies. Existing navigation (e.g. GPS) and communication systems are not able to completely satisfy the needs and requirements of safety-of-life-critical applications. As a matter of fact, the GPS system's main weaknesses today is its lack of integrity, which means its inability to warn users of a malfunction in a reasonable time, while the other positioning techniques do not provide satisfactory accuracy as well, and terrestrial communication networks are not capable to cope with stringent requirement in terms of service reliability and coverage.In this context, GAUSS proposes an innovative satellite-based solution using novel technology and effective tools for addressing mobility challenges in a cost efficient manner, improving safety and effectiveness.GAUSS (Galileo And UMTS Synergetic System) is a Research and Technological Development project co- funded by European Commission, within the frame of the 5th IST Programme. The project lasted two years, and it was successfully completed in November 2002. GAUSS key concept is the integration of Satellite Navigation GNSS and UMTS communication technology, to

  1. Proportional-integral controller based small-signal analysis of hybrid distributed generation systems

    International Nuclear Information System (INIS)

    Ray, Prakash K.; Mohanty, Soumya R.; Kishor, Nand

    2011-01-01

    Research highlights: → We aim to minimize the deviation of frequency in an integrated energy resources like offshore wind, photovoltaic (PV), fuel cell (FC) and diesel engine generator (DEG) along with the energy storage elements like flywheel energy storage system (FESS) and battery energy storage system (BESS). → Further ultracapacitor (UC) as an alternative energy storage element and proportional-integral (PI) controller is addressed in order to achieve improvements in the deviation of frequency profiles. → A comparative assessment of frequency deviation for different hybrid systems is also carried out in the presence of high voltage direct current (HVDC) link and high voltage alternating current (HVAC) line. → In the study both qualitative and quantitative analysis reflects the improvements in frequency deviation profiles with use of ultracapacitor (UC) as energy storage element. -- Abstract: The large band variation in the wind speed and unpredictable solar radiation causes remarkable fluctuations of output power in offshore wind and photovoltaic system respectively, which leads to large deviation in the system frequency. In this context, to minimize the deviation in frequency, this paper presents integration of different energy resources like offshore wind, photovoltaic (PV), fuel cell (FC) and diesel engine generator (DEG) along with the energy storage elements like flywheel energy storage system (FESS) and battery energy storage system (BESS). Further ultracapacitor (UC) as an alternative energy storage element and proportional-integral (PI) controller is addressed in order to achieve improvements in the deviation of frequency profiles. A comparative assessment of frequency deviation for different hybrid systems is also carried out in the presence of high-voltage direct current (HVDC) link and high-voltage alternating current (HVAC) line. Frequency deviation for different isolated hybrid systems are presented graphically as well as in terms of

  2. Integration Strategy Is a Key Step in Network-Based Analysis and Dramatically Affects Network Topological Properties and Inferring Outcomes

    Science.gov (United States)

    Jin, Nana; Wu, Deng; Gong, Yonghui; Bi, Xiaoman; Jiang, Hong; Li, Kongning; Wang, Qianghu

    2014-01-01

    An increasing number of experiments have been designed to detect intracellular and intercellular molecular interactions. Based on these molecular interactions (especially protein interactions), molecular networks have been built for using in several typical applications, such as the discovery of new disease genes and the identification of drug targets and molecular complexes. Because the data are incomplete and a considerable number of false-positive interactions exist, protein interactions from different sources are commonly integrated in network analyses to build a stable molecular network. Although various types of integration strategies are being applied in current studies, the topological properties of the networks from these different integration strategies, especially typical applications based on these network integration strategies, have not been rigorously evaluated. In this paper, systematic analyses were performed to evaluate 11 frequently used methods using two types of integration strategies: empirical and machine learning methods. The topological properties of the networks of these different integration strategies were found to significantly differ. Moreover, these networks were found to dramatically affect the outcomes of typical applications, such as disease gene predictions, drug target detections, and molecular complex identifications. The analysis presented in this paper could provide an important basis for future network-based biological researches. PMID:25243127

  3. Uncertainty analysis of an integrated energy system based on information theory

    International Nuclear Information System (INIS)

    Fu, Xueqian; Sun, Hongbin; Guo, Qinglai; Pan, Zhaoguang; Xiong, Wen; Wang, Li

    2017-01-01

    Currently, a custom-designed configuration of different renewable technologies named the integrated energy system (IES) has become popular due to its high efficiency, benefiting from complementary multi-energy technologies. This paper proposes an information entropy approach to quantify uncertainty in an integrated energy system based on a stochastic model that drives a power system model derived from an actual network on Barry Island. Due to the complexity of co-behaviours between generators, a copula-based approach is utilized to articulate the dependency structure of the generator outputs with regard to such factors as weather conditions. Correlation coefficients and mutual information, which are effective for assessing the dependence relationships, are applied to judge whether the stochastic IES model is correct. The calculated information values can be used to analyse the impacts of the coupling of power and heat on power flows and heat flows, and this approach will be helpful for improving the operation of IES. - Highlights: • The paper explores uncertainty of an integrated energy system. • The dependent weather model is verified from the perspective of correlativity. • The IES model considers the dependence between power and heat. • The information theory helps analyse the complexity of IES operation. • The application of the model is studied using an operational system on Barry Island.

  4. PLACE-BASED GREEN BUILDING: INTEGRATING LOCAL ENVIRONMENTAL AND PLANNING ANALYSIS INTO GREEN BUILDING GUIDELINES

    Science.gov (United States)

    This project will develop a model for place-based green building guidelines based on an analysis of local environmental, social, and land use conditions. The ultimate goal of this project is to develop a methodology and model for placing green buildings within their local cont...

  5. Integral equation based stability analysis of short wavelength drift modes in tokamaks

    International Nuclear Information System (INIS)

    Hirose, A.; Elia, M.

    2003-01-01

    Linear stability of electron skin-size drift modes in collisionless tokamak discharges has been investigated in terms of electromagnetic, kinetic integral equations in which neither ions nor electrons are assumed to be adiabatic. A slab-like ion temperature gradient mode persists in such a short wavelength regime. However, toroidicity has a strong stabilizing influence on this mode. In the electron branch, the toroidicity induced skin-size drift mode previously predicted in terms of local kinetic analysis has been recovered. The mode is driven by positive magnetic shear and strongly stabilized for negative shear. The corresponding mixing length anomalous thermal diffusivity exhibits favourable isotope dependence. (author)

  6. Integration of ROOT Notebooks as an ATLAS analysis web-based tool in outreach and public data release

    CERN Document Server

    Sanchez, Arturo; The ATLAS collaboration

    2016-01-01

    The integration of the ROOT data analysis framework with the Jupyter Notebook technology presents an incredible potential in the enhance and expansion of educational and training programs: starting from university students in their early years, passing to new ATLAS PhD students and post doctoral researchers, to those senior analysers and professors that want to restart their contact with the analysis of data or to include a more friendly but yet very powerful open source tool in the classroom. Such tools have been already tested in several environments and a fully web-based integration together with Open Access Data repositories brings the possibility to go a step forward in the search of ATLAS for integration between several CERN projects in the field of the education and training, developing new computing solutions on the way.

  7. Integrated genetic analysis microsystems

    International Nuclear Information System (INIS)

    Lagally, Eric T; Mathies, Richard A

    2004-01-01

    With the completion of the Human Genome Project and the ongoing DNA sequencing of the genomes of other animals, bacteria, plants and others, a wealth of new information about the genetic composition of organisms has become available. However, as the demand for sequence information grows, so does the workload required both to generate this sequence and to use it for targeted genetic analysis. Microfabricated genetic analysis systems are well poised to assist in the collection and use of these data through increased analysis speed, lower analysis cost and higher parallelism leading to increased assay throughput. In addition, such integrated microsystems may point the way to targeted genetic experiments on single cells and in other areas that are otherwise very difficult. Concomitant with these advantages, such systems, when fully integrated, should be capable of forming portable systems for high-speed in situ analyses, enabling a new standard in disciplines such as clinical chemistry, forensics, biowarfare detection and epidemiology. This review will discuss the various technologies available for genetic analysis on the microscale, and efforts to integrate them to form fully functional robust analysis devices. (topical review)

  8. Entropy-based analysis and bioinformatics-inspired integration of global economic information transfer.

    Directory of Open Access Journals (Sweden)

    Jinkyu Kim

    Full Text Available The assessment of information transfer in the global economic network helps to understand the current environment and the outlook of an economy. Most approaches on global networks extract information transfer based mainly on a single variable. This paper establishes an entirely new bioinformatics-inspired approach to integrating information transfer derived from multiple variables and develops an international economic network accordingly. In the proposed methodology, we first construct the transfer entropies (TEs between various intra- and inter-country pairs of economic time series variables, test their significances, and then use a weighted sum approach to aggregate information captured in each TE. Through a simulation study, the new method is shown to deliver better information integration compared to existing integration methods in that it can be applied even when intra-country variables are correlated. Empirical investigation with the real world data reveals that Western countries are more influential in the global economic network and that Japan has become less influential following the Asian currency crisis.

  9. Entropy-based analysis and bioinformatics-inspired integration of global economic information transfer.

    Science.gov (United States)

    Kim, Jinkyu; Kim, Gunn; An, Sungbae; Kwon, Young-Kyun; Yoon, Sungroh

    2013-01-01

    The assessment of information transfer in the global economic network helps to understand the current environment and the outlook of an economy. Most approaches on global networks extract information transfer based mainly on a single variable. This paper establishes an entirely new bioinformatics-inspired approach to integrating information transfer derived from multiple variables and develops an international economic network accordingly. In the proposed methodology, we first construct the transfer entropies (TEs) between various intra- and inter-country pairs of economic time series variables, test their significances, and then use a weighted sum approach to aggregate information captured in each TE. Through a simulation study, the new method is shown to deliver better information integration compared to existing integration methods in that it can be applied even when intra-country variables are correlated. Empirical investigation with the real world data reveals that Western countries are more influential in the global economic network and that Japan has become less influential following the Asian currency crisis.

  10. A solar reserve methodology for renewable energy integration studies based on sub-hourly variability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ibanez, Eduardo; Brinkman, Gregory; Hummon, Marissa [National Renewable Energy Lab. (NREL), Golden, CO (United States); Lew, Debra

    2012-07-01

    Increasing penetration of wind and solar energy are raising concerns among electric system operators because of the variability and uncertainty associated with the power sources. Previous work focused on the quantification of reserves for systems with wind power. This paper presents a new methodology that allows the determination of necessary reserves for high penetrations of photovoltaic power and compares it to the wind-based methodology. The solar reserve methodology was applied to Phase 2 of the Western Wind and Solar Integration Study. A summary of the results is included. (orig.)

  11. ANALYSIS DATA SETS USING HYBRID TECHNIQUES APPLIED ARTIFICIAL INTELLIGENCE BASED PRODUCTION SYSTEMS INTEGRATED DESIGN

    OpenAIRE

    Daniel-Petru GHENCEA; Miron ZAPCIU; Claudiu-Florinel BISU; Elena-Iuliana BOTEANU; Elena-Luminiţa OLTEANU

    2017-01-01

    The paper proposes a prediction model of behavior spindle from the point of view of the thermal deformations and the level of the vibrations by highlighting and processing the characteristic equations. This is a model analysis for the shaft with similar electro-mechanical characteristics can be achieved using a hybrid analysis based on artificial intelligence (genetic algorithms - artificial neural networks - fuzzy logic). The paper presents a prediction mode obtaining valid range of values f...

  12. Influencing Factors and Development Trend Analysis of China Electric Grid Investment Demand Based on a Panel Co-Integration Model

    OpenAIRE

    Jinchao Li; Lin Chen; Yuwei Xiang; Jinying Li; Dong Peng

    2018-01-01

    Electric grid investment demand analysis is significant to reasonably arranging construction funds for the electric grid and reduce costs. This paper used the panel data of electric grid investment from 23 provinces of China between 2004 and 2016 as samples to analyze the influence between electric grid investment demand and GDP, population scale, social electricity consumption, installed electrical capacity, and peak load based on co-integration tests. We find that GDP and peak load have pos...

  13. Integrating clinicians, knowledge and data: expert-based cooperative analysis in healthcare decision support

    Directory of Open Access Journals (Sweden)

    García-Alonso Carlos

    2010-09-01

    Full Text Available Abstract Background Decision support in health systems is a highly difficult task, due to the inherent complexity of the process and structures involved. Method This paper introduces a new hybrid methodology Expert-based Cooperative Analysis (EbCA, which incorporates explicit prior expert knowledge in data analysis methods, and elicits implicit or tacit expert knowledge (IK to improve decision support in healthcare systems. EbCA has been applied to two different case studies, showing its usability and versatility: 1 Bench-marking of small mental health areas based on technical efficiency estimated by EbCA-Data Envelopment Analysis (EbCA-DEA, and 2 Case-mix of schizophrenia based on functional dependency using Clustering Based on Rules (ClBR. In both cases comparisons towards classical procedures using qualitative explicit prior knowledge were made. Bayesian predictive validity measures were used for comparison with expert panels results. Overall agreement was tested by Intraclass Correlation Coefficient in case "1" and kappa in both cases. Results EbCA is a new methodology composed by 6 steps:. 1 Data collection and data preparation; 2 acquisition of "Prior Expert Knowledge" (PEK and design of the "Prior Knowledge Base" (PKB; 3 PKB-guided analysis; 4 support-interpretation tools to evaluate results and detect inconsistencies (here Implicit Knowledg -IK- might be elicited; 5 incorporation of elicited IK in PKB and repeat till a satisfactory solution; 6 post-processing results for decision support. EbCA has been useful for incorporating PEK in two different analysis methods (DEA and Clustering, applied respectively to assess technical efficiency of small mental health areas and for case-mix of schizophrenia based on functional dependency. Differences in results obtained with classical approaches were mainly related to the IK which could be elicited by using EbCA and had major implications for the decision making in both cases. Discussion This

  14. Integrating clinicians, knowledge and data: expert-based cooperative analysis in healthcare decision support.

    Science.gov (United States)

    Gibert, Karina; García-Alonso, Carlos; Salvador-Carulla, Luis

    2010-09-30

    Decision support in health systems is a highly difficult task, due to the inherent complexity of the process and structures involved. This paper introduces a new hybrid methodology Expert-based Cooperative Analysis (EbCA), which incorporates explicit prior expert knowledge in data analysis methods, and elicits implicit or tacit expert knowledge (IK) to improve decision support in healthcare systems. EbCA has been applied to two different case studies, showing its usability and versatility: 1) Bench-marking of small mental health areas based on technical efficiency estimated by EbCA-Data Envelopment Analysis (EbCA-DEA), and 2) Case-mix of schizophrenia based on functional dependency using Clustering Based on Rules (ClBR). In both cases comparisons towards classical procedures using qualitative explicit prior knowledge were made. Bayesian predictive validity measures were used for comparison with expert panels results. Overall agreement was tested by Intraclass Correlation Coefficient in case "1" and kappa in both cases. EbCA is a new methodology composed by 6 steps:. 1) Data collection and data preparation; 2) acquisition of "Prior Expert Knowledge" (PEK) and design of the "Prior Knowledge Base" (PKB); 3) PKB-guided analysis; 4) support-interpretation tools to evaluate results and detect inconsistencies (here Implicit Knowledg -IK- might be elicited); 5) incorporation of elicited IK in PKB and repeat till a satisfactory solution; 6) post-processing results for decision support. EbCA has been useful for incorporating PEK in two different analysis methods (DEA and Clustering), applied respectively to assess technical efficiency of small mental health areas and for case-mix of schizophrenia based on functional dependency. Differences in results obtained with classical approaches were mainly related to the IK which could be elicited by using EbCA and had major implications for the decision making in both cases. This paper presents EbCA and shows the convenience of

  15. Analysis of e-learning implementation readiness based on integrated elr model

    Science.gov (United States)

    Adiyarta, K.; Napitupulu, D.; Rahim, R.; Abdullah, D.; Setiawan, MI

    2018-04-01

    E-learning nowadays has become a requirement for institutions to support their learning activities. To adopt e-learning, an institution requires a large strategy and resources for optimal application. Unfortunately, not all institutions that have used e-learning got the desired results or expectations. This study aims to identify the extent of the level of readiness of e-learning implementation in institution X. The degree of institutional readiness will determine the success of future e-learning utilization. In addition, institutional readiness measurement are needed to evaluate the effectiveness of strategies in e-learning development. The research method used is survey with questionnaire designed based on integration of 8 best practice ELR (e-learning readiness) model. The results showed that from 13 factors of integrated ELR model being measured, there are 3 readiness factors included in the category of not ready and needs a lot of work. They are human resource (2.57), technology skill (2.38) and content factors (2.41). In general, e-learning implementation in institutions is in the category of not ready but needs some of work (3.27). Therefore, the institution should consider which factors or areas of ELR factors are considered still not ready and needs improvement in the future.

  16. Power Loss Analysis for Wind Power Grid Integration Based on Weibull Distribution

    Directory of Open Access Journals (Sweden)

    Ahmed Al Ameri

    2017-04-01

    Full Text Available The growth of electrical demand increases the need of renewable energy sources, such as wind energy, to meet that need. Electrical power losses are an important factor when wind farm location and size are selected. The capitalized cost of constant power losses during the life of a wind farm will continue to high levels. During the operation period, a method to determine if the losses meet the requirements of the design is significantly needed. This article presents a Simulink simulation of wind farm integration into the grid; the aim is to achieve a better understanding of wind variation impact on grid losses. The real power losses are set as a function of the annual variation, considering a Weibull distribution. An analytical method has been used to select the size and placement of a wind farm, taking into account active power loss reduction. It proposes a fast linear model estimation to find the optimal capacity of a wind farm based on DC power flow and graph theory. The results show that the analytical approach is capable of predicting the optimal size and location of wind turbines. Furthermore, it revealed that the annual variation of wind speed could have a strong effect on real power loss calculations. In addition to helping to improve utility efficiency, the proposed method can develop specific designs to speeding up integration of wind farms into grids.

  17. Traffic Multiresolution Modeling and Consistency Analysis of Urban Expressway Based on Asynchronous Integration Strategy

    Directory of Open Access Journals (Sweden)

    Liyan Zhang

    2017-01-01

    Full Text Available The paper studies multiresolution traffic flow simulation model of urban expressway. Firstly, compared with two-level hybrid model, three-level multiresolution hybrid model has been chosen. Then, multiresolution simulation framework and integration strategies are introduced. Thirdly, the paper proposes an urban expressway multiresolution traffic simulation model by asynchronous integration strategy based on Set Theory, which includes three submodels: macromodel, mesomodel, and micromodel. After that, the applicable conditions and derivation process of the three submodels are discussed in detail. In addition, in order to simulate and evaluate the multiresolution model, “simple simulation scenario” of North-South Elevated Expressway in Shanghai has been established. The simulation results showed the following. (1 Volume-density relationships of three submodels are unanimous with detector data. (2 When traffic density is high, macromodel has a high precision and smaller error and the dispersion of results is smaller. Compared with macromodel, simulation accuracies of micromodel and mesomodel are lower but errors are bigger. (3 Multiresolution model can simulate characteristics of traffic flow, capture traffic wave, and keep the consistency of traffic state transition. Finally, the results showed that the novel multiresolution model can have higher simulation accuracy and it is feasible and effective in the real traffic simulation scenario.

  18. Comparative analysis of the influence of creep of concrete composite beams of steel - concrete model based on Volterra integral equation

    Directory of Open Access Journals (Sweden)

    Partov Doncho

    2017-01-01

    Full Text Available The paper presents analysis of the stress-strain behaviour and deflection changes due to creep in statically determinate composite steel-concrete beam according to EUROCODE 2, ACI209R-92 and Gardner&Lockman models. The mathematical model involves the equation of equilibrium, compatibility and constitutive relationship, i.e. an elastic law for the steel part and an integral-type creep law of Boltzmann - Volterra for the concrete part considering the above mentioned models. On the basis of the theory of viscoelastic body of Maslov-Arutyunian-Trost-Zerna-Bažant for determining the redistribution of stresses in beam section between concrete plate and steel beam with respect to time 't', two independent Volterra integral equations of the second kind have been derived. Numerical method based on linear approximation of the singular kernel function in the integral equation is presented. Example with the model proposed is investigated.

  19. An extensive analysis of disease-gene associations using network integration and fast kernel-based gene prioritization methods.

    Science.gov (United States)

    Valentini, Giorgio; Paccanaro, Alberto; Caniza, Horacio; Romero, Alfonso E; Re, Matteo

    2014-06-01

    In the context of "network medicine", gene prioritization methods represent one of the main tools to discover candidate disease genes by exploiting the large amount of data covering different types of functional relationships between genes. Several works proposed to integrate multiple sources of data to improve disease gene prioritization, but to our knowledge no systematic studies focused on the quantitative evaluation of the impact of network integration on gene prioritization. In this paper, we aim at providing an extensive analysis of gene-disease associations not limited to genetic disorders, and a systematic comparison of different network integration methods for gene prioritization. We collected nine different functional networks representing different functional relationships between genes, and we combined them through both unweighted and weighted network integration methods. We then prioritized genes with respect to each of the considered 708 medical subject headings (MeSH) diseases by applying classical guilt-by-association, random walk and random walk with restart algorithms, and the recently proposed kernelized score functions. The results obtained with classical random walk algorithms and the best single network achieved an average area under the curve (AUC) across the 708 MeSH diseases of about 0.82, while kernelized score functions and network integration boosted the average AUC to about 0.89. Weighted integration, by exploiting the different "informativeness" embedded in different functional networks, outperforms unweighted integration at 0.01 significance level, according to the Wilcoxon signed rank sum test. For each MeSH disease we provide the top-ranked unannotated candidate genes, available for further bio-medical investigation. Network integration is necessary to boost the performances of gene prioritization methods. Moreover the methods based on kernelized score functions can further enhance disease gene ranking results, by adopting both

  20. An extensive analysis of disease-gene associations using network integration and fast kernel-based gene prioritization methods

    Science.gov (United States)

    Valentini, Giorgio; Paccanaro, Alberto; Caniza, Horacio; Romero, Alfonso E.; Re, Matteo

    2014-01-01

    Objective In the context of “network medicine”, gene prioritization methods represent one of the main tools to discover candidate disease genes by exploiting the large amount of data covering different types of functional relationships between genes. Several works proposed to integrate multiple sources of data to improve disease gene prioritization, but to our knowledge no systematic studies focused on the quantitative evaluation of the impact of network integration on gene prioritization. In this paper, we aim at providing an extensive analysis of gene-disease associations not limited to genetic disorders, and a systematic comparison of different network integration methods for gene prioritization. Materials and methods We collected nine different functional networks representing different functional relationships between genes, and we combined them through both unweighted and weighted network integration methods. We then prioritized genes with respect to each of the considered 708 medical subject headings (MeSH) diseases by applying classical guilt-by-association, random walk and random walk with restart algorithms, and the recently proposed kernelized score functions. Results The results obtained with classical random walk algorithms and the best single network achieved an average area under the curve (AUC) across the 708 MeSH diseases of about 0.82, while kernelized score functions and network integration boosted the average AUC to about 0.89. Weighted integration, by exploiting the different “informativeness” embedded in different functional networks, outperforms unweighted integration at 0.01 significance level, according to the Wilcoxon signed rank sum test. For each MeSH disease we provide the top-ranked unannotated candidate genes, available for further bio-medical investigation. Conclusions Network integration is necessary to boost the performances of gene prioritization methods. Moreover the methods based on kernelized score functions can further

  1. ANALYSIS DATA SETS USING HYBRID TECHNIQUES APPLIED ARTIFICIAL INTELLIGENCE BASED PRODUCTION SYSTEMS INTEGRATED DESIGN

    Directory of Open Access Journals (Sweden)

    Daniel-Petru GHENCEA

    2017-06-01

    Full Text Available The paper proposes a prediction model of behavior spindle from the point of view of the thermal deformations and the level of the vibrations by highlighting and processing the characteristic equations. This is a model analysis for the shaft with similar electro-mechanical characteristics can be achieved using a hybrid analysis based on artificial intelligence (genetic algorithms - artificial neural networks - fuzzy logic. The paper presents a prediction mode obtaining valid range of values for spindles with similar characteristics based on measured data sets from a few spindles test without additional measures being required. Extracting polynomial functions of graphs resulting from simultaneous measurements and predict the dynamics of the two features with multi-objective criterion is the main advantage of this method.

  2. Integration of ROOT notebook as an ATLAS analysis web-based tool in outreach and public data release projects

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00237353; The ATLAS collaboration

    2017-01-01

    Integration of the ROOT data analysis framework with the Jupyter Notebook technology presents the potential of enhancement and expansion of educational and training programs. It can be beneficial for university students in their early years, new PhD students and post-doctoral researchers, as well as for senior researchers and teachers who want to refresh their data analysis skills or to introduce a more friendly and yet very powerful open source tool in the classroom. Such tools have been already tested in several environments. A fully web-based integration of the tools and the Open Access Data repositories brings the possibility to go a step forward in the ATLAS quest of making use of several CERN projects in the field of the education and training, developing new computing solutions on the way.

  3. Bending analysis of embedded nanoplates based on the integral formulation of Eringen's nonlocal theory using the finite element method

    Science.gov (United States)

    Ansari, R.; Torabi, J.; Norouzzadeh, A.

    2018-04-01

    Due to the capability of Eringen's nonlocal elasticity theory to capture the small length scale effect, it is widely used to study the mechanical behaviors of nanostructures. Previous studies have indicated that in some cases, the differential form of this theory cannot correctly predict the behavior of structure, and the integral form should be employed to avoid obtaining inconsistent results. The present study deals with the bending analysis of nanoplates resting on elastic foundation based on the integral formulation of Eringen's nonlocal theory. Since the formulation is presented in a general form, arbitrary kernel functions can be used. The first order shear deformation plate theory is considered to model the nanoplates, and the governing equations for both integral and differential forms are presented. Finally, the finite element method is applied to solve the problem. Selected results are given to investigate the effects of elastic foundation and to compare the predictions of integral nonlocal model with those of its differential nonlocal and local counterparts. It is found that by the use of proposed integral formulation of Eringen's nonlocal model, the paradox observed for the cantilever nanoplate is resolved.

  4. Wing aeroelasticity analysis based on an integral boundary-layer method coupled with Euler solver

    Directory of Open Access Journals (Sweden)

    Ma Yanfeng

    2016-10-01

    Full Text Available An interactive boundary-layer method, which solves the unsteady flow, is developed for aeroelastic computation in the time domain. The coupled method combines the Euler solver with the integral boundary-layer solver (Euler/BL in a “semi-inverse” manner to compute flows with the inviscid and viscous interaction. Unsteady boundary conditions on moving surfaces are taken into account by utilizing the approximate small-perturbation method without moving the computational grids. The steady and unsteady flow calculations for the LANN wing are presented. The wing tip displacement of high Reynolds number aero-structural dynamics (HIRENASD Project is simulated under different angles of attack. The flutter-boundary predictions for the AGARD 445.6 wing are provided. The results of the interactive boundary-layer method are compared with those of the Euler method and experimental data. The study shows that viscous effects are significant for these cases and the further data analysis confirms the validity and practicability of the coupled method.

  5. Evaluating ecommerce websites cognitive efficiency: an integrative framework based on data envelopment analysis.

    Science.gov (United States)

    Lo Storto, Corrado

    2013-11-01

    This paper presents an integrative framework to evaluate ecommerce website efficiency from the user viewpoint using Data Envelopment Analysis (DEA). This framework is inspired by concepts driven from theories of information processing and cognition and considers the website efficiency as a measure of its quality and performance. When the users interact with the website interfaces to perform a task, they are involved in a cognitive effort, sustaining a cognitive cost to search, interpret and process information, and experiencing either a sense of satisfaction or dissatisfaction for that. The amount of ambiguity and uncertainty, and the search (over-)time during navigation that they perceive determine the effort size - and, as a consequence, the cognitive cost amount - they have to bear to perform their task. On the contrary, task performing and result achievement provide the users with cognitive benefits, making interaction with the website potentially attractive, satisfying, and useful. In total, 9 variables are measured, classified in a set of 3 website macro-dimensions (user experience, site navigability and structure). The framework is implemented to compare 52 ecommerce websites that sell products in the information technology and media market. A stepwise regression is performed to assess the influence of cognitive costs and benefits that mostly affect website efficiency. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  6. Cost Analysis of Integrative Inpatient Treatment Based on DRG Data: The Example of Anthroposophic Medicine

    Science.gov (United States)

    Heinz, Jürgen; Fiori, Wolfgang; Heusser, Peter

    2013-01-01

    Background. Much work has been done to evaluate the outcome of integrative inpatient treatment but scarcely the costs. This paper evaluates the costs for inpatient treatment in three anthroposophic hospitals (AHs). Material and Methods. Cost and performance data from a total of 23,180 cases were analyzed and compared to national reference data. Subgroup analysis was performed between the cases with and without anthroposophic medical complex (AMC) treatment. Results. Costs and length of stay in the cases without AMC displayed no relevant differences compared to the national reference data. In contrast the inlier cases with AMC caused an average of € 1,394 more costs. However costs per diem were not higher than those in the national reference data. Hence, the delivery of AMC was associated with a prolonged length of stay. 46.6% of the cases with AMC were high outliers. Only 10.6% of the inlier cases with AMC were discharged before reaching the mean length of stay of each DRG. Discussion. Treatment in an AH is not generally associated with an increased use of resources. However, the provision of AMC leads to a prolonged length of stay and cannot be adequately reimbursed by the current G-DRG system. Due to the heterogeneity of the patient population, an additional payment should be negotiated individually. PMID:23431346

  7. Cost Analysis of Integrative Inpatient Treatment Based on DRG Data: The Example of Anthroposophic Medicine

    Directory of Open Access Journals (Sweden)

    Jürgen Heinz

    2013-01-01

    Full Text Available Background. Much work has been done to evaluate the outcome of integrative inpatient treatment but scarcely the costs. This paper evaluates the costs for inpatient treatment in three anthroposophic hospitals (AHs. Material and Methods. Cost and performance data from a total of 23,180 cases were analyzed and compared to national reference data. Subgroup analysis was performed between the cases with and without anthroposophic medical complex (AMC treatment. Results. Costs and length of stay in the cases without AMC displayed no relevant differences compared to the national reference data. In contrast the inlier cases with AMC caused an average of € 1,394 more costs. However costs per diem were not higher than those in the national reference data. Hence, the delivery of AMC was associated with a prolonged length of stay. 46.6% of the cases with AMC were high outliers. Only 10.6% of the inlier cases with AMC were discharged before reaching the mean length of stay of each DRG. Discussion. Treatment in an AH is not generally associated with an increased use of resources. However, the provision of AMC leads to a prolonged length of stay and cannot be adequately reimbursed by the current G-DRG system. Due to the heterogeneity of the patient population, an additional payment should be negotiated individually.

  8. Integrating Household Risk Mitigation Behavior in Flood Risk Analysis: An Agent-Based Model Approach.

    Science.gov (United States)

    Haer, Toon; Botzen, W J Wouter; de Moel, Hans; Aerts, Jeroen C J H

    2017-10-01

    Recent studies showed that climate change and socioeconomic trends are expected to increase flood risks in many regions. However, in these studies, human behavior is commonly assumed to be constant, which neglects interaction and feedback loops between human and environmental systems. This neglect of human adaptation leads to a misrepresentation of flood risk. This article presents an agent-based model that incorporates human decision making in flood risk analysis. In particular, household investments in loss-reducing measures are examined under three economic decision models: (1) expected utility theory, which is the traditional economic model of rational agents; (2) prospect theory, which takes account of bounded rationality; and (3) a prospect theory model, which accounts for changing risk perceptions and social interactions through a process of Bayesian updating. We show that neglecting human behavior in flood risk assessment studies can result in a considerable misestimation of future flood risk, which is in our case study an overestimation of a factor two. Furthermore, we show how behavior models can support flood risk analysis under different behavioral assumptions, illustrating the need to include the dynamic adaptive human behavior of, for instance, households, insurers, and governments. The method presented here provides a solid basis for exploring human behavior and the resulting flood risk with respect to low-probability/high-impact risks. © 2016 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.

  9. An integrated factor analysis model for product eco-design based on full life cycle assessment

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Z.; Xiao, T.; Li, D.

    2016-07-01

    Among the methods of comprehensive analysis for a product or an enterprise, there exist defects and deficiencies in traditional standard cost analyses and life cycle assessment methods. For example, some methods only emphasize one dimension (such as economic or environmental factors) while neglecting other relevant dimensions. This paper builds a factor analysis model of resource value flow, based on full life cycle assessment and eco-design theory, in order to expose the relevant internal logic between these two factors. The model considers the efficient multiplication of resources, economic efficiency, and environmental efficiency as its core objectives. The model studies the status of resource value flow during the entire life cycle of a product, and gives an in-depth analysis on the mutual logical relationship of product performance, value, resource consumption, and environmental load to reveal the symptoms and potentials in different dimensions. This provides comprehensive, accurate and timely decision-making information for enterprise managers regarding product eco-design, as well as production and management activities. To conclude, it verifies the availability of this evaluation and analysis model using a Chinese SUV manufacturer as an example. (Author)

  10. An Integrated Circuit for Chip-Based Analysis of Enzyme Kinetics and Metabolite Quantification.

    Science.gov (United States)

    Cheah, Boon Chong; Macdonald, Alasdair Iain; Martin, Christopher; Streklas, Angelos J; Campbell, Gordon; Al-Rawhani, Mohammed A; Nemeth, Balazs; Grant, James P; Barrett, Michael P; Cumming, David R S

    2016-06-01

    We have created a novel chip-based diagnostic tools based upon quantification of metabolites using enzymes specific for their chemical conversion. Using this device we show for the first time that a solid-state circuit can be used to measure enzyme kinetics and calculate the Michaelis-Menten constant. Substrate concentration dependency of enzyme reaction rates is central to this aim. Ion-sensitive field effect transistors (ISFET) are excellent transducers for biosensing applications that are reliant upon enzyme assays, especially since they can be fabricated using mainstream microelectronics technology to ensure low unit cost, mass-manufacture, scaling to make many sensors and straightforward miniaturisation for use in point-of-care devices. Here, we describe an integrated ISFET array comprising 2(16) sensors. The device was fabricated with a complementary metal oxide semiconductor (CMOS) process. Unlike traditional CMOS ISFET sensors that use the Si3N4 passivation of the foundry for ion detection, the device reported here was processed with a layer of Ta2O5 that increased the detection sensitivity to 45 mV/pH unit at the sensor readout. The drift was reduced to 0.8 mV/hour with a linear pH response between pH 2-12. A high-speed instrumentation system capable of acquiring nearly 500 fps was developed to stream out the data. The device was then used to measure glucose concentration through the activity of hexokinase in the range of 0.05 mM-231 mM, encompassing glucose's physiological range in blood. Localised and temporal enzyme kinetics of hexokinase was studied in detail. These results present a roadmap towards a viable personal metabolome machine.

  11. Integration of lyoplate based flow cytometry and computational analysis for standardized immunological biomarker discovery.

    Directory of Open Access Journals (Sweden)

    Federica Villanova

    Full Text Available Discovery of novel immune biomarkers for monitoring of disease prognosis and response to therapy in immune-mediated inflammatory diseases is an important unmet clinical need. Here, we establish a novel framework for immunological biomarker discovery, comparing a conventional (liquid flow cytometry platform (CFP and a unique lyoplate-based flow cytometry platform (LFP in combination with advanced computational data analysis. We demonstrate that LFP had higher sensitivity compared to CFP, with increased detection of cytokines (IFN-γ and IL-10 and activation markers (Foxp3 and CD25. Fluorescent intensity of cells stained with lyophilized antibodies was increased compared to cells stained with liquid antibodies. LFP, using a plate loader, allowed medium-throughput processing of samples with comparable intra- and inter-assay variability between platforms. Automated computational analysis identified novel immunophenotypes that were not detected with manual analysis. Our results establish a new flow cytometry platform for standardized and rapid immunological biomarker discovery with wide application to immune-mediated diseases.

  12. Integration of lyoplate based flow cytometry and computational analysis for standardized immunological biomarker discovery.

    Science.gov (United States)

    Villanova, Federica; Di Meglio, Paola; Inokuma, Margaret; Aghaeepour, Nima; Perucha, Esperanza; Mollon, Jennifer; Nomura, Laurel; Hernandez-Fuentes, Maria; Cope, Andrew; Prevost, A Toby; Heck, Susanne; Maino, Vernon; Lord, Graham; Brinkman, Ryan R; Nestle, Frank O

    2013-01-01

    Discovery of novel immune biomarkers for monitoring of disease prognosis and response to therapy in immune-mediated inflammatory diseases is an important unmet clinical need. Here, we establish a novel framework for immunological biomarker discovery, comparing a conventional (liquid) flow cytometry platform (CFP) and a unique lyoplate-based flow cytometry platform (LFP) in combination with advanced computational data analysis. We demonstrate that LFP had higher sensitivity compared to CFP, with increased detection of cytokines (IFN-γ and IL-10) and activation markers (Foxp3 and CD25). Fluorescent intensity of cells stained with lyophilized antibodies was increased compared to cells stained with liquid antibodies. LFP, using a plate loader, allowed medium-throughput processing of samples with comparable intra- and inter-assay variability between platforms. Automated computational analysis identified novel immunophenotypes that were not detected with manual analysis. Our results establish a new flow cytometry platform for standardized and rapid immunological biomarker discovery with wide application to immune-mediated diseases.

  13. Electromagnetic Field Analysis of an Electric Dipole Antenna Based on a Surface Integral Equation in Multilayered Dissipative Media

    Directory of Open Access Journals (Sweden)

    Yidong Xu

    2017-07-01

    Full Text Available In this paper, a novel method based on the Poggio–Miller–Chang-Harrington–Wu–Tsai (PMCHWT integral equation is presented to study the electromagnetic fields excited by vertical or horizontal electric dipoles in the presence of a layered region which consists of K-layered dissipative media and the air above. To transform the continuous integral equation into a block tridiagonal matrix with the feature of convenient solution, the Rao–Wilton–Glisson (RWG functions are introduced as expansion and testing functions. The electromagnetic fields excited by an electric dipole are calculated and compared with the available results, where the electric dipole antenna is buried in the non-planar air–sea–seabed, air–rock–earth–mine, and multilayered sphere structures. The analysis and computations demonstrate that the method exhibits high accuracy and solving performance in the near field propagation region.

  14. [Integrative study of Guangdong ecological-economic system based on emergy analysis].

    Science.gov (United States)

    Sui, Chunhua; Lu, Hongfang; Zheng, Fengying

    2006-11-01

    Based on the theories and methodologies of emergy, a quantitative analysis on the development sustainability of Guangdong Province in 1990-2003 was made from the aspects of environment, society, and economy at system and subsystem levels. The results showed that Guangdong Province was of developed provinces in China, and highly depended on the input of feedback emergy. Though the pollution control was fruitful, the increasing environmental loading was still not relieved on the whole, and the development sustainability was relatively low. This Province was increasingly relied on international market, and actually, under the status to the bad in exporting primary products and importing high-tech products. To improve the development sustainability of this Province, more concerns should be paid on improving the added values of products, making full use of the natural and labor resources in its underdeveloped area, and further bringing the economic superiority of its developed area into play.

  15. Integrated Rudder/Fin Concise Control Based on Frequency Domain Analysis

    OpenAIRE

    W. Guan; Z. J. Su; G. Q. Zhang

    2013-01-01

    This paper describes a concise robust controller design of integrated rudder and fin control system in use of the closed loop gain shaping algorithm (CGSA) strategy. Compared with the arbitrary selection of weighting function in integrated rudder and fin H∞ mixed sensitivity control design procedures, the CGSA methods provided a relatively more straightforward and concise design method. Simulations were described that the overall performance of each CGSA rudder and fin control loop and the in...

  16. Integrating Stakeholder Preferences and GIS-Based Multicriteria Analysis to Identify Forest Landscape Restoration Priorities

    Directory of Open Access Journals (Sweden)

    David Uribe

    2014-02-01

    Full Text Available A pressing question that arises during the planning of an ecological restoration process is: where to restore first? Answering this question is a complex task; it requires a multidimensional approach to consider economic constrains and the preferences of stakeholders. Being the problem of spatial nature, it may be explored effectively through Multicriteria Decision Analysis (MCDA performed in a Geographical Information System (GIS environment. The proposed approach is based on the definition and weighting of multiple criteria for evaluating land suitability. An MCDA-based methodology was used to identify priority areas for Forest Landscape Restoration in the Upper Mixtec region, Oaxaca (Mexico, one of the most degraded areas of Latin America. Socioeconomic and environmental criteria were selected and evaluated. The opinions of four different stakeholder groups were considered: general public, academic, Non-governmental organizations (NGOs and governmental officers. The preferences of these groups were spatially modeled to identify their priorities. The final result was a map that identifies the most preferable sites for restoration, where resources and efforts should be concentrated. MCDA proved to be a very useful tool in collective planning, when alternative sites have to be identified and prioritized to guide the restoration work.

  17. Integrated data base program

    International Nuclear Information System (INIS)

    Notz, K.J.

    1981-01-01

    The IDB Program provides direct support to the DOE Nuclear Waste Management and Fuel Cycle Programs and their lead sites and support contractors by providing and maintaining a current, integrated data base of spent fuel and radioactive waste inventories and projections. All major waste types (HLW, TRU, and LLW) and sources (government, commerical fuel cycle, and I/I) are included. A major data compilation was issued in September, 1981: Spent Fuel and Radioactive Waste Inventories and Projections as of December 31, 1980, DOE/NE-0017. This report includes chapters on Spent Fuel, HLW, TRU Waste, LLW, Remedial Action Waste, Active Uranium Mill Tailings, and Airborne Waste, plus Appendices with more detailed data in selected areas such as isotopics, radioactivity, thermal power, projections, and land usage. The LLW sections include volumes, radioactivity, thermal power, current inventories, projected inventories and characteristics, source terms, land requirements, and a breakdown in terms of government/commercial and defense/fuel cycle/I and I

  18. An integrated approach for visual analysis of a multisource moving objects knowledge base

    NARCIS (Netherlands)

    Willems, N.; van Hage, W.R.; de Vries, G.; Janssens, J.H.M.; Malaisé, V.

    2010-01-01

    We present an integrated and multidisciplinary approach for analyzing the behavior of moving objects. The results originate from an ongoing research of four different partners from the Dutch Poseidon project (Embedded Systems Institute (2007)), which aims to develop new methods for Maritime Safety

  19. An Integrated Approach for Visual Analysis of a Multi-Source Moving Objects Knowledge Base

    NARCIS (Netherlands)

    Willems, C.M.E.; van Hage, W.R.; de Vries, G.K.D.; Janssens, J.; Malaisé, V.

    2010-01-01

    We present an integrated and multidisciplinary approach for analyzing the behavior of moving objects. The results originate from an ongoing research of four different partners from the Dutch Poseidon project (Embedded Systems Institute (2007)), which aims to develop new methods for Maritime Safety

  20. An integrated approach for visual analysis of a multi-source moving objects knowledge base

    NARCIS (Netherlands)

    Willems, N.; Hage, van W.R.; Vries, de G.; Janssens, J.H.M.; Malaisé, V.

    2010-01-01

    We present an integrated and multidisciplinary approach for analyzing the behavior of moving objects. The results originate from an ongoing research of four different partners from the Dutch Poseidon project (Embedded Systems Institute (2007)), which aims to develop new methods for Maritime Safety

  1. Integrated Experimental and Model-based Analysis Reveals the Spatial Aspects of EGFR Activation Dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Shankaran, Harish; Zhang, Yi; Chrisler, William B.; Ewald, Jonathan A.; Wiley, H. S.; Resat, Haluk

    2012-10-02

    The epidermal growth factor receptor (EGFR) belongs to the ErbB family of receptor tyrosine kinases, and controls a diverse set of cellular responses relevant to development and tumorigenesis. ErbB activation is a complex process involving receptor-ligand binding, receptor dimerization, phosphorylation, and trafficking (internalization, recycling and degradation), which together dictate the spatio-temporal distribution of active receptors within the cell. The ability to predict this distribution, and elucidation of the factors regulating it, would help to establish a mechanistic link between ErbB expression levels and the cellular response. Towards this end, we constructed mathematical models for deconvolving the contributions of receptor dimerization and phosphorylation to EGFR activation, and to examine the dependence of these processes on sub-cellular location. We collected experimental datasets for EGFR activation dynamics in human mammary epithelial cells, with the specific goal of model parameterization, and used the data to estimate parameters for several alternate models. Model-based analysis indicated that: 1) signal termination via receptor dephosphorylation in late endosomes, prior to degradation, is an important component of the response, 2) less than 40% of the receptors in the cell are phosphorylated at any given time, even at saturating ligand doses, and 3) receptor dephosphorylation rates at the cell surface and early endosomes are comparable. We validated the last finding by measuring EGFR dephosphorylation rates at various times following ligand addition both in whole cells, and in endosomes using ELISAs and fluorescent imaging. Overall, our results provide important information on how EGFR phosphorylation levels are regulated within cells. Further, the mathematical model described here can be extended to determine receptor dimer abundances in cells co-expressing various levels of ErbB receptors. This study demonstrates that an iterative cycle of

  2. Fuzzy knowledge bases integration based on ontology

    OpenAIRE

    Ternovoy, Maksym; Shtogrina, Olena

    2012-01-01

    the paper describes the approach for fuzzy knowledge bases integration with the usage of ontology. This approach is based on metadata-base usage for integration of different knowledge bases with common ontology. The design process of metadata-base is described.

  3. Integrating technical analysis and public values in risk-based decision making

    International Nuclear Information System (INIS)

    Bohnenblust, Hans; Slovic, Paul

    1998-01-01

    Simple technical analysis cannot capture the complex scope of preferences or values of society and individuals. However, decision making needs to be sustained by formal analysis. The paper describes a policy framework which incorporates both technical analysis and aspects of public values. The framework can be used as a decision supporting tool and helps decision makers to make more informed and more transparent decisions about safety issues

  4. Integrated piping structural analysis system

    International Nuclear Information System (INIS)

    Motoi, Toshio; Yamadera, Masao; Horino, Satoshi; Idehata, Takamasa

    1979-01-01

    Structural analysis of the piping system for nuclear power plants has become larger in scale and in quantity. In addition, higher quality analysis is regarded as of major importance nowadays from the point of view of nuclear plant safety. In order to fulfill to the above requirements, an integrated piping structural analysis system (ISAP-II) has been developed. Basic philosophy of this system is as follows: 1. To apply the date base system. All information is concentrated. 2. To minimize the manual process in analysis, evaluation and documentation. Especially to apply the graphic system as much as possible. On the basis of the above philosophy four subsystems were made. 1. Data control subsystem. 2. Analysis subsystem. 3. Plotting subsystem. 4. Report subsystem. Function of the data control subsystem is to control all information of the data base. Piping structural analysis can be performed by using the analysis subsystem. Isometric piping drawing and mode shape, etc. can be plotted by using the plotting subsystem. Total analysis report can be made without the manual process through the reporting subsystem. (author)

  5. Integrating eQTL data with GWAS summary statistics in pathway-based analysis with application to schizophrenia.

    Science.gov (United States)

    Wu, Chong; Pan, Wei

    2018-04-01

    Many genetic variants affect complex traits through gene expression, which can be exploited to boost statistical power and enhance interpretation in genome-wide association studies (GWASs) as demonstrated by the transcriptome-wide association study (TWAS) approach. Furthermore, due to polygenic inheritance, a complex trait is often affected by multiple genes with similar functions as annotated in gene pathways. Here, we extend TWAS from gene-based analysis to pathway-based analysis: we integrate public pathway collections, expression quantitative trait locus (eQTL) data and GWAS summary association statistics (or GWAS individual-level data) to identify gene pathways associated with complex traits. The basic idea is to weight the SNPs of the genes in a pathway based on their estimated cis-effects on gene expression, then adaptively test for association of the pathway with a GWAS trait by effectively aggregating possibly weak association signals across the genes in the pathway. The P values can be calculated analytically and thus fast. We applied our proposed test with the KEGG and GO pathways to two schizophrenia (SCZ) GWAS summary association data sets, denoted by SCZ1 and SCZ2 with about 20,000 and 150,000 subjects, respectively. Most of the significant pathways identified by analyzing the SCZ1 data were reproduced by the SCZ2 data. Importantly, we identified 15 novel pathways associated with SCZ, such as GABA receptor complex (GO:1902710), which could not be uncovered by the standard single SNP-based analysis or gene-based TWAS. The newly identified pathways may help us gain insights into the biological mechanism underlying SCZ. Our results showcase the power of incorporating gene expression information and gene functional annotations into pathway-based association testing for GWAS. © 2018 WILEY PERIODICALS, INC.

  6. An activity-based cost analysis of the Honduras community-based, integrated child care (AIN-C) programme.

    Science.gov (United States)

    Fiedler, John L; Villalobos, Carlos A; De Mattos, Annette C

    2008-11-01

    The Honduras AIN-C programme is a preventive health and nutrition programme of the Honduras Ministry of Health (MOH) that relies on volunteers to help mothers monitor and maintain the adequate growth of young children. A quasi-experimental, design-based evaluation found that the programme achieved near-universal coverage and was effective in improving mothers' child-rearing knowledge, attitudes and practices, including feeding and appropriate care-giving and care-seeking practices for children with diarrhoea and acute respiratory illness. The programme is widely regarded as a model. This study was undertaken to provide the first comprehensive estimates of the cost of the AIN-C programme, with the goal of providing a programme and financial planning tool for Honduras. An additional comparison of study findings was also undertaken to determine the cost of the AIN-C programme's community-based services relative to a similar facility-based service. Expressed in mid-2005 US dollars, the study found that after the programme is phased-in: (1) the annual, recurrent cost per child under 2 years participating in the programme is $6.43; (2) the annual, incremental budget requirements per child under 2 years participating in the programme are $3.90; (3) the cost of an AIN-C monthly growth monitoring and counselling session per child is 11% of the cost of a traditional MOH, facility-based growth and development consultation per child; and (4) the effect of mothers substituting AIN-C monitor care for MOH facility-based care 'saves' 203 000 outpatient visits a year, with a potential cost saving of $1.66 million, the equivalent of 60% of the recurrent cost of the programme and roughly equal to the annual incremental budget requirements of the programme. Sensitivity analysis of the cost estimates is performed to provide insight, for countries considering introducing a similar programme, into how modifications of key characteristics of the programme affect its costs.

  7. Protein Analysis Using Real-Time PCR Instrumentation: Incorporation in an Integrated, Inquiry-Based Project

    Science.gov (United States)

    Southard, Jonathan N.

    2014-01-01

    Instrumentation for real-time PCR is used primarily for amplification and quantitation of nucleic acids. The capability to measure fluorescence while controlling temperature in multiple samples can also be applied to the analysis of proteins. Conformational stability and changes in stability due to ligand binding are easily assessed. Protein…

  8. Research on the Reliability Analysis of the Integrated Modular Avionics System Based on the AADL Error Model

    Directory of Open Access Journals (Sweden)

    Peng Wang

    2018-01-01

    Full Text Available In recent years, the integrated modular avionics (IMA concept has been introduced to replace the traditional federated avionics. Different avionics functions are hosted in a shared IMA platform, and IMA adopts partition technologies to provide a logical isolation among different functions. The IMA architecture can provide more sophisticated and powerful avionics functionality; meanwhile, the failure propagation patterns in IMA are more complex. The feature of resource sharing introduces some unintended interconnections among different functions, which makes the failure propagation modes more complex. Therefore, this paper proposes an architecture analysis and design language- (AADL- based method to establish the reliability model of IMA platform. The single software and hardware error behavior in IMA system is modeled. The corresponding AADL error model of failure propagation among components, between software and hardware, is given. Finally, the display function of IMA platform is taken as an example to illustrate the effectiveness of the proposed method.

  9. An integrated one-chip-sensor system for microRNA quantitative analysis based on digital droplet polymerase chain reaction

    Science.gov (United States)

    Tsukuda, Masahiko; Wiederkehr, Rodrigo Sergio; Cai, Qing; Majeed, Bivragh; Fiorini, Paolo; Stakenborg, Tim; Matsuno, Toshinobu

    2016-04-01

    A silicon microfluidic chip was developed for microRNA (miRNA) quantitative analysis. It performs sequentially reverse transcription and polymerase chain reaction in a digital droplet format. Individual processes take place on different cavities, and reagent and sample mixing is carried out on a chip, prior to entering each compartment. The droplets are generated on a T-junction channel before the polymerase chain reaction step. Also, a miniaturized fluorescence detector was developed, based on an optical pick-up head of digital versatile disc (DVD) and a micro-photomultiplier tube. The chip integrated in the detection system was tested using synthetic miRNA with known concentrations, ranging from 300 to 3,000 templates/µL. Results proved the functionality of the system.

  10. A Sparsity-based Framework for Resolution Enhancement in Optical Fault Analysis of Integrated Circuits

    Science.gov (United States)

    2015-01-01

    discussions and collaboration. I also want to thank other co-workers for discussions and their contributions, Dr. Helen Fawcett, Dr. Euan Ramsay , Dr...optical fault analysis techniques Gordon E. Moore predicted the rapid decrease in IC dimensions (Moore, 1998) and this decrease continues as predicted...Serrels, K. A., Ramsay , E., Warburton, R. J., and Reid, D. T. (2008). Nanoscale optical microscopy in the vectorial focusing regime. Nature Photonics, 2(5

  11. Integrated computer-aided forensic case analysis, presentation, and documentation based on multimodal 3D data.

    Science.gov (United States)

    Bornik, Alexander; Urschler, Martin; Schmalstieg, Dieter; Bischof, Horst; Krauskopf, Astrid; Schwark, Thorsten; Scheurer, Eva; Yen, Kathrin

    2018-06-01

    Three-dimensional (3D) crime scene documentation using 3D scanners and medical imaging modalities like computed tomography (CT) and magnetic resonance imaging (MRI) are increasingly applied in forensic casework. Together with digital photography, these modalities enable comprehensive and non-invasive recording of forensically relevant information regarding injuries/pathologies inside the body and on its surface. Furthermore, it is possible to capture traces and items at crime scenes. Such digitally secured evidence has the potential to similarly increase case understanding by forensic experts and non-experts in court. Unlike photographs and 3D surface models, images from CT and MRI are not self-explanatory. Their interpretation and understanding requires radiological knowledge. Findings in tomography data must not only be revealed, but should also be jointly studied with all the 2D and 3D data available in order to clarify spatial interrelations and to optimally exploit the data at hand. This is technically challenging due to the heterogeneous data representations including volumetric data, polygonal 3D models, and images. This paper presents a novel computer-aided forensic toolbox providing tools to support the analysis, documentation, annotation, and illustration of forensic cases using heterogeneous digital data. Conjoint visualization of data from different modalities in their native form and efficient tools to visually extract and emphasize findings help experts to reveal unrecognized correlations and thereby enhance their case understanding. Moreover, the 3D case illustrations created for case analysis represent an efficient means to convey the insights gained from case analysis to forensic non-experts involved in court proceedings like jurists and laymen. The capability of the presented approach in the context of case analysis, its potential to speed up legal procedures and to ultimately enhance legal certainty is demonstrated by introducing a number of

  12. SU-F-J-94: Development of a Plug-in Based Image Analysis Tool for Integration Into Treatment Planning

    Energy Technology Data Exchange (ETDEWEB)

    Owen, D; Anderson, C; Mayo, C; El Naqa, I; Ten Haken, R; Cao, Y; Balter, J; Matuszak, M [University of Michigan, Ann Arbor, MI (United States)

    2016-06-15

    Purpose: To extend the functionality of a commercial treatment planning system (TPS) to support (i) direct use of quantitative image-based metrics within treatment plan optimization and (ii) evaluation of dose-functional volume relationships to assist in functional image adaptive radiotherapy. Methods: A script was written that interfaces with a commercial TPS via an Application Programming Interface (API). The script executes a program that performs dose-functional volume analyses. Written in C#, the script reads the dose grid and correlates it with image data on a voxel-by-voxel basis through API extensions that can access registration transforms. A user interface was designed through WinForms to input parameters and display results. To test the performance of this program, image- and dose-based metrics computed from perfusion SPECT images aligned to the treatment planning CT were generated, validated, and compared. Results: The integration of image analysis information was successfully implemented as a plug-in to a commercial TPS. Perfusion SPECT images were used to validate the calculation and display of image-based metrics as well as dose-intensity metrics and histograms for defined structures on the treatment planning CT. Various biological dose correction models, custom image-based metrics, dose-intensity computations, and dose-intensity histograms were applied to analyze the image-dose profile. Conclusion: It is possible to add image analysis features to commercial TPSs through custom scripting applications. A tool was developed to enable the evaluation of image-intensity-based metrics in the context of functional targeting and avoidance. In addition to providing dose-intensity metrics and histograms that can be easily extracted from a plan database and correlated with outcomes, the system can also be extended to a plug-in optimization system, which can directly use the computed metrics for optimization of post-treatment tumor or normal tissue response

  13. Energy saving analysis and management modeling based on index decomposition analysis integrated energy saving potential method: Application to complex chemical processes

    International Nuclear Information System (INIS)

    Geng, Zhiqiang; Gao, Huachao; Wang, Yanqing; Han, Yongming; Zhu, Qunxiong

    2017-01-01

    Highlights: • The integrated framework that combines IDA with energy-saving potential method is proposed. • Energy saving analysis and management framework of complex chemical processes is obtained. • This proposed method is efficient in energy optimization and carbon emissions of complex chemical processes. - Abstract: Energy saving and management of complex chemical processes play a crucial role in the sustainable development procedure. In order to analyze the effect of the technology, management level, and production structure having on energy efficiency and energy saving potential, this paper proposed a novel integrated framework that combines index decomposition analysis (IDA) with energy saving potential method. The IDA method can obtain the level of energy activity, energy hierarchy and energy intensity effectively based on data-drive to reflect the impact of energy usage. The energy saving potential method can verify the correctness of the improvement direction proposed by the IDA method. Meanwhile, energy efficiency improvement, energy consumption reduction and energy savings can be visually discovered by the proposed framework. The demonstration analysis of ethylene production has verified the practicality of the proposed method. Moreover, we can obtain the corresponding improvement for the ethylene production based on the demonstration analysis. The energy efficiency index and the energy saving potential of these worst months can be increased by 6.7% and 7.4%, respectively. And the carbon emissions can be reduced by 7.4–8.2%.

  14. Computation of integral bases

    NARCIS (Netherlands)

    Bauch, J.H.P.

    2015-01-01

    Let $A$ be a Dedekind domain, $K$ the fraction field of $A$, and $f\\in A[x]$ a monic irreducible separable polynomial. For a given non-zero prime ideal $\\mathfrak{p}$ of $A$ we present in this paper a new method to compute a $\\mathfrak{p}$-integral basis of the extension of $K$ determined by $f$.

  15. Computation of integral bases

    NARCIS (Netherlands)

    Bauch, J.D.

    2016-01-01

    Let A be a Dedekind domain, K the fraction field of A, and f∈. A[. x] a monic irreducible separable polynomial. For a given non-zero prime ideal p of A we present in this paper a new characterization of a p-integral basis of the extension of K determined by f. This characterization yields in an

  16. Analysis of Hybrid-Integrated High-Speed Electro-Absorption Modulated Lasers Based on EM/Circuit Co-simulation

    DEFF Research Database (Denmark)

    Johansen, Tom Keinicke; Krozer, Viktor; Kazmierski, C.

    2009-01-01

    An improved electromagnetic simulation (EM) based approach has been developed for optimization of the electrical to optical (E/O) transmission properties of integrated electro-absorption modulated lasers (EMLs) aiming at 100 Gbit/s Ethernet applications. Our approach allows for an accurate analysis...... of the EML performance in a hybrid microstrip assembly. The established EM-based approach provides a design methodology for the future hybrid integration of the EML with its driving electronics....

  17. Analysis of transient electromagnetic wave interactions on graphene-based devices using integral equations

    KAUST Repository

    Shi, Yifei

    2015-10-26

    Graphene is a monolayer of carbon atoms structured in the form of a honeycomb lattice. Recent experimental studies have revealed that it can support surface plasmons at Terahertz frequencies thanks to its dispersive conductivity. Additionally, characteristics of these plasmons can be dynamically adjusted via electrostatic gating of the graphene sheet (K. S. Novoselov, et al., Science, 306, 666–669, 2004). These properties suggest that graphene can be a building block for novel electromagnetic and photonic devices for applications in the fields of photovoltaics, bio-chemical sensing, all-optical computing, and flexible electronics. Simulation of electromagnetic interactions on graphene-based devices is not an easy task. The thickness of the graphene sheet is orders of magnitude smaller than any other geometrical dimension of the device. Consequently, discretization of such a device leads to significantly large number of unknowns and/or ill-conditioned matrix systems.

  18. Velocity Model Analysis Based on Integrated Well and Seismic Data of East Java Basin

    Science.gov (United States)

    Mubin, Fathul; Widya, Aviandy; Eka Nurcahya, Budi; Nurul Mahmudah, Erma; Purwaman, Indro; Radityo, Aryo; Shirly, Agung; Nurwani, Citra

    2018-03-01

    Time to depth conversion is an important processof seismic interpretationtoidentify hydrocarbonprospectivity. Main objectives of this research are to minimize the risk of error in geometry and time to depth conversion. Since it’s using a large amount of data and had been doing in the large scale of research areas, this research can be classified as a regional scale research. The research was focused on three horizons time interpretation: Top Kujung I, Top Ngimbang and Basement which located in the offshore and onshore areas of east Java basin. These three horizons was selected because they were assumed to be equivalent to the rock formation, which is it has always been the main objective of oil and gas exploration in the East Java Basin. As additional value, there was no previous works on velocity modeling for regional scale using geological parameters in East Java basin. Lithology and interval thickness were identified as geological factors that effected the velocity distribution in East Java Basin. Therefore, a three layer geological model was generated, which was defined by the type of lithology; carbonate (layer 1: Top Kujung I), shale (layer 2: Top Ngimbang) and Basement. A statistical method using three horizons is able to predict the velocity distribution on sparse well data in a regional scale. The average velocity range for Top Kujung I is 400 m/s - 6000 m/s, Top Ngimbang is 500 m/s - 8200 m/s and Basement is 600 m/s - 8000 m/s. Some velocity anomalies found in Madura sub-basin area, caused by geological factor which identified as thick shale deposit and high density values on shale. Result of velocity and depth modeling analysis can be used to define the volume range deterministically and to make geological models to prospect generation in details by geological concept.

  19. Model-Based Integrated High Penetration Renewables Planning and Control Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bank, Jason [Electrical Distribution Design, Blacksburg, VA (United States); Broadwater, Robert [Electrical Distribution Design, Blacksburg, VA (United States); Cheng, Danling [Electrical Distribution Design, Blacksburg, VA (United States); Costyk, David [Electrical Distribution Design, Blacksburg, VA (United States); Leyo, Mark [Electrical Distribution Design, Blacksburg, VA (United States); Seguin, Richard [Electrical Distribution Design, Blacksburg, VA (United States); Woyak, Jeremy [Electrical Distribution Design, Blacksburg, VA (United States); Acharya-Menon, Amrita [Pepco Holdings, Inc. (PHI), Washington, DC (United States); Steffel, Steve [Pepco Holdings, Inc. (PHI), Washington, DC (United States); Dise, John [Clean Power Research, Napa, CA (United States); Athawale, Rasika [Rutgers Univ., New Brunswick, NJ (United States); Felder, Frank [Rutgers Univ., New Brunswick, NJ (United States)

    2015-12-14

    Increasing adoption of Solar Photovoltaic (PV) generation at the distribution level poses several changes for the reliable operation of electrical power distribution systems. The addition of a significant amount of PV to a distribution network can introduce a variety of operational problems, including steady-state overvoltages, reverse flows, voltage flicker and excessive controller movement among others. These adverse impacts can be mitigated through a variety of equipment upgrades which represent a cost to either the electric utility or the owner of the PV site. The study performed here aims to quantify the levels of PV generation which present operational problems on a distribution circuit and how those problems might be alleviated. The study performed here included 20 distribution feeders selected from Pepco Holdings, Inc. (PHI) service territory. These feeders are located in the states of Delaware, Maryland and New Jersey. A hosting capacity study was performed on each feeder to determine how much additional PV it could support in its current configuration. Several improvements were then performed on these circuits including phase balancing, capacitor redesign, reducing the voltage regulator set points, fixed power factor operation on the PV inverters and the installation of battery storage. After each of these improvements the hosting capacity of the circuit was reevaluated in order to determine how that particular improvement impacted the amount of PV that could be hosted by the circuit. Each of these improvements represents a real cost in terms of labor and equipment in order to be implemented. They are expected to provide a benefit in terms of the amount of additional PV generation which can be safely interconnected to the distribution feeder. A cost benefit analysis was performed in order to evaluate the expected costs of each feeder improvement and how each one was able to increase the PV hosting capacity of each feeder. It is hoped that these results

  20. [Integration of pharmacokinetics and pharmacodynamics based on the in vivo analysis of drug-receptor binding].

    Science.gov (United States)

    Yamada, Shizuo

    2015-01-01

      As I was deeply interested in the effects of drugs on the human body, I chose pharmacology as the subject of special study when I became a 4th year student at Shizuoka College of Pharmacy. I studied abroad as a postdoctoral fellow for two years, from 1978, under the tutelage of Professor Henry I. Yamamura (pharmacology) in the College of Medicine at the University of Arizona, USA. He taught me a variety of valuable skills such as the radioreceptor binding assay, which represented the most advanced technology developed in the US at that time. After returning home, I engaged in clarifying receptor abnormalities in pathological conditions, as well as in drug action mechanisms, by making the best use of this radioreceptor binding assay. In 1989, following the founding of the University of Shizuoka, I was invited by Professor Ryohei Kimura to join the Department of Pharmacokinetics. This switch in discipline provided a good opportunity for me to broaden my perspectives in pharmaceutical sciences. I worked on evaluating drug-receptor binding in vivo as a combined index for pharmacokinetics and pharmacological effect manifestation, with the aim of bridging pharmacology and pharmacokinetics. In fact, by focusing on data from in vivo receptor binding, it became possible to clearly rationalize the important consideration of drug dose-concentration-action relationships, and to study quantitative and kinetic analyses of relationships among pharmacokinetics, receptor binding and pharmacological effects. Based on this concept, I was able to demonstrate the utility of dynamic analyses of drug-receptor binding in drug discovery, drug fostering, and the proper use of pharmacokinetics with regard to many drugs.

  1. Vertical integration and market power: A model-based analysis of restructuring in the Korean electricity market

    International Nuclear Information System (INIS)

    Bunn, Derek W.; Martoccia, Maria; Ochoa, Patricia; Kim, Haein; Ahn, Nam-Sung; Yoon, Yong-Beom

    2010-01-01

    An agent-based simulation model is developed using computational learning to investigate the impact of vertical integration between electricity generators and retailers on market power in a competitive wholesale market setting. It is observed that if partial vertical integration creates some market foreclosure, whether this leads to an increase or decrease in market power is situation specific. A detailed application to the Korean market structure reveals this to be the case. We find that in various cases, whilst vertical integration generally reduces spot prices, it can increase or decrease the market power of other market generators, depending upon the market share and the technology segment of the market, which is integrated, as well as the market concentrations before and after the integration.

  2. Vertical integration and market power. A model-based analysis of restructuring in the Korean electricity market

    Energy Technology Data Exchange (ETDEWEB)

    Bunn, Derek W.; Martoccia, Maria; Ochoa, Patricia [London Business School, London (United Kingdom); Kim, Haein; Ahn, Nam-Sung; Yoon, Yong-Beom [Korean Electric Power Corporation, Seoul (Korea)

    2010-07-15

    An agent-based simulation model is developed using computational learning to investigate the impact of vertical integration between electricity generators and retailers on market power in a competitive wholesale market setting. It is observed that if partial vertical integration creates some market foreclosure, whether this leads to an increase or decrease in market power is situation specific. A detailed application to the Korean market structure reveals this to be the case. We find that in various cases, whilst vertical integration generally reduces spot prices, it can increase or decrease the market power of other market generators, depending upon the market share and the technology segment of the market, which is integrated, as well as the market concentrations before and after the integration. (author)

  3. Vertical integration and market power: A model-based analysis of restructuring in the Korean electricity market

    Energy Technology Data Exchange (ETDEWEB)

    Bunn, Derek W., E-mail: dbunn@london.ed [London Business School, London (United Kingdom); Martoccia, Maria; Ochoa, Patricia [London Business School, London (United Kingdom); Kim, Haein; Ahn, Nam-Sung; Yoon, Yong-Beom [Korean Electric Power Corporation, Seoul (Korea, Republic of)

    2010-07-15

    An agent-based simulation model is developed using computational learning to investigate the impact of vertical integration between electricity generators and retailers on market power in a competitive wholesale market setting. It is observed that if partial vertical integration creates some market foreclosure, whether this leads to an increase or decrease in market power is situation specific. A detailed application to the Korean market structure reveals this to be the case. We find that in various cases, whilst vertical integration generally reduces spot prices, it can increase or decrease the market power of other market generators, depending upon the market share and the technology segment of the market, which is integrated, as well as the market concentrations before and after the integration.

  4. Influencing Factors and Development Trend Analysis of China Electric Grid Investment Demand Based on a Panel Co-Integration Model

    Directory of Open Access Journals (Sweden)

    Jinchao Li

    2018-01-01

    Full Text Available Electric grid investment demand analysis is significant to reasonably arranging construction funds for the electric grid and reduce costs. This paper used the panel data of electric grid investment from 23 provinces of China between 2004 and 2016 as samples to analyze the influence between electric grid investment demand and GDP, population scale, social electricity consumption, installed electrical capacity, and peak load based on co-integration tests. We find that GDP and peak load have positive influences on electric grid investment demand, but the impact of population scale, social electricity consumption, and installed electrical capacity on electric grid investment is not remarkable. We divide different regions in China into the eastern region, central region, and western region to analyze influence factors of electric grid investment, finally obtaining key factors in the eastern, central, and western regions. In the end, according to the analysis of key factors, we make a prediction about China’s electric grid investment for 2020 in different scenarios. The results offer a certain understanding for the development trend of China’s electric grid investment and contribute to the future development of electric grid investment.

  5. Integrated analysis on static/dynamic aeroelasticity of curved panels based on a modified local piston theory

    Science.gov (United States)

    Yang, Zhichun; Zhou, Jian; Gu, Yingsong

    2014-10-01

    A flow field modified local piston theory, which is applied to the integrated analysis on static/dynamic aeroelastic behaviors of curved panels, is proposed in this paper. The local flow field parameters used in the modification are obtained by CFD technique which has the advantage to simulate the steady flow field accurately. This flow field modified local piston theory for aerodynamic loading is applied to the analysis of static aeroelastic deformation and flutter stabilities of curved panels in hypersonic flow. In addition, comparisons are made between results obtained by using the present method and curvature modified method. It shows that when the curvature of the curved panel is relatively small, the static aeroelastic deformations and flutter stability boundaries obtained by these two methods have little difference, while for curved panels with larger curvatures, the static aeroelastic deformation obtained by the present method is larger and the flutter stability boundary is smaller compared with those obtained by the curvature modified method, and the discrepancy increases with the increasing of curvature of panels. Therefore, the existing curvature modified method is non-conservative compared to the proposed flow field modified method based on the consideration of hypersonic flight vehicle safety, and the proposed flow field modified local piston theory for curved panels enlarges the application range of piston theory.

  6. Decision making based on data analysis and optimization algorithm applied for cogeneration systems integration into a grid

    Science.gov (United States)

    Asmar, Joseph Al; Lahoud, Chawki; Brouche, Marwan

    2018-05-01

    Cogeneration and trigeneration systems can contribute to the reduction of primary energy consumption and greenhouse gas emissions in residential and tertiary sectors, by reducing fossil fuels demand and grid losses with respect to conventional systems. The cogeneration systems are characterized by a very high energy efficiency (80 to 90%) as well as a less polluting aspect compared to the conventional energy production. The integration of these systems into the energy network must simultaneously take into account their economic and environmental challenges. In this paper, a decision-making strategy will be introduced and is divided into two parts. The first one is a strategy based on a multi-objective optimization tool with data analysis and the second part is based on an optimization algorithm. The power dispatching of the Lebanese electricity grid is then simulated and considered as a case study in order to prove the compatibility of the cogeneration power calculated by our decision-making technique. In addition, the thermal energy produced by the cogeneration systems which capacity is selected by our technique shows compatibility with the thermal demand for district heating.

  7. A fundamental numerical analysis for noninvasive thermometry integrated in a heating applicator based on the reentrant cavity

    International Nuclear Information System (INIS)

    Ohwada, Hiroshi; Ishihara, Yasutoshi

    2010-01-01

    To improve the efficacy of hyperthermia treatment, a novel method of noninvasive measurement of body temperature change is proposed. The proposed technology, thermometry, is based on changes in the electromagnetic field distribution inside the heating applicator with temperature changes and the temperature dependence of the dielectric constant. In addition, an image of the temperature change distribution inside a body is reconstructed by applying a computed tomography (CT) algorithm. The proposed thermometry method can serve as a possible noninvasive method to monitor the temperature change distribution inside the body without the use of enormous thermometers such as in the case of magnetic resonance imaging (MRI). Furthermore, this temperature monitoring method can be easily combined with a heating applicator based on a cavity resonator, and the novel integrated treatment system can possibly be used to treat cancer effectively while noninvasively monitoring the heating effect. In this paper, the phase change distributions of the electromagnetic field with temperature changes are simulated by numerical analysis using the finite difference time domain (FDTD) method. Moreover, to estimate the phase change distributions inside a target body, the phase change distributions with temperature changes are reconstructed by a filtered back-projection. In addition, the reconstruction accuracy of the converted temperature change distribution from the phase change is evaluated. (author)

  8. ASKI: A modular toolbox for scattering-integral-based seismic full waveform inversion and sensitivity analysis utilizing external forward codes

    Directory of Open Access Journals (Sweden)

    Florian Schumacher

    2016-01-01

    Full Text Available Due to increasing computational resources, the development of new numerically demanding methods and software for imaging Earth’s interior remains of high interest in Earth sciences. Here, we give a description from a user’s and programmer’s perspective of the highly modular, flexible and extendable software package ASKI–Analysis of Sensitivity and Kernel Inversion–recently developed for iterative scattering-integral-based seismic full waveform inversion. In ASKI, the three fundamental steps of solving the seismic forward problem, computing waveform sensitivity kernels and deriving a model update are solved by independent software programs that interact via file output/input only. Furthermore, the spatial discretizations of the model space used for solving the seismic forward problem and for deriving model updates, respectively, are kept completely independent. For this reason, ASKI does not contain a specific forward solver but instead provides a general interface to established community wave propagation codes. Moreover, the third fundamental step of deriving a model update can be repeated at relatively low costs applying different kinds of model regularization or re-selecting/weighting the inverted dataset without need to re-solve the forward problem or re-compute the kernels. Additionally, ASKI offers the user sensitivity and resolution analysis tools based on the full sensitivity matrix and allows to compose customized workflows in a consistent computational environment. ASKI is written in modern Fortran and Python, it is well documented and freely available under terms of the GNU General Public License (http://www.rub.de/aski.

  9. Integrated system reliability analysis

    DEFF Research Database (Denmark)

    Gintautas, Tomas; Sørensen, John Dalsgaard

    Specific targets: 1) The report shall describe the state of the art of reliability and risk-based assessment of wind turbine components. 2) Development of methodology for reliability and risk-based assessment of the wind turbine at system level. 3) Describe quantitative and qualitative measures...

  10. Mathematical analysis of the boundary-integral based electrostatics estimation approximation for molecular solvation: exact results for spherical inclusions.

    Science.gov (United States)

    Bardhan, Jaydeep P; Knepley, Matthew G

    2011-09-28

    We analyze the mathematically rigorous BIBEE (boundary-integral based electrostatics estimation) approximation of the mixed-dielectric continuum model of molecular electrostatics, using the analytically solvable case of a spherical solute containing an arbitrary charge distribution. Our analysis, which builds on Kirkwood's solution using spherical harmonics, clarifies important aspects of the approximation and its relationship to generalized Born models. First, our results suggest a new perspective for analyzing fast electrostatic models: the separation of variables between material properties (the dielectric constants) and geometry (the solute dielectric boundary and charge distribution). Second, we find that the eigenfunctions of the reaction-potential operator are exactly preserved in the BIBEE model for the sphere, which supports the use of this approximation for analyzing charge-charge interactions in molecular binding. Third, a comparison of BIBEE to the recent GBε theory suggests a modified BIBEE model capable of predicting electrostatic solvation free energies to within 4% of a full numerical Poisson calculation. This modified model leads to a projection-framework understanding of BIBEE and suggests opportunities for future improvements. © 2011 American Institute of Physics

  11. Aespoe Hard Rock Laboratory. Analysis of fracture networks based on the integration of structural and hydrogeological observations on different scales

    Energy Technology Data Exchange (ETDEWEB)

    Bossart, P. [Geotechnical Inst. Ltd., Bern (Switzerland); Hermanson, Jan [Golder Associates, Stockholm (Sweden); Mazurek, M. [Univ. of Bern (Switzerland)

    2001-05-01

    Fracture networks at Aespoe have been studied for several rock types exhibiting different degrees of ductile and brittle deformation, as well as on different scales. Mesoscopic fault systems have been characterised and classified in an earlier report, this report focuses mainly on fracture networks derived on smaller scales, but also includes mesoscopic and larger scales. The TRUE-1 block has been selected for detailed structural analysis on a small scale due to the high density of relevant information. In addition to the data obtained from core materials, structural maps, BIP data and the results of hydro tests were synthesised to derive a conceptual structural model. The approach used to derive this conceptual model is based on the integration of deterministic structural evidence, probabilistic information and both upscaling and downscaling of observations and concepts derived on different scales. Twelve fracture networks mapped at different sites and scales and exhibiting various styles of tectonic deformation were analysed for fractal properties and structural and hydraulic interconnectedness. It was shown that these analysed fracture networks are not self-similar. An important result is the structural and hydraulic interconnectedness of fracture networks on all scales in the Aespoe rocks, which is further corroborated by geochemical evidence. Due to the structural and hydraulic interconnectedness of fracture systems on all scales at Aespoe, contaminants from waste canisters placed in tectonically low deformation environments would be transported - after having passed through the engineered barriers -from low-permeability fractures towards higher permeability fractures and may thus eventually reach high-permeability features.

  12. Aespoe Hard Rock Laboratory. Analysis of fracture networks based on the integration of structural and hydrogeological observations on different scales

    International Nuclear Information System (INIS)

    Bossart, P.; Hermanson, Jan; Mazurek, M.

    2001-05-01

    Fracture networks at Aespoe have been studied for several rock types exhibiting different degrees of ductile and brittle deformation, as well as on different scales. Mesoscopic fault systems have been characterised and classified in an earlier report, this report focuses mainly on fracture networks derived on smaller scales, but also includes mesoscopic and larger scales. The TRUE-1 block has been selected for detailed structural analysis on a small scale due to the high density of relevant information. In addition to the data obtained from core materials, structural maps, BIP data and the results of hydro tests were synthesised to derive a conceptual structural model. The approach used to derive this conceptual model is based on the integration of deterministic structural evidence, probabilistic information and both upscaling and downscaling of observations and concepts derived on different scales. Twelve fracture networks mapped at different sites and scales and exhibiting various styles of tectonic deformation were analysed for fractal properties and structural and hydraulic interconnectedness. It was shown that these analysed fracture networks are not self-similar. An important result is the structural and hydraulic interconnectedness of fracture networks on all scales in the Aespoe rocks, which is further corroborated by geochemical evidence. Due to the structural and hydraulic interconnectedness of fracture systems on all scales at Aespoe, contaminants from waste canisters placed in tectonically low deformation environments would be transported - after having passed through the engineered barriers -from low-permeability fractures towards higher permeability fractures and may thus eventually reach high-permeability features

  13. Modeling and Analysis of Hybrid Cellular/WLAN Systems with Integrated Service-Based Vertical Handoff Schemes

    Science.gov (United States)

    Xia, Weiwei; Shen, Lianfeng

    We propose two vertical handoff schemes for cellular network and wireless local area network (WLAN) integration: integrated service-based handoff (ISH) and integrated service-based handoff with queue capabilities (ISHQ). Compared with existing handoff schemes in integrated cellular/WLAN networks, the proposed schemes consider a more comprehensive set of system characteristics such as different features of voice and data services, dynamic information about the admitted calls, user mobility and vertical handoffs in two directions. The code division multiple access (CDMA) cellular network and IEEE 802.11e WLAN are taken into account in the proposed schemes. We model the integrated networks by using multi-dimensional Markov chains and the major performance measures are derived for voice and data services. The important system parameters such as thresholds to prioritize handoff voice calls and queue sizes are optimized. Numerical results demonstrate that the proposed ISHQ scheme can maximize the utilization of overall bandwidth resources with the best quality of service (QoS) provisioning for voice and data services.

  14. The Impact of Information System-Enabled Supply Chain Process Integration on Business Performance: A Resource-Based Analysis

    OpenAIRE

    Morteza Ghobakhloo; Sai Hong Tang; Mohammad Sadegh Sabouri; Norzima Zulkifli

    2014-01-01

    This paper seeks to develop and test a model to examine the relationships between, technical aspects of IS resources (IS alignment, IS resources technical quality, IS advancement), supply chain process integration, and firm performance. A questionnaire-based survey was conducted to collect data from 227 supply chain, logistics, or procurement/purchasing managers of leading manufacturing and retail organizations. Drawing on resources-based view of the firm, and through extending the concept of...

  15. Interface-based software integration

    Directory of Open Access Journals (Sweden)

    Aziz Ahmad Rais

    2016-07-01

    Full Text Available Enterprise architecture frameworks define the goals of enterprise architecture in order to make business processes and IT operations more effective, and to reduce the risk of future investments. These enterprise architecture frameworks offer different architecture development methods that help in building enterprise architecture. In practice, the larger organizations become, the larger their enterprise architecture and IT become. This leads to an increasingly complex system of enterprise architecture development and maintenance. Application software architecture is one type of architecture that, along with business architecture, data architecture and technology architecture, composes enterprise architecture. From the perspective of integration, enterprise architecture can be considered a system of interaction between multiple examples of application software. Therefore, effective software integration is a very important basis for the future success of the enterprise architecture in question. This article will provide interface-based integration practice in order to help simplify the process of building such a software integration system. The main goal of interface-based software integration is to solve problems that may arise with software integration requirements and developing software integration architecture.

  16. Analysis of silicon-based integrated photovoltaic-electrochemical hydrogen generation system under varying temperature and illumination

    Institute of Scientific and Technical Information of China (English)

    Vishwa Bhatt; Brijesh Tripathi; Pankaj Yadav; Manoj Kumar

    2017-01-01

    Last decade witnessed tremendous research and development in the area of photo-electrolytic hydrogen generation using chemically stable nanostructured photo-cathode/anode materials.Due to intimately coupled charge separation and photo-catalytic processes,it is very difficult to optimize individual components of such system leading to a very low demonstrated solar-to-fuel efficiency (SFE) of less than 1%.Recently there has been growing interest in an integrated photovoltaic-electrochemical (PV-EC) system based on GaAs solar cells with the demonstrated SFE of 24.5% under concentrated illumination condition.But a high cost of GaAs based solar cells and recent price drop of poly-crystalline silicon (pc-Si) solar cells motivated researchers to explore silicon based integrated PV-EC system.In this paper a theoretical framework is introduced to model silicon-based integrated PV-EC device.The theoretical framework is used to analyze the coupling and kinetic losses of a silicon solar cell based integrated PV-EC water splitting system under varying temperature and illumination.The kinetic loss occurs in the range of 19.1%-27.9% and coupling loss takes place in the range of 5.45%-6.74% with respect to varying illumination in the range of 20-100 mW/cm2.Similarly,the effect of varying temperature has severe impact on the performance of the system,wherein the coupling loss occurs in the range of 0.84%-21.51% for the temperature variation from 25 to 50 ℃.

  17. From heat integration targets toward implementation – A TSA (total site analysis)-based design approach for heat recovery systems in industrial clusters

    International Nuclear Information System (INIS)

    Hackl, Roman; Harvey, Simon

    2015-01-01

    The European process industry is facing major challenges to decrease production costs. One strategy to achieve this is by increasing energy efficiency. Single chemical processes are often well-integrated and the tools to target and design such measures are well developed. Site-wide heat integration based on total site analysis tools can be used to identify opportunities to further increase energy efficiency. However, the methodology has to be developed further in order to enable identification of practical heat integration measures in a systematic way. Designing site-wide heat recovery systems across an industrial cluster is complex and involves aspects apart from thermal process and utility flows. This work presents a method for designing a roadmap of heat integration investments based on total site analysis. The method is applied to a chemical cluster in Sweden. The results of the case study show that application of the proposed method can achieve up to 42% of the previously targeted hot utility savings of 129 MW. A roadmap of heat integration systems is suggested, ranging from less complex systems that achieve a minor share of the heat recovery potential to sophisticated, strongly interdependent systems demanding large investments and a high level of collaboration. - Highlights: • Methodology focused on the practical implementation of site-wide heat recovery. • Algorithm to determine a roadmap of heat integration investments. • Case study: 42% hot utility savings potential at a pay-back period of 3.9y.

  18. International Space Station Configuration Analysis and Integration

    Science.gov (United States)

    Anchondo, Rebekah

    2016-01-01

    Ambitious engineering projects, such as NASA's International Space Station (ISS), require dependable modeling, analysis, visualization, and robotics to ensure that complex mission strategies are carried out cost effectively, sustainably, and safely. Learn how Booz Allen Hamilton's Modeling, Analysis, Visualization, and Robotics Integration Center (MAVRIC) team performs engineering analysis of the ISS Configuration based primarily on the use of 3D CAD models. To support mission planning and execution, the team tracks the configuration of ISS and maintains configuration requirements to ensure operational goals are met. The MAVRIC team performs multi-disciplinary integration and trade studies to ensure future configurations meet stakeholder needs.

  19. A modified precise integration method based on Magnus expansion for transient response analysis of time varying dynamical structure

    International Nuclear Information System (INIS)

    Yue, Cong; Ren, Xingmin; Yang, Yongfeng; Deng, Wangqun

    2016-01-01

    This paper provides a precise and efficacious methodology for manifesting forced vibration response with respect to the time-variant linear rotational structure subjected to unbalanced excitation. A modified algorithm based on time step precise integration method and Magnus expansion is developed for instantaneous dynamic problems. The iterative solution is achieved by the ideology of transition and dimensional increment matrix. Numerical examples on a typical accelerating rotation system considering gyroscopic moment and mass unbalance force comparatively demonstrate the validity, effectiveness and accuracy with Newmark-β method. It is shown that the proposed algorithm has high accuracy without loss efficiency.

  20. A computer software system for integration and analysis of grid-based remote sensing data with other natural resource data. Remote Sensing Project

    Science.gov (United States)

    Tilmann, S. E.; Enslin, W. R.; Hill-Rowley, R.

    1977-01-01

    A computer-based information system is described designed to assist in the integration of commonly available spatial data for regional planning and resource analysis. The Resource Analysis Program (RAP) provides a variety of analytical and mapping phases for single factor or multi-factor analyses. The unique analytical and graphic capabilities of RAP are demonstrated with a study conducted in Windsor Township, Eaton County, Michigan. Soil, land cover/use, topographic and geological maps were used as a data base to develope an eleven map portfolio. The major themes of the portfolio are land cover/use, non-point water pollution, waste disposal, and ground water recharge.

  1. Thermodynamic analysis and optimization of an integrated Rankine power cycle and nano-fluid based parabolic trough solar collector

    International Nuclear Information System (INIS)

    Toghyani, Somayeh; Baniasadi, Ehsan; Afshari, Ebrahim

    2016-01-01

    Highlights: • The performance of an integrated nano-fluid based solar Rankine cycle is studied. • The effect of solar intensity, ambient temperature, and volume fraction is evaluated. • The concept of Finite Time Thermodynamics is applied. • It is shown that CuO/oil nano-fluid has the best performance from exergy perspective. - Abstract: In this paper, the performance of an integrated Rankine power cycle with parabolic trough solar system and a thermal storage system is simulated based on four different nano-fluids in the solar collector system, namely CuO, SiO_2, TiO_2 and Al_2O_3. The effects of solar intensity, dead state temperature, and volume fraction of different nano-particles on the performance of the integrated cycle are studied using second law of thermodynamics. Also, the genetic algorithm is applied to optimize the net output power of the solar Rankine cycle. The solar thermal energy is stored in a two-tank system to improve the overall performance of the system when sunlight is not available. The concept of Finite Time Thermodynamics is applied for analyzing the performance of the solar collector and thermal energy storage system. This study reveals that by increasing the volume fraction of nano-particles, the exergy efficiency of the system increases. At higher dead state temperatures, the overall exergy efficiency is increased, and higher solar irradiation leads to considerable increase of the output power of the system. It is shown that among the selected nano-fluids, CuO/oil has the best performance from exergy perspective.

  2. Integration of Nanoparticle-Based Paper Sensors into the Classroom: An Example of Application for Rapid Colorimetric Analysis of Antioxidants

    Science.gov (United States)

    Sharpe, Erica; Andreescu, Silvana

    2015-01-01

    We describe a laboratory experiment that employs the Nanoceria Reducing Antioxidant Capacity (or NanoCerac) Assay to introduce students to portable nanoparticle-based paper sensors for rapid analysis and field detection of polyphenol antioxidants. The experiment gives students a hands-on opportunity to utilize nanoparticle chemistry to develop…

  3. Treatment Integrity of School-Based Interventions with Children in the "Journal of Applied Behavior Analysis" 1991-2005

    Science.gov (United States)

    McIntyre, Laura Lee; Gresham, Frank M.; DiGennaro, Florence D.; Reed, Derek D.

    2007-01-01

    We reviewed all school-based experimental studies with individuals 0 to 18 years published in the "Journal of Applied Behavior Analysis" (JABA) between 1991 and 2005. A total of 142 articles (152 studies) that met review criteria were included. Nearly all (95%) of these experiments provided an operational definition of the independent variable,…

  4. Analysis Method for Integrating Components of Product

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jun Ho [Inzest Co. Ltd, Seoul (Korea, Republic of); Lee, Kun Sang [Kookmin Univ., Seoul (Korea, Republic of)

    2017-04-15

    This paper presents some of the methods used to incorporate the parts constituting a product. A new relation function concept and its structure are introduced to analyze the relationships of component parts. This relation function has three types of information, which can be used to establish a relation function structure. The relation function structure of the analysis criteria was established to analyze and present the data. The priority components determined by the analysis criteria can be integrated. The analysis criteria were divided based on their number and orientation, as well as their direct or indirect characteristic feature. This paper presents a design algorithm for component integration. This algorithm was applied to actual products, and the components inside the product were integrated. Therefore, the proposed algorithm was used to conduct research to improve the brake discs for bicycles. As a result, an improved product similar to the related function structure was actually created.

  5. Analysis Method for Integrating Components of Product

    International Nuclear Information System (INIS)

    Choi, Jun Ho; Lee, Kun Sang

    2017-01-01

    This paper presents some of the methods used to incorporate the parts constituting a product. A new relation function concept and its structure are introduced to analyze the relationships of component parts. This relation function has three types of information, which can be used to establish a relation function structure. The relation function structure of the analysis criteria was established to analyze and present the data. The priority components determined by the analysis criteria can be integrated. The analysis criteria were divided based on their number and orientation, as well as their direct or indirect characteristic feature. This paper presents a design algorithm for component integration. This algorithm was applied to actual products, and the components inside the product were integrated. Therefore, the proposed algorithm was used to conduct research to improve the brake discs for bicycles. As a result, an improved product similar to the related function structure was actually created.

  6. PopHR: a knowledge-based platform to support integration, analysis, and visualization of population health data.

    Science.gov (United States)

    Shaban-Nejad, Arash; Lavigne, Maxime; Okhmatovskaia, Anya; Buckeridge, David L

    2017-01-01

    Population health decision makers must consider complex relationships between multiple concepts measured with differential accuracy from heterogeneous data sources. Population health information systems are currently limited in their ability to integrate data and present a coherent portrait of population health. Consequentially, these systems can provide only basic support for decision makers. The Population Health Record (PopHR) is a semantic web application that automates the integration and extraction of massive amounts of heterogeneous data from multiple distributed sources (e.g., administrative data, clinical records, and survey responses) to support the measurement and monitoring of population health and health system performance for a defined population. The design of the PopHR draws on the theories of the determinants of health and evidence-based public health to harmonize and explicitly link information about a population with evidence about the epidemiology and control of chronic diseases. Organizing information in this manner and linking it explicitly to evidence is expected to improve decision making related to the planning, implementation, and evaluation of population health and health system interventions. In this paper, we describe the PopHR platform and discuss the architecture, design, key modules, and its implementation and use. © 2016 New York Academy of Sciences.

  7. An integrated genetic data environment (GDE)-based LINUX interface for analysis of HIV-1 and other microbial sequences.

    Science.gov (United States)

    De Oliveira, T; Miller, R; Tarin, M; Cassol, S

    2003-01-01

    Sequence databases encode a wealth of information needed to develop improved vaccination and treatment strategies for the control of HIV and other important pathogens. To facilitate effective utilization of these datasets, we developed a user-friendly GDE-based LINUX interface that reduces input/output file formatting. GDE was adapted to the Linux operating system, bioinformatics tools were integrated with microbe-specific databases, and up-to-date GDE menus were developed for several clinically important viral, bacterial and parasitic genomes. Each microbial interface was designed for local access and contains Genbank, BLAST-formatted and phylogenetic databases. GDE-Linux is available for research purposes by direct application to the corresponding author. Application-specific menus and support files can be downloaded from (http://www.bioafrica.net).

  8. Design and System Analysis of Quad-Generation Plant Based on Biomass Gasification Integrated with District Heating

    DEFF Research Database (Denmark)

    Rudra, Souman

    alternative by upgrading existing district heating plant. It provides a generic modeling framework to design flexible energy system in near future. These frameworks address the three main issues arising in the planning and designing of energy system: a) socio impact at both planning and proses design level; b...... in this study. The overall aim of this work is to provide a complete assessment of the technical potential of biomass gasification for local heat and power supply in Denmark and replace of natural gas for the production. This study also finds and defines the future areas of research in the gasification......, it possible to lay a foundation for future gasification based power sector to produce flexible output such as electricity, heat, chemicals or bio-fuels by improving energy system of existing DHP(district heating plant) integrating gasification technology. The present study investigate energy system...

  9. Digital gene expression analysis based on integrated de novo transcriptome assembly of sweet potato [Ipomoea batatas (L. Lam].

    Directory of Open Access Journals (Sweden)

    Xiang Tao

    Full Text Available BACKGROUND: Sweet potato (Ipomoea batatas L. [Lam.] ranks among the top six most important food crops in the world. It is widely grown throughout the world with high and stable yield, strong adaptability, rich nutrient content, and multiple uses. However, little is known about the molecular biology of this important non-model organism due to lack of genomic resources. Hence, studies based on high-throughput sequencing technologies are needed to get a comprehensive and integrated genomic resource and better understanding of gene expression patterns in different tissues and at various developmental stages. METHODOLOGY/PRINCIPAL FINDINGS: Illumina paired-end (PE RNA-Sequencing was performed, and generated 48.7 million of 75 bp PE reads. These reads were de novo assembled into 128,052 transcripts (≥ 100 bp, which correspond to 41.1 million base pairs, by using a combined assembly strategy. Transcripts were annotated by Blast2GO and 51,763 transcripts got BLASTX hits, in which 39,677 transcripts have GO terms and 14,117 have ECs that are associated with 147 KEGG pathways. Furthermore, transcriptome differences of seven tissues were analyzed by using Illumina digital gene expression (DGE tag profiling and numerous differentially and specifically expressed transcripts were identified. Moreover, the expression characteristics of genes involved in viral genomes, starch metabolism and potential stress tolerance and insect resistance were also identified. CONCLUSIONS/SIGNIFICANCE: The combined de novo transcriptome assembly strategy can be applied to other organisms whose reference genomes are not available. The data provided here represent the most comprehensive and integrated genomic resources for cloning and identifying genes of interest in sweet potato. Characterization of sweet potato transcriptome provides an effective tool for better understanding the molecular mechanisms of cellular processes including development of leaves and storage roots

  10. Integrating multi-criteria decision analysis for a GIS-based hazardous waste landfill sitting in Kurdistan Province, western Iran

    International Nuclear Information System (INIS)

    Sharifi, Mozafar; Hadidi, Mosslem; Vessali, Elahe; Mosstafakhani, Parasto; Taheri, Kamal; Shahoie, Saber; Khodamoradpour, Mehran

    2009-01-01

    The evaluation of a hazardous waste disposal site is a complicated process because it requires data from diverse social and environmental fields. These data often involve processing of a significant amount of spatial information which can be used by GIS as an important tool for land use suitability analysis. This paper presents a multi-criteria decision analysis alongside with a geospatial analysis for the selection of hazardous waste landfill sites in Kurdistan Province, western Iran. The study employs a two-stage analysis to provide a spatial decision support system for hazardous waste management in a typically under developed region. The purpose of GIS was to perform an initial screening process to eliminate unsuitable land followed by utilization of a multi-criteria decision analysis (MCDA) to identify the most suitable sites using the information provided by the regional experts with reference to new chosen criteria. Using 21 exclusionary criteria, as input layers, masked maps were prepared. Creating various intermediate or analysis map layers a final overlay map was obtained representing areas for hazardous waste landfill sites. In order to evaluate different landfill sites produced by the overlaying a landfill suitability index system was developed representing cumulative effects of relative importance (weights) and suitability values of 14 non-exclusionary criteria including several criteria resulting from field observation. Using this suitability index 15 different sites were visited and based on the numerical evaluation provided by MCDA most suitable sites were determined.

  11. Integration of Rural Community Pharmacies into a Rural Family Medicine Practice-Based Research Network: A Descriptive Analysis

    Directory of Open Access Journals (Sweden)

    Nicholas E. Hagemeier

    2015-01-01

    Full Text Available Purpose: Practice-based research networks (PBRN seek to shorten the gap between research and application in primary patient care settings. Inclusion of community pharmacies in primary care PBRNs is relatively unexplored. Such a PBRN model could improve care coordination and community-based research, especially in rural and underserved areas. The objectives of this study were to: 1 evaluate rural Appalachian community pharmacy key informants’ perceptions of PBRNs and practice-based research; 2 explore key informants’ perceptions of perceived applicability of practice-based research domains; and 3 explore pharmacy key informant interest in PBRN participation. Methods: The sample consisted of community pharmacies within city limits of all Appalachian Research Network (AppNET PBRN communities in South Central Appalachia. A descriptive, cross-sectional, questionnaire-based study was conducted from November 2013 to February 2014. Bivariate and multivariate analyses were conducted to examine associations between key informant and practice characteristics, and PBRN interest and perceptions. Findings: A 47.8% response rate was obtained. Most key informants (88% were very or somewhat interested in participating in AppNET. Enrichment of patient care (82.8%, improved relationships with providers in the community (75.9%, and professional development opportunities (69.0% were perceived by more than two-thirds of respondents to be very beneficial outcomes of PBRN participation. Respondents ranked time constraints (63% and workflow disruptions (20% as the biggest barriers to PBRN participation. Conclusion: Key informants in rural Appalachian community pharmacies indicated interest in PBRN participation. Integration of community pharmacies into existing rural PBRNs could advance community level care coordination and promote improved health outcomes in rural and underserved areas.   Type: Original Research

  12. Gaps Analysis of Integrating Product Design, Manufacturing, and Quality Data in The Supply Chain Using Model-Based Definition.

    Science.gov (United States)

    Trainer, Asa; Hedberg, Thomas; Feeney, Allison Barnard; Fischer, Kevin; Rosche, Phil

    2016-01-01

    MBE vision. Finally, it also seeks to explore the interaction between CAD and CMM processes and determine if the concept of feedback from CAM and CMM back to CAD is feasible. The main goal of our study is to test the hypothesis that model-based-data interoperability from CAD-to-CAM and CAD-to-CMM is feasible through standards-based integration. This paper presents several barriers to model-based-data interoperability. Overall, the project team demonstrated the exchange of product definition data between CAD, CAM, and CMM systems using standards-based methods. While gaps in standards coverage were identified, the gaps should not stop industry's progress toward MBE. The results of our study provide evidence in support of an open-standards method to model-based-data interoperability, which would provide maximum value and impact to industry.

  13. Integration of gel-based and gel-free proteomic data for functional analysis of proteins through Soybean Proteome Database

    KAUST Repository

    Komatsu, Setsuko

    2017-05-10

    The Soybean Proteome Database (SPD) stores data on soybean proteins obtained with gel-based and gel-free proteomic techniques. The database was constructed to provide information on proteins for functional analyses. The majority of the data is focused on soybean (Glycine max ‘Enrei’). The growth and yield of soybean are strongly affected by environmental stresses such as flooding. The database was originally constructed using data on soybean proteins separated by two-dimensional polyacrylamide gel electrophoresis, which is a gel-based proteomic technique. Since 2015, the database has been expanded to incorporate data obtained by label-free mass spectrometry-based quantitative proteomics, which is a gel-free proteomic technique. Here, the portions of the database consisting of gel-free proteomic data are described. The gel-free proteomic database contains 39,212 proteins identified in 63 sample sets, such as temporal and organ-specific samples of soybean plants grown under flooding stress or non-stressed conditions. In addition, data on organellar proteins identified in mitochondria, nuclei, and endoplasmic reticulum are stored. Furthermore, the database integrates multiple omics data such as genomics, transcriptomics, metabolomics, and proteomics. The SPD database is accessible at http://proteome.dc.affrc.go.jp/Soybean/. Biological significanceThe Soybean Proteome Database stores data obtained from both gel-based and gel-free proteomic techniques. The gel-free proteomic database comprises 39,212 proteins identified in 63 sample sets, such as different organs of soybean plants grown under flooding stress or non-stressed conditions in a time-dependent manner. In addition, organellar proteins identified in mitochondria, nuclei, and endoplasmic reticulum are stored in the gel-free proteomics database. A total of 44,704 proteins, including 5490 proteins identified using a gel-based proteomic technique, are stored in the SPD. It accounts for approximately 80% of all

  14. Integration of gel-based and gel-free proteomic data for functional analysis of proteins through Soybean Proteome Database.

    Science.gov (United States)

    Komatsu, Setsuko; Wang, Xin; Yin, Xiaojian; Nanjo, Yohei; Ohyanagi, Hajime; Sakata, Katsumi

    2017-06-23

    The Soybean Proteome Database (SPD) stores data on soybean proteins obtained with gel-based and gel-free proteomic techniques. The database was constructed to provide information on proteins for functional analyses. The majority of the data is focused on soybean (Glycine max 'Enrei'). The growth and yield of soybean are strongly affected by environmental stresses such as flooding. The database was originally constructed using data on soybean proteins separated by two-dimensional polyacrylamide gel electrophoresis, which is a gel-based proteomic technique. Since 2015, the database has been expanded to incorporate data obtained by label-free mass spectrometry-based quantitative proteomics, which is a gel-free proteomic technique. Here, the portions of the database consisting of gel-free proteomic data are described. The gel-free proteomic database contains 39,212 proteins identified in 63 sample sets, such as temporal and organ-specific samples of soybean plants grown under flooding stress or non-stressed conditions. In addition, data on organellar proteins identified in mitochondria, nuclei, and endoplasmic reticulum are stored. Furthermore, the database integrates multiple omics data such as genomics, transcriptomics, metabolomics, and proteomics. The SPD database is accessible at http://proteome.dc.affrc.go.jp/Soybean/. The Soybean Proteome Database stores data obtained from both gel-based and gel-free proteomic techniques. The gel-free proteomic database comprises 39,212 proteins identified in 63 sample sets, such as different organs of soybean plants grown under flooding stress or non-stressed conditions in a time-dependent manner. In addition, organellar proteins identified in mitochondria, nuclei, and endoplasmic reticulum are stored in the gel-free proteomics database. A total of 44,704 proteins, including 5490 proteins identified using a gel-based proteomic technique, are stored in the SPD. It accounts for approximately 80% of all predicted proteins from

  15. Integration of gel-based and gel-free proteomic data for functional analysis of proteins through Soybean Proteome Database

    KAUST Repository

    Komatsu, Setsuko; Wang, Xin; Yin, Xiaojian; Nanjo, Yohei; Ohyanagi, Hajime; Sakata, Katsumi

    2017-01-01

    The Soybean Proteome Database (SPD) stores data on soybean proteins obtained with gel-based and gel-free proteomic techniques. The database was constructed to provide information on proteins for functional analyses. The majority of the data is focused on soybean (Glycine max ‘Enrei’). The growth and yield of soybean are strongly affected by environmental stresses such as flooding. The database was originally constructed using data on soybean proteins separated by two-dimensional polyacrylamide gel electrophoresis, which is a gel-based proteomic technique. Since 2015, the database has been expanded to incorporate data obtained by label-free mass spectrometry-based quantitative proteomics, which is a gel-free proteomic technique. Here, the portions of the database consisting of gel-free proteomic data are described. The gel-free proteomic database contains 39,212 proteins identified in 63 sample sets, such as temporal and organ-specific samples of soybean plants grown under flooding stress or non-stressed conditions. In addition, data on organellar proteins identified in mitochondria, nuclei, and endoplasmic reticulum are stored. Furthermore, the database integrates multiple omics data such as genomics, transcriptomics, metabolomics, and proteomics. The SPD database is accessible at http://proteome.dc.affrc.go.jp/Soybean/. Biological significanceThe Soybean Proteome Database stores data obtained from both gel-based and gel-free proteomic techniques. The gel-free proteomic database comprises 39,212 proteins identified in 63 sample sets, such as different organs of soybean plants grown under flooding stress or non-stressed conditions in a time-dependent manner. In addition, organellar proteins identified in mitochondria, nuclei, and endoplasmic reticulum are stored in the gel-free proteomics database. A total of 44,704 proteins, including 5490 proteins identified using a gel-based proteomic technique, are stored in the SPD. It accounts for approximately 80% of all

  16. An integrated portfolio optimisation procedure based on data envelopment analysis, artificial bee colony algorithm and genetic programming

    Science.gov (United States)

    Hsu, Chih-Ming

    2014-12-01

    Portfolio optimisation is an important issue in the field of investment/financial decision-making and has received considerable attention from both researchers and practitioners. However, besides portfolio optimisation, a complete investment procedure should also include the selection of profitable investment targets and determine the optimal timing for buying/selling the investment targets. In this study, an integrated procedure using data envelopment analysis (DEA), artificial bee colony (ABC) and genetic programming (GP) is proposed to resolve a portfolio optimisation problem. The proposed procedure is evaluated through a case study on investing in stocks in the semiconductor sub-section of the Taiwan stock market for 4 years. The potential average 6-month return on investment of 9.31% from 1 November 2007 to 31 October 2011 indicates that the proposed procedure can be considered a feasible and effective tool for making outstanding investment plans, and thus making profits in the Taiwan stock market. Moreover, it is a strategy that can help investors to make profits even when the overall stock market suffers a loss.

  17. IIS--Integrated Interactome System: a web-based platform for the annotation, analysis and visualization of protein-metabolite-gene-drug interactions by integrating a variety of data sources and tools.

    Science.gov (United States)

    Carazzolle, Marcelo Falsarella; de Carvalho, Lucas Miguel; Slepicka, Hugo Henrique; Vidal, Ramon Oliveira; Pereira, Gonçalo Amarante Guimarães; Kobarg, Jörg; Meirelles, Gabriela Vaz

    2014-01-01

    High-throughput screening of physical, genetic and chemical-genetic interactions brings important perspectives in the Systems Biology field, as the analysis of these interactions provides new insights into protein/gene function, cellular metabolic variations and the validation of therapeutic targets and drug design. However, such analysis depends on a pipeline connecting different tools that can automatically integrate data from diverse sources and result in a more comprehensive dataset that can be properly interpreted. We describe here the Integrated Interactome System (IIS), an integrative platform with a web-based interface for the annotation, analysis and visualization of the interaction profiles of proteins/genes, metabolites and drugs of interest. IIS works in four connected modules: (i) Submission module, which receives raw data derived from Sanger sequencing (e.g. two-hybrid system); (ii) Search module, which enables the user to search for the processed reads to be assembled into contigs/singlets, or for lists of proteins/genes, metabolites and drugs of interest, and add them to the project; (iii) Annotation module, which assigns annotations from several databases for the contigs/singlets or lists of proteins/genes, generating tables with automatic annotation that can be manually curated; and (iv) Interactome module, which maps the contigs/singlets or the uploaded lists to entries in our integrated database, building networks that gather novel identified interactions, protein and metabolite expression/concentration levels, subcellular localization and computed topological metrics, GO biological processes and KEGG pathways enrichment. This module generates a XGMML file that can be imported into Cytoscape or be visualized directly on the web. We have developed IIS by the integration of diverse databases following the need of appropriate tools for a systematic analysis of physical, genetic and chemical-genetic interactions. IIS was validated with yeast two

  18. Nondestructive Analysis of Tumor-Associated Membrane Protein Integrating Imaging and Amplified Detection in situ Based on Dual-Labeled DNAzyme.

    Science.gov (United States)

    Chen, Xiaoxia; Zhao, Jing; Chen, Tianshu; Gao, Tao; Zhu, Xiaoli; Li, Genxi

    2018-01-01

    Comprehensive analysis of the expression level and location of tumor-associated membrane proteins (TMPs) is of vital importance for the profiling of tumor cells. Currently, two kinds of independent techniques, i.e. ex situ detection and in situ imaging, are usually required for the quantification and localization of TMPs respectively, resulting in some inevitable problems. Methods: Herein, based on a well-designed and fluorophore-labeled DNAzyme, we develop an integrated and facile method, in which imaging and quantification of TMPs in situ are achieved simultaneously in a single system. The labeled DNAzyme not only produces localized fluorescence for the visualization of TMPs but also catalyzes the cleavage of a substrate to produce quantitative fluorescent signals that can be collected from solution for the sensitive detection of TMPs. Results: Results from the DNAzyme-based in situ imaging and quantification of TMPs match well with traditional immunofluorescence and western blotting. In addition to the advantage of two-in-one, the DNAzyme-based method is highly sensitivity, allowing the detection of TMPs in only 100 cells. Moreover, the method is nondestructive. Cells after analysis could retain their physiological activity and could be cultured for other applications. Conclusion: The integrated system provides solid results for both imaging and quantification of TMPs, making it a competitive method over some traditional techniques for the analysis of TMPs, which offers potential application as a toolbox in the future.

  19. Integrative biological analysis for neuropsychopharmacology.

    Science.gov (United States)

    Emmett, Mark R; Kroes, Roger A; Moskal, Joseph R; Conrad, Charles A; Priebe, Waldemar; Laezza, Fernanda; Meyer-Baese, Anke; Nilsson, Carol L

    2014-01-01

    Although advances in psychotherapy have been made in recent years, drug discovery for brain diseases such as schizophrenia and mood disorders has stagnated. The need for new biomarkers and validated therapeutic targets in the field of neuropsychopharmacology is widely unmet. The brain is the most complex part of human anatomy from the standpoint of number and types of cells, their interconnections, and circuitry. To better meet patient needs, improved methods to approach brain studies by understanding functional networks that interact with the genome are being developed. The integrated biological approaches--proteomics, transcriptomics, metabolomics, and glycomics--have a strong record in several areas of biomedicine, including neurochemistry and neuro-oncology. Published applications of an integrated approach to projects of neurological, psychiatric, and pharmacological natures are still few but show promise to provide deep biological knowledge derived from cells, animal models, and clinical materials. Future studies that yield insights based on integrated analyses promise to deliver new therapeutic targets and biomarkers for personalized medicine.

  20. CORE-BASED INTEGRATED SEDIMENTOLOGIC, STRATIGRAPHIC, AND GEOCHEMICAL ANALYSIS OF THE OIL SHALE BEARING GREEN RIVER FORMATION, UINTA BASIN, UTAH

    Energy Technology Data Exchange (ETDEWEB)

    Lauren P. Birgenheier; Michael D. Vanden Berg,

    2011-04-11

    An integrated detailed sedimentologic, stratigraphic, and geochemical study of Utah's Green River Formation has found that Lake Uinta evolved in three phases (1) a freshwater rising lake phase below the Mahogany zone, (2) an anoxic deep lake phase above the base of the Mahogany zone and (3) a hypersaline lake phase within the middle and upper R-8. This long term lake evolution was driven by tectonic basin development and the balance of sediment and water fill with the neighboring basins, as postulated by models developed from the Greater Green River Basin by Carroll and Bohacs (1999). Early Eocene abrupt global-warming events may have had significant control on deposition through the amount of sediment production and deposition rates, such that lean zones below the Mahogany zone record hyperthermal events and rich zones record periods between hyperthermals. This type of climatic control on short-term and long-term lake evolution and deposition has been previously overlooked. This geologic history contains key points relevant to oil shale development and engineering design including: (1) Stratigraphic changes in oil shale quality and composition are systematic and can be related to spatial and temporal changes in the depositional environment and basin dynamics. (2) The inorganic mineral matrix of oil shale units changes significantly from clay mineral/dolomite dominated to calcite above the base of the Mahogany zone. This variation may result in significant differences in pyrolysis products and geomechanical properties relevant to development and should be incorporated into engineering experiments. (3) This study includes a region in the Uinta Basin that would be highly prospective for application of in-situ production techniques. Stratigraphic targets for in-situ recovery techniques should extend above and below the Mahogany zone and include the upper R-6 and lower R-8.

  1. Integrated sequence analysis. Final report

    International Nuclear Information System (INIS)

    Andersson, K.; Pyy, P.

    1998-02-01

    The NKS/RAK subprojet 3 'integrated sequence analysis' (ISA) was formulated with the overall objective to develop and to test integrated methodologies in order to evaluate event sequences with significant human action contribution. The term 'methodology' denotes not only technical tools but also methods for integration of different scientific disciplines. In this report, we first discuss the background of ISA and the surveys made to map methods in different application fields, such as man machine system simulation software, human reliability analysis (HRA) and expert judgement. Specific event sequences were, after the surveys, selected for application and testing of a number of ISA methods. The event sequences discussed in the report were cold overpressure of BWR, shutdown LOCA of BWR, steam generator tube rupture of a PWR and BWR disturbed signal view in the control room after an external event. Different teams analysed these sequences by using different ISA and HRA methods. Two kinds of results were obtained from the ISA project: sequence specific and more general findings. The sequence specific results are discussed together with each sequence description. The general lessons are discussed under a separate chapter by using comparisons of different case studies. These lessons include areas ranging from plant safety management (design, procedures, instrumentation, operations, maintenance and safety practices) to methodological findings (ISA methodology, PSA,HRA, physical analyses, behavioural analyses and uncertainty assessment). Finally follows a discussion about the project and conclusions are presented. An interdisciplinary study of complex phenomena is a natural way to produce valuable and innovative results. This project came up with structured ways to perform ISA and managed to apply the in practice. The project also highlighted some areas where more work is needed. In the HRA work, development is required for the use of simulators and expert judgement as

  2. Rule-based Information Integration

    NARCIS (Netherlands)

    de Keijzer, Ander; van Keulen, Maurice

    2005-01-01

    In this report, we show the process of information integration. We specifically discuss the language used for integration. We show that integration consists of two phases, the schema mapping phase and the data integration phase. We formally define transformation rules, conversion, evolution and

  3. Integrated Reliability and Risk Analysis System (IRRAS)

    International Nuclear Information System (INIS)

    Russell, K.D.; McKay, M.K.; Sattison, M.B.; Skinner, N.L.; Wood, S.T.; Rasmuson, D.M.

    1992-01-01

    The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the user the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification. Version 1.0 of the IRRAS program was released in February of 1987. Since that time, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 4.0 and is the subject of this Reference Manual. Version 4.0 of IRRAS provides the same capabilities as Version 1.0 and adds a relational data base facility for managing the data, improved functionality, and improved algorithm performance

  4. Integrating fuzzy object based image analysis and ant colony optimization for road extraction from remotely sensed images

    Science.gov (United States)

    Maboudi, Mehdi; Amini, Jalal; Malihi, Shirin; Hahn, Michael

    2018-04-01

    Updated road network as a crucial part of the transportation database plays an important role in various applications. Thus, increasing the automation of the road extraction approaches from remote sensing images has been the subject of extensive research. In this paper, we propose an object based road extraction approach from very high resolution satellite images. Based on the object based image analysis, our approach incorporates various spatial, spectral, and textural objects' descriptors, the capabilities of the fuzzy logic system for handling the uncertainties in road modelling, and the effectiveness and suitability of ant colony algorithm for optimization of network related problems. Four VHR optical satellite images which are acquired by Worldview-2 and IKONOS satellites are used in order to evaluate the proposed approach. Evaluation of the extracted road networks shows that the average completeness, correctness, and quality of the results can reach 89%, 93% and 83% respectively, indicating that the proposed approach is applicable for urban road extraction. We also analyzed the sensitivity of our algorithm to different ant colony optimization parameter values. Comparison of the achieved results with the results of four state-of-the-art algorithms and quantifying the robustness of the fuzzy rule set demonstrate that the proposed approach is both efficient and transferable to other comparable images.

  5. [Integrated health care organizations: guideline for analysis].

    Science.gov (United States)

    Vázquez Navarrete, M Luisa; Vargas Lorenzo, Ingrid; Farré Calpe, Joan; Terraza Núñez, Rebeca

    2005-01-01

    There has been a tendency recently to abandon competition and to introduce policies that promote collaboration between health providers as a means of improving the efficiency of the system and the continuity of care. A number of countries, most notably the United States, have experienced the integration of health care providers to cover the continuum of care of a defined population. Catalonia has witnessed the steady emergence of increasing numbers of integrated health organisations (IHO) but, unlike the United States, studies on health providers' integration are scarce. As part of a research project currently underway, a guide was developed to study Catalan IHOs, based on a classical literature review and the development of a theoretical framework. The guide proposes analysing the IHO's performance in relation to their final objectives of improving the efficiency and continuity of health care by an analysis of the integration type (based on key characteristics); external elements (existence of other suppliers, type of services' payment mechanisms); and internal elements (model of government, organization and management) that influence integration. Evaluation of the IHO's performance focuses on global strategies and results on coordination of care and efficiency. Two types of coordination are evaluated: information coordination and coordination of care management. Evaluation of the efficiency of the IHO refers to technical and allocative efficiency. This guide may have to be modified for use in the Catalan context.

  6. An Integrated Solution-Based Rapid Sample Preparation Procedure for the Analysis of N-Glycans From Therapeutic Monoclonal Antibodies.

    Science.gov (United States)

    Aich, Udayanath; Liu, Aston; Lakbub, Jude; Mozdzanowski, Jacek; Byrne, Michael; Shah, Nilesh; Galosy, Sybille; Patel, Pramthesh; Bam, Narendra

    2016-03-01

    Consistent glycosylation in therapeutic monoclonal antibodies is a major concern in the biopharmaceutical industry as it impacts the drug's safety and efficacy and manufacturing processes. Large numbers of samples are created for the analysis of glycans during various stages of recombinant proteins drug development. Profiling and quantifying protein N-glycosylation is important but extremely challenging due to its microheterogeneity and more importantly the limitations of existing time-consuming sample preparation methods. Thus, a quantitative method with fast sample preparation is crucial for understanding, controlling, and modifying the glycoform variance in therapeutic monoclonal antibody development. Presented here is a rapid and highly quantitative method for the analysis of N-glycans from monoclonal antibodies. The method comprises a simple and fast solution-based sample preparation method that uses nontoxic reducing reagents for direct labeling of N-glycans. The complete work flow for the preparation of fluorescently labeled N-glycans takes a total of 3 h with less than 30 min needed for the release of N-glycans from monoclonal antibody samples. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  7. Integrated sequence analysis. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, K.; Pyy, P

    1998-02-01

    The NKS/RAK subprojet 3 `integrated sequence analysis` (ISA) was formulated with the overall objective to develop and to test integrated methodologies in order to evaluate event sequences with significant human action contribution. The term `methodology` denotes not only technical tools but also methods for integration of different scientific disciplines. In this report, we first discuss the background of ISA and the surveys made to map methods in different application fields, such as man machine system simulation software, human reliability analysis (HRA) and expert judgement. Specific event sequences were, after the surveys, selected for application and testing of a number of ISA methods. The event sequences discussed in the report were cold overpressure of BWR, shutdown LOCA of BWR, steam generator tube rupture of a PWR and BWR disturbed signal view in the control room after an external event. Different teams analysed these sequences by using different ISA and HRA methods. Two kinds of results were obtained from the ISA project: sequence specific and more general findings. The sequence specific results are discussed together with each sequence description. The general lessons are discussed under a separate chapter by using comparisons of different case studies. These lessons include areas ranging from plant safety management (design, procedures, instrumentation, operations, maintenance and safety practices) to methodological findings (ISA methodology, PSA,HRA, physical analyses, behavioural analyses and uncertainty assessment). Finally follows a discussion about the project and conclusions are presented. An interdisciplinary study of complex phenomena is a natural way to produce valuable and innovative results. This project came up with structured ways to perform ISA and managed to apply the in practice. The project also highlighted some areas where more work is needed. In the HRA work, development is required for the use of simulators and expert judgement as

  8. Microprocessor-based integrated LMFBR core surveillance

    International Nuclear Information System (INIS)

    Gmeiner, L.

    1984-06-01

    This report results from a joint study of KfK and INTERATOM. The aim of this study is to explore the advantages of microprocessors and microelectronics for a more sophisticated core surveillance, which is based on the integration of separate surveillance techniques. Due to new developments in microelectronics and related software an approach to LMFBR core surveillance can be conceived that combines a number of measurements into a more intelligent decision-making data processing system. The following techniques are considered to contribute essentially to an integrated core surveillance system: - subassembly state and thermal hydraulics performance monitoring, - temperature noise analysis, - acoustic core surveillance, - failure characterization and failure prediction based on DND- and cover gas signals, and - flux tilting techniques. Starting from a description of these techniques it is shown that by combination and correlation of these individual techniques a higher degree of cost-effectiveness, reliability and accuracy can be achieved. (orig./GL) [de

  9. Integrative Workflows for Metagenomic Analysis

    Directory of Open Access Journals (Sweden)

    Efthymios eLadoukakis

    2014-11-01

    Full Text Available The rapid evolution of all sequencing technologies, described by the term Next Generation Sequencing (NGS, have revolutionized metagenomic analysis. They constitute a combination of high-throughput analytical protocols, coupled to delicate measuring techniques, in order to potentially discover, properly assemble and map allelic sequences to the correct genomes, achieving particularly high yields for only a fraction of the cost of traditional processes (i.e. Sanger. From a bioinformatic perspective, this boils down to many gigabytes of data being generated from each single sequencing experiment, rendering the management or even the storage, critical bottlenecks with respect to the overall analytical endeavor. The enormous complexity is even more aggravated by the versatility of the processing steps available, represented by the numerous bioinformatic tools that are essential, for each analytical task, in order to fully unveil the genetic content of a metagenomic dataset. These disparate tasks range from simple, nonetheless non-trivial, quality control of raw data to exceptionally complex protein annotation procedures, requesting a high level of expertise for their proper application or the neat implementation of the whole workflow. Furthermore, a bioinformatic analysis of such scale, requires grand computational resources, imposing as the sole realistic solution, the utilization of cloud computing infrastructures. In this review article we discuss different, integrative, bioinformatic solutions available, which address the aforementioned issues, by performing a critical assessment of the available automated pipelines for data management, quality control and annotation of metagenomic data, embracing various, major sequencing technologies and applications.

  10. Integrated Radiation Analysis and Design Tools

    Data.gov (United States)

    National Aeronautics and Space Administration — The Integrated Radiation Analysis and Design Tools (IRADT) Project develops and maintains an integrated tool set that collects the current best practices, databases,...

  11. Integration of ROOT Notebooks as a Web-based ATLAS Analysis tool for public data releases and outreach

    CERN Document Server

    Banda, Tea; CERN. Geneva. EP Department

    2016-01-01

    The project consists in the initial development of ROOT notebooks for a Z boson analysis in C++ programming language that will allow students and researches to perform fast and very useful data analysis, using ATLAS public data and Monte- Carlo simulations. Several tools are considered: ROOT Data Analysis Frame- work, Jupyter Notebook Technology and CERN-ROOT computing service so-called SWAN.

  12. Energy and carbon emissions analysis and prediction of complex petrochemical systems based on an improved extreme learning machine integrated interpretative structural model

    International Nuclear Information System (INIS)

    Han, Yongming; Zhu, Qunxiong; Geng, Zhiqiang; Xu, Yuan

    2017-01-01

    Highlights: • The ELM integrated ISM (ISM-ELM) method is proposed. • The proposed method is more efficient and accurate than the ELM through the UCI data set. • Energy and carbon emissions analysis and prediction of petrochemical industries based ISM-ELM is obtained. • The proposed method is valid in improving energy efficiency and reducing carbon emissions of ethylene plants. - Abstract: Energy saving and carbon emissions reduction of the petrochemical industry are affected by many factors. Thus, it is difficult to analyze and optimize the energy of complex petrochemical systems accurately. This paper proposes an energy and carbon emissions analysis and prediction approach based on an improved extreme learning machine (ELM) integrated interpretative structural model (ISM) (ISM-ELM). ISM based the partial correlation coefficient is utilized to analyze key parameters that affect the energy and carbon emissions of the complex petrochemical system, and can denoise and reduce dimensions of data to decrease the training time and errors of the ELM prediction model. Meanwhile, in terms of the model accuracy and the training time, the robustness and effectiveness of the ISM-ELM model are better than the ELM through standard data sets from the University of California Irvine (UCI) repository. Moreover, a multi-inputs and single-output (MISO) model of energy and carbon emissions of complex ethylene systems is established based on the ISM-ELM. Finally, detailed analyses and simulations using the real ethylene plant data demonstrate the effectiveness of the ISM-ELM and can guide the improvement direction of energy saving and carbon emissions reduction in complex petrochemical systems.

  13. Integrated minicomputer alpha analysis system

    International Nuclear Information System (INIS)

    Vasilik, D.G.; Coy, D.E.; Seamons, M.; Henderson, R.W.; Romero, L.L.; Thomson, D.A.

    1978-01-01

    Approximately 1,000 stack and occupation air samples from plutonium and uranium facilities at LASL are analyzed daily. The concentrations of radio-nuclides in air are determined by measuring absolute alpha activities of particulates collected on air sample filter media. The Integrated Minicomputer Pulse system (IMPULSE) is an interface between many detectors of extremely simple design and a Digital Equipment Corporation (DEC) PDP-11/04 minicomputer. The detectors are photomultiplier tubes faced with zinc sulfide (ZnS). The average detector background is approximately 0.07 cpm. The IMPULSE system includes two mainframes, each of which can hold up to 64 detectors. The current hardware configuration includes 64 detectors in one mainframe and 40 detectors in the other. Each mainframe contains a minicomputer with 28K words of Random Access Memory. One minicomputer controls the detectors in both mainframes. A second computer was added for fail-safe redundancy and to support other laboratory computer requirements. The main minicomputer includes a dual floppy disk system and a dual DEC 'RK05' disk system for mass storage. The RK05 facilitates report generation and trend analysis. The IMPULSE hardware provides for passage of data from the detectors to the computer, and for passage of status and control information from the computer to the detector stations

  14. Integrity Analysis of Damaged Steam Generator Tubes

    International Nuclear Information System (INIS)

    Stanic, D.

    1998-01-01

    Variety of degradation mechanisms affecting steam generator tubes makes steam generators as one of the critical components in the nuclear power plants. Depending of their nature, degradation mechanisms cause different types of damages. It requires performance of extensive integrity analysis in order to access various conditions of crack behavior under operating and accidental conditions. Development and application of advanced eddy current techniques for steam generator examination provide good characterization of found damages. Damage characteristics (shape, orientation and dimensions) may be defined and used for further evaluation of damage influence on tube integrity. In comparison with experimental and analytical methods, numerical methods are also efficient tools for integrity assessment. Application of finite element methods provides relatively simple modeling of different type of damages and simulation of various operating conditions. The stress and strain analysis may be performed for elastic and elasto-plastic state with good ability for visual presentation of results. Furthermore, the fracture mechanics parameters may be calculated. Results obtained by numerical analysis supplemented with experimental results are the base for definition of alternative plugging criteria which may significantly reduce the number of plugged tubes. (author)

  15. Efficient strategies for the integration of renewable energy into future energy infrastructures in Europe – An analysis based on transnational modeling and case studies for nine European regions

    International Nuclear Information System (INIS)

    Boie, Inga; Fernandes, Camila; Frías, Pablo; Klobasa, Marian

    2014-01-01

    As a result of the current international climate change strategy, the European Commission has agreed on ambitious targets to reduce CO 2 emissions by more than 80% until 2050 as compared to 1990 levels and to increase the share of renewable energy and improve energy efficiency by 20% until 2020. Under this framework, renewable energy generation has increased considerably in the EU and it is expected to keep growing in the future years. This paper presents long-term strategies for transmission infrastructure development to integrate increasing amounts of renewable generation in the time horizon of 2030–2050. These are part of the outcomes of the SUSPLAN project, which focuses on four possible future renewable deployment scenarios in different European regions taking into account the corresponding infrastructure needs, especially electricity and gas grids, both on regional and transnational level. The main objective of the project is the development of guidelines for the integration of renewable energy into future energy infrastructures while taking account of national and regional characteristics. Therefore, the analysis is based on a two-track approach: A transnational modeling exercise (“top-down”) and in-depth case studies for nine representative European regions (“bottom-up”). - Highlights: • We present the main outcomes of the SUSPLAN EU project. • It assesses long-term energy infrastructure needs to integrate RES in Europe. • Regional and transnational analyses are performed for 4 RES scenarios until 2050. • Major barriers to the integration of RES into energy infrastructure are identified. • Efficient strategies to mitigate these barriers are proposed

  16. Integration of ROOT Notebooks as a Web-based ATLAS Analysis tool for Public Data Releases and Outreach

    CERN Document Server

    Abah, Anthony

    2016-01-01

    The project worked on the development of a physics analysis and its software under ROOT framework and Jupyter notebooks for the the ATLAS Outreach and the Naples teams. This analysis is created in the context of the release of data and Monte Carlo samples by the ATLAS collaboration. The project focuses on the enhancement of the recent opendata.atlas.cern web platform to be used as educational resources for university students and new researches. The generated analysis structure and tutorials will be used to extend the participation of students from other locations around the World. We conclude the project with the creation of a complete notebook representing the so-called W analysis in C + + language for the mentioned platform.

  17. Ontology-based Vaccine and Drug Adverse Event Representation and Theory-guided Systematic Causal Network Analysis toward Integrative Pharmacovigilance Research.

    Science.gov (United States)

    He, Yongqun

    2016-06-01

    Compared with controlled terminologies ( e.g. , MedDRA, CTCAE, and WHO-ART), the community-based Ontology of AEs (OAE) has many advantages in adverse event (AE) classifications. The OAE-derived Ontology of Vaccine AEs (OVAE) and Ontology of Drug Neuropathy AEs (ODNAE) serve as AE knowledge bases and support data integration and analysis. The Immune Response Gene Network Theory explains molecular mechanisms of vaccine-related AEs. The OneNet Theory of Life treats the whole process of a life of an organism as a single complex and dynamic network ( i.e. , OneNet). A new "OneNet effectiveness" tenet is proposed here to expand the OneNet theory. Derived from the OneNet theory, the author hypothesizes that one human uses one single genotype-rooted mechanism to respond to different vaccinations and drug treatments, and experimentally identified mechanisms are manifestations of the OneNet blueprint mechanism under specific conditions. The theories and ontologies interact together as semantic frameworks to support integrative pharmacovigilance research.

  18. Integrated Case Based and Rule Based Reasoning for Decision Support

    OpenAIRE

    Eshete, Azeb Bekele

    2009-01-01

    This project is a continuation of my specialization project which was focused on studying theoretical concepts related to case based reasoning method, rule based reasoning method and integration of them. The integration of rule-based and case-based reasoning methods has shown a substantial improvement with regards to performance over the individual methods. Verdande Technology As wants to try integrating the rule based reasoning method with an existing case based system. This project focu...

  19. Analysis of Food Hub Commerce and Participation Using Agent-Based Modeling: Integrating Financial and Social Drivers.

    Science.gov (United States)

    Krejci, Caroline C; Stone, Richard T; Dorneich, Michael C; Gilbert, Stephen B

    2016-02-01

    Factors influencing long-term viability of an intermediated regional food supply network (food hub) were modeled using agent-based modeling techniques informed by interview data gathered from food hub participants. Previous analyses of food hub dynamics focused primarily on financial drivers rather than social factors and have not used mathematical models. Based on qualitative and quantitative data gathered from 22 customers and 11 vendors at a midwestern food hub, an agent-based model (ABM) was created with distinct consumer personas characterizing the range of consumer priorities. A comparison study determined if the ABM behaved differently than a model based on traditional economic assumptions. Further simulation studies assessed the effect of changes in parameters, such as producer reliability and the consumer profiles, on long-term food hub sustainability. The persona-based ABM model produced different and more resilient results than the more traditional way of modeling consumers. Reduced producer reliability significantly reduced trade; in some instances, a modest reduction in reliability threatened the sustainability of the system. Finally, a modest increase in price-driven consumers at the outset of the simulation quickly resulted in those consumers becoming a majority of the overall customer base. Results suggest that social factors, such as desire to support the community, can be more important than financial factors. An ABM of food hub dynamics, based on human factors data gathered from the field, can be a useful tool for policy decisions. Similar approaches can be used for modeling customer dynamics with other sustainable organizations. © 2015, Human Factors and Ergonomics Society.

  20. Northeastern Brazilian margin: Regional tectonic evolution based on integrated analysis of seismic reflection and potential field data and modelling

    Science.gov (United States)

    Blaich, Olav A.; Tsikalas, Filippos; Faleide, Jan Inge

    2008-10-01

    Integration of regional seismic reflection and potential field data along the northeastern Brazilian margin, complemented by crustal-scale gravity modelling, is used to reveal and illustrate onshore-offshore crustal structure correlation, the character of the continent-ocean boundary, and the relationship of crustal structure to regional variation of potential field anomalies. The study reveals distinct along-margin structural and magmatic changes that are spatially related to a number of conjugate Brazil-West Africa transfer systems, governing the margin segmentation and evolution. Several conceptual tectonic models are invoked to explain the structural evolution of the different margin segments in a conjugate margin context. Furthermore, the constructed transects, the observed and modelled Moho relief, and the potential field anomalies indicate that the Recôncavo, Tucano and Jatobá rift system may reflect a polyphase deformation rifting-mode associated with a complex time-dependent thermal structure of the lithosphere. The constructed transects and available seismic reflection profiles, indicate that the northern part of the study area lacks major breakup-related magmatic activity, suggesting a rifted non-volcanic margin affinity. In contrast, the southern part of the study area is characterized by abrupt crustal thinning and evidence for breakup magmatic activity, suggesting that this region evolved, partially, with a rifted volcanic margin affinity and character.

  1. FRET based integrated pyrene-AgNPs system for detection of Hg (II) and pyrene dimer: Applications to environmental analysis

    Science.gov (United States)

    Walekar, Laxman S.; Hu, Peidong; Vafaei Molamahmood, Hamed; Long, Mingce

    2018-06-01

    The integrated system of pyrene and cetyltrimethyl ammonium bromide (CTAB) capped silver nanoparticles (AgNPs) with a distance (r) of 2.78 nm has been developed for the detection of Hg (II) and pyrene dimer. The interaction between pyrene and AgNPs results in the fluorescence quenching of pyrene due to the energy transfer, whose mechanism can be attributed to the Forster Resonance Energy Transfer (FRET) supported by experimental observation and theoretical calculations. The developed probe shows a highly selective and sensitive response towards Hg (II) probably due to the amalgam formation, which results in the fluorescence recovery (90%) of pyrene and color change of solution from yellowish brown to colorless. The addition of Hg (II) may increase the distance between pyrene and AgNPs undergoes the 'FRET OFF' process. This system gives a selective response towards Hg (II) over other competing metal ions. Under the optimal condition, the system offers good linearity between 0.1 and 0.6 μg mL-1 with a detection limit of 62 ng mL-1. In addition, the system also provides an effective platform for detection of pyrene in its dimer form even at very low concentrations (10 ng mL-1) on the surface of AgNPs. Therefore, it could be used as effective alternatives for the detection of Hg (II) as well as pyrene simultaneously.

  2. Algal Functional Annotation Tool: a web-based analysis suite to functionally interpret large gene lists using integrated annotation and expression data

    Directory of Open Access Journals (Sweden)

    Merchant Sabeeha S

    2011-07-01

    Full Text Available Abstract Background Progress in genome sequencing is proceeding at an exponential pace, and several new algal genomes are becoming available every year. One of the challenges facing the community is the association of protein sequences encoded in the genomes with biological function. While most genome assembly projects generate annotations for predicted protein sequences, they are usually limited and integrate functional terms from a limited number of databases. Another challenge is the use of annotations to interpret large lists of 'interesting' genes generated by genome-scale datasets. Previously, these gene lists had to be analyzed across several independent biological databases, often on a gene-by-gene basis. In contrast, several annotation databases, such as DAVID, integrate data from multiple functional databases and reveal underlying biological themes of large gene lists. While several such databases have been constructed for animals, none is currently available for the study of algae. Due to renewed interest in algae as potential sources of biofuels and the emergence of multiple algal genome sequences, a significant need has arisen for such a database to process the growing compendiums of algal genomic data. Description The Algal Functional Annotation Tool is a web-based comprehensive analysis suite integrating annotation data from several pathway, ontology, and protein family databases. The current version provides annotation for the model alga Chlamydomonas reinhardtii, and in the future will include additional genomes. The site allows users to interpret large gene lists by identifying associated functional terms, and their enrichment. Additionally, expression data for several experimental conditions were compiled and analyzed to provide an expression-based enrichment search. A tool to search for functionally-related genes based on gene expression across these conditions is also provided. Other features include dynamic visualization of

  3. Designing simulator-based training: An approach integrating cognitive task analysis and four-component instructional design

    NARCIS (Netherlands)

    Tjiam, I.M.; Schout, B.M.; Hendrikx, A.J.M.; Scherpbier, A.J.J.A.; Witjes, J.A.; Van Merrienboer, J.J.

    2012-01-01

    Most studies of simulator-based surgical skills training have focused on the acquisition of psychomotor skills, but surgical procedures are complex tasks requiring both psychomotor and cognitive skills. As skills training is modelled on expert performance consisting partly of unconscious automatic

  4. Novel ring resonator-based integrated photonic beamformer for broadband phased array receive antennas - part 1: design and performance analysis

    NARCIS (Netherlands)

    Meijerink, Arjan; Roeloffzen, C.G.H.; Meijerink, Roland; Zhuang, L.; Marpaung, D.A.I.; Bentum, Marinus Jan; Burla, M.; Verpoorte, Jaco; Jorna, Pieter; Huizinga, Adriaan; van Etten, Wim

    2010-01-01

    A novel optical beamformer concept is introduced that can be used for seamless control of the reception angle in broadband wireless receivers employing a large phased array antenna (PAA). The core of this beamformer is an optical beamforming network (OBFN), using ring resonator-based broadband

  5. An operator expansion technique for path integral analysis

    International Nuclear Information System (INIS)

    Tsvetkov, I.V.

    1995-01-01

    A new method of path integral analysis in the framework of a power series technique is presented. The method is based on the operator expansion of an exponential. A regular procedure to calculate the correction terms is found. (orig.)

  6. Analysis of Price Variation and Market Integration of Prosopis ...

    African Journals Online (AJOL)

    Analysis of Price Variation and Market Integration of Prosopis Africana (guill. ... select five markets based on the presence of traders selling the commodity in the markets ... T- test result showed that Prosopis africana seed trade is profitable and ...

  7. Universal integrals based on copulas

    Czech Academy of Sciences Publication Activity Database

    Klement, E.P.; Mesiar, Radko; Spizzichino, F.; Stupňanová, A.

    2014-01-01

    Roč. 13, č. 3 (2014), s. 273-286 ISSN 1568-4539 R&D Projects: GA ČR GAP402/11/0378 Institutional support: RVO:67985556 Keywords : capacity * copula * universal integral Subject RIV: BA - General Mathematics Impact factor: 2.163, year: 2014 http://library.utia.cas.cz/separaty/2014/E/mesiar-0432228.pdf

  8. Qualitative Analysis of Integration Adapter Modeling

    OpenAIRE

    Ritter, Daniel; Holzleitner, Manuel

    2015-01-01

    Integration Adapters are a fundamental part of an integration system, since they provide (business) applications access to its messaging channel. However, their modeling and configuration remain under-represented. In previous work, the integration control and data flow syntax and semantics have been expressed in the Business Process Model and Notation (BPMN) as a semantic model for message-based integration, while adapter and the related quality of service modeling were left for further studi...

  9. Problems in mathematical analysis III integration

    CERN Document Server

    Kaczor, W J

    2003-01-01

    We learn by doing. We learn mathematics by doing problems. This is the third volume of Problems in Mathematical Analysis. The topic here is integration for real functions of one real variable. The first chapter is devoted to the Riemann and the Riemann-Stieltjes integrals. Chapter 2 deals with Lebesgue measure and integration. The authors include some famous, and some not so famous, integral inequalities related to Riemann integration. Many of the problems for Lebesgue integration concern convergence theorems and the interchange of limits and integrals. The book closes with a section on Fourier series, with a concentration on Fourier coefficients of functions from particular classes and on basic theorems for convergence of Fourier series. The book is primarily geared toward students in analysis, as a study aid, for problem-solving seminars, or for tutorials. It is also an excellent resource for instructors who wish to incorporate problems into their lectures. Solutions for the problems are provided in the boo...

  10. Heterogeneity of the North Atlantic oceanic lithosphere based on integrated analysis of GOCE satellite gravity and geological data

    Science.gov (United States)

    Barantseva, Olga; Artemieva, Irina; Thybo, Hans; Herceg, Matija

    2015-04-01

    We present the results from modelling the gravity and density structure of the upper mantle for the off-shore area of the North Atlantic region. The crust and upper mantle of the region is expected to be anomalous: Part of the region affected by the Icelandic plume has an anomalously shallow bathymetry, whereas the northern part of the region is characterized by ultraslow spreading. In order to understand the links between deep geodynamical processes that control the spreading rate, on one hand, and their manifestations such as oceanic floor bathymetry and heat flow, on the other hand, we model the gravity and density structure of the upper mantle from satellite gravity data. The calculations are based on interpretation of GOCE gravity satellite data for the North Atlantics. To separate the gravity signal responsible for density anomalies within the crust and upper mantle, we subtract the lower harmonics caused by deep density structure of the Earth (the core and the lower mantle). The gravity effect of the upper mantle is calculated by subtracting the gravity effect of the crust for two crustal models. We use a recent regional seismic model for the crustal structure (Artemieva and Thybo, 2013) based om seismic data together with borehole data for sediments. For comparison, similar results are presented for the global CRUST 1.0 model as well (Laske, 2013). The conversion of seismic velocity data for the crustal structure to crustal density structure is crucial for the final results. We use a combination of Vp-to-density conversion based on published laboratory measurements for the crystalline basement (Ludwig, Nafe, Drake, 1970; Christensen and Mooney, 1995) and for oceanic sediments and oceanic crust based on laboratory measurements for serpentinites and gabbros from the Mid-Atlantic Ridge (Kelemen et al., 2004). Also, to overcome the high degree of uncertainty in Vp-to-density conversion, we account for regional tectonic variations in the Northern Atlantics as

  11. Computerized integrated data base production system (COMPINDAS)

    Energy Technology Data Exchange (ETDEWEB)

    Marek, D; Buerk, K [Fachinformationszentrum Karlsruhe, Gesellschaft fuer Wissenschaftlich-Technische Information mbH, Eggenstein-Leopoldshafen (Germany)

    1990-05-01

    Based on many years of experience, and with the main objective in mind to guarantee long-term database quality and efficiency of input processes, Fachinformationszentrum Karlsruhe is developing an integrated interactive data management systems for bibliographic and factual databases. Its concept includes the following range of applications: Subject analysis with computer-assisted classification, indexing and translation; technical procedures with online acquisition and management of literature and factual data, recording by means of optical scanning, computer-assisted bibliographic description, control and update procedures; support of the whole process by continuous surveillance of document flow. All these procedures will be performed in an integrated manner. They system is to meet high standards for flexibility, data integrity and effectiveness of system functions. Independent of the type of data, the appropriate database or the subject field to be handled, all data will be stored in one large pool. One main goal is to avoid duplication of work and redundancy of data storage. The system will work online, interactive and conversational. COMPINDAS is being established on the basis of the ADABAS as database management system for storage and retrieval. The applications are being generated by means of aDis of ASTEC in Munich. aDis is used for the definition of the data structures, checking routines, coupling processes, and the design of dialogue and batch routines including masks. (author). 7 figs.

  12. Computerized integrated data base production system (COMPINDAS)

    International Nuclear Information System (INIS)

    Marek, D.; Buerk, K.

    1990-05-01

    Based on many years of experience, and with the main objective in mind to guarantee long-term database quality and efficiency of input processes, Fachinformationszentrum Karlsruhe is developing an integrated interactive data management systems for bibliographic and factual databases. Its concept includes the following range of applications: Subject analysis with computer-assisted classification, indexing and translation; technical procedures with online acquisition and management of literature and factual data, recording by means of optical scanning, computer-assisted bibliographic description, control and update procedures; support of the whole process by continuous surveillance of document flow. All these procedures will be performed in an integrated manner. They system is to meet high standards for flexibility, data integrity and effectiveness of system functions. Independent of the type of data, the appropriate database or the subject field to be handled, all data will be stored in one large pool. One main goal is to avoid duplication of work and redundancy of data storage. The system will work online, interactive and conversational. COMPINDAS is being established on the basis of the ADABAS as database management system for storage and retrieval. The applications are being generated by means of aDis of ASTEC in Munich. aDis is used for the definition of the data structures, checking routines, coupling processes, and the design of dialogue and batch routines including masks. (author). 7 figs

  13. Microcontroller based Integrated Circuit Tester

    OpenAIRE

    Yousif Taha Yousif Elamin; Abdelrasoul Jabar Alzubaidi

    2015-01-01

    The digital integrated circuit (IC) tester is implemented by using the ATmega32 microcontroller . The microcontroller processes the inputs and outputs and displays the results on a Liquid Crystal Display (LCD). The basic function of the digital IC tester is to test a digital IC for correct logical functioning as described in the truth table and/or function table. The designed model can test digital ICs having 14 pins. Since it is programmable, any number of ICs can be tested . Thi...

  14. Heterogeneity of the North Atlantic oceanic lithosphere based on integrated analysis of GOCE satellite gravity and geological data

    DEFF Research Database (Denmark)

    Barantseva, Olga; Artemieva, Irina; Thybo, Hans

    2015-01-01

    harmonics caused by deep density structure of the Earth (the core and the lower mantle). The gravity effect of the upper mantle is calculated after the subtracting gravity effect of the crust for two crustal models, including seismic and borehole data on sediments. We use a recent regional seismic model......We present the results of modeling of the gravity and density structure of the upper mantle for the off-shore area of the North Atlantic region. The crust and upper mantle of the region is expected to be anomalous: a part of the region affected by the Icelandic plume has an anomalously shallow...... the gravity and density structure of the upper mantle from satellite gravity data. The calculations are based on interpretation of GOCE gravity satellite data for the North Atlantics. To separate gravity signal, responsible for density anomalies within the crust and upper mantle, we subtract the lower...

  15. Mobile phone-based evaluation of latent tuberculosis infection: Proof of concept for an integrated image capture and analysis system.

    Science.gov (United States)

    Naraghi, Safa; Mutsvangwa, Tinashe; Goliath, René; Rangaka, Molebogeng X; Douglas, Tania S

    2018-05-08

    The tuberculin skin test is the most widely used method for detecting latent tuberculosis infection in adults and active tuberculosis in children. We present the development of a mobile-phone based screening tool for measuring the tuberculin skin test induration. The tool makes use of a mobile application developed on the Android platform to capture images of an induration, and photogrammetric reconstruction using Agisoft PhotoScan to reconstruct the induration in 3D, followed by 3D measurement of the induration with the aid of functions from the Python programming language. The system enables capture of images by the person being screened for latent tuberculosis infection. Measurement precision was tested using a 3D printed induration. Real-world use of the tool was simulated by application to a set of mock skin indurations, created by a make-up artist, and the performance of the tool was evaluated. The usability of the application was assessed with the aid of a questionnaire completed by participants. The tool was found to measure the 3D printed induration with greater precision than the current ruler and pen method, as indicated by the lower standard deviation produced (0.3 mm versus 1.1 mm in the literature). There was high correlation between manual and algorithm measurement of mock skin indurations. The height of the skin induration and the definition of its margins were found to influence the accuracy of 3D reconstruction and therefore the measurement error, under simulated real-world conditions. Based on assessment of the user experience in capturing images, a simplified user interface would benefit wide-spread implementation. The mobile application shows good agreement with direct measurement. It provides an alternative method for measuring tuberculin skin test indurations and may remove the need for an in-person follow-up visit after test administration, thus improving latent tuberculosis infection screening throughput. Copyright © 2018 Elsevier Ltd

  16. The Vehicle Integrated Performance Analysis Experience: Reconnecting With Technical Integration

    Science.gov (United States)

    McGhee, D. S.

    2006-01-01

    Very early in the Space Launch Initiative program, a small team of engineers at MSFC proposed a process for performing system-level assessments of a launch vehicle. Aimed primarily at providing insight and making NASA a smart buyer, the Vehicle Integrated Performance Analysis (VIPA) team was created. The difference between the VIPA effort and previous integration attempts is that VIPA a process using experienced people from various disciplines, which focuses them on a technically integrated assessment. The foundations of VIPA s process are described. The VIPA team also recognized the need to target early detailed analysis toward identifying significant systems issues. This process is driven by the T-model for technical integration. VIPA s approach to performing system-level technical integration is discussed in detail. The VIPA process significantly enhances the development and monitoring of realizable project requirements. VIPA s assessment validates the concept s stated performance, identifies significant issues either with the concept or the requirements, and then reintegrates these issues to determine impacts. This process is discussed along with a description of how it may be integrated into a program s insight and review process. The VIPA process has gained favor with both engineering and project organizations for being responsive and insightful

  17. Integrated logistic support analysis system

    International Nuclear Information System (INIS)

    Carnicero Iniguez, E.J.; Garcia de la Sen, R.

    1993-01-01

    Integrating logic support into a system results in a large volume of information having to be managed which can only be achieved with the help of computer applications. Both past experience and growing needs in such tasks have led Emperesarios Agrupados to undertake an ambitious development project which is described in this paper. (author)

  18. Analysis of integrated energy systems

    International Nuclear Information System (INIS)

    Matsuhashi, Takaharu; Kaya, Yoichi; Komiyama, Hiroshi; Hayashi, Taketo; Yasukawa, Shigeru.

    1988-01-01

    World attention is now attracted to the concept of Novel Horizontally Integrated Energy System (NHIES). In NHIES, all fossil fuels are fist converted into CO and H 2 . Potential environmental contaminants such as sulfur are removed during this process. CO turbines are mainly used to generate electric power. Combustion is performed in pure oxygen produced through air separation, making it possible to completely prevent the formation of thermal NOx. Thus, NHIES would release very little amount of such substances that would contribute to acid rain. In this system, the intermediate energy sources of CO, H 2 and O 2 are integrated horizontally. They are combined appropriately to produce a specific form of final energy source. The integration of intermediate energy sources can provide a wide variety of final energy sources, allowing any type of fossil fuel to serve as an alternative to other types of fossil fuel. Another feature of NHIES is the positive use of nuclear fuel to reduce the formation of CO 2 . Studies are under way in Japan to develop a new concept of integrated energy system. These studies are especially aimed at decreased overall efficiency and introduction of new liquid fuels that are high in conversion efficiency. Considerations are made on the final form of energy source, robust control, acid fallout, and CO 2 reduction. (Nogami, K.)

  19. A meta-analysis of human embryonic stem cells transcriptome integrated into a web-based expression atlas.

    Science.gov (United States)

    Assou, Said; Le Carrour, Tanguy; Tondeur, Sylvie; Ström, Susanne; Gabelle, Audrey; Marty, Sophie; Nadal, Laure; Pantesco, Véronique; Réme, Thierry; Hugnot, Jean-Philippe; Gasca, Stéphan; Hovatta, Outi; Hamamah, Samir; Klein, Bernard; De Vos, John

    2007-04-01

    Microarray technology provides a unique opportunity to examine gene expression patterns in human embryonic stem cells (hESCs). We performed a meta-analysis of 38 original studies reporting on the transcriptome of hESCs. We determined that 1,076 genes were found to be overexpressed in hESCs by at least three studies when compared to differentiated cell types, thus composing a "consensus hESC gene list." Only one gene was reported by all studies: the homeodomain transcription factor POU5F1/OCT3/4. The list comprised other genes critical for pluripotency such as the transcription factors NANOG and SOX2, and the growth factors TDGF1/CRIPTO and Galanin. We show that CD24 and SEMA6A, two cell surface protein-coding genes from the top of the consensus hESC gene list, display a strong and specific membrane protein expression on hESCs. Moreover, CD24 labeling permits the purification by flow cytometry of hESCs cocultured on human fibroblasts. The consensus hESC gene list also included the FZD7 WNT receptor, the G protein-coupled receptor GPR19, and the HELLS helicase, which could play an important role in hESCs biology. Conversely, we identified 783 genes downregulated in hESCs and reported in at least three studies. This "consensus differentiation gene list" included the IL6ST/GP130 LIF receptor. We created an online hESC expression atlas, http://amazonia.montp.inserm.fr, to provide an easy access to this public transcriptome dataset. Expression histograms comparing hESCs to a broad collection of fetal and adult tissues can be retrieved with this web tool for more than 15,000 genes.

  20. An Association Rule Based Method to Integrate Metro-Public Bicycle Smart Card Data for Trip Chain Analysis

    Directory of Open Access Journals (Sweden)

    De Zhao

    2018-01-01

    Full Text Available Smart card data provide valuable insights and massive samples for enhancing the understanding of transfer behavior between metro and public bicycle. However, smart cards for metro and public bicycle are often issued and managed by independent companies and this results in the same commuter having different identity tags in the metro and public bicycle smart card systems. The primary objective of this study is to develop a data fusion methodology for matching metro and public bicycle smart cards for the same commuter using historical smart card data. A novel method with association rules to match the data derived from the two systems is proposed and validation was performed. The results showed that our proposed method successfully matched 573 pairs of smart cards with an accuracy of 100%. We also validated the association rules method through visualization of individual metro and public bicycle trips. Based on the matched cards, interesting findings of metro-bicycle transfer have been derived, including the spatial pattern of the public bicycle as first/last mile solution as well as the duration of a metro trip chain.

  1. Carbon carry capacity and carbon sequestration potential in China based on an integrated analysis of mature forest biomass.

    Science.gov (United States)

    Liu, YingChun; Yu, GuiRui; Wang, QiuFeng; Zhang, YangJian; Xu, ZeHong

    2014-12-01

    Forests play an important role in acting as a carbon sink of terrestrial ecosystem. Although global forests have huge carbon carrying capacity (CCC) and carbon sequestration potential (CSP), there were few quantification reports on Chinese forests. We collected and compiled a forest biomass dataset of China, a total of 5841 sites, based on forest inventory and literature search results. From the dataset we extracted 338 sites with forests aged over 80 years, a threshold for defining mature forest, to establish the mature forest biomass dataset. After analyzing the spatial pattern of the carbon density of Chinese mature forests and its controlling factors, we used carbon density of mature forests as the reference level, and conservatively estimated the CCC of the forests in China by interpolation methods of Regression Kriging, Inverse Distance Weighted and Partial Thin Plate Smoothing Spline. Combining with the sixth National Forest Resources Inventory, we also estimated the forest CSP. The results revealed positive relationships between carbon density of mature forests and temperature, precipitation and stand age, and the horizontal and elevational patterns of carbon density of mature forests can be well predicted by temperature and precipitation. The total CCC and CSP of the existing forests are 19.87 and 13.86 Pg C, respectively. Subtropical forests would have more CCC and CSP than other biomes. Consequently, relying on forests to uptake carbon by decreasing disturbance on forests would be an alternative approach for mitigating greenhouse gas concentration effects besides afforestation and reforestation.

  2. Altered integrity of the right arcuate fasciculus as a trait marker of schizophrenia: a sibling study using tractography-based analysis of the whole brain.

    Science.gov (United States)

    Wu, Chen-Hao; Hwang, Tzung-Jeng; Chen, Yu-Jen; Hsu, Yun-Chin; Lo, Yu-Chun; Liu, Chih-Min; Hwu, Hai-Gwo; Liu, Chen-Chung; Hsieh, Ming H; Chien, Yi Ling; Chen, Chung-Ming; Tseng, Wen-Yih Isaac

    2015-03-01

    Trait markers of schizophrenia aid the dissection of the heterogeneous phenotypes into distinct subtypes and facilitate the genetic underpinning of the disease. The microstructural integrity of the white matter tracts could serve as a trait marker of schizophrenia, and tractography-based analysis (TBA) is the current method of choice. Manual tractography is time-consuming and limits the analysis to preselected fiber tracts. Here, we sought to identify a trait marker of schizophrenia from among 74 fiber tracts across the whole brain using a novel automatic TBA method. Thirty-one patients with schizophrenia, 31 unaffected siblings and 31 healthy controls were recruited to undergo diffusion spectrum magnetic resonance imaging at 3T. Generalized fractional anisotropy (GFA), an index reflecting tract integrity, was computed for each tract and compared among the three groups. Ten tracts were found to exhibit significant differences between the groups with a linear, stepwise order from controls to siblings to patients; they included the right arcuate fasciculus, bilateral fornices, bilateral auditory tracts, left optic radiation, the genu of the corpus callosum, and the corpus callosum to the bilateral dorsolateral prefrontal cortices, bilateral temporal poles, and bilateral hippocampi. Posthoc between-group analyses revealed that the GFA of the right arcuate fasciculus was significantly decreased in both the patients and unaffected siblings compared to the controls. Furthermore, the GFA of the right arcuate fasciculus exhibited a trend toward positive symptom scores. In conclusion, the right arcuate fasciculus may be a candidate trait marker and deserves further study to verify any genetic association. © 2014 Wiley Periodicals, Inc.

  3. Analysis of Optimal Operation of an Energy Integrated Distillation Plant

    DEFF Research Database (Denmark)

    Li, Hong Wen; Hansen, C.A.; Gani, Rafiqul

    2003-01-01

    The efficiency of manufacturing systems can be significantly increased through diligent application of control based on mathematical models thereby enabling more tight integration of decision making with systems operation. In the present paper analysis of optimal operation of an energy integrated...

  4. Integrative Analysis of Omics Big Data.

    Science.gov (United States)

    Yu, Xiang-Tian; Zeng, Tao

    2018-01-01

    The diversity and huge omics data take biology and biomedicine research and application into a big data era, just like that popular in human society a decade ago. They are opening a new challenge from horizontal data ensemble (e.g., the similar types of data collected from different labs or companies) to vertical data ensemble (e.g., the different types of data collected for a group of person with match information), which requires the integrative analysis in biology and biomedicine and also asks for emergent development of data integration to address the great changes from previous population-guided to newly individual-guided investigations.Data integration is an effective concept to solve the complex problem or understand the complicate system. Several benchmark studies have revealed the heterogeneity and trade-off that existed in the analysis of omics data. Integrative analysis can combine and investigate many datasets in a cost-effective reproducible way. Current integration approaches on biological data have two modes: one is "bottom-up integration" mode with follow-up manual integration, and the other one is "top-down integration" mode with follow-up in silico integration.This paper will firstly summarize the combinatory analysis approaches to give candidate protocol on biological experiment design for effectively integrative study on genomics and then survey the data fusion approaches to give helpful instruction on computational model development for biological significance detection, which have also provided newly data resources and analysis tools to support the precision medicine dependent on the big biomedical data. Finally, the problems and future directions are highlighted for integrative analysis of omics big data.

  5. Containment integrity analysis under accidents

    International Nuclear Information System (INIS)

    Lin Chengge; Zhao Ruichang; Liu Zhitao

    2010-01-01

    Containment integrity analyses for current nuclear power plants (NPPs) mainly focus on the internal pressure caused by design basis accidents (DBAs). In addition to the analyses of containment pressure response caused by DBAs, the behavior of containment during severe accidents (SAs) are also evaluated for AP1000 NPP. Since the conservatism remains in the assumptions,boundary conditions and codes, margin of the results of containment integrity analyses may be overestimated. Along with the improvements of the knowledge to the phenomena and process of relevant accidents, the margin overrated can be appropriately reduced by using the best estimate codes combined with the uncertainty methods, which could be beneficial to the containment design and construction of large passive plants (LPP) in China. (authors)

  6. Integrating reliability analysis and design

    International Nuclear Information System (INIS)

    Rasmuson, D.M.

    1980-10-01

    This report describes the Interactive Reliability Analysis Project and demonstrates the advantages of using computer-aided design systems (CADS) in reliability analysis. Common cause failure problems require presentations of systems, analysis of fault trees, and evaluation of solutions to these. Results have to be communicated between the reliability analyst and the system designer. Using a computer-aided design system saves time and money in the analysis of design. Computer-aided design systems lend themselves to cable routing, valve and switch lists, pipe routing, and other component studies. At EG and G Idaho, Inc., the Applicon CADS is being applied to the study of water reactor safety systems

  7. Agent-based enterprise integration

    Energy Technology Data Exchange (ETDEWEB)

    N. M. Berry; C. M. Pancerella

    1998-12-01

    The authors are developing and deploying software agents in an enterprise information architecture such that the agents manage enterprise resources and facilitate user interaction with these resources. The enterprise agents are built on top of a robust software architecture for data exchange and tool integration across heterogeneous hardware and software. The resulting distributed multi-agent system serves as a method of enhancing enterprises in the following ways: providing users with knowledge about enterprise resources and applications; accessing the dynamically changing enterprise; locating enterprise applications and services; and improving search capabilities for applications and data. Furthermore, agents can access non-agents (i.e., databases and tools) through the enterprise framework. The ultimate target of the effort is the user; they are attempting to increase user productivity in the enterprise. This paper describes their design and early implementation and discusses the planned future work.

  8. Development of an integrated data acquisition and handling system based on digital time series analysis for the measurement of plasma fluctuations

    International Nuclear Information System (INIS)

    Ghayspoor, R.; Roth, J.R.

    1986-01-01

    The nonlinear characteristics of data obtained by many plasma diagnostic systems requires the power of modern computers for on-line data processing and reduction. The objective of this work is to develop an integrated data acquisition and handling system based on digital time series analysis techniques. These techniques make it possible to investigate the nature of plasma fluctuations and the physical processes which give rise to them. The approach is to digitize the data, and to generate various spectra by means of Fast Fourier Transforms (FFT). Of particular interest is the computer generated auto-power spectrum, cross-power spectrum, phase spectrum, and squared coherency spectrum. Software programs based on those developed by Jae. Y. Hong at the University of Texas are utilized for these spectra. The LeCroy 3500-SA signal analyzer and VAX 11/780 are used as the data handling and reduction system in this work. In this report, the software required to link these two systems are described

  9. Open Source GIS based integrated watershed management

    Science.gov (United States)

    Byrne, J. M.; Lindsay, J.; Berg, A. A.

    2013-12-01

    Optimal land and water management to address future and current resource stresses and allocation challenges requires the development of state-of-the-art geomatics and hydrological modelling tools. Future hydrological modelling tools should be of high resolution, process based with real-time capability to assess changing resource issues critical to short, medium and long-term enviromental management. The objective here is to merge two renowned, well published resource modeling programs to create an source toolbox for integrated land and water management applications. This work will facilitate a much increased efficiency in land and water resource security, management and planning. Following an 'open-source' philosophy, the tools will be computer platform independent with source code freely available, maximizing knowledge transfer and the global value of the proposed research. The envisioned set of water resource management tools will be housed within 'Whitebox Geospatial Analysis Tools'. Whitebox, is an open-source geographical information system (GIS) developed by Dr. John Lindsay at the University of Guelph. The emphasis of the Whitebox project has been to develop a user-friendly interface for advanced spatial analysis in environmental applications. The plugin architecture of the software is ideal for the tight-integration of spatially distributed models and spatial analysis algorithms such as those contained within the GENESYS suite. Open-source development extends knowledge and technology transfer to a broad range of end-users and builds Canadian capability to address complex resource management problems with better tools and expertise for managers in Canada and around the world. GENESYS (Generate Earth Systems Science input) is an innovative, efficient, high-resolution hydro- and agro-meteorological model for complex terrain watersheds developed under the direction of Dr. James Byrne. GENESYS is an outstanding research and applications tool to address

  10. Spatial Integration Analysis of Provincial Historical and Cultural Heritage Resources Based on Geographic Information System (gis) — a Case Study of Spatial Integration Analysis of Historical and Cultural Heritage Resources in Zhejiang Province

    Science.gov (United States)

    Luo, W.; Zhang, J.; Wu, Q.; Chen, J.; Huo, X.; Zhang, J.; Zhang, Y.; Wang, T.

    2017-08-01

    In China historical and cultural heritage resources include historically and culturally famous cities, towns, villages, blocks, immovable cultural relics and the scenic spots with cultural connotation. The spatial distribution laws of these resources are always directly connected to the regional physical geography, historical development and historical traffic geography and have high research values. Meanwhile, the exhibition and use of these resources are greatly influenced by traffic and tourism and other plans at the provincial level, and it is of great realistic significance to offer proposals on traffic and so on that are beneficial to the exhibition of heritage resources based on the research of province distribution laws. This paper takes the spatial analysis of Geographic Information System (GIS) as the basic technological means and all historical and cultural resources in China's Zhejiang Province as research objects, and finds out in the space the accumulation areas and accumulation belts of Zhejiang Province's historic cities and cultural resources through overlay analysis and density analysis, etc. It then discusses the reasons of the formation of these accumulation areas and accumulation belts by combining with the analysis of physical geography and historical geography and so on, and in the end, linking the tourism planning and traffic planning at the provincial level, it provides suggestions on the exhibition and use of accumulation areas and accumulation belts of historic cities and cultural resources.

  11. Parallel processing of structural integrity analysis codes

    International Nuclear Information System (INIS)

    Swami Prasad, P.; Dutta, B.K.; Kushwaha, H.S.

    1996-01-01

    Structural integrity analysis forms an important role in assessing and demonstrating the safety of nuclear reactor components. This analysis is performed using analytical tools such as Finite Element Method (FEM) with the help of digital computers. The complexity of the problems involved in nuclear engineering demands high speed computation facilities to obtain solutions in reasonable amount of time. Parallel processing systems such as ANUPAM provide an efficient platform for realising the high speed computation. The development and implementation of software on parallel processing systems is an interesting and challenging task. The data and algorithm structure of the codes plays an important role in exploiting the parallel processing system capabilities. Structural analysis codes based on FEM can be divided into two categories with respect to their implementation on parallel processing systems. The first category codes such as those used for harmonic analysis, mechanistic fuel performance codes need not require the parallelisation of individual modules of the codes. The second category of codes such as conventional FEM codes require parallelisation of individual modules. In this category, parallelisation of equation solution module poses major difficulties. Different solution schemes such as domain decomposition method (DDM), parallel active column solver and substructuring method are currently used on parallel processing systems. Two codes, FAIR and TABS belonging to each of these categories have been implemented on ANUPAM. The implementation details of these codes and the performance of different equation solvers are highlighted. (author). 5 refs., 12 figs., 1 tab

  12. Direct analysis of δ2H and δ18O in natural and enriched human urine using laser-based, Off-Axis Integrated Cavity Output Spectroscopy

    Science.gov (United States)

    Berman, Elena S.F.; Fortsona, Susan L.; Snaith, Steven P.; Gupta, Manish; Baer, Douglas S.; Chery, Isabelle; Blanc, Stephane; Melanson, Edward L.; Thomson, Peter J; Speakman, John R.

    2012-01-01

    The stable isotopes of hydrogen (δ2H) and oxygen (δ18O) in human urine are measured during studies of total energy expenditure by the doubly labeled water method, measurement of total body water, and measurement of insulin resistance by glucose disposal among other applications. An ultrasensitive laser absorption spectrometer based on off-axis integrated cavity output spectroscopy was demonstrated for simple and inexpensive measurement of stable isotopes in natural isotopic abundance and isotopically enriched human urine. Preparation of urine for analysis was simple and rapid (approx. 25 samples per hour), requiring no decolorizing or distillation steps. Analysis schemes were demonstrated to address sample-to-sample memory while still allowing analysis of 45 natural or 30 enriched urine samples per day. The instrument was linear over a wide range of water isotopes (δ2H = −454 to +1702 ‰ and δ18O= −58.3 to +265 ‰). Measurements of human urine were precise to better than 0.65 ‰ 1σ for δ2H and 0.09 ‰ 1σ for δ18O for natural urines, 1.1 ‰ 1σ for δ2H and 0.13 ‰ 1σ for δ18O for low enriched urines, and 1.0 ‰ 1σ for δ2H and 0.08 ‰ 1σ for δ18O for high enriched urines. Furthermore, the accuracy of the isotope measurements of human urines was verified to better than ±0.81 ‰ in δ2H and ±0.13 ‰ in δ18O (average deviation) against three independent IRMS laboratories. The ability to immediately and inexpensively measure the stable isotopes of water in human urine is expected to increase the number and variety of experiments which can be undertaken. PMID:23075099

  13. A Cloud Based Data Integration Framework

    OpenAIRE

    Jiang , Nan; Xu , Lai; Vrieze , Paul ,; Lim , Mian-Guan; Jarabo , Oscar

    2012-01-01

    Part 7: Cloud-Based Support; International audience; Virtual enterprise (VE) relies on resource sharing and collaboration across geographically dispersed and dynamically allied businesses in order to better respond to market opportunities. It is generally considered that effective data integration and management is crucial to realise the value of VE. This paper describes a cloud-based data integration framework that can be used for supporting VE to discover, explore and respond more emerging ...

  14. Integrating Data Transformation in Principal Components Analysis

    KAUST Repository

    Maadooliat, Mehdi

    2015-01-02

    Principal component analysis (PCA) is a popular dimension reduction method to reduce the complexity and obtain the informative aspects of high-dimensional datasets. When the data distribution is skewed, data transformation is commonly used prior to applying PCA. Such transformation is usually obtained from previous studies, prior knowledge, or trial-and-error. In this work, we develop a model-based method that integrates data transformation in PCA and finds an appropriate data transformation using the maximum profile likelihood. Extensions of the method to handle functional data and missing values are also developed. Several numerical algorithms are provided for efficient computation. The proposed method is illustrated using simulated and real-world data examples.

  15. Integrating Mainframe Data Bases on a Microcomputer

    OpenAIRE

    Marciniak, Thomas A.

    1985-01-01

    Microcomputers support user-friendly software for interrogating their resident data bases. Many medical data bases currently consist of files on less accessible mainframe computers with more limited inquiry capabilities. We discuss the transferring and integrating of mainframe data into microcomputer data base systems in one medical environment.

  16. Integrative cluster analysis in bioinformatics

    CERN Document Server

    Abu-Jamous, Basel; Nandi, Asoke K

    2015-01-01

    Clustering techniques are increasingly being put to use in the analysis of high-throughput biological datasets. Novel computational techniques to analyse high throughput data in the form of sequences, gene and protein expressions, pathways, and images are becoming vital for understanding diseases and future drug discovery. This book details the complete pathway of cluster analysis, from the basics of molecular biology to the generation of biological knowledge. The book also presents the latest clustering methods and clustering validation, thereby offering the reader a comprehensive review o

  17. Integral data analysis for resonance parameters determination

    International Nuclear Information System (INIS)

    Larson, N.M.; Leal, L.C.; Derrien, H.

    1997-09-01

    Neutron time-of-flight experiments have long been used to determine resonance parameters. Those resonance parameters have then been used in calculations of integral quantities such as Maxwellian averages or resonance integrals, and results of those calculations in turn have been used as a criterion for acceptability of the resonance analysis. However, the calculations were inadequate because covariances on the parameter values were not included in the calculations. In this report an effort to correct for that deficiency is documented: (1) the R-matrix analysis code SAMMY has been modified to include integral quantities of importance, (2) directly within the resonance parameter analysis, and (3) to determine the best fit to both differential (microscopic) and integral (macroscopic) data simultaneously. This modification was implemented because it is expected to have an impact on the intermediate-energy range that is important for criticality safety applications

  18. Integration of Human Reliability Analysis Models into the Simulation-Based Framework for the Risk-Informed Safety Margin Characterization Toolkit

    International Nuclear Information System (INIS)

    Boring, Ronald; Mandelli, Diego; Rasmussen, Martin; Ulrich, Thomas; Groth, Katrina; Smith, Curtis

    2016-01-01

    This report presents an application of a computation-based human reliability analysis (HRA) framework called the Human Unimodel for Nuclear Technology to Enhance Reliability (HUNTER). HUNTER has been developed not as a standalone HRA method but rather as framework that ties together different HRA methods to model dynamic risk of human activities as part of an overall probabilistic risk assessment (PRA). While we have adopted particular methods to build an initial model, the HUNTER framework is meant to be intrinsically flexible to new pieces that achieve particular modeling goals. In the present report, the HUNTER implementation has the following goals: • Integration with a high fidelity thermal-hydraulic model capable of modeling nuclear power plant behaviors and transients • Consideration of a PRA context • Incorporation of a solid psychological basis for operator performance • Demonstration of a functional dynamic model of a plant upset condition and appropriate operator response This report outlines these efforts and presents the case study of a station blackout scenario to demonstrate the various modules developed to date under the HUNTER research umbrella.

  19. A New Methodology for Open Pit Slope Design in Karst-Prone Ground Conditions Based on Integrated Stochastic-Limit Equilibrium Analysis

    Science.gov (United States)

    Zhang, Ke; Cao, Ping; Ma, Guowei; Fan, Wenchen; Meng, Jingjing; Li, Kaihui

    2016-07-01

    Using the Chengmenshan Copper Mine as a case study, a new methodology for open pit slope design in karst-prone ground conditions is presented based on integrated stochastic-limit equilibrium analysis. The numerical modeling and optimization design procedure contain a collection of drill core data, karst cave stochastic model generation, SLIDE simulation and bisection method optimization. Borehole investigations are performed, and the statistical result shows that the length of the karst cave fits a negative exponential distribution model, but the length of carbonatite does not exactly follow any standard distribution. The inverse transform method and acceptance-rejection method are used to reproduce the length of the karst cave and carbonatite, respectively. A code for karst cave stochastic model generation, named KCSMG, is developed. The stability of the rock slope with the karst cave stochastic model is analyzed by combining the KCSMG code and the SLIDE program. This approach is then applied to study the effect of the karst cave on the stability of the open pit slope, and a procedure to optimize the open pit slope angle is presented.

  20. Integration of Human Reliability Analysis Models into the Simulation-Based Framework for the Risk-Informed Safety Margin Characterization Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Rasmussen, Martin [Norwegian Univ. of Science and Technology, Trondheim (Norway). Social Research; Herberger, Sarah [Idaho National Lab. (INL), Idaho Falls, ID (United States); Ulrich, Thomas [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-06-01

    This report presents an application of a computation-based human reliability analysis (HRA) framework called the Human Unimodel for Nuclear Technology to Enhance Reliability (HUNTER). HUNTER has been developed not as a standalone HRA method but rather as framework that ties together different HRA methods to model dynamic risk of human activities as part of an overall probabilistic risk assessment (PRA). While we have adopted particular methods to build an initial model, the HUNTER framework is meant to be intrinsically flexible to new pieces that achieve particular modeling goals. In the present report, the HUNTER implementation has the following goals: • Integration with a high fidelity thermal-hydraulic model capable of modeling nuclear power plant behaviors and transients • Consideration of a PRA context • Incorporation of a solid psychological basis for operator performance • Demonstration of a functional dynamic model of a plant upset condition and appropriate operator response This report outlines these efforts and presents the case study of a station blackout scenario to demonstrate the various modules developed to date under the HUNTER research umbrella.

  1. Analysis and performance assessment of a new solar-based multigeneration system integrated with ammonia fuel cell and solid oxide fuel cell-gas turbine combined cycle

    Science.gov (United States)

    Siddiqui, Osamah; Dincer, Ibrahim

    2017-12-01

    In the present study, a new solar-based multigeneration system integrated with an ammonia fuel cell and solid oxide fuel cell-gas turbine combined cycle to produce electricity, hydrogen, cooling and hot water is developed for analysis and performance assessment. In this regard, thermodynamic analyses and modeling through both energy and exergy approaches are employed to assess and evaluate the overall system performance. Various parametric studies are conducted to study the effects of varying system parameters and operating conditions on the energy and exergy efficiencies. The results of this study show that the overall multigeneration system energy efficiency is obtained as 39.1% while the overall system exergy efficiency is calculated as 38.7%, respectively. The performance of this multigeneration system results in an increase of 19.3% in energy efficiency as compared to single generation system. Furthermore, the exergy efficiency of the multigeneration system is 17.8% higher than the single generation system. Moreover, both energy and exergy efficiencies of the solid oxide fuel cell-gas turbine combined cycle are determined as 68.5% and 55.9% respectively.

  2. Learning Behavior and Achievement Analysis of a Digital Game-Based Learning Approach Integrating Mastery Learning Theory and Different Feedback Models

    Science.gov (United States)

    Yang, Kai-Hsiang

    2017-01-01

    It is widely accepted that the digital game-based learning approach has the advantage of stimulating students' learning motivation, but simply using digital games in the classroom does not guarantee satisfactory learning achievement, especially in the case of the absence of a teacher. Integrating appropriate learning strategies into a game can…

  3. Risk assessment of water pollution sources based on an integrated k-means clustering and set pair analysis method in the region of Shiyan, China.

    Science.gov (United States)

    Li, Chunhui; Sun, Lian; Jia, Junxiang; Cai, Yanpeng; Wang, Xuan

    2016-07-01

    Source water areas are facing many potential water pollution risks. Risk assessment is an effective method to evaluate such risks. In this paper an integrated model based on k-means clustering analysis and set pair analysis was established aiming at evaluating the risks associated with water pollution in source water areas, in which the weights of indicators were determined through the entropy weight method. Then the proposed model was applied to assess water pollution risks in the region of Shiyan in which China's key source water area Danjiangkou Reservoir for the water source of the middle route of South-to-North Water Diversion Project is located. The results showed that eleven sources with relative high risk value were identified. At the regional scale, Shiyan City and Danjiangkou City would have a high risk value in term of the industrial discharge. Comparatively, Danjiangkou City and Yunxian County would have a high risk value in terms of agricultural pollution. Overall, the risk values of north regions close to the main stream and reservoir of the region of Shiyan were higher than that in the south. The results of risk level indicated that five sources were in lower risk level (i.e., level II), two in moderate risk level (i.e., level III), one in higher risk level (i.e., level IV) and three in highest risk level (i.e., level V). Also risks of industrial discharge are higher than that of the agricultural sector. It is thus essential to manage the pillar industry of the region of Shiyan and certain agricultural companies in the vicinity of the reservoir to reduce water pollution risks of source water areas. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. High-Throughput Analysis With 96-Capillary Array Electrophoresis and Integrated Sample Preparation for DNA Sequencing Based on Laser Induced Fluorescence Detection

    Energy Technology Data Exchange (ETDEWEB)

    Xue, Gang [Iowa State Univ., Ames, IA (United States)

    2001-01-01

    The purpose of this research was to improve the fluorescence detection for the multiplexed capillary array electrophoresis, extend its use beyond the genomic analysis, and to develop an integrated micro-sample preparation system for high-throughput DNA sequencing. The authors first demonstrated multiplexed capillary zone electrophoresis (CZE) and micellar electrokinetic chromatography (MEKC) separations in a 96-capillary array system with laser-induced fluorescence detection. Migration times of four kinds of fluoresceins and six polyaromatic hydrocarbons (PAHs) are normalized to one of the capillaries using two internal standards. The relative standard deviations (RSD) after normalization are 0.6-1.4% for the fluoresceins and 0.1-1.5% for the PAHs. Quantitative calibration of the separations based on peak areas is also performed, again with substantial improvement over the raw data. This opens up the possibility of performing massively parallel separations for high-throughput chemical analysis for process monitoring, combinatorial synthesis, and clinical diagnosis. The authors further improved the fluorescence detection by step laser scanning. A computer-controlled galvanometer scanner is adapted for scanning a focused laser beam across a 96-capillary array for laser-induced fluorescence detection. The signal at a single photomultiplier tube is temporally sorted to distinguish among the capillaries. The limit of detection for fluorescein is 3 x 10-11 M (S/N = 3) for 5-mW of total laser power scanned at 4 Hz. The observed cross-talk among capillaries is 0.2%. Advantages include the efficient utilization of light due to the high duty-cycle of step scan, good detection performance due to the reduction of stray light, ruggedness due to the small mass of the galvanometer mirror, low cost due to the simplicity of components, and flexibility due to the independent paths for excitation and emission.

  5. Integrated application of river water quality modelling and cost-benefit analysis to optimize the environmental economical value based on various aquatic waste load reduction strategies

    Science.gov (United States)

    Wu, Chen-Yu; Fan, Chihhao

    2017-04-01

    To assure the river water quality, the Taiwan government establishes many pollution control strategies and expends huge monetary investment. Despite all these efforts, many rivers still suffer from severe pollution because of massive discharges of domestic and industrial wastewater without proper treatment. A comprehensive evaluation tool seems required to assess the suitability of water pollution control strategies. Therefore, the purpose of this study is to quantify the potential strategic benefits by applying the water quality modelling integrated with cost-benefit analysis to simulating scenarios based on regional development planning. The Erhjen Creek is selected as the study example because it is a major river in southern Taiwan, and its riverine environment impacts a great deal to the neighboring people. For strategy assessment, we established QUAL2k model of Erhjen Creek and conducted the cost-benefit analyses according the proposed strategies. In the water quality simulation, HEC-RAS was employed to calculate the hydraulic parameters and dilution impact of tidal effect in the downstream section. Daily pollution loadings were obtained from the Water Pollution Control Information System maintained by Taiwan EPA, and the wastewater delivery ratios were calculated by comparing the occurrence of pollution loadings with the monitoring data. In the cost-benefit analysis, we adopted the market valuation method, setting a period of 65 years for analysis and discount rate at 2.59%. Capital investments were the costs of design, construction, operation and maintenance for each project in Erhjen Creek catchment. In model calibration and model verification, the mean absolute percentage errors (MAPEs) were calculated to be 21.4% and 25.5%, respectively, which met the prescribed acceptable criteria of 50%. This model was applied to simulating water quality based on implementing various pollution control policies and engineering projects in the Erhjen Creek. The overall

  6. Preliminary Integrated Safety Analysis Status Report

    International Nuclear Information System (INIS)

    Gwyn, D.

    2001-01-01

    This report provides the status of the potential Monitored Geologic Repository (MGR) Integrated Safety Analysis (EA) by identifying the initial work scope scheduled for completion during the ISA development period, the schedules associated with the tasks identified, safety analysis issues encountered, and a summary of accomplishments during the reporting period. This status covers the period from October 1, 2000 through March 30, 2001

  7. Analysis of gene expression profiles of soft tissue sarcoma using a combination of knowledge-based filtering with integration of multiple statistics.

    Directory of Open Access Journals (Sweden)

    Anna Takahashi

    Full Text Available The diagnosis and treatment of soft tissue sarcomas (STS have been difficult. Of the diverse histological subtypes, undifferentiated pleomorphic sarcoma (UPS is particularly difficult to diagnose accurately, and its classification per se is still controversial. Recent advances in genomic technologies provide an excellent way to address such problems. However, it is often difficult, if not impossible, to identify definitive disease-associated genes using genome-wide analysis alone, primarily because of multiple testing problems. In the present study, we analyzed microarray data from 88 STS patients using a combination method that used knowledge-based filtering and a simulation based on the integration of multiple statistics to reduce multiple testing problems. We identified 25 genes, including hypoxia-related genes (e.g., MIF, SCD1, P4HA1, ENO1, and STAT1 and cell cycle- and DNA repair-related genes (e.g., TACC3, PRDX1, PRKDC, and H2AFY. These genes showed significant differential expression among histological subtypes, including UPS, and showed associations with overall survival. STAT1 showed a strong association with overall survival in UPS patients (logrank p = 1.84 × 10(-6 and adjusted p value 2.99 × 10(-3 after the permutation test. According to the literature, the 25 genes selected are useful not only as markers of differential diagnosis but also as prognostic/predictive markers and/or therapeutic targets for STS. Our combination method can identify genes that are potential prognostic/predictive factors and/or therapeutic targets in STS and possibly in other cancers. These disease-associated genes deserve further preclinical and clinical validation.

  8. Integrability of dynamical systems algebra and analysis

    CERN Document Server

    Zhang, Xiang

    2017-01-01

    This is the first book to systematically state the fundamental theory of integrability and its development of ordinary differential equations with emphasis on the Darboux theory of integrability and local integrability together with their applications. It summarizes the classical results of Darboux integrability and its modern development together with their related Darboux polynomials and their applications in the reduction of Liouville and elementary integrabilty and in the center—focus problem, the weakened Hilbert 16th problem on algebraic limit cycles and the global dynamical analysis of some realistic models in fields such as physics, mechanics and biology. Although it can be used as a textbook for graduate students in dynamical systems, it is intended as supplementary reading for graduate students from mathematics, physics, mechanics and engineering in courses related to the qualitative theory, bifurcation theory and the theory of integrability of dynamical systems.

  9. Strategic Analysis of Technology Integration at Allstream

    OpenAIRE

    Brown, Jeff

    2011-01-01

    Innovation has been defined as the combination of invention and commercialization. Invention without commercialization is rarely, if ever, profitable. For the purposes of this paper the definition of innovation will be further expanded into the concept of technology integration. Successful technology integration not only includes new technology introduction, but also the operationalization of the new technology within each business unit of the enterprise. This paper conducts an analysis of Al...

  10. IMP: Integrated method for power analysis

    Energy Technology Data Exchange (ETDEWEB)

    1989-03-01

    An integrated, easy to use, economical package of microcomputer programs has been developed which can be used by small hydro developers to evaluate potential sites for small scale hydroelectric plants in British Columbia. The programs enable evaluation of sites located far from the nearest stream gauging station, for which streamflow data are not available. For each of the province's 6 hydrologic regions, a streamflow record for one small watershed is provided in the data base. The program can then be used to generate synthetic streamflow records and to compare results obtained by the modelling procedure with the actual data. The program can also be used to explore the significance of modelling parameters and to develop a detailed appreciation for the accuracy which can be obtained under various circumstances. The components of the program are an atmospheric model of precipitation; a watershed model that will generate a continuous series of streamflow data, based on information from the atmospheric model; a flood frequency analysis system that uses site-specific topographic data plus information from the atmospheric model to generate a flood frequency curve; a hydroelectric power simulation program which determines daily energy output for a run-of-river or reservoir storage site based on selected generation facilities and the time series generated in the watershed model; and a graphic analysis package that provides direct visualization of data and modelling results. This report contains a description of the programs, a user guide, the theory behind the model, the modelling methodology, and results from a workshop that reviewed the program package. 32 refs., 16 figs., 18 tabs.

  11. An integrated acquisition, display, and analysis system

    International Nuclear Information System (INIS)

    Ahmad, T.; Huckins, R.J.

    1987-01-01

    The design goal of the ND9900/Genuie was to integrate a high performance data acquisition and display subsystem with a state-of-the-art 32-bit supermicrocomputer. This was achieved by integrating a Digital Equipment Corporation MicroVAX II CPU board with acquisition and display controllers via the Q-bus. The result is a tightly coupled processing and analysis system for Pulse Height Analysis and other applications. The system architecture supports distributed processing, so that acquisition and display functions are semi-autonomous, making the VAX concurrently available for applications programs

  12. Results of an Integrative Analysis: A Call for Contextualizing HIV and AIDS Clinical Practice Guidelines to Support Evidence-Based Practice.

    Science.gov (United States)

    Edwards, Nancy; Kahwa, Eulalia; Hoogeveen, Katie

    2017-12-01

    Practice guidelines aim to improve the standard of care for people living with HIV/AIDS. Successfully implementing guidelines requires tailoring them to populations served and to social and organizational influences on care. To examine dimensions of context, which nurses and midwives described as having a significant impact on their care of patients living with HIV/AIDS in Kenya, Uganda, South Africa, and Jamaica and to determine whether HIV/AIDS guidelines include adaptations congruent with these dimensions of context. Two sets of data were used. The first came from a qualitative study. In-depth interviews were conducted with purposively selected nurses, midwives, and nurse managers from 21 districts in four study countries. A coding framework was iteratively developed and themes inductively identified. Context dimensions were derived from these themes. A second data set of published guidelines for HIV/AIDS care was then assembled. Guidelines were identified through Google and PubMed searches. Using a deductive integrative analysis approach, text related to context dimensions was extracted from guidelines and categorized into problem and strategy statements. Ninety-six individuals participated in qualitative interviews. Four discrete dimensions of context were identified: health workforce adequacy, workplace exposure risk, workplace consequences for nurses living with HIV/AIDS, and the intersection of work and family life. Guidelines most often acknowledged health human resource constraints and presented mitigation strategies to offset them, and least often discussed workplace consequences and the intersections of family and work life. Guidelines should more consistently acknowledge diverse implementation contexts, propose how recommendations can be adapted to these realities, and suggest what role frontline healthcare providers have in realizing the structural changes necessary for healthier work environments and better patient care. Guideline recommendations

  13. Continuous integration congestion cost allocation based on sensitivity

    International Nuclear Information System (INIS)

    Wu, Z.Q.; Wang, Y.N.

    2004-01-01

    Congestion cost allocation is a very important topic in congestion management. Allocation methods based on the Aumann-Shapley value use the discrete numerical integration method, which needs to solve the incremented OPF solution many times, and as such it is not suitable for practical application to large-scale systems. The optimal solution and its sensitivity change tendency during congestion removal using a DC optimal power flow (OPF) process is analysed. A simple continuous integration method based on the sensitivity is proposed for the congestion cost allocation. The proposed sensitivity analysis method needs a smaller computation time than the method based on using the quadratic method and inner point iteration. The proposed congestion cost allocation method uses a continuous integration method rather than discrete numerical integration. The method does not need to solve the incremented OPF solutions; which allows it use in large-scale systems. The method can also be used for AC OPF congestion management. (author)

  14. Efficacy and Cost-Effectiveness Analysis of Evidence-Based Nursing Interventions to Maintain Tissue Integrity to Prevent Pressure Ulcers and Incontinence-Associated Dermatitis.

    Science.gov (United States)

    Avşar, Pınar; Karadağ, Ayişe

    2018-02-01

    A reduction in tissue tolerance promotes the development of pressure ulcers (PUs) and incontinence-associated dermatitis (IAD). To determine the cost-effectiveness and efficacy of evidence-based (EB) nursing interventions on increasing tissue tolerance by maintaining tissue integrity. The study involved 154 patients in two intensive care units (77 patients, control group; 77 patients, intervention group). Data were collected using the following: patient characteristics form, Braden PU risk assessment scale, tissue integrity monitoring form, PU identification form, IAD and severity scale, and a cost table of the interventions. Patients in the intervention group were cared for by nurses trained in the use of the data collection tools and in EB practices to improve tissue tolerance. Routine nursing care was given to the patients in the control group. The researcher observed all patients in terms of tissue integrity and recorded the care-related costs. Deterioration of tissue integrity was observed in 18.2% patients in the intervention group compared to 54.5% in the control group (p cost to increase tissue tolerance prevention in the intervention and control groups was X¯ = $204.34 ± 41.07 and X¯ = $138.90 ± 1.70, respectively. It is recommended that EB policies and procedures are developed to improve tissue tolerance by maintaining tissue integrity. Although the cost of EB preventive initiatives is relatively high compared to those that are not EB, the former provide a significant reduction in the prevalence of tissue integrity deterioration. © 2017 Sigma Theta Tau International.

  15. Abel integral equations analysis and applications

    CERN Document Server

    Gorenflo, Rudolf

    1991-01-01

    In many fields of application of mathematics, progress is crucially dependent on the good flow of information between (i) theoretical mathematicians looking for applications, (ii) mathematicians working in applications in need of theory, and (iii) scientists and engineers applying mathematical models and methods. The intention of this book is to stimulate this flow of information. In the first three chapters (accessible to third year students of mathematics and physics and to mathematically interested engineers) applications of Abel integral equations are surveyed broadly including determination of potentials, stereology, seismic travel times, spectroscopy, optical fibres. In subsequent chapters (requiring some background in functional analysis) mapping properties of Abel integral operators and their relation to other integral transforms in various function spaces are investi- gated, questions of existence and uniqueness of solutions of linear and nonlinear Abel integral equations are treated, and for equatio...

  16. The economics of renewable electricity market integration. An empirical and model-based analysis of regulatory frameworks and their impacts on the power market

    Energy Technology Data Exchange (ETDEWEB)

    Nicolosi, Marco

    2012-07-01

    As power systems increase in complexity due to higher shares of intermitting RES-E, so increase the requirements for power system modeling. This thesis shows empirically, with examples from Germany and Texas, that the increasing RES-E share strongly affects current power market operation. The markets further create price signals, which lead to system adaptations in the long-run. To get an estimate of the adaptation effects, 'The High Temporal Resolution Electricity Market Analysis Model' (THEA) has been developed. In a first application for the ERCOT market in Texas, particular model attributes are tested and compared to some complexity reducing approaches, i.e. the reduction of temporal resolution and the reduction of operational constraints. In both cases, the results show significant differences compared to the results when the full spectrum of THEA's capabilities is utilized. The ERCOT case study additionally shows that the adaptation to RES-E in an isolated, mainly thermal-based power system is quite severe. Market signals which underline this conclusion are the severely reduced value of wind energy, the increasing curtailment and the strong shift towards peak-oriented generating capacities. The second application of THEA models the German power market with its interconnected markets. This analysis increases the complexity significantly by modeling a well interconnected system, increasing the amount of different RES-E technologies and adding CAES investment options. In order to assess the impact on the different system component's supply, demand and grid infrastructure, specific measures are applied to compare several scenarios. Each scenario represents a policy option, which either reduces or increases the flexibility of the power system. The scenario comparisons capture the effects of a lower RES-E share, a larger baseload capacity fleet, higher interconnector capacities, various RES-E support scheme designs and the capability of RES-E to

  17. Integrating an infectious disease programme into the primary health care service: a retrospective analysis of Chagas disease community-based surveillance in Honduras.

    Science.gov (United States)

    Hashimoto, Ken; Zúniga, Concepción; Nakamura, Jiro; Hanada, Kyo

    2015-03-24

    Integration of disease-specific programmes into the primary health care (PHC) service has been attempted mostly in clinically oriented disease control such as HIV/AIDS and tuberculosis but rarely in vector control. Chagas disease is controlled principally by interventions against the triatomine vector. In Honduras, after successful reduction of household infestation by vertical approach, the Ministry of Health implemented community-based vector surveillance at the PHC services (health centres) to prevent the resurgence of infection. This paper retrospectively analyses the effects and process of integrating a Chagas disease vector surveillance system into health centres. We evaluated the effects of integration at six pilot sites in western Honduras during 2008-2011 on; surveillance performance; knowledge, attitude and practice in schoolchildren; reports of triatomine bug infestation and institutional response; and seroprevalence among children under 15 years of age. The process of integration of the surveillance system was analysed using the PRECEDE-PROCEED model for health programme planning. The model was employed to systematically determine influential and interactive factors which facilitated the integration process at different levels of the Ministry of Health and the community. Overall surveillance performance improved from 46 to 84 on a 100 point-scale. Schoolchildren's attitude (risk awareness) score significantly increased from 77 to 83 points. Seroprevalence declined from 3.4% to 0.4%. Health centres responded to the community bug reports by insecticide spraying. As key factors, the health centres had potential management capacity and influence over the inhabitants' behaviours and living environment directly and through community health volunteers. The National Chagas Programme played an essential role in facilitating changes with adequate distribution of responsibilities, participatory modelling, training and, evaluation and advocacy. We found that Chagas

  18. Game analysis of product-service integration

    Directory of Open Access Journals (Sweden)

    Heping Zhong

    2014-10-01

    Full Text Available Purpose: This paper aims at defining the value creation mechanism and income distribution strategies of product-service integration in order to promote product-service integration of a firm.Design/methodology/approach: This paper conducts researches quantitatively on the coordination mechanism of product-service integration by using game theory, and uses the methods of Shapley value and Equal growth rate to further discuss income distribution strategies of product-service integration.Findings: Product-service integration increases the total income of a firm and the added value of the income decreases as the unit price demand variation coefficient of products and services increases, while decreases as the marginal cost of products increases, decreases as the marginal cost of services increases. Moreover, the findings suggest that both income distribution strategies of product-service integration based on Shapley value method and Equal growth rate method can make the product department and service department of a firm win-win and realize the pareto improvement. The choice of what kind of distribution strategy to coordinate the actions between departments depends on the department playing dominant role in the firm. Generally speaking, for a firm at the center of market, when the product department is the main contributor to firm income, the service department will choose the income distribution strategy of product-service integration based on Shapley value method; when the service department is the main contributor to firm income, the service department will choose the income distribution strategy of product-service integration based on Equal growth rate method.Research limitations/implications: This paper makes some strict assumptions such as complete information, risk neutral, linear cost function and so on and the discussion is limited to the simple relationship between product department and service department.Practical implications: Product

  19. Nonlinear structural analysis using integrated force method

    Indian Academy of Sciences (India)

    A new formulation termed the Integrated Force Method (IFM) was proposed by Patnaik ... nated ``Structure (nY m)'' where (nY m) are the force and displacement degrees of ..... Patnaik S N, Yadagiri S 1976 Frequency analysis of structures.

  20. Integrated Temperature Sensors based on Heat Diffusion

    NARCIS (Netherlands)

    Van Vroonhoven, C.P.L.

    2015-01-01

    This thesis describes the theory, design and implementation of a new class of integrated temperature sensors, based on heat diffusion. In such sensors, temperature is sensed by measuring the time it takes for heat to diffuse through silicon. An on-chip thermal delay can be determined by geometry and

  1. IPAD: the Integrated Pathway Analysis Database for Systematic Enrichment Analysis.

    Science.gov (United States)

    Zhang, Fan; Drabier, Renee

    2012-01-01

    Next-Generation Sequencing (NGS) technologies and Genome-Wide Association Studies (GWAS) generate millions of reads and hundreds of datasets, and there is an urgent need for a better way to accurately interpret and distill such large amounts of data. Extensive pathway and network analysis allow for the discovery of highly significant pathways from a set of disease vs. healthy samples in the NGS and GWAS. Knowledge of activation of these processes will lead to elucidation of the complex biological pathways affected by drug treatment, to patient stratification studies of new and existing drug treatments, and to understanding the underlying anti-cancer drug effects. There are approximately 141 biological human pathway resources as of Jan 2012 according to the Pathguide database. However, most currently available resources do not contain disease, drug or organ specificity information such as disease-pathway, drug-pathway, and organ-pathway associations. Systematically integrating pathway, disease, drug and organ specificity together becomes increasingly crucial for understanding the interrelationships between signaling, metabolic and regulatory pathway, drug action, disease susceptibility, and organ specificity from high-throughput omics data (genomics, transcriptomics, proteomics and metabolomics). We designed the Integrated Pathway Analysis Database for Systematic Enrichment Analysis (IPAD, http://bioinfo.hsc.unt.edu/ipad), defining inter-association between pathway, disease, drug and organ specificity, based on six criteria: 1) comprehensive pathway coverage; 2) gene/protein to pathway/disease/drug/organ association; 3) inter-association between pathway, disease, drug, and organ; 4) multiple and quantitative measurement of enrichment and inter-association; 5) assessment of enrichment and inter-association analysis with the context of the existing biological knowledge and a "gold standard" constructed from reputable and reliable sources; and 6) cross-linking of

  2. Direct integration multiple collision integral transport analysis method for high energy fusion neutronics

    International Nuclear Information System (INIS)

    Koch, K.R.

    1985-01-01

    A new analysis method specially suited for the inherent difficulties of fusion neutronics was developed to provide detailed studies of the fusion neutron transport physics. These studies should provide a better understanding of the limitations and accuracies of typical fusion neutronics calculations. The new analysis method is based on the direct integration of the integral form of the neutron transport equation and employs a continuous energy formulation with the exact treatment of the energy angle kinematics of the scattering process. In addition, the overall solution is analyzed in terms of uncollided, once-collided, and multi-collided solution components based on a multiple collision treatment. Furthermore, the numerical evaluations of integrals use quadrature schemes that are based on the actual dependencies exhibited in the integrands. The new DITRAN computer code was developed on the Cyber 205 vector supercomputer to implement this direct integration multiple-collision fusion neutronics analysis. Three representative fusion reactor models were devised and the solutions to these problems were studied to provide suitable choices for the numerical quadrature orders as well as the discretized solution grid and to understand the limitations of the new analysis method. As further verification and as a first step in assessing the accuracy of existing fusion-neutronics calculations, solutions obtained using the new analysis method were compared to typical multigroup discrete ordinates calculations

  3. Pixel extraction based integral imaging with controllable viewing direction

    International Nuclear Information System (INIS)

    Ji, Chao-Chao; Deng, Huan; Wang, Qiong-Hua

    2012-01-01

    We propose pixel extraction based integral imaging with a controllable viewing direction. The proposed integral imaging can provide viewers three-dimensional (3D) images in a very small viewing angle. The viewing angle and the viewing direction of the reconstructed 3D images are controlled by the pixels extracted from an elemental image array. Theoretical analysis and a 3D display experiment of the viewing direction controllable integral imaging are carried out. The experimental results verify the correctness of the theory. A 3D display based on the integral imaging can protect the viewer’s privacy and has huge potential for a television to show multiple 3D programs at the same time. (paper)

  4. Efficacy of an integrated hospital-primary care program for heart failure: a population-based analysis of 56,742 patients.

    Science.gov (United States)

    Comín-Colet, Josep; Verdú-Rotellar, José María; Vela, Emili; Clèries, Montse; Bustins, Montserrat; Mendoza, Lola; Badosa, Neus; Cladellas, Mercè; Ferré, Sofía; Bruguera, Jordi

    2014-04-01

    The efficacy of heart failure programs has been demonstrated in clinical trials but their applicability in the real world practice setting is more controversial. This study evaluates the feasibility and efficacy of an integrated hospital-primary care program for the management of patients with heart failure in an integrated health area covering a population of 309,345. For the analysis, we included all patients consecutively admitted with heart failure as the principal diagnosis who had been discharged alive from all of the hospitals in Catalonia, Spain, from 2005 to 2011, the period when the program was implemented, and compared mortality and readmissions among patients exposed to the program with the rates in the patients of all the remaining integrated health areas of the Servei Català de la Salut (Catalan Health Service). We included 56,742 patients in the study. There were 181,204 hospital admissions and 30,712 deaths during the study period. In the adjusted analyses, when compared to the 54,659 patients from the other health areas, the 2083 patients exposed to the program had a lower risk of death (hazard ratio=0.92 [95% confidence interval, 0.86-0.97]; P=.005), a lower risk of clinically-related readmission (hazard ratio=0.71 [95% confidence interval, 0.66-0.76]; P<.001), and a lower risk of readmission for heart failure (hazard ratio=0.86 [95% confidence interval, 0.80-0.94]; P<.001). The positive impact on the morbidity and mortality rates was more marked once the program had become well established. The implementation of multidisciplinary heart failure management programs that integrate the hospital and the community is feasible and is associated with a significant reduction in patient morbidity and mortality. Copyright © 2013 Sociedad Española de Cardiología. Published by Elsevier Espana. All rights reserved.

  5. Microprocessor-based integrated LMFBR core surveillance. Pt. 2

    International Nuclear Information System (INIS)

    Elies, V.

    1985-12-01

    This report is the result of the KfK part of a joint study of KfK and INTERATOM. The aim of this study is to explore the advantages of microprocessors and microelectronics for a more sophisticated core surveillance, which is based on the integration of separate surveillance techniques. After a description of the experimental results gained with the different surveillance techniques so far, it is shown which kinds of correlation can be done using the evaluation results obtained from the single surveillance systems. The main part of this report contains the systems analysis of a microcomputer-based system integrating different surveillance methods. After an analysis of the hardware requirements a hardware structure for the integrated system is proposed. The software structure is then described for the subsystem performing the different surveillance algorithms as well as for the system which does the correlation thus deriving additional information from the single results. (orig.) [de

  6. Decision-Based Design Integrating Consumer Preferences into Engineering Design

    CERN Document Server

    Chen, Wei; Wassenaar, Henk Jan

    2013-01-01

    Building upon the fundamental principles of decision theory, Decision-Based Design: Integrating Consumer Preferences into Engineering Design presents an analytical approach to enterprise-driven Decision-Based Design (DBD) as a rigorous framework for decision making in engineering design.  Once the related fundamentals of decision theory, economic analysis, and econometrics modelling are established, the remaining chapters describe the entire process, the associated analytical techniques, and the design case studies for integrating consumer preference modeling into the enterprise-driven DBD framework. Methods for identifying key attributes, optimal design of human appraisal experiments, data collection, data analysis, and demand model estimation are presented and illustrated using engineering design case studies. The scope of the chapters also provides: •A rigorous framework of integrating the interests from both producer and consumers in engineering design, •Analytical techniques of consumer choice model...

  7. Integrating neural network technology and noise analysis

    International Nuclear Information System (INIS)

    Uhrig, R.E.; Oak Ridge National Lab., TN

    1995-01-01

    The integrated use of neural network and noise analysis technologies offers advantages not available by the use of either technology alone. The application of neural network technology to noise analysis offers an opportunity to expand the scope of problems where noise analysis is useful and unique ways in which the integration of these technologies can be used productively. The two-sensor technique, in which the responses of two sensors to an unknown driving source are related, is used to demonstration such integration. The relationship between power spectral densities (PSDs) of accelerometer signals is derived theoretically using noise analysis to demonstrate its uniqueness. This relationship is modeled from experimental data using a neural network when the system is working properly, and the actual PSD of one sensor is compared with the PSD of that sensor predicted by the neural network using the PSD of the other sensor as an input. A significant deviation between the actual and predicted PSDs indicate that system is changing (i.e., failing). Experiments carried out on check values and bearings illustrate the usefulness of the methodology developed. (Author)

  8. An integrated reliability-based design optimization of offshore towers

    International Nuclear Information System (INIS)

    Karadeniz, Halil; Togan, Vedat; Vrouwenvelder, Ton

    2009-01-01

    After recognizing the uncertainty in the parameters such as material, loading, geometry and so on in contrast with the conventional optimization, the reliability-based design optimization (RBDO) concept has become more meaningful to perform an economical design implementation, which includes a reliability analysis and an optimization algorithm. RBDO procedures include structural analysis, reliability analysis and sensitivity analysis both for optimization and for reliability. The efficiency of the RBDO system depends on the mentioned numerical algorithms. In this work, an integrated algorithms system is proposed to implement the RBDO of the offshore towers, which are subjected to the extreme wave loading. The numerical strategies interacting with each other to fulfill the RBDO of towers are as follows: (a) a structural analysis program, SAPOS, (b) an optimization program, SQP and (c) a reliability analysis program based on FORM. A demonstration of an example tripod tower under the reliability constraints based on limit states of the critical stress, buckling and the natural frequency is presented.

  9. An integrated reliability-based design optimization of offshore towers

    Energy Technology Data Exchange (ETDEWEB)

    Karadeniz, Halil [Faculty of Civil Engineering and Geosciences, Delft University of Technology, Delft (Netherlands)], E-mail: h.karadeniz@tudelft.nl; Togan, Vedat [Department of Civil Engineering, Karadeniz Technical University, Trabzon (Turkey); Vrouwenvelder, Ton [Faculty of Civil Engineering and Geosciences, Delft University of Technology, Delft (Netherlands)

    2009-10-15

    After recognizing the uncertainty in the parameters such as material, loading, geometry and so on in contrast with the conventional optimization, the reliability-based design optimization (RBDO) concept has become more meaningful to perform an economical design implementation, which includes a reliability analysis and an optimization algorithm. RBDO procedures include structural analysis, reliability analysis and sensitivity analysis both for optimization and for reliability. The efficiency of the RBDO system depends on the mentioned numerical algorithms. In this work, an integrated algorithms system is proposed to implement the RBDO of the offshore towers, which are subjected to the extreme wave loading. The numerical strategies interacting with each other to fulfill the RBDO of towers are as follows: (a) a structural analysis program, SAPOS, (b) an optimization program, SQP and (c) a reliability analysis program based on FORM. A demonstration of an example tripod tower under the reliability constraints based on limit states of the critical stress, buckling and the natural frequency is presented.

  10. Integrated tools for control-system analysis

    Science.gov (United States)

    Ostroff, Aaron J.; Proffitt, Melissa S.; Clark, David R.

    1989-01-01

    The basic functions embedded within a user friendly software package (MATRIXx) are used to provide a high level systems approach to the analysis of linear control systems. Various control system analysis configurations are assembled automatically to minimize the amount of work by the user. Interactive decision making is incorporated via menu options and at selected points, such as in the plotting section, by inputting data. There are five evaluations such as the singular value robustness test, singular value loop transfer frequency response, Bode frequency response, steady-state covariance analysis, and closed-loop eigenvalues. Another section describes time response simulations. A time response for random white noise disturbance is available. The configurations and key equations used for each type of analysis, the restrictions that apply, the type of data required, and an example problem are described. One approach for integrating the design and analysis tools is also presented.

  11. An integrated system for genetic analysis

    Directory of Open Access Journals (Sweden)

    Duan Xiao

    2006-04-01

    Full Text Available Abstract Background Large-scale genetic mapping projects require data management systems that can handle complex phenotypes and detect and correct high-throughput genotyping errors, yet are easy to use. Description We have developed an Integrated Genotyping System (IGS to meet this need. IGS securely stores, edits and analyses genotype and phenotype data. It stores information about DNA samples, plates, primers, markers and genotypes generated by a genotyping laboratory. Data are structured so that statistical genetic analysis of both case-control and pedigree data is straightforward. Conclusion IGS can model complex phenotypes and contain genotypes from whole genome association studies. The database makes it possible to integrate genetic analysis with data curation. The IGS web site http://bioinformatics.well.ox.ac.uk/project-igs.shtml contains further information.

  12. Construction of an integrated database to support genomic sequence analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gilbert, W.; Overbeek, R.

    1994-11-01

    The central goal of this project is to develop an integrated database to support comparative analysis of genomes including DNA sequence data, protein sequence data, gene expression data and metabolism data. In developing the logic-based system GenoBase, a broader integration of available data was achieved due to assistance from collaborators. Current goals are to easily include new forms of data as they become available and to easily navigate through the ensemble of objects described within the database. This report comments on progress made in these areas.

  13. Development of safety analysis technology for integral reactor

    Energy Technology Data Exchange (ETDEWEB)

    Sim, Suk K.; Song, J. H.; Chung, Y. J. and others

    1999-03-01

    Inherent safety features and safety system characteristics of the SMART integral reactor are investigated in this study. Performance and safety of the SMART conceptual design have been evaluated and confirmed through the performance and safety analyses using safety analysis system codes as well as a preliminary performance and safety analysis methodology. SMART design base events and their acceptance criteria are identified to develop a preliminary PIRT for the SMART integral reactor. Using the preliminary PIRT, a set of experimental program for the thermal hydraulic separate effect tests and the integral effect tests was developed for the thermal hydraulic model development and the system code validation. Safety characteristics as well as the safety issues of the integral reactor has been identified during the study, which will be used to resolve the safety issues and guide the regulatory criteria for the integral reactor. The results of the performance and safety analyses performed during the study were used to feedback for the SMART conceptual design. The performance and safety analysis code systems as well as the preliminary safety analysis methodology developed in this study will be validated as the SMART design evolves. The performance and safety analysis technology developed during the study will be utilized for the SMART basic design development. (author)

  14. Integrative Analysis of Genetic, Genomic, and Phenotypic Data for Ethanol Behaviors: A Network-Based Pipeline for Identifying Mechanisms and Potential Drug Targets.

    Science.gov (United States)

    Bogenpohl, James W; Mignogna, Kristin M; Smith, Maren L; Miles, Michael F

    2017-01-01

    Complex behavioral traits, such as alcohol abuse, are caused by an interplay of genetic and environmental factors, producing deleterious functional adaptations in the central nervous system. The long-term behavioral consequences of such changes are of substantial cost to both the individual and society. Substantial progress has been made in the last two decades in understanding elements of brain mechanisms underlying responses to ethanol in animal models and risk factors for alcohol use disorder (AUD) in humans. However, treatments for AUD remain largely ineffective and few medications for this disease state have been licensed. Genome-wide genetic polymorphism analysis (GWAS) in humans, behavioral genetic studies in animal models and brain gene expression studies produced by microarrays or RNA-seq have the potential to produce nonbiased and novel insight into the underlying neurobiology of AUD. However, the complexity of such information, both statistical and informational, has slowed progress toward identifying new targets for intervention in AUD. This chapter describes one approach for integrating behavioral, genetic, and genomic information across animal model and human studies. The goal of this approach is to identify networks of genes functioning in the brain that are most relevant to the underlying mechanisms of a complex disease such as AUD. We illustrate an example of how genomic studies in animal models can be used to produce robust gene networks that have functional implications, and to integrate such animal model genomic data with human genetic studies such as GWAS for AUD. We describe several useful analysis tools for such studies: ComBAT, WGCNA, and EW_dmGWAS. The end result of this analysis is a ranking of gene networks and identification of their cognate hub genes, which might provide eventual targets for future therapeutic development. Furthermore, this combined approach may also improve our understanding of basic mechanisms underlying gene x

  15. The integrated microbial genome resource of analysis.

    Science.gov (United States)

    Checcucci, Alice; Mengoni, Alessio

    2015-01-01

    Integrated Microbial Genomes and Metagenomes (IMG) is a biocomputational system that allows to provide information and support for annotation and comparative analysis of microbial genomes and metagenomes. IMG has been developed by the US Department of Energy (DOE)-Joint Genome Institute (JGI). IMG platform contains both draft and complete genomes, sequenced by Joint Genome Institute and other public and available genomes. Genomes of strains belonging to Archaea, Bacteria, and Eukarya domains are present as well as those of viruses and plasmids. Here, we provide some essential features of IMG system and case study for pangenome analysis.

  16. Elements for successful sensor-based process control {Integrated Metrology}

    International Nuclear Information System (INIS)

    Butler, Stephanie Watts

    1998-01-01

    Current productivity needs have stimulated development of alternative metrology, control, and equipment maintenance methods. Specifically, sensor applications provide the opportunity to increase productivity, tighten control, reduce scrap, and improve maintenance schedules and procedures. Past experience indicates a complete integrated solution must be provided for sensor-based control to be used successfully in production. In this paper, Integrated Metrology is proposed as the term for an integrated solution that will result in a successful application of sensors for process control. This paper defines and explores the perceived four elements of successful sensor applications: business needs, integration, components, and form. Based upon analysis of existing successful commercially available controllers, the necessary business factors have been determined to be strong, measurable industry-wide business needs whose solution is profitable and feasible. This paper examines why the key aspect of integration is the decision making process. A detailed discussion is provided of the components of most importance to sensor based control: decision-making methods, the 3R's of sensors, and connectivity. A metric for one of the R's (resolution) is proposed to allow focus on this important aspect of measurement. A form for these integrated components which synergistically partitions various aspects of control at the equipment and MES levels to efficiently achieve desired benefits is recommended

  17. Elements for successful sensor-based process control {Integrated Metrology}

    Science.gov (United States)

    Butler, Stephanie Watts

    1998-11-01

    Current productivity needs have stimulated development of alternative metrology, control, and equipment maintenance methods. Specifically, sensor applications provide the opportunity to increase productivity, tighten control, reduce scrap, and improve maintenance schedules and procedures. Past experience indicates a complete integrated solution must be provided for sensor-based control to be used successfully in production. In this paper, Integrated Metrology is proposed as the term for an integrated solution that will result in a successful application of sensors for process control. This paper defines and explores the perceived four elements of successful sensor applications: business needs, integration, components, and form. Based upon analysis of existing successful commercially available controllers, the necessary business factors have been determined to be strong, measurable industry-wide business needs whose solution is profitable and feasible. This paper examines why the key aspect of integration is the decision making process. A detailed discussion is provided of the components of most importance to sensor based control: decision-making methods, the 3R's of sensors, and connectivity. A metric for one of the R's (resolution) is proposed to allow focus on this important aspect of measurement. A form for these integrated components which synergistically partitions various aspects of control at the equipment and MES levels to efficiently achieve desired benefits is recommended.

  18. Integrated analysis of genetic data with R

    Directory of Open Access Journals (Sweden)

    Zhao Jing

    2006-01-01

    Full Text Available Abstract Genetic data are now widely available. There is, however, an apparent lack of concerted effort to produce software systems for statistical analysis of genetic data compared with other fields of statistics. It is often a tremendous task for end-users to tailor them for particular data, especially when genetic data are analysed in conjunction with a large number of covariates. Here, R http://www.r-project.org, a free, flexible and platform-independent environment for statistical modelling and graphics is explored as an integrated system for genetic data analysis. An overview of some packages currently available for analysis of genetic data is given. This is followed by examples of package development and practical applications. With clear advantages in data management, graphics, statistical analysis, programming, internet capability and use of available codes, it is a feasible, although challenging, task to develop it into an integrated platform for genetic analysis; this will require the joint efforts of many researchers.

  19. Integrated Data Base: Status and waste projections

    International Nuclear Information System (INIS)

    Klein, J.A.

    1990-01-01

    The Integrated Data Base (IDB) is the official US Department of Energy (DOE) data base for spent fuel and radioactive waste inventories and projections. DOE low-level waste (LLW) is just one of the many waste types that are documented with the IDB. Summary-level tables and figures are presented illustrating historical and projected volume changes of DOE LLW. This information is readily available through the annual IDB publication. Other presentation formats are also available to the DOE community through a request to the IDB Program. 4 refs., 6 figs., 5 tabs

  20. Advancing Alternative Analysis: Integration of Decision Science

    DEFF Research Database (Denmark)

    Malloy, Timothy F; Zaunbrecher, Virginia M; Batteate, Christina

    2016-01-01

    Decision analysis-a systematic approach to solving complex problems-offers tools and frameworks to support decision making that are increasingly being applied to environmental challenges. Alternatives analysis is a method used in regulation and product design to identify, compare, and evaluate......, and civil society and included experts in toxicology, decision science, alternatives assessment, engineering, and law and policy. Participants were divided into two groups and prompted with targeted questions. Throughout the workshop, the groups periodically came together in plenary sessions to reflect......) engaging the systematic development and evaluation of decision approaches and tools; (2) using case studies to advance the integration of decision analysis into alternatives analysis; (3) supporting transdisciplinary research; and (4) supporting education and outreach efforts....

  1. Surface-based vertexwise analysis of morphometry and microstructural integrity for white matter tracts in diffusion tensor imaging: With application to the corpus callosum in Alzheimer's disease.

    Science.gov (United States)

    Tang, Xiaoying; Qin, Yuanyuan; Zhu, Wenzhen; Miller, Michael I

    2017-04-01

    In this article, we present a unified statistical pipeline for analyzing the white matter (WM) tracts morphometry and microstructural integrity, both globally and locally within the same WM tract, from diffusion tensor imaging. Morphometry is quantified globally by the volumetric measurement and locally by the vertexwise surface areas. Meanwhile, microstructural integrity is quantified globally by the mean fractional anisotropy (FA) and trace values within the specific WM tract and locally by the FA and trace values defined at each vertex of its bounding surface. The proposed pipeline consists of four steps: (1) fully automated segmentation of WM tracts in a multi-contrast multi-atlas framework; (2) generation of the smooth surface representations for the WM tracts of interest; (3) common template surface generation on which the localized morphometric and microstructural statistics are defined and a variety of statistical analyses can be conducted; (4) multiple comparison correction to determine the significance of the statistical analysis results. Detailed herein, this pipeline has been applied to the corpus callosum in Alzheimer's disease (AD) with significantly decreased FA values and increased trace values, both globally and locally, being detected in patients with AD when compared to normal aging populations. A subdivision of the corpus callosum in both hemispheres revealed that the AD pathology primarily affects the body and splenium of the corpus callosum. Validation analyses and two multiple comparison correction strategies are provided. Hum Brain Mapp 38:1875-1893, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  2. Integration and segregation in auditory scene analysis

    Science.gov (United States)

    Sussman, Elyse S.

    2005-03-01

    Assessment of the neural correlates of auditory scene analysis, using an index of sound change detection that does not require the listener to attend to the sounds [a component of event-related brain potentials called the mismatch negativity (MMN)], has previously demonstrated that segregation processes can occur without attention focused on the sounds and that within-stream contextual factors influence how sound elements are integrated and represented in auditory memory. The current study investigated the relationship between the segregation and integration processes when they were called upon to function together. The pattern of MMN results showed that the integration of sound elements within a sound stream occurred after the segregation of sounds into independent streams and, further, that the individual streams were subject to contextual effects. These results are consistent with a view of auditory processing that suggests that the auditory scene is rapidly organized into distinct streams and the integration of sequential elements to perceptual units takes place on the already formed streams. This would allow for the flexibility required to identify changing within-stream sound patterns, needed to appreciate music or comprehend speech..

  3. The QoS Indicators Analysis of Integrated EUHT Wireless Communication System Based on Urban Rail Transit in High-Speed Scenario

    Directory of Open Access Journals (Sweden)

    Xiaoxuan Wang

    2018-01-01

    Full Text Available Nowadays, in urban rail transit systems, train wayside communication system uses Wireless Local Area Network (WLAN as wireless technologies to achieve safety-related information exchange between trains and wayside equipment. However, according to the high speed mobility of trains and the limitations of frequency band, WLAN is unable to meet the demands of future intracity and intercity rail transit. And although the Time Division-Long Term Evolution (TD-LTE technology has high performance compared with WLAN, only 20 MHz bandwidth can be used at most. Moreover, in high-speed scenario over 300 km/h, TD-LTE can hardly meet the future requirement as well. The equipment based on Enhanced Ultra High Throughput (EUHT technology can achieve a better performance in high-speed scenario compared with WLAN and TD-LTE. Furthermore, it allows using the frequency resource flexibly based on 5.8 GHz, such as 20 MHz, 40 MHz, and 80 MHz. In this paper, we set up an EUHT wireless communication system for urban rail transit in high-speed scenario integrated all the traffics of it. An outdoor testing environment in Beijing-Tianjin High-speed Railway is set up to measure the performance of integrated EUHT wireless communication system based on urban rail transit. The communication delay, handoff latency, and throughput of this system are analyzed. Extensive testing results show that the Quality of Service (QoS of the designed integrated EUHT wireless communication system satisfies the requirements of urban rail transit system in high-speed scenario. Moreover, compared with testing results of TD-LTE which we got before, the maximum handoff latency of safety-critical traffics can be decreased from 225 ms to 150 ms. The performance of throughput-critical traffics can achieve 2-way 2 Mbps CCTV and 1-way 8 Mbps PIS which are much better than 2-way 1 Mbps CCTV and 1-way 2 Mbps PIS in TD-LTE.

  4. Integrated framework for dynamic safety analysis

    International Nuclear Information System (INIS)

    Kim, Tae Wan; Karanki, Durga R.

    2012-01-01

    In the conventional PSA (Probabilistic Safety Assessment), detailed plant simulations by independent thermal hydraulic (TH) codes are used in the development of accident sequence models. Typical accidents in a NPP involve complex interactions among process, safety systems, and operator actions. As independent TH codes do not have the models of operator actions and full safety systems, they cannot literally simulate the integrated and dynamic interactions of process, safety systems, and operator responses. Offline simulation with pre decided states and time delays may not model the accident sequences properly. Moreover, when stochastic variability in responses of accident models is considered, defining all the combinations for simulations will be cumbersome task. To overcome some of these limitations of conventional safety analysis approach, TH models are coupled with the stochastic models in the dynamic event tree (DET) framework, which provides flexibility to model the integrated response due to better communication as all the accident elements are in the same model. The advantages of this framework also include: Realistic modeling in dynamic scenarios, comprehensive results, integrated approach (both deterministic and probabilistic models), and support for HRA (Human Reliability Analysis)

  5. Presentation planning using an integrated knowledge base

    Science.gov (United States)

    Arens, Yigal; Miller, Lawrence; Sondheimer, Norman

    1988-01-01

    A description is given of user interface research aimed at bringing together multiple input and output modes in a way that handles mixed mode input (commands, menus, forms, natural language), interacts with a diverse collection of underlying software utilities in a uniform way, and presents the results through a combination of output modes including natural language text, maps, charts and graphs. The system, Integrated Interfaces, derives much of its ability to interact uniformly with the user and the underlying services and to build its presentations, from the information present in a central knowledge base. This knowledge base integrates models of the application domain (Navy ships in the Pacific region, in the current demonstration version); the structure of visual displays and their graphical features; the underlying services (data bases and expert systems); and interface functions. The emphasis is on a presentation planner that uses the knowledge base to produce multi-modal output. There has been a flurry of recent work in user interface management systems. (Several recent examples are listed in the references). Existing work is characterized by an attempt to relieve the software designer of the burden of handcrafting an interface for each application. The work has generally focused on intelligently handling input. This paper deals with the other end of the pipeline - presentations.

  6. Integrating health and environmental impact analysis

    DEFF Research Database (Denmark)

    Reis, S; Morris, G.; Fleming, L. E.

    2015-01-01

    which addresses human activity in all its social, economic and cultural complexity. The new approach must be integral to, and interactive, with the natural environment. We see the continuing failure to truly integrate human health and environmental impact analysis as deeply damaging, and we propose...... while equally emphasizing the health of the environment, and the growing calls for 'ecological public health' as a response to global environmental concerns now suffusing the discourse in public health. More revolution than evolution, ecological public health will demand new perspectives regarding...... the interconnections among society, the economy, the environment and our health and well-being. Success must be built on collaborations between the disparate scientific communities of the environmental sciences and public health as well as interactions with social scientists, economists and the legal profession...

  7. Advancing Alternative Analysis: Integration of Decision Science.

    Science.gov (United States)

    Malloy, Timothy F; Zaunbrecher, Virginia M; Batteate, Christina M; Blake, Ann; Carroll, William F; Corbett, Charles J; Hansen, Steffen Foss; Lempert, Robert J; Linkov, Igor; McFadden, Roger; Moran, Kelly D; Olivetti, Elsa; Ostrom, Nancy K; Romero, Michelle; Schoenung, Julie M; Seager, Thomas P; Sinsheimer, Peter; Thayer, Kristina A

    2017-06-13

    Decision analysis-a systematic approach to solving complex problems-offers tools and frameworks to support decision making that are increasingly being applied to environmental challenges. Alternatives analysis is a method used in regulation and product design to identify, compare, and evaluate the safety and viability of potential substitutes for hazardous chemicals. We assessed whether decision science may assist the alternatives analysis decision maker in comparing alternatives across a range of metrics. A workshop was convened that included representatives from government, academia, business, and civil society and included experts in toxicology, decision science, alternatives assessment, engineering, and law and policy. Participants were divided into two groups and were prompted with targeted questions. Throughout the workshop, the groups periodically came together in plenary sessions to reflect on other groups' findings. We concluded that the further incorporation of decision science into alternatives analysis would advance the ability of companies and regulators to select alternatives to harmful ingredients and would also advance the science of decision analysis. We advance four recommendations: a ) engaging the systematic development and evaluation of decision approaches and tools; b ) using case studies to advance the integration of decision analysis into alternatives analysis; c ) supporting transdisciplinary research; and d ) supporting education and outreach efforts. https://doi.org/10.1289/EHP483.

  8. Structural integrity analysis of a steam turbine

    International Nuclear Information System (INIS)

    Villagarcia, Maria P.

    1997-01-01

    One of the most critical components of a power utility is the rotor of the steam turbine. Catastrophic failures of the last decades have promoted the development of life assessment procedures for rotors. The present study requires the knowledge of operating conditions, component geometry, the properties of materials, history of the component, size, location and nature of the existing flaws. The aim of the present work is the obtention of a structural integrity analysis procedure for a steam turbine rotor, taking into account the above-mentioned parameters. In this procedure, a stress thermal analysis by finite elements is performed initially, in order to obtain the temperature and stress distribution for a subsequent analysis by fracture mechanics. The risk of a fast fracture due to flaws in the central zone of the rotor is analyzed. The procedure is applied to an operating turbine: the main steam turbine of the Atucha I nuclear power utility. (author)

  9. A taxonomy of integral reaction path analysis

    Energy Technology Data Exchange (ETDEWEB)

    Grcar, Joseph F.; Day, Marcus S.; Bell, John B.

    2004-12-23

    W. C. Gardiner observed that achieving understanding through combustion modeling is limited by the ability to recognize the implications of what has been computed and to draw conclusions about the elementary steps underlying the reaction mechanism. This difficulty can be overcome in part by making better use of reaction path analysis in the context of multidimensional flame simulations. Following a survey of current practice, an integral reaction flux is formulated in terms of conserved scalars that can be calculated in a fully automated way. Conditional analyses are then introduced, and a taxonomy for bidirectional path analysis is explored. Many examples illustrate the resulting path analysis and uncover some new results about nonpremixed methane-air laminar jets.

  10. The ASDEX integrated data analysis system AIDA

    International Nuclear Information System (INIS)

    Grassie, K.; Gruber, O.; Kardaun, O.; Kaufmann, M.; Lackner, K.; Martin, P.; Mast, K.F.; McCarthy, P.J.; Mertens, V.; Pohl, D.; Rang, U.; Wunderlich, R.

    1989-11-01

    Since about two years, the ASDEX integrated data analysis system (AIDA), which combines the database (DABA) and the statistical analysis system (SAS), is successfully in operation. Besides a considerable, but meaningful, reduction of the 'raw' shot data, it offers the advantage of carefully selected and precisely defined datasets, which are easily accessible for informative tabular data overviews (DABA), and multi-shot analysis (SAS). Even rather complicated, statistical analyses can be performed efficiently within this system. In this report, we want to summarise AIDA's main features, give some details on its set-up and on the physical models which have been used for the derivation of the processed data. We also give short introduction how to use DABA and SAS. (orig.)

  11. Risk analysis of hematopoietic stem cell transplant process: failure mode, effect, and criticality analysis and hazard analysis critical control point methods integration based on guidelines to good manufacturing practice for medicinal product ANNEX 20 (February 2008).

    Science.gov (United States)

    Gianassi, S; Bisin, S; Bindi, B; Spitaleri, I; Bambi, F

    2010-01-01

    The collection and handling of hematopoietic stem cells (HSCs) must meet high quality requirements. An integrated Quality Risk Management can help to identify and contain potential risks related to HSC production. Risk analysis techniques allow one to "weigh" identified hazards, considering the seriousness of their effects, frequency, and detectability, seeking to prevent the most harmful hazards. The Hazard Analysis Critical Point, recognized as the most appropriate technique to identify risks associated with physical, chemical, and biological hazards for cellular products, consists of classifying finished product specifications and limits of acceptability, identifying all off-specifications, defining activities that can cause them, and finally establishing both a monitoring system for each Critical Control Point and corrective actions for deviations. The severity of possible effects on patients, as well as the occurrence and detectability of critical parameters, are measured on quantitative scales (Risk Priority Number [RPN]). Risk analysis was performed with this technique on manipulation process of HPC performed at our blood center. The data analysis showed that hazards with higher values of RPN with greater impact on the process are loss of dose and tracking; technical skills of operators and manual transcription of data were the most critical parameters. Problems related to operator skills are handled by defining targeted training programs, while other critical parameters can be mitigated with the use of continuous control systems. The blood center management software was completed by a labeling system with forms designed to be in compliance with standards in force and by starting implementation of a cryopreservation management module. Copyright 2010 Elsevier Inc. All rights reserved.

  12. An Integrative Analysis of Preeclampsia Based on the Construction of an Extended Composite Network Featuring Protein-Protein Physical Interactions and Transcriptional Relationships.

    Directory of Open Access Journals (Sweden)

    Daniel Vaiman

    Full Text Available Preeclampsia (PE is a pregnancy disorder defined by hypertension and proteinuria. This disease remains a major cause of maternal and fetal morbidity and mortality. Defective placentation is generally described as being at the root of the disease. The characterization of the transcriptome signature of the preeclamptic placenta has allowed to identify differentially expressed genes (DEGs. However, we still lack a detailed knowledge on how these DEGs impact the function of the placenta. The tools of network biology offer a methodology to explore complex diseases at a systems level. In this study we performed a cross-platform meta-analysis of seven publically available gene expression datasets comparing non-pathological and preeclamptic placentas. Using the rank product algorithm we identified a total of 369 DEGs consistently modified in PE. The DEGs were used as seeds to build both an extended physical protein-protein interactions network and a transcription factors regulatory network. Topological and clustering analysis was conducted to analyze the connectivity properties of the networks. Finally both networks were merged into a composite network which presents an integrated view of the regulatory pathways involved in preeclampsia and the crosstalk between them. This network is a useful tool to explore the relationship between the DEGs and enable hypothesis generation for functional experimentation.

  13. Development of Probabilistic Structural Analysis Integrated with Manufacturing Processes

    Science.gov (United States)

    Pai, Shantaram S.; Nagpal, Vinod K.

    2007-01-01

    An effort has been initiated to integrate manufacturing process simulations with probabilistic structural analyses in order to capture the important impacts of manufacturing uncertainties on component stress levels and life. Two physics-based manufacturing process models (one for powdered metal forging and the other for annular deformation resistance welding) have been linked to the NESSUS structural analysis code. This paper describes the methodology developed to perform this integration including several examples. Although this effort is still underway, particularly for full integration of a probabilistic analysis, the progress to date has been encouraging and a software interface that implements the methodology has been developed. The purpose of this paper is to report this preliminary development.

  14. Application of a faith-based integration tool to assess mental and physical health interventions.

    Science.gov (United States)

    Saunders, Donna M; Leak, Jean; Carver, Monique E; Smith, Selina A

    2017-01-01

    To build on current research involving faith-based interventions (FBIs) for addressing mental and physical health, this study a) reviewed the extent to which relevant publications integrate faith concepts with health and b) initiated analysis of the degree of FBI integration with intervention outcomes. Derived from a systematic search of articles published between 2007 and 2017, 36 studies were assessed with a Faith-Based Integration Assessment Tool (FIAT) to quantify faith-health integration. Basic statistical procedures were employed to determine the association of faith-based integration with intervention outcomes. The assessed studies possessed (on average) moderate, inconsistent integration because of poor use of faith measures, and moderate, inconsistent use of faith practices. Analysis procedures for determining the effect of FBI integration on intervention outcomes were inadequate for formulating practical conclusions. Regardless of integration, interventions were associated with beneficial outcomes. To determine the link between FBI integration and intervention outcomes, additional analyses are needed.

  15. Gait recognition based on integral outline

    Science.gov (United States)

    Ming, Guan; Fang, Lv

    2017-02-01

    Biometric identification technology replaces traditional security technology, which has become a trend, and gait recognition also has become a hot spot of research because its feature is difficult to imitate and theft. This paper presents a gait recognition system based on integral outline of human body. The system has three important aspects: the preprocessing of gait image, feature extraction and classification. Finally, using a method of polling to evaluate the performance of the system, and summarizing the problems existing in the gait recognition and the direction of development in the future.

  16. Accelerator physics analysis with an integrated toolkit

    International Nuclear Information System (INIS)

    Holt, J.A.; Michelotti, L.; Satogata, T.

    1992-08-01

    Work is in progress on an integrated software toolkit for linear and nonlinear accelerator design, analysis, and simulation. As a first application, ''beamline'' and ''MXYZPTLK'' (differential algebra) class libraries, were used with an X Windows graphics library to build an user-friendly, interactive phase space tracker which, additionally, finds periodic orbits. This program was used to analyse a theoretical lattice which contains octupoles and decapoles to find the 20th order, stable and unstable periodic orbits and to explore the local phase space structure

  17. Integrated Curriculum and Subject-based Curriculum: Achievement and Attitudes

    Science.gov (United States)

    Casady, Victoria

    The research conducted for this mixed-method study, qualitative and quantitative, analyzed the results of an academic year-long study to determine whether the use of an integrated fourth grade curriculum would benefit student achievement in the areas of English language arts, social studies, and science more than a subject-based traditional curriculum. The research was conducted based on the international, national, and state test scores, which show a slowing or lack of growth. Through pre- and post-assessments, student questionnaires, and administrative interviews, the researcher analyzed the phenomenological experiences of the students to determine if the integrated curriculum was a beneficial restructuring of the curriculum. The research questions for this study focused on the achievement and attitudes of the students in the study and whether the curriculum they were taught impacted their achievement and attitudes over the course of one school year. The curricula for the study were organized to cover the current standards, where the integrated curriculum focused on connections between subject areas to help students make connections to what they are learning and the world beyond the classroom. The findings of this study indicated that utilizing the integrated curriculum could increase achievement as well as students' attitudes toward specific content areas. The ANOVA analysis for English language arts was not determined to be significant; although, greater growth in the students from the integrated curriculum setting was recorded. The ANOVA for social studies (0.05) and the paired t-tests (0.001) for science both determined significant positive differences. The qualitative analysis led to the discovery that the experiences of the students from the integrated curriculum setting were more positive. The evaluation of the data from this study led the researcher to determine that the integrated curriculum was a worthwhile endeavor to increase achievement and attitudes

  18. The Integral A Crux for Analysis

    CERN Document Server

    Krantz, Steven G

    2011-01-01

    This book treats all of the most commonly used theories of the integral. After motivating the idea of integral, we devote a full chapter to the Riemann integral and the next to the Lebesgue integral. Another chapter compares and contrasts the two theories. The concluding chapter offers brief introductions to the Henstock integral, the Daniell integral, the Stieltjes integral, and other commonly used integrals. The purpose of this book is to provide a quick but accurate (and detailed) introduction to all aspects of modern integration theory. It should be accessible to any student who has had ca

  19. Integrated Ecological River Health Assessments, Based on Water Chemistry, Physical Habitat Quality and Biological Integrity

    Directory of Open Access Journals (Sweden)

    Ji Yoon Kim

    2015-11-01

    Full Text Available This study evaluated integrative river ecosystem health using stressor-based models of physical habitat health, chemical water health, and biological health of fish and identified multiple-stressor indicators influencing the ecosystem health. Integrated health responses (IHRs, based on star-plot approach, were calculated from qualitative habitat evaluation index (QHEI, nutrient pollution index (NPI, and index of biological integrity (IBI in four different longitudinal regions (Groups I–IV. For the calculations of IHRs values, multi-metric QHEI, NPI, and IBI models were developed and their criteria for the diagnosis of the health were determined. The longitudinal patterns of the river were analyzed by a self-organizing map (SOM model and the key major stressors in the river were identified by principal component analysis (PCA. Our model scores of integrated health responses (IHRs suggested that mid-stream and downstream regions were impaired, and the key stressors were closely associated with nutrient enrichment (N and P and organic matter pollutions from domestic wastewater disposal plants and urban sewage. This modeling approach of IHRs may be used as an effective tool for evaluations of integrative ecological river health..

  20. Towards risk-based structural integrity methods for PWRs

    International Nuclear Information System (INIS)

    Chapman, O.J.V.; Lloyd, R.B.

    1992-01-01

    This paper describes the development of risk-based structural integrity assurance methods and their application to Pressurized Water Reactor (PWR) plant. In-service inspection is introduced as a way of reducing the failure probability of high risk sites and the latter are identified using reliability analysis; the extent and interval of inspection can also be optimized. The methodology is illustrated by reference to the aspect of reliability of weldments in PWR systems. (author)

  1. Inertial navigation sensor integrated motion analysis for autonomous vehicle navigation

    Science.gov (United States)

    Roberts, Barry; Bhanu, Bir

    1992-01-01

    Recent work on INS integrated motion analysis is described. Results were obtained with a maximally passive system of obstacle detection (OD) for ground-based vehicles and rotorcraft. The OD approach involves motion analysis of imagery acquired by a passive sensor in the course of vehicle travel to generate range measurements to world points within the sensor FOV. INS data and scene analysis results are used to enhance interest point selection, the matching of the interest points, and the subsequent motion-based computations, tracking, and OD. The most important lesson learned from the research described here is that the incorporation of inertial data into the motion analysis program greatly improves the analysis and makes the process more robust.

  2. Integrated circuits based on conjugated polymer monolayer.

    Science.gov (United States)

    Li, Mengmeng; Mangalore, Deepthi Kamath; Zhao, Jingbo; Carpenter, Joshua H; Yan, Hongping; Ade, Harald; Yan, He; Müllen, Klaus; Blom, Paul W M; Pisula, Wojciech; de Leeuw, Dago M; Asadi, Kamal

    2018-01-31

    It is still a great challenge to fabricate conjugated polymer monolayer field-effect transistors (PoM-FETs) due to intricate crystallization and film formation of conjugated polymers. Here we demonstrate PoM-FETs based on a single monolayer of a conjugated polymer. The resulting PoM-FETs are highly reproducible and exhibit charge carrier mobilities reaching 3 cm 2  V -1  s -1 . The high performance is attributed to the strong interactions of the polymer chains present already in solution leading to pronounced edge-on packing and well-defined microstructure in the monolayer. The high reproducibility enables the integration of discrete unipolar PoM-FETs into inverters and ring oscillators. Real logic functionality has been demonstrated by constructing a 15-bit code generator in which hundreds of self-assembled PoM-FETs are addressed simultaneously. Our results provide the state-of-the-art example of integrated circuits based on a conjugated polymer monolayer, opening prospective pathways for bottom-up organic electronics.

  3. Integration of video and radiation analysis data

    International Nuclear Information System (INIS)

    Menlove, H.O.; Howell, J.A.; Rodriguez, C.A.; Eccleston, G.W.; Beddingfield, D.; Smith, J.E.; Baumgart, C.W.

    1995-01-01

    For the past several years, the integration of containment and surveillance (C/S) with nondestructive assay (NDA) sensors for monitoring the movement of nuclear material has focused on the hardware and communications protocols in the transmission network. Little progress has been made in methods to utilize the combined C/S and NDA data for safeguards and to reduce the inspector time spent in nuclear facilities. One of the fundamental problems in the integration of the combined data is that the two methods operate in different dimensions. The C/S video data is spatial in nature; whereas, the NDA sensors provide radiation levels versus time data. The authors have introduced a new method to integrate spatial (digital video) with time (radiation monitoring) information. This technology is based on pattern recognition by neural networks, provides significant capability to analyze complex data, and has the ability to learn and adapt to changing situations. This technique has the potential of significantly reducing the frequency of inspection visits to key facilities without a loss of safeguards effectiveness

  4. Analysis of integrated video and radiation data

    International Nuclear Information System (INIS)

    Howell, J.A.; Menlove, H.O.; Rodriguez, C.A.; Beddingfield, D.; Vasil, A.

    1995-01-01

    We have developed prototype software for a facility-monitoring application that will detect anomalous activity in a nuclear facility. The software, which forms the basis of a simple model, automatically reviews and analyzes integrated safeguards data from continuous unattended monitoring systems. This technology, based on pattern recognition by neural networks, provides significant capability to analyze complex data and has the ability to learn and adapt to changing situations. It is well suited for large automated facilities, reactors, spent-fuel storage facilities, reprocessing plants, and nuclear material storage vaults

  5. Computational Approaches for Integrative Analysis of the Metabolome and Microbiome

    Directory of Open Access Journals (Sweden)

    Jasmine Chong

    2017-11-01

    Full Text Available The study of the microbiome, the totality of all microbes inhabiting the host or an environmental niche, has experienced exponential growth over the past few years. The microbiome contributes functional genes and metabolites, and is an important factor for maintaining health. In this context, metabolomics is increasingly applied to complement sequencing-based approaches (marker genes or shotgun metagenomics to enable resolution of microbiome-conferred functionalities associated with health. However, analyzing the resulting multi-omics data remains a significant challenge in current microbiome studies. In this review, we provide an overview of different computational approaches that have been used in recent years for integrative analysis of metabolome and microbiome data, ranging from statistical correlation analysis to metabolic network-based modeling approaches. Throughout the process, we strive to present a unified conceptual framework for multi-omics integration and interpretation, as well as point out potential future directions.

  6. Application of Stochastic Sensitivity Analysis to Integrated Force Method

    Directory of Open Access Journals (Sweden)

    X. F. Wei

    2012-01-01

    Full Text Available As a new formulation in structural analysis, Integrated Force Method has been successfully applied to many structures for civil, mechanical, and aerospace engineering due to the accurate estimate of forces in computation. Right now, it is being further extended to the probabilistic domain. For the assessment of uncertainty effect in system optimization and identification, the probabilistic sensitivity analysis of IFM was further investigated in this study. A set of stochastic sensitivity analysis formulation of Integrated Force Method was developed using the perturbation method. Numerical examples are presented to illustrate its application. Its efficiency and accuracy were also substantiated with direct Monte Carlo simulations and the reliability-based sensitivity method. The numerical algorithm was shown to be readily adaptable to the existing program since the models of stochastic finite element and stochastic design sensitivity are almost identical.

  7. A Key Event Path Analysis Approach for Integrated Systems

    Directory of Open Access Journals (Sweden)

    Jingjing Liao

    2012-01-01

    Full Text Available By studying the key event paths of probabilistic event structure graphs (PESGs, a key event path analysis approach for integrated system models is proposed. According to translation rules concluded from integrated system architecture descriptions, the corresponding PESGs are constructed from the colored Petri Net (CPN models. Then the definitions of cycle event paths, sequence event paths, and key event paths are given. Whereafter based on the statistic results after the simulation of CPN models, key event paths are found out by the sensitive analysis approach. This approach focuses on the logic structures of CPN models, which is reliable and could be the basis of structured analysis for discrete event systems. An example of radar model is given to characterize the application of this approach, and the results are worthy of trust.

  8. Integrative Analysis of Metabolic Models – from Structure to Dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Hartmann, Anja, E-mail: hartmann@ipk-gatersleben.de [Leibniz Institute of Plant Genetics and Crop Plant Research (IPK), Gatersleben (Germany); Schreiber, Falk [Monash University, Melbourne, VIC (Australia); Martin-Luther-University Halle-Wittenberg, Halle (Germany)

    2015-01-26

    The characterization of biological systems with respect to their behavior and functionality based on versatile biochemical interactions is a major challenge. To understand these complex mechanisms at systems level modeling approaches are investigated. Different modeling formalisms allow metabolic models to be analyzed depending on the question to be solved, the biochemical knowledge and the availability of experimental data. Here, we describe a method for an integrative analysis of the structure and dynamics represented by qualitative and quantitative metabolic models. Using various formalisms, the metabolic model is analyzed from different perspectives. Determined structural and dynamic properties are visualized in the context of the metabolic model. Interaction techniques allow the exploration and visual analysis thereby leading to a broader understanding of the behavior and functionality of the underlying biological system. The System Biology Metabolic Model Framework (SBM{sup 2} – Framework) implements the developed method and, as an example, is applied for the integrative analysis of the crop plant potato.

  9. A new species of Tometes Valenciennes 1850 (Characiformes: Serrasalmidae from Tocantins-Araguaia River Basin based on integrative analysis of molecular and morphological data.

    Directory of Open Access Journals (Sweden)

    Marcelo C Andrade

    Full Text Available A new large serrasalmid species of Tometes is described from the Tocantins-Araguaia River Basin. Tometes siderocarajensis sp. nov. is currently found in the rapids of the Itacaiúnas River Basin, and formerly inhabited the lower Tocantins River. The new species can be distinguished from all congeners, except from T. ancylorhynchus, by the presence of lateral space between 1st and 2nd premaxillary teeth, and by the absence of lateral cusps in these two teeth. However, T. siderocarajensis sp. nov. can be differentiated from syntopic congener T. ancylorhynchus by an entirely black with mottled red body in live specimens, densely pigmented pelvic fins with a high concentration of dark chromatophores, and the presence of 39 to 41 rows of circumpeduncular scales (vs. silvery body coloration with slightly reddish overtones on middle flank during breeding period in live specimens, hyaline to slightly pale coloration on distalmost region of pelvic fins, and 30 to 36 rows of circumpeduncular scales. Additionally, molecular sequence shows that T. siderocarajensis sp. nov. is reciprocally monophyletic, and diagnosable from all congeners by having two autapomorphic molecular characters in the mitochondrial gene COI. The phylogenetic reconstruction still show that T. siderocarajensis sp. nov. is closely related to T. trilobatus. This is the first molecular study using an integrative taxonomic approach based on morphological and molecular sequence data for all described species of Tometes. These findings increase the number of formally described species of Tometes to seven. A key to the Tometes species is provided.

  10. Process Integration Analysis of an Industrial Hydrogen Production Process

    OpenAIRE

    Stolten, Detlef; Grube, Thomas; Tock, Laurence; Maréchal, François; Metzger, Christian; Arpentinier, Philippe

    2010-01-01

    The energy efficiency of an industrial hydrogen production process using steam methane reforming (SMR) combined with the water gas shift reaction (WGS) is analyzed using process integration techniques based on heat cascade calculation and pinch analysis with the aim of identifying potential measures to enhance the process performance. The challenge is to satisfy the high temperature heat demand of the SMR reaction by minimizing the consumption of natural gas to feed the combustion and to expl...

  11. Methodological improvements in voxel-based analysis of diffusion tensor images: applications to study the impact of apolipoprotein E on white matter integrity.

    Science.gov (United States)

    Newlander, Shawn M; Chu, Alan; Sinha, Usha S; Lu, Po H; Bartzokis, George

    2014-02-01

    To identify regional differences in apparent diffusion coefficient (ADC) and fractional anisotropy (FA) using customized preprocessing before voxel-based analysis (VBA) in 14 normal subjects with the specific genes that decrease (apolipoprotein [APO] E ε2) and that increase (APOE ε4) the risk of Alzheimer's disease. Diffusion tensor images (DTI) acquired at 1.5 Tesla were denoised with a total variation tensor regularization algorithm before affine and nonlinear registration to generate a common reference frame for the image volumes of all subjects. Anisotropic and isotropic smoothing with varying kernel sizes was applied to the aligned data before VBA to determine regional differences between cohorts segregated by allele status. VBA on the denoised tensor data identified regions of reduced FA in APOE ε4 compared with the APOE ε2 healthy older carriers. The most consistent results were obtained using the denoised tensor and anisotropic smoothing before statistical testing. In contrast, isotropic smoothing identified regional differences for small filter sizes alone, emphasizing that this method introduces bias in FA values for higher kernel sizes. Voxel-based DTI analysis can be performed on low signal to noise ratio images to detect subtle regional differences in cohorts using the proposed preprocessing techniques. Copyright © 2013 Wiley Periodicals, Inc.

  12. Results of an Integrative Analysis: A Call for Contextualizing HIV and AIDS Clinical Practice Guidelines to Support Evidence‐Based Practice

    Science.gov (United States)

    Kahwa, Eulalia; Hoogeveen, Katie

    2017-01-01

    ABSTRACT Background Practice guidelines aim to improve the standard of care for people living with HIV/AIDS. Successfully implementing guidelines requires tailoring them to populations served and to social and organizational influences on care. Aims To examine dimensions of context, which nurses and midwives described as having a significant impact on their care of patients living with HIV/AIDS in Kenya, Uganda, South Africa, and Jamaica and to determine whether HIV/AIDS guidelines include adaptations congruent with these dimensions of context. Methods Two sets of data were used. The first came from a qualitative study. In‐depth interviews were conducted with purposively selected nurses, midwives, and nurse managers from 21 districts in four study countries. A coding framework was iteratively developed and themes inductively identified. Context dimensions were derived from these themes. A second data set of published guidelines for HIV/AIDS care was then assembled. Guidelines were identified through Google and PubMed searches. Using a deductive integrative analysis approach, text related to context dimensions was extracted from guidelines and categorized into problem and strategy statements. Results Ninety‐six individuals participated in qualitative interviews. Four discrete dimensions of context were identified: health workforce adequacy, workplace exposure risk, workplace consequences for nurses living with HIV/AIDS, and the intersection of work and family life. Guidelines most often acknowledged health human resource constraints and presented mitigation strategies to offset them, and least often discussed workplace consequences and the intersections of family and work life. Linking Evidence to Action Guidelines should more consistently acknowledge diverse implementation contexts, propose how recommendations can be adapted to these realities, and suggest what role frontline healthcare providers have in realizing the structural changes necessary for healthier

  13. STINGRAY: system for integrated genomic resources and analysis.

    Science.gov (United States)

    Wagner, Glauber; Jardim, Rodrigo; Tschoeke, Diogo A; Loureiro, Daniel R; Ocaña, Kary A C S; Ribeiro, Antonio C B; Emmel, Vanessa E; Probst, Christian M; Pitaluga, André N; Grisard, Edmundo C; Cavalcanti, Maria C; Campos, Maria L M; Mattoso, Marta; Dávila, Alberto M R

    2014-03-07

    The STINGRAY system has been conceived to ease the tasks of integrating, analyzing, annotating and presenting genomic and expression data from Sanger and Next Generation Sequencing (NGS) platforms. STINGRAY includes: (a) a complete and integrated workflow (more than 20 bioinformatics tools) ranging from functional annotation to phylogeny; (b) a MySQL database schema, suitable for data integration and user access control; and (c) a user-friendly graphical web-based interface that makes the system intuitive, facilitating the tasks of data analysis and annotation. STINGRAY showed to be an easy to use and complete system for analyzing sequencing data. While both Sanger and NGS platforms are supported, the system could be faster using Sanger data, since the large NGS datasets could potentially slow down the MySQL database usage. STINGRAY is available at http://stingray.biowebdb.org and the open source code at http://sourceforge.net/projects/stingray-biowebdb/.

  14. Trajectory Based Traffic Analysis

    DEFF Research Database (Denmark)

    Krogh, Benjamin Bjerre; Andersen, Ove; Lewis-Kelham, Edwin

    2013-01-01

    We present the INTRA system for interactive path-based traffic analysis. The analyses are developed in collaboration with traffic researchers and provide novel insights into conditions such as congestion, travel-time, choice of route, and traffic-flow. INTRA supports interactive point-and-click a......We present the INTRA system for interactive path-based traffic analysis. The analyses are developed in collaboration with traffic researchers and provide novel insights into conditions such as congestion, travel-time, choice of route, and traffic-flow. INTRA supports interactive point......-and-click analysis, due to a novel and efficient indexing structure. With the web-site daisy.aau.dk/its/spqdemo/we will demonstrate several analyses, using a very large real-world data set consisting of 1.9 billion GPS records (1.5 million trajectories) recorded from more than 13000 vehicles, and touching most...

  15. Integrated Data Base Program: a status report

    International Nuclear Information System (INIS)

    Notz, K.J.; Klein, J.A.

    1984-06-01

    The Integrated Data Base (IDB) Program provides official Department of Energy (DOE) data on spent fuel and radioactive waste inventories, projections, and characteristics. The accomplishments of FY 1983 are summarized for three broad areas: (1) upgrading and issuing of the annual report on spent fuel and radioactive waste inventories, projections, and characteristics, including ORIGEN2 applications and a quality assurance plan; (2) creation of a summary data file in user-friendly format for use on a personal computer and enhancing user access to program data; and (3) optimizing and documentation of the data handling methodology used by the IDB Program and providing direct support to other DOE programs and sites in data handling. Plans for future work in these three areas are outlined. 23 references, 11 figures

  16. Office of Integrated Assessment and Policy Analysis

    International Nuclear Information System (INIS)

    Parzyck, D.C.

    1980-01-01

    The mission of the Office of Integrated Assessments and Policy Analysis (OIAPA) is to examine current and future policies related to the development and use of energy technologies. The principal ongoing research activity to date has focused on the impacts of several energy sources, including coal, oil shale, solar, and geothermal, from the standpoint of the Resource Conservation and Recovery Act. An additional project has recently been initiated on an evaluation of impacts associated with the implementation of the Toxic Substances Control Act. The impacts of the Resource Conservation and Recovery Act and the Toxic Substances Control Act on energy supply constitute the principal research focus of OIAPA for the near term. From these studies a research approach will be developed to identify certain common elements in the regulatory evaluation cycle as a means of evaluating subsequent environmental, health, and socioeconomic impact. It is planned that an integrated assessment team examine studies completed or underway on the following aspects of major regulations: health, risk assessment, testing protocols, environment control cost/benefits, institutional structures, and facility siting. This examination would assess the methodologies used, determine the general applicability of such studies, and present in a logical form information that appears to have broad general application. A suggested action plan for the State of Tennessee on radioactive and hazardous waste management is outlined

  17. Preference-Based Recommendations for OLAP Analysis

    Science.gov (United States)

    Jerbi, Houssem; Ravat, Franck; Teste, Olivier; Zurfluh, Gilles

    This paper presents a framework for integrating OLAP and recommendations. We focus on the anticipatory recommendation process that assists the user during his OLAP analysis by proposing to him the forthcoming analysis step. We present a context-aware preference model that matches decision-makers intuition, and we discuss a preference-based approach for generating personalized recommendations.

  18. The impact of climate change mitigation on water demand for energy and food: An integrated analysis based on the Shared Socioeconomic Pathways

    NARCIS (Netherlands)

    Mouratiadou, Ioanna; Biewald, Anne; Pehl, Michaja; Bonsch, Markus; Baumstark, Lavinia; Klein, David; Popp, Alexander; Luderer, Gunnar; Kriegler, Elmar

    2016-01-01

    Abstract Climate change mitigation, in the context of growing population and ever increasing economic activity, will require a transformation of energy and agricultural systems, posing significant challenges to global water resources. We use an integrated modelling framework of the

  19. A multilayered integrated sensor for three-dimensional, micro total analysis systems

    International Nuclear Information System (INIS)

    Xiao, Jing; Song, Fuchuan; Seo, Sang-Woo

    2013-01-01

    This paper presents a layer-by-layer integration approach of different functional devices and demonstrates a heterogeneously integrated optical sensor featuring a micro-ring resonator and a high-speed thin-film InGaAs-based photodetector co-integrated with a microfluidic droplet generation device. A thin optical device structure allows a seamless integration with other polymer-based devices on a silicon platform. The integrated sensor successfully demonstrates its transient measurement capability of two-phase liquid flow in a microfluidic droplet generation device. The proposed approach represents an important step toward fully integrated micro total analysis systems. (paper)

  20. A DSM-based framework for integrated function modelling

    DEFF Research Database (Denmark)

    Eisenbart, Boris; Gericke, Kilian; Blessing, Lucienne T. M.

    2017-01-01

    an integrated function modelling framework, which specifically aims at relating between the different function modelling perspectives prominently addressed in different disciplines. It uses interlinked matrices based on the concept of DSM and MDM in order to facilitate cross-disciplinary modelling and analysis...... of the functionality of a system. The article further presents the application of the framework based on a product example. Finally, an empirical study in industry is presented. Therein, feedback on the potential of the proposed framework to support interdisciplinary design practice as well as on areas of further...

  1. Analysis of Waste Isolation Pilot Plant Samples: Integrated Summary Report

    Energy Technology Data Exchange (ETDEWEB)

    Britt, Phillip F [ORNL

    2015-03-01

    Analysis of Waste Isolation Pilot Plant Samples: Integrated Summary Report. Summaries of conclusions, analytical processes, and analytical results. Analysis of samples taken from the Waste Isolation Pilot Plant (WIPP) near Carlsbad, New Mexico in support of the WIPP Technical Assessment Team (TAT) activities to determine to the extent feasible the mechanisms and chemical reactions that may have resulted in the breach of at least one waste drum and release of waste material in WIPP Panel 7 Room 7 on February 14, 2014. This report integrates and summarizes the results contained in three separate reports, described below, and draws conclusions based on those results. Chemical and Radiochemical Analyses of WIPP Samples R-15 C5 SWB and R16 C-4 Lip; PNNL-24003, Pacific Northwest National Laboratory, December 2014 Analysis of Waste Isolation Pilot Plant (WIPP) Underground and MgO Samples by the Savannah River National Laboratory (SRNL); SRNL-STI-2014-00617; Savannah River National Laboratory, December 2014 Report for WIPP UG Sample #3, R15C5 (9/3/14); LLNL-TR-667015; Lawrence Livermore National Laboratory, January 2015 This report is also contained in the Waste Isolation Pilot Plant Technical Assessment Team Report; SRNL-RP-2015-01198; Savannah River National Laboratory, March 17, 2015, as Appendix C: Analysis Integrated Summary Report.

  2. Methodology for dimensional variation analysis of ITER integrated systems

    International Nuclear Information System (INIS)

    Fuentes, F. Javier; Trouvé, Vincent; Cordier, Jean-Jacques; Reich, Jens

    2016-01-01

    Highlights: • Tokamak dimensional management methodology, based on 3D variation analysis, is presented. • Dimensional Variation Model implementation workflow is described. • Methodology phases are described in detail. The application of this methodology to the tolerance analysis of ITER Vacuum Vessel is presented. • Dimensional studies are a valuable tool for the assessment of Tokamak PCR (Project Change Requests), DR (Deviation Requests) and NCR (Non-Conformance Reports). - Abstract: The ITER machine consists of a large number of complex systems highly integrated, with critical functional requirements and reduced design clearances to minimize the impact in cost and performances. Tolerances and assembly accuracies in critical areas could have a serious impact in the final performances, compromising the machine assembly and plasma operation. The management of tolerances allocated to part manufacture and assembly processes, as well as the control of potential deviations and early mitigation of non-compliances with the technical requirements, is a critical activity on the project life cycle. A 3D tolerance simulation analysis of ITER Tokamak machine has been developed based on 3DCS dedicated software. This integrated dimensional variation model is representative of Tokamak manufacturing functional tolerances and assembly processes, predicting accurate values for the amount of variation on critical areas. This paper describes the detailed methodology to implement and update the Tokamak Dimensional Variation Model. The model is managed at system level. The methodology phases are illustrated by its application to the Vacuum Vessel (VV), considering the status of maturity of VV dimensional variation model. The following topics are described in this paper: • Model description and constraints. • Model implementation workflow. • Management of input and output data. • Statistical analysis and risk assessment. The management of the integration studies based on

  3. Methodology for dimensional variation analysis of ITER integrated systems

    Energy Technology Data Exchange (ETDEWEB)

    Fuentes, F. Javier, E-mail: FranciscoJavier.Fuentes@iter.org [ITER Organization, Route de Vinon-sur-Verdon—CS 90046, 13067 St Paul-lez-Durance (France); Trouvé, Vincent [Assystem Engineering & Operation Services, rue J-M Jacquard CS 60117, 84120 Pertuis (France); Cordier, Jean-Jacques; Reich, Jens [ITER Organization, Route de Vinon-sur-Verdon—CS 90046, 13067 St Paul-lez-Durance (France)

    2016-11-01

    Highlights: • Tokamak dimensional management methodology, based on 3D variation analysis, is presented. • Dimensional Variation Model implementation workflow is described. • Methodology phases are described in detail. The application of this methodology to the tolerance analysis of ITER Vacuum Vessel is presented. • Dimensional studies are a valuable tool for the assessment of Tokamak PCR (Project Change Requests), DR (Deviation Requests) and NCR (Non-Conformance Reports). - Abstract: The ITER machine consists of a large number of complex systems highly integrated, with critical functional requirements and reduced design clearances to minimize the impact in cost and performances. Tolerances and assembly accuracies in critical areas could have a serious impact in the final performances, compromising the machine assembly and plasma operation. The management of tolerances allocated to part manufacture and assembly processes, as well as the control of potential deviations and early mitigation of non-compliances with the technical requirements, is a critical activity on the project life cycle. A 3D tolerance simulation analysis of ITER Tokamak machine has been developed based on 3DCS dedicated software. This integrated dimensional variation model is representative of Tokamak manufacturing functional tolerances and assembly processes, predicting accurate values for the amount of variation on critical areas. This paper describes the detailed methodology to implement and update the Tokamak Dimensional Variation Model. The model is managed at system level. The methodology phases are illustrated by its application to the Vacuum Vessel (VV), considering the status of maturity of VV dimensional variation model. The following topics are described in this paper: • Model description and constraints. • Model implementation workflow. • Management of input and output data. • Statistical analysis and risk assessment. The management of the integration studies based on

  4. PHIDIAS- Pathogen Host Interaction Data Integration and Analysis

    Indian Academy of Sciences (India)

    PHIDIAS- Pathogen Host Interaction Data Integration and Analysis- allows searching of integrated genome sequences, conserved domains and gene expressions data related to pathogen host interactions in high priority agents for public health and security ...

  5. Integrated data base for spent fuel and radwaste: inventories

    International Nuclear Information System (INIS)

    Notz, K.J.; Carter, W.L.; Kibbey, A.H.

    1982-01-01

    The Integrated Data Base (IDB) program provides and maintains current, integrated data on spent reactor fuel and radwaste, including historical data, current inventories, projected inventories, and material characteristics. The IDB program collects, organizes, integrates, and - where necessary - reconciles inventory and projection (I/P) and characteristics information to provide a coherent, self-consistent data base on spent fuel and radwaste

  6. INTEGRATION OF FACILITY MODELING CAPABILITIES FOR NUCLEAR NONPROLIFERATION ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Gorensek, M.; Hamm, L.; Garcia, H.; Burr, T.; Coles, G.; Edmunds, T.; Garrett, A.; Krebs, J.; Kress, R.; Lamberti, V.; Schoenwald, D.; Tzanos, C.; Ward, R.

    2011-07-18

    Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come from many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.

  7. Integration of facility modeling capabilities for nuclear nonproliferation analysis

    International Nuclear Information System (INIS)

    Garcia, Humberto; Burr, Tom; Coles, Garill A.; Edmunds, Thomas A.; Garrett, Alfred; Gorensek, Maximilian; Hamm, Luther; Krebs, John; Kress, Reid L.; Lamberti, Vincent; Schoenwald, David; Tzanos, Constantine P.; Ward, Richard C.

    2012-01-01

    Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come from many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.

  8. Integration Of Facility Modeling Capabilities For Nuclear Nonproliferation Analysis

    International Nuclear Information System (INIS)

    Gorensek, M.; Hamm, L.; Garcia, H.; Burr, T.; Coles, G.; Edmunds, T.; Garrett, A.; Krebs, J.; Kress, R.; Lamberti, V.; Schoenwald, D.; Tzanos, C.; Ward, R.

    2011-01-01

    Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come from many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.

  9. Mapping Fish Community Variables by Integrating Field and Satellite Data, Object-Based Image Analysis and Modeling in a Traditional Fijian Fisheries Management Area

    Directory of Open Access Journals (Sweden)

    Stacy Jupiter

    2011-03-01

    Full Text Available The use of marine spatial planning for zoning multi-use areas is growing in both developed and developing countries. Comprehensive maps of marine resources, including those important for local fisheries management and biodiversity conservation, provide a crucial foundation of information for the planning process. Using a combination of field and high spatial resolution satellite data, we use an empirical procedure to create a bathymetric map (RMSE 1.76 m and object-based image analysis to produce accurate maps of geomorphic and benthic coral reef classes (Kappa values of 0.80 and 0.63; 9 and 33 classes, respectively covering a large (>260 km2 traditional fisheries management area in Fiji. From these maps, we derive per-pixel information on habitat richness, structural complexity, coral cover and the distance from land, and use these variables as input in models to predict fish species richness, diversity and biomass. We show that random forest models outperform five other model types, and that all three fish community variables can be satisfactorily predicted from the high spatial resolution satellite data. We also show geomorphic zone to be the most important predictor on average, with secondary contributions from a range of other variables including benthic class, depth, distance from land, and live coral cover mapped at coarse spatial scales, suggesting that data with lower spatial resolution and lower cost may be sufficient for spatial predictions of the three fish community variables.

  10. Migration in Deltas: An Integrated Analysis

    Science.gov (United States)

    Nicholls, Robert J.; Hutton, Craig W.; Lazar, Attila; Adger, W. Neil; Allan, Andrew; Arto, Inaki; Vincent, Katharine; Rahman, Munsur; Salehin, Mashfiqus; Sugata, Hazra; Ghosh, Tuhin; Codjoe, Sam; Appeaning-Addo, Kwasi

    2017-04-01

    Deltas and low-lying coastal regions have long been perceived as vulnerable to global sea-level rise, with the potential for mass displacement of exposed populations. The assumption of mass displacement of populations in deltas requires a comprehensive reassessment in the light of present and future migration in deltas, including the potential role of adaptation to influence these decisions. At present, deltas are subject to multiple drivers of environmental change and often have high population densities as they are accessible and productive ecosystems. Climate change, catchment management, subsidence and land cover change drive environmental change across all deltas. Populations in deltas are also highly mobile, with significant urbanization trends and the growth of large cities and mega-cities within or adjacent to deltas across Asia and Africa. Such migration is driven primarily by economic opportunity, yet environmental change in general, and climate change in particular, are likely to play an increasing direct and indirect role in future migration trends. The policy challenges centre on the role of migration within regional adaptation strategies to climate change; the protection of vulnerable populations; and the future of urban settlements within deltas. This paper reviews current knowledge on migration and adaptation to environmental change to discern specific issues pertinent to delta regions. It develops a new integrated methodology to assess present and future migration in deltas using the Volta delta in Ghana, Mahanadi delta in India and Ganges-Brahmaputra-Meghna delta across India and Bangladesh. The integrated method focuses on: biophysical changes and spatial distribution of vulnerability; demographic changes and migration decision-making using multiple methods and data; macro-economic trends and scenarios in the deltas; and the policies and governance structures that constrain and enable adaptation. The analysis is facilitated by a range of

  11. JAVA based LCD Reconstruction and Analysis Tools

    International Nuclear Information System (INIS)

    Bower, G.

    2004-01-01

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  12. Java based LCD reconstruction and analysis tools

    International Nuclear Information System (INIS)

    Bower, Gary; Cassell, Ron; Graf, Norman; Johnson, Tony; Ronan, Mike

    2001-01-01

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  13. Utilizing Integrated Prediction Error Filter Analysis (INPEFA) to divide base-level cycle of fan-deltas: A case study of the Triassic Baikouquan Formation in Mabei Slope Area, Mahu Depression, Junggar Basin, China

    Science.gov (United States)

    Yuan, Rui; Zhu, Rui; Qu, Jianhua; Wu, Jun; You, Xincai; Sun, Yuqiu; Zhou, Yuanquan (Nancy)

    2018-05-01

    The Mahu Depression is an important hydrocarbon-bearing foreland sag located at the northwestern margin of the Junggar Basin, China. On the northern slope of the depression, large coarse-grained proximal fan-delta depositional systems developed in the Lower Triassic Baikouquan Formation (T1b). Some lithologic hydrocarbon reservoirs have been found in the conglomerates of the formation since recent years. However, the rapid vertical and horizontal lithology variations make it is difficult to divide the base-level cycle of the formation using the conventional methods. Spectral analysis technologies, such as Integrated Prediction Error Filter Analysis (INPEFA), provide another effective way to overcome this difficultly. In this paper, processed by INPEFA, conventional resistivity logs are utilized to study the base-level cycle of the fan-delta depositional systems. The negative trend of the INPEFA curve indicates the base-level fall semi-cycles, adversely, positive trend suggests the rise semi-cycles. Base-level cycles of Baikouquan Formation are divided in single and correlation wells. One long-term base-level rise semi-cycle, including three medium-term base-level cycles, is identified overall the Baikouquan Formation. The medium-term base-level cycles are characterized as rise semi-cycles mainly in the fan-delta plain, symmetric cycles in the fan-delta front and fall semi-cycles mainly in the pro-fan-delta. The short-term base-level rise semi-cycles most developed in the braided channels, sub-aqueous distributary channels and sheet sands. While, the interdistributary bays and pro-fan-delta mud indicate short-term base-level fall semi-cycles. Finally, based on the method of INPEFA, sequence filling model of Baikouquan formation is established.

  14. Integration of risk analysis, land use planning, and cost analysis

    International Nuclear Information System (INIS)

    Rajen, G.; Sanchez, G.

    1994-01-01

    The Department of Energy (DOE) and the Pueblo of San Ildefonso (Pueblo), which is a sovereign Indian tribe, have often been involved in adversarial situations regarding the Los Alamos National Laboratory (LANL). The Pueblo shares a common boundary with the LANL. This paper describes an on-going project that could alter the DOE and the Pueblo's relationship to one of cooperation; and unite the DOE and the Pueblo in a Pollution Prevention/Waste Minimization, and Integrated Risk Analysis and Land Use Planning effort

  15. Lectures on functional analysis and the Lebesgue integral

    CERN Document Server

    Komornik, Vilmos

    2016-01-01

    This textbook, based on three series of lectures held by the author at the University of Strasbourg, presents functional analysis in a non-traditional way by generalizing elementary theorems of plane geometry to spaces of arbitrary dimension. This approach leads naturally to the basic notions and theorems. Most results are illustrated by the small ℓp spaces. The Lebesgue integral, meanwhile, is treated via the direct approach of Frigyes Riesz, whose constructive definition of measurable functions leads to optimal, clear-cut versions of the classical theorems of Fubini-Tonelli and Radon-Nikodým. Lectures on Functional Analysis and the Lebesgue Integral presents the most important topics for students, with short, elegant proofs. The exposition style follows the Hungarian mathematical tradition of Paul Erdős and others. The order of the first two parts, functional analysis and the Lebesgue integral, may be reversed. In the third and final part they are combined to study various spaces of continuous and integ...

  16. Integration of Financial Markets in Post Global Financial Crises and Implications for British Financial Sector: Analysis Based on A Panel VAR Model

    OpenAIRE

    Nasir, M; Du, M

    2017-01-01

    This study analyses the dynamics of integration among global financial markets in the context of Global Financial Crisis (2008) by employing a Panel Vector Autoregressive (VAR) model on the monthly data of nine countries and three markets from Jan 2003 to Oct 2015. It was found that there has been a shift in the association among the global financial markets since Global Financial Crisis (GFC).Moreover, the British financial sectors in Post-GFC world clearly showed a change in the association...

  17. Applying Groebner bases to solve reduction problems for Feynman integrals

    International Nuclear Information System (INIS)

    Smirnov, Alexander V.; Smirnov, Vladimir A.

    2006-01-01

    We describe how Groebner bases can be used to solve the reduction problem for Feynman integrals, i.e. to construct an algorithm that provides the possibility to express a Feynman integral of a given family as a linear combination of some master integrals. Our approach is based on a generalized Buchberger algorithm for constructing Groebner-type bases associated with polynomials of shift operators. We illustrate it through various examples of reduction problems for families of one- and two-loop Feynman integrals. We also solve the reduction problem for a family of integrals contributing to the three-loop static quark potential

  18. Applying Groebner bases to solve reduction problems for Feynman integrals

    Energy Technology Data Exchange (ETDEWEB)

    Smirnov, Alexander V. [Mechanical and Mathematical Department and Scientific Research Computer Center of Moscow State University, Moscow 119992 (Russian Federation); Smirnov, Vladimir A. [Nuclear Physics Institute of Moscow State University, Moscow 119992 (Russian Federation)

    2006-01-15

    We describe how Groebner bases can be used to solve the reduction problem for Feynman integrals, i.e. to construct an algorithm that provides the possibility to express a Feynman integral of a given family as a linear combination of some master integrals. Our approach is based on a generalized Buchberger algorithm for constructing Groebner-type bases associated with polynomials of shift operators. We illustrate it through various examples of reduction problems for families of one- and two-loop Feynman integrals. We also solve the reduction problem for a family of integrals contributing to the three-loop static quark potential.

  19. PROSPECTS OF THE REGIONAL INTEGRATION POLICY BASED ON CLUSTER FORMATION

    Directory of Open Access Journals (Sweden)

    Elena Tsepilova

    2018-01-01

    Full Text Available The purpose of this article is to develop the theoretical foundations of regional integration policy and to determine its prospects on the basis of cluster formation. The authors use such research methods as systematization, comparative and complex analysis, synthesis, statistical method. Within the framework of the research, the concept of regional integration policy is specified, and its integration core – cluster – is allocated. The authors work out an algorithm of regional clustering, which will ensure the growth of economy and tax income. Measures have been proposed to optimize the organizational mechanism of interaction between the participants of the territorial cluster and the authorities that allow to ensure the effective functioning of clusters, including taxation clusters. Based on the results of studying the existing methods for assessing the effectiveness of cluster policy, the authors propose their own approach to evaluating the consequences of implementing the regional integration policy, according to which the list of quantitative and qualitative indicators is defined. The present article systematizes the experience and results of the cluster policy of certain European countries, that made it possible to determine the prospects and synergetic effect from the development of clusters as an integration foundation of regional policy in the Russian Federation. The authors carry out the analysis of activity of cluster formations using the example of the Rostov region – a leader in the formation of conditions for the cluster policy development in the Southern Federal District. 11 clusters and cluster initiatives are developing in this region. As a result, the authors propose measures for support of the already existing clusters and creation of the new ones.

  20. K West integrated water treatment system subproject safety analysis document

    International Nuclear Information System (INIS)

    SEMMENS, L.S.

    1999-01-01

    This Accident Analysis evaluates unmitigated accident scenarios, and identifies Safety Significant and Safety Class structures, systems, and components for the K West Integrated Water Treatment System

  1. K West integrated water treatment system subproject safety analysis document

    Energy Technology Data Exchange (ETDEWEB)

    SEMMENS, L.S.

    1999-02-24

    This Accident Analysis evaluates unmitigated accident scenarios, and identifies Safety Significant and Safety Class structures, systems, and components for the K West Integrated Water Treatment System.

  2. Control Synthesis for the Flow-Based Microfluidic Large-Scale Integration Biochips

    DEFF Research Database (Denmark)

    Minhass, Wajid Hassan; Pop, Paul; Madsen, Jan

    2013-01-01

    In this paper we are interested in flow-based microfluidic biochips, which are able to integrate the necessary functions for biochemical analysis on-chip. In these chips, the flow of liquid is manipulated using integrated microvalves. By combining severalmicrovalves, more complex units, such asmi......In this paper we are interested in flow-based microfluidic biochips, which are able to integrate the necessary functions for biochemical analysis on-chip. In these chips, the flow of liquid is manipulated using integrated microvalves. By combining severalmicrovalves, more complex units...

  3. Integrating health and environmental impact analysis.

    Science.gov (United States)

    Reis, S; Morris, G; Fleming, L E; Beck, S; Taylor, T; White, M; Depledge, M H; Steinle, S; Sabel, C E; Cowie, H; Hurley, F; Dick, J McP; Smith, R I; Austen, M

    2015-10-01

    Scientific investigations have progressively refined our understanding of the influence of the environment on human health, and the many adverse impacts that human activities exert on the environment, from the local to the planetary level. Nonetheless, throughout the modern public health era, health has been pursued as though our lives and lifestyles are disconnected from ecosystems and their component organisms. The inadequacy of the societal and public health response to obesity, health inequities, and especially global environmental and climate change now calls for an ecological approach which addresses human activity in all its social, economic and cultural complexity. The new approach must be integral to, and interactive, with the natural environment. We see the continuing failure to truly integrate human health and environmental impact analysis as deeply damaging, and we propose a new conceptual model, the ecosystems-enriched Drivers, Pressures, State, Exposure, Effects, Actions or 'eDPSEEA' model, to address this shortcoming. The model recognizes convergence between the concept of ecosystems services which provides a human health and well-being slant to the value of ecosystems while equally emphasizing the health of the environment, and the growing calls for 'ecological public health' as a response to global environmental concerns now suffusing the discourse in public health. More revolution than evolution, ecological public health will demand new perspectives regarding the interconnections among society, the economy, the environment and our health and well-being. Success must be built on collaborations between the disparate scientific communities of the environmental sciences and public health as well as interactions with social scientists, economists and the legal profession. It will require outreach to political and other stakeholders including a currently largely disengaged general public. The need for an effective and robust science-policy interface has

  4. Integrated optical 3D digital imaging based on DSP scheme

    Science.gov (United States)

    Wang, Xiaodong; Peng, Xiang; Gao, Bruce Z.

    2008-03-01

    We present a scheme of integrated optical 3-D digital imaging (IO3DI) based on digital signal processor (DSP), which can acquire range images independently without PC support. This scheme is based on a parallel hardware structure with aid of DSP and field programmable gate array (FPGA) to realize 3-D imaging. In this integrated scheme of 3-D imaging, the phase measurement profilometry is adopted. To realize the pipeline processing of the fringe projection, image acquisition and fringe pattern analysis, we present a multi-threads application program that is developed under the environment of DSP/BIOS RTOS (real-time operating system). Since RTOS provides a preemptive kernel and powerful configuration tool, with which we are able to achieve a real-time scheduling and synchronization. To accelerate automatic fringe analysis and phase unwrapping, we make use of the technique of software optimization. The proposed scheme can reach a performance of 39.5 f/s (frames per second), so it may well fit into real-time fringe-pattern analysis and can implement fast 3-D imaging. Experiment results are also presented to show the validity of proposed scheme.

  5. An integrated platform for biomolecule interaction analysis

    Science.gov (United States)

    Jan, Chia-Ming; Tsai, Pei-I.; Chou, Shin-Ting; Lee, Shu-Sheng; Lee, Chih-Kung

    2013-02-01

    We developed a new metrology platform which can detect real-time changes in both a phase-interrogation mode and intensity mode of a SPR (surface plasmon resonance). We integrated a SPR and ellipsometer to a biosensor chip platform to create a new biomolecular interaction measurement mechanism. We adopted a conductive ITO (indium-tinoxide) film to the bio-sensor platform chip to expand the dynamic range and improve measurement accuracy. The thickness of the conductive film and the suitable voltage constants were found to enhance performance. A circularly polarized ellipsometry configuration was incorporated into the newly developed platform to measure the label-free interactions of recombinant human C-reactive protein (CRP) with immobilized biomolecule target monoclonal human CRP antibody at various concentrations. CRP was chosen as it is a cardiovascular risk biomarker and is an acute phase reactant as well as a specific prognostic indicator for inflammation. We found that the sensitivity of a phaseinterrogation SPR is predominantly dependent on the optimization of the sample incidence angle. The effect of the ITO layer effective index under DC and AC effects as well as an optimal modulation were experimentally performed and discussed. Our experimental results showed that the modulated dynamic range for phase detection was 10E-2 RIU based on a current effect and 10E-4 RIU based on a potential effect of which a 0.55 (°/RIU) measurement was found by angular-interrogation. The performance of our newly developed metrology platform was characterized to have a higher sensitivity and less dynamic range when compared to a traditional full-field measurement system.

  6. Noise analysis of switched integrator preamplifiers

    International Nuclear Information System (INIS)

    Sun Hongbo; Li Yulan; Zhu Weibin

    2004-01-01

    The main noise sources of switched integrator preamplifiers are discussed, and their noise performance are given combined PSpice simulation and experiments on them. Then, some practical methods on how to reduce noise of preamplifiers in two different integrator modes are provided. (authors)

  7. Social Ecological Model Analysis for ICT Integration

    Science.gov (United States)

    Zagami, Jason

    2013-01-01

    ICT integration of teacher preparation programmes was undertaken by the Australian Teaching Teachers for the Future (TTF) project in all 39 Australian teacher education institutions and highlighted the need for guidelines to inform systemic ICT integration approaches. A Social Ecological Model (SEM) was used to positively inform integration…

  8. Understanding integrated care: a comprehensive conceptual framework based on the integrative functions of primary care.

    Science.gov (United States)

    Valentijn, Pim P; Schepman, Sanneke M; Opheij, Wilfrid; Bruijnzeels, Marc A

    2013-01-01

    Primary care has a central role in integrating care within a health system. However, conceptual ambiguity regarding integrated care hampers a systematic understanding. This paper proposes a conceptual framework that combines the concepts of primary care and integrated care, in order to understand the complexity of integrated care. The search method involved a combination of electronic database searches, hand searches of reference lists (snowball method) and contacting researchers in the field. The process of synthesizing the literature was iterative, to relate the concepts of primary care and integrated care. First, we identified the general principles of primary care and integrated care. Second, we connected the dimensions of integrated care and the principles of primary care. Finally, to improve content validity we held several meetings with researchers in the field to develop and refine our conceptual framework. The conceptual framework combines the functions of primary care with the dimensions of integrated care. Person-focused and population-based care serve as guiding principles for achieving integration across the care continuum. Integration plays complementary roles on the micro (clinical integration), meso (professional and organisational integration) and macro (system integration) level. Functional and normative integration ensure connectivity between the levels. The presented conceptual framework is a first step to achieve a better understanding of the inter-relationships among the dimensions of integrated care from a primary care perspective.

  9. Application of symplectic integrator to numerical fluid analysis

    International Nuclear Information System (INIS)

    Tanaka, Nobuatsu

    2000-01-01

    This paper focuses on application of the symplectic integrator to numerical fluid analysis. For the purpose, we introduce Hamiltonian particle dynamics to simulate fluid behavior. The method is based on both the Hamiltonian formulation of a system and the particle methods, and is therefore called Hamiltonian Particle Dynamics (HPD). In this paper, an example of HPD applications, namely the behavior of incompressible inviscid fluid, is solved. In order to improve accuracy of HPD with respect to space, CIVA, which is a highly accurate interpolation method, is combined, but the combined method is subject to problems in that the invariants of the system are not conserved in a long-time computation. For solving the problems, symplectic time integrators are introduced and the effectiveness is confirmed by numerical analyses. (author)

  10. Integration of facility modeling capabilities for nuclear nonproliferation analysis

    International Nuclear Information System (INIS)

    Burr, Tom; Gorensek, M.B.; Krebs, John; Kress, Reid L.; Lamberti, Vincent; Schoenwald, David; Ward, Richard C.

    2012-01-01

    Developing automated methods for data collection and analysis that can facilitate nuclearnonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facilitymodeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facilitymodeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come from many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facilitymodelingcapabilities and illustrates how they could be integrated and utilized for nonproliferationanalysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facilitymodeling tools. After considering a representative sampling of key facilitymodelingcapabilities, the proposed integration framework is illustrated with several examples.

  11. Predictors of Traditional Medical Practices in Illness Behavior in Northwestern Ethiopia: An Integrated Model of Behavioral Prediction Based Logistic Regression Analysis

    Directory of Open Access Journals (Sweden)

    Abenezer Yared

    2017-01-01

    Full Text Available This study aimed at investigating traditional medical beliefs and practices in illness behavior as well as predictors of the practices in Gondar city, northwestern Ethiopia, by using the integrated model of behavioral prediction. A cross-sectional quantitative survey was conducted to collect data through interviewer administered structured questionnaires from 496 individuals selected by probability proportional to size sampling technique. Unadjusted bivariate and adjusted multivariate logistic regression analyses were performed, and the results indicated that sociocultural predictors of normative response and attitude as well as psychosocial individual difference variables of traditional understanding of illness causation and perceived efficacy had statistically significant associations with traditional medical practices. Due to the influence of these factors, majority of the study population (85% thus relied on both herbal and spiritual varieties of traditional medicine to respond to their perceived illnesses, supporting the conclusion that characterized the illness behavior of the people as mainly involving traditional medical practices. The results implied two-way medicine needs to be developed with ongoing research, and health educations must take the traditional customs into consideration, for integrating interventions in the health care system in ways that the general public accepts yielding a better health outcome.

  12. Case for integral core-disruptive accident analysis

    International Nuclear Information System (INIS)

    Luck, L.B.; Bell, C.R.

    1985-01-01

    Integral analysis is an approach used at the Los Alamos National Laboratory to cope with the broad multiplicity of accident paths and complex phenomena that characterize the transition phase of core-disruptive accident progression in a liquid-metal-cooled fast breeder reactor. The approach is based on the combination of a reference calculation, which is intended to represent a band of similar accident paths, and associated system- and separate-effect studies, which are designed to determine the effect of uncertainties. Results are interpreted in the context of a probabilistic framework. The approach was applied successfully in two studies; illustrations from the Clinch River Breeder Reactor licensing assessment are included

  13. CFD Analysis for Advanced Integrated Head Assembly

    Energy Technology Data Exchange (ETDEWEB)

    Jo, Won Ho; Kang, Tae Kyo; Cho, Yeon Ho; Kim, Hyun Min [KEPCO Engineering and Construction Co., Daejeon (Korea, Republic of)

    2016-10-15

    The Integrated Head Assembly (IHA) is permanently installed on the reactor vessel closure head during the normal plant operation and refueling operation. It consists of a number of systems and components such as the head lifting system, seismic support system, Control Element Drive Mechanism (CEDM) cooling system, cable support system, cooling shroud assemblies. With the operating experiences of the IHA, the needs for the design change to the current APR1400 IHA arouse to improve the seismic resistance and to accommodate the convenient maintenance. In this paper, the effects of the design changes were rigorously studied for the various sizes of the inlet openings to assure the proper cooling of the CEDMs. And the system pressure differentials and required flow rate for the CEDM cooling fan were analyzed regarding the various operating conditions for determining the capacity of the fan. As a part of the design process of the AIHA, the number of air inlets and baffle regions are reduced by simplifying the design of the APR1400 IHA. The design change of the baffle regions has been made such that the maximum possible space are occupied inside the IHA cooling shroud shell while avoiding the interference with CEDMs. So, only the air inlet opening was studied for the design change to supply a sufficient cooling air flow for each CEDM. The size and location of the air inlets in middle cooling shroud assembly were determined by the CFD analyses of the AIHA. And the case CFD analyses were performed depending on the ambient air temperature and fan operating conditions. The size of the air inlet openings is increased by comparison with the initial AIHA design, and it is confirmed that the cooling air flow rate for each CEDM meet the design requirement of 800 SCFM ± 10% with the increased air inlets. At the initial analysis, the fan outlet flow rate was assumed as 48.3 lbm/s, but the result revealed that the less outflow rate at the fan is enough to meet the design requirement

  14. Diffusion tensor imaging study of early white matter integrity in HIV-infected patients: A tract-based spatial statistics analysis

    Directory of Open Access Journals (Sweden)

    Ruili Li

    2015-12-01

    Conclusion: Multiple cerebral white matter fiber tracts are damaged in HIV-infected patients without cognitive impairment. Quantitative analysis of DTI using TBSS is valuable in evaluating changes of HIV-associated white matter microstructures.

  15. Integrative analysis to select cancer candidate biomarkers to targeted validation

    Science.gov (United States)

    Heberle, Henry; Domingues, Romênia R.; Granato, Daniela C.; Yokoo, Sami; Canevarolo, Rafael R.; Winck, Flavia V.; Ribeiro, Ana Carolina P.; Brandão, Thaís Bianca; Filgueiras, Paulo R.; Cruz, Karen S. P.; Barbuto, José Alexandre; Poppi, Ronei J.; Minghim, Rosane; Telles, Guilherme P.; Fonseca, Felipe Paiva; Fox, Jay W.; Santos-Silva, Alan R.; Coletta, Ricardo D.; Sherman, Nicholas E.; Paes Leme, Adriana F.

    2015-01-01

    Targeted proteomics has flourished as the method of choice for prospecting for and validating potential candidate biomarkers in many diseases. However, challenges still remain due to the lack of standardized routines that can prioritize a limited number of proteins to be further validated in human samples. To help researchers identify candidate biomarkers that best characterize their samples under study, a well-designed integrative analysis pipeline, comprising MS-based discovery, feature selection methods, clustering techniques, bioinformatic analyses and targeted approaches was performed using discovery-based proteomic data from the secretomes of three classes of human cell lines (carcinoma, melanoma and non-cancerous). Three feature selection algorithms, namely, Beta-binomial, Nearest Shrunken Centroids (NSC), and Support Vector Machine-Recursive Features Elimination (SVM-RFE), indicated a panel of 137 candidate biomarkers for carcinoma and 271 for melanoma, which were differentially abundant between the tumor classes. We further tested the strength of the pipeline in selecting candidate biomarkers by immunoblotting, human tissue microarrays, label-free targeted MS and functional experiments. In conclusion, the proposed integrative analysis was able to pre-qualify and prioritize candidate biomarkers from discovery-based proteomics to targeted MS. PMID:26540631

  16. A 40 GHz fully integrated circuit with a vector network analyzer and a coplanar-line-based detection area for circulating tumor cell analysis using 65 nm CMOS technology

    Science.gov (United States)

    Nakanishi, Taiki; Matsunaga, Maya; Kobayashi, Atsuki; Nakazato, Kazuo; Niitsu, Kiichi

    2018-03-01

    A 40-GHz fully integrated CMOS-based circuit for circulating tumor cells (CTC) analysis, consisting of an on-chip vector network analyzer (VNA) and a highly sensitive coplanar-line-based detection area is presented in this paper. In this work, we introduce a fully integrated architecture that eliminates unwanted parasitic effects. The proposed analyzer was designed using 65 nm CMOS technology, and SPICE and MWS simulations were used to validate its operation. The simulation confirmed that the proposed circuit can measure S-parameter shifts resulting from the addition of various types of tumor cells to the detection area, the data of which are provided in a previous study: the |S 21| values for HepG2, A549, and HEC-1-A cells are -0.683, -0.580, and -0.623 dB, respectively. Additionally, the measurement demonstrated an S-parameters reduction of -25.7% when a silicone resin was put on the circuit. Hence, the proposed system is expected to contribute to cancer diagnosis.

  17. Measure and integral an introduction to real analysis

    CERN Document Server

    Wheeden, Richard L

    2015-01-01

    Now considered a classic text on the topic, Measure and Integral: An Introduction to Real Analysis provides an introduction to real analysis by first developing the theory of measure and integration in the simple setting of Euclidean space, and then presenting a more general treatment based on abstract notions characterized by axioms and with less geometric content.Published nearly forty years after the first edition, this long-awaited Second Edition also:Studies the Fourier transform of functions in the spaces L1, L2, and Lp, 1 p Shows the Hilbert transform to be a bounded operator on L2, as an application of the L2 theory of the Fourier transform in the one-dimensional caseCovers fractional integration and some topics related to mean oscillation properties of functions, such as the classes of Hölder continuous functions and the space of functions of bounded mean oscillationDerives a subrepresentation formula, which in higher dimensions plays a role roughly similar to the one played by the fundamental theor...

  18. Integrative Analysis of Cancer Diagnosis Studies with Composite Penalization

    Science.gov (United States)

    Liu, Jin; Huang, Jian; Ma, Shuangge

    2013-01-01

    Summary In cancer diagnosis studies, high-throughput gene profiling has been extensively conducted, searching for genes whose expressions may serve as markers. Data generated from such studies have the “large d, small n” feature, with the number of genes profiled much larger than the sample size. Penalization has been extensively adopted for simultaneous estimation and marker selection. Because of small sample sizes, markers identified from the analysis of single datasets can be unsatisfactory. A cost-effective remedy is to conduct integrative analysis of multiple heterogeneous datasets. In this article, we investigate composite penalization methods for estimation and marker selection in integrative analysis. The proposed methods use the minimax concave penalty (MCP) as the outer penalty. Under the homogeneity model, the ridge penalty is adopted as the inner penalty. Under the heterogeneity model, the Lasso penalty and MCP are adopted as the inner penalty. Effective computational algorithms based on coordinate descent are developed. Numerical studies, including simulation and analysis of practical cancer datasets, show satisfactory performance of the proposed methods. PMID:24578589

  19. Performance Criteria of Spatial Development Projects Based on Interregional Integration

    Directory of Open Access Journals (Sweden)

    Elena Viktorovna Kurushina

    2018-03-01

    Full Text Available The search of efficient ways for the development of regional socio-economic space is a relevant problem. The authors consider the models of spatial organization according to the Spatial Development Strategy of the Russian Federation until 2030. We conduct the comparative analysis of scenarios for the polarized and diversified spatial growth. Many investigations consider the concepts of polarized and endogenous growth. This study proposes a methodology to assess the development of macroregions and to increase the viability of interregional integration projects. To develop this methodology, we formulate scientific principles and indirect criteria of the project performance conforming to the theory of regional integration. In addition to the territorial community and complementarity of the development potentials, regional integration in the country should be based on the principles of security, networking, limited quantity and awareness of the potential project participants. Integration should ensure synergetic effects and take into account cultural and historical closeness, that manifests in the common mentality and existing economic relations among regions. The calculation results regarding the indirect criteria are obtained using the methods of classification and spatial correlation. This study confirms the hypothesis, that the formation of the Western Siberian and Ural macro-regions is appropriate. We have concluded this on the basis of the criteria of economic development, economic integration, the similarity of regional spaces as habitats, and a number of participants for the subjects of the Ural Federal District. The projection of the patterns of international economic integration to the interregional level allows predicting the highest probability for the successful cooperation among the Western Siberian regions with a high level of economic development. The authors’ method has revealed a high synchronization between the economies of

  20. SODIM: Service Oriented Data Integration based on MapReduce

    Directory of Open Access Journals (Sweden)

    Ghada ElSheikh

    2013-09-01

    Data integration systems can benefit from innovative dynamic infrastructure solutions such as Clouds, with its more agility, lower cost, device independency, location independency, and scalability. This study consolidates the data integration system, Service Orientation, and distributed processing to develop a new data integration system called Service Oriented Data Integration based on MapReduce (SODIM that improves the system performance, especially with large number of data sources, and that can efficiently be hosted on modern dynamic infrastructures as Clouds.

  1. Sensitivity Analysis for Design Optimization Integrated Software Tools, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this proposed project is to provide a new set of sensitivity analysis theory and codes, the Sensitivity Analysis for Design Optimization Integrated...

  2. Integrated Genome-Based Studies of Shewanella Ecophysiology

    Energy Technology Data Exchange (ETDEWEB)

    Andrei L. Osterman, Ph.D.

    2012-12-17

    Integration of bioinformatics and experimental techniques was applied to mapping and characterization of the key components (pathways, enzymes, transporters, regulators) of the core metabolic machinery in Shewanella oneidensis and related species with main focus was on metabolic and regulatory pathways involved in utilization of various carbon and energy sources. Among the main accomplishments reflected in ten joint publications with other participants of Shewanella Federation are: (i) A systems-level reconstruction of carbohydrate utilization pathways in the genus of Shewanella (19 species). This analysis yielded reconstruction of 18 sugar utilization pathways including 10 novel pathway variants and prediction of > 60 novel protein families of enzymes, transporters and regulators involved in these pathways. Selected functional predictions were verified by focused biochemical and genetic experiments. Observed growth phenotypes were consistent with bioinformatic predictions providing strong validation of the technology and (ii) Global genomic reconstruction of transcriptional regulons in 16 Shewanella genomes. The inferred regulatory network includes 82 transcription factors, 8 riboswitches and 6 translational attenuators. Of those, 45 regulons were inferred directly from the genome context analysis, whereas others were propagated from previously characterized regulons in other species. Selected regulatory predictions were experimentally tested. Integration of this analysis with microarray data revealed overall consistency and provided additional layer of interactions between regulons. All the results were captured in the new database RegPrecise, which is a joint development with the LBNL team. A more detailed analysis of the individual subsystems, pathways and regulons in Shewanella spp included bioinfiormatics-based prediction and experimental characterization of: (i) N-Acetylglucosamine catabolic pathway; (ii)Lactate utilization machinery; (iii) Novel Nrt

  3. Integrated knowledge base tool for acquisition and verification of NPP alarm systems

    International Nuclear Information System (INIS)

    Park, Joo Hyun; Seong, Poong Hyun

    1998-01-01

    Knowledge acquisition and knowledge base verification are important activities in developing knowledge-based systems such as alarm processing systems. In this work, we developed the integrated tool, for knowledge acquisition and verification of NPP alarm processing systems, by using G2 tool. The tool integrates document analysis method and ECPN matrix analysis method, for knowledge acquisition and knowledge verification, respectively. This tool enables knowledge engineers to perform their tasks from knowledge acquisition to knowledge verification consistently

  4. Advanced Concept Architecture Design and Integrated Analysis (ACADIA)

    Science.gov (United States)

    2017-11-03

    1 Advanced Concept Architecture Design and Integrated Analysis (ACADIA) Submitted to the National Institute of Aerospace (NIA) on...Research Report 20161001 - 20161030 Advanced Concept Architecture Design and Integrated Analysis (ACADIA) W911NF-16-2-0229 8504Cedric Justin, Youngjun

  5. Integrating fire management analysis into land management planning

    Science.gov (United States)

    Thomas J. Mills

    1983-01-01

    The analysis of alternative fire management programs should be integrated into the land and resource management planning process, but a single fire management analysis model cannot meet all planning needs. Therefore, a set of simulation models that are analytically separate from integrated land management planning models are required. The design of four levels of fire...

  6. Practical use of the integrated reporting framework – an analysis of the content of integrated reports of selected companies

    Directory of Open Access Journals (Sweden)

    Monika Raulinajtys-Grzybek

    2017-09-01

    Full Text Available Practical use of the integrated reporting framework – an analysis of the content of integrated reports of selected companies The purpose of the article is to provide a research tool for an initial assessment of whether a company’s integrated reports meet the objectives set out in the IIRC Integrated Reporting Framework and its empirical verification. In particular, the research addresses whether the reports meet the goal of improving the quality of information available and covering all factors that influence the organization’s ability to create value. The article uses the theoretical output on the principles of preparing integrated reports and analyzes the content of selected integrated reports. Based on the source analysis, a research tool has been developed for an initial assessment of whether an integrated report fulfills its objectives. It consists of 42 questions that verify the coverage of the defined elements and the implementation of the guiding principles set by the IIRC. For empirical verification of the tool, a comparative analysis was carried out for reports prepared by selected companies operating in the utilities sector. Answering questions from the research tool allows a researcher to formulate conclusions about the implementation of the guiding principles and the completeness of the presentation of the content elements. As a result of the analysis of selected integrated reports, it was stated that various elements of the report are presented with different levels of accuracy in different reports. Reports provide the most complete information on performance and strategy. The information about business model and prospective data is in some cases presented without making a link to other parts of the report – e.g. risks and opportunities, financial data or capitals. The absence of such links limits the ability to claim that an integrated report meets its objectives, since a set of individual reports, each presenting

  7. Accurate fluid force measurement based on control surface integration

    Science.gov (United States)

    Lentink, David

    2018-01-01

    Nonintrusive 3D fluid force measurements are still challenging to conduct accurately for freely moving animals, vehicles, and deforming objects. Two techniques, 3D particle image velocimetry (PIV) and a new technique, the aerodynamic force platform (AFP), address this. Both rely on the control volume integral for momentum; whereas PIV requires numerical integration of flow fields, the AFP performs the integration mechanically based on rigid walls that form the control surface. The accuracy of both PIV and AFP measurements based on the control surface integration is thought to hinge on determining the unsteady body force associated with the acceleration of the volume of displaced fluid. Here, I introduce a set of non-dimensional error ratios to show which fluid and body parameters make the error negligible. The unsteady body force is insignificant in all conditions where the average density of the body is much greater than the density of the fluid, e.g., in gas. Whenever a strongly deforming body experiences significant buoyancy and acceleration, the error is significant. Remarkably, this error can be entirely corrected for with an exact factor provided that the body has a sufficiently homogenous density or acceleration distribution, which is common in liquids. The correction factor for omitting the unsteady body force, {{{ {ρ f}} {1 - {ρ f} ( {{ρ b}+{ρ f}} )}.{( {{{{ρ }}b}+{ρ f}} )}}} , depends only on the fluid, {ρ f}, and body, {{ρ }}b, density. Whereas these straightforward solutions work even at the liquid-gas interface in a significant number of cases, they do not work for generalized bodies undergoing buoyancy in combination with appreciable body density inhomogeneity, volume change (PIV), or volume rate-of-change (PIV and AFP). In these less common cases, the 3D body shape needs to be measured and resolved in time and space to estimate the unsteady body force. The analysis shows that accounting for the unsteady body force is straightforward to non

  8. Integrated electrochemical gluconic acid biosensor based on self-assembled monolayer-modified gold electrodes. Application to the analysis of gluconic acid in musts and wines.

    Science.gov (United States)

    Campuzano, S; Gamella, M; Serra, B; Reviejo, A J; Pingarrón, J M

    2007-03-21

    An integrated amperometric gluconic acid biosensor constructed using a gold electrode (AuE) modified with a self-assembled monolayer (SAM) of 3-mercaptopropionic acid (MPA) on which gluconate dehydrogenase (GADH, 0.84 U) and the mediator tetrathiafulvalene (TTF, 1.5 micromol) were coimmobilized by covering the electrode surface with a dialysis membrane is reported. The working conditions selected were Eapp=+0.15 V and 25+/-1 degrees C. The useful lifetime of one single TTF-GADH-MPA-AuE was surprisingly long. After 53 days of continuous use, the biosensor exhibited 86% of the original sensitivity. A linear calibration plot was obtained for gluconic acid over the 6.0x10(-7) to 2.0x10(-5) M concentration range, with a limit of detection of 1.9x10(-7) M. The effect of potential interferents (glucose, fructose, galactose, arabinose, and tartaric, citric, malic, ascorbic, gallic, and caffeic acids) on the biosensor response was evaluated. The behavior of the biosensor in a flow-injection system in connection with amperometric detection was tested. The analytical usefulness of the biosensor was evaluated by determining gluconic acid in wine and must samples, and the results obtained were validated by comparison with those provided by using a commercial enzyme test kit.

  9. Ontology-based geographic data set integration

    NARCIS (Netherlands)

    Uitermark, H.T.J.A.; Uitermark, Harry T.; Oosterom, Peter J.M.; Mars, Nicolaas; Molenaar, Martien; Molenaar, M.

    1999-01-01

    In order to develop a system to propagate updates we investigate the semantic and spatial relationships between independently produced geographic data sets of the same region (data set integration). The goal of this system is to reduce operator intervention in update operations between corresponding

  10. Real analysis measure theory, integration, and Hilbert spaces

    CERN Document Server

    Stein, Elias M

    2005-01-01

    Real Analysis is the third volume in the Princeton Lectures in Analysis, a series of four textbooks that aim to present, in an integrated manner, the core areas of analysis. Here the focus is on the development of measure and integration theory, differentiation and integration, Hilbert spaces, and Hausdorff measure and fractals. This book reflects the objective of the series as a whole: to make plain the organic unity that exists between the various parts of the subject, and to illustrate the wide applicability of ideas of analysis to other fields of mathematics and science. After

  11. Semantic web for integrated network analysis in biomedicine.

    Science.gov (United States)

    Chen, Huajun; Ding, Li; Wu, Zhaohui; Yu, Tong; Dhanapalan, Lavanya; Chen, Jake Y

    2009-03-01

    The Semantic Web technology enables integration of heterogeneous data on the World Wide Web by making the semantics of data explicit through formal ontologies. In this article, we survey the feasibility and state of the art of utilizing the Semantic Web technology to represent, integrate and analyze the knowledge in various biomedical networks. We introduce a new conceptual framework, semantic graph mining, to enable researchers to integrate graph mining with ontology reasoning in network data analysis. Through four case studies, we demonstrate how semantic graph mining can be applied to the analysis of disease-causal genes, Gene Ontology category cross-talks, drug efficacy analysis and herb-drug interactions analysis.

  12. Integrated information system for analysis of nuclear power plants

    International Nuclear Information System (INIS)

    Galperin, A.

    1994-01-01

    Performing complicated engineering analyses of a nuclear power plant requires storage and manipulation of a large amount of information, both data and knowledge. This information is characterized by its multidisciplinary nature, complexity, and diversity. The problems caused by inefficient and lengthy manual operations involving the data flow management within the frame-work of the safety-related analysis of a power plant can be solved by applying the computer aided engineering principles. These principles are the basis of the design of an integrated information storage system (IRIS). The basic idea is to create a computerized environment, which includes both database and functional capabilities. Consideration and analysis of the data types and required data manipulation capabilities as well as operational requirements, resulted in the choice of an object-oriented data-base management system (OODBMS) as a development platform for solving the software engineering problems. Several advantages of OODBMSs over conventional relations database systems were found of crucial importance, especially providing the necessary flexibility for different data types and extensibility potential. A detailed design of a data model is produced for the plant technical data and for the storage of analysis results. The overall system architecture was designed to assure the feasibility of integrating database capabilities with procedures and functions written in conventional algorithmic programming languages

  13. DESIGN ANALYSIS OF ELECTRICAL MACHINES THROUGH INTEGRATED NUMERICAL APPROACH

    Directory of Open Access Journals (Sweden)

    ARAVIND C.V.

    2016-02-01

    Full Text Available An integrated design platform for the newer type of machines is presented in this work. The machine parameters are evaluated out using developed modelling tool. With the machine parameters, the machine is modelled using computer aided tool. The designed machine is brought to simulation tool to perform electromagnetic and electromechanical analysis. In the simulation, conditions setting are performed to setup the materials, meshes, rotational speed and the excitation circuit. Electromagnetic analysis is carried out to predict the behavior of the machine based on the movement of flux in the machines. Besides, electromechanical analysis is carried out to analyse the speed-torque characteristic, the current-torque characteristic and the phase angle-torque characteristic. After all the results are analysed, the designed machine is used to generate S block function that is compatible with MATLAB/SIMULINK tool for the dynamic operational characteristics. This allows the integration of existing drive system into the new machines designed in the modelling tool. An example of the machine design is presented to validate the usage of such a tool.

  14. Integrated risk analysis of global climate change

    International Nuclear Information System (INIS)

    Shlyakhter, Alexander; Wilson, Richard; Valverde A, L.J. Jr.

    1995-01-01

    This paper discusses several factors that should be considered in integrated risk analyses of global climate change. We begin by describing how the problem of global climate change can be subdivided into largely independent parts that can be linked together in an analytically tractable fashion. Uncertainty plays a central role in integrated risk analyses of global climate change. Accordingly, we consider various aspects of uncertainty as they relate to the climate change problem. We also consider the impacts of these uncertainties on various risk management issues, such as sequential decision strategies, value of information, and problems of interregional and intergenerational equity. (author)

  15. Simulation analysis for integrated evaluation of technical and commercial risk

    International Nuclear Information System (INIS)

    Gutleber, D.S.; Heiberger, E.M.; Morris, T.D.

    1995-01-01

    Decisions to invest in oil- and gasfield acquisitions or participating interests often are based on the perceived ability to enhance the economic value of the underlying asset. A multidisciplinary approach integrating reservoir engineering, operations and drilling, and deal structuring with Monte Carlo simulation modeling can overcome weaknesses of deterministic analysis and significantly enhance investment decisions. This paper discusses the use of spreadsheets and Monte Carlo simulation to generate probabilistic outcomes for key technical and economic parameters for ultimate identification of the economic volatility and value of potential deal concepts for a significant opportunity. The approach differs from a simple risk analysis for an individual well by incorporating detailed, full-field simulations that vary the reservoir parameters, capital and operating cost assumptions, and schedules on timing in the framework of various deal structures

  16. Integrated watershed analysis: adapting to changing times

    Science.gov (United States)

    Gordon H. Reeves

    2013-01-01

    Resource managers are increasingly required to conduct integrated analyses of aquatic and terrestrial ecosystems before undertaking any activities. Th ere are a number of research studies on the impacts of management actions on these ecosystems, as well as a growing body of knowledge about ecological processes that aff ect them, particularly aquatic ecosystems, which...

  17. Relay Feedback Analysis for Double Integral Plants

    Directory of Open Access Journals (Sweden)

    Zhen Ye

    2011-01-01

    Full Text Available Double integral plants under relay feedback are studied. Complete results on the uniqueness of solutions, existence, and stability of the limit cycles are established using the point transformation method. Analytical expressions are also given for determining the amplitude and period of a limit cycle from the plant parameters.

  18. Integrated analysis of oxide nuclear fuel sintering

    International Nuclear Information System (INIS)

    Baranov, V.; Kuzmin, R.; Tenishev, A.; Timoshin, I.; Khlunov, A.; Ivanov, A.; Petrov, I.

    2011-01-01

    Dilatometric and thermal-gravimetric investigations have been carried out for the sintering process of oxide nuclear fuel in gaseous Ar - 8% H 2 atmosphere at temperatures up to 1600 0 C. The pressed compacts were fabricated under real production conditions of the OAO MSZ with application of two different technologies, so called 'dry' and 'wet' technologies. Effects of the grain size growth after the heating to different temperatures were observed. In order to investigate the effects produced by rate of heating on properties of sintered fuel pellets, the heating rates were varied from 1 to 8 0 C per minute. Time of isothermal overexposure at maximal temperature (1600 0 C) was about 8 hours. Real production conditions were imitated. The results showed that the sintering process of the fuel pellets produced by two technologies differs. The samples sintered under different heating rates were studied with application of scanning electronic microscopy analysis for determination of mean grain size. A simulation of heating profile for industrial furnaces was performed to reduce the beam cycles and estimate the effects of variation of the isothermal overexposure temperatures. Based on this data, an optimization of the sintering conditions was performed in operations terms of OAO MSZ. (authors)

  19. Integration Processes of Delay Differential Equation Based on Modified Laguerre Functions

    Directory of Open Access Journals (Sweden)

    Yeguo Sun

    2012-01-01

    Full Text Available We propose long-time convergent numerical integration processes for delay differential equations. We first construct an integration process based on modified Laguerre functions. Then we establish its global convergence in certain weighted Sobolev space. The proposed numerical integration processes can also be used for systems of delay differential equations. We also developed a technique for refinement of modified Laguerre-Radau interpolations. Lastly, numerical results demonstrate the spectral accuracy of the proposed method and coincide well with analysis.

  20. Life-cycle analysis of product integrated polymer solar cells

    DEFF Research Database (Denmark)

    Espinosa Martinez, Nieves; García-Valverde, Rafael; Krebs, Frederik C

    2011-01-01

    A life cycle analysis (LCA) on a product integrated polymer solar module is carried out in this study. These assessments are well-known to be useful in developmental stages of a product in order to identify the bottlenecks for the up-scaling in its production phase for several aspects spanning from...... economics through design to functionality. An LCA study was performed to quantify the energy use and greenhouse gas (GHG) emissions from electricity use in the manufacture of a light-weight lamp based on a plastic foil, a lithium-polymer battery, a polymer solar cell, printed circuitry, blocking diode......, switch and a white light emitting semiconductor diode. The polymer solar cell employed in this prototype presents a power conversion efficiency in the range of 2 to 3% yielding energy payback times (EPBT) in the range of 1.3–2 years. Based on this it is worthwhile to undertake a life-cycle study...

  1. Integrated omics analysis of specialized metabolism in medicinal plants.

    Science.gov (United States)

    Rai, Amit; Saito, Kazuki; Yamazaki, Mami

    2017-05-01

    Medicinal plants are a rich source of highly diverse specialized metabolites with important pharmacological properties. Until recently, plant biologists were limited in their ability to explore the biosynthetic pathways of these metabolites, mainly due to the scarcity of plant genomics resources. However, recent advances in high-throughput large-scale analytical methods have enabled plant biologists to discover biosynthetic pathways for important plant-based medicinal metabolites. The reduced cost of generating omics datasets and the development of computational tools for their analysis and integration have led to the elucidation of biosynthetic pathways of several bioactive metabolites of plant origin. These discoveries have inspired synthetic biology approaches to develop microbial systems to produce bioactive metabolites originating from plants, an alternative sustainable source of medicinally important chemicals. Since the demand for medicinal compounds are increasing with the world's population, understanding the complete biosynthesis of specialized metabolites becomes important to identify or develop reliable sources in the future. Here, we review the contributions of major omics approaches and their integration to our understanding of the biosynthetic pathways of bioactive metabolites. We briefly discuss different approaches for integrating omics datasets to extract biologically relevant knowledge and the application of omics datasets in the construction and reconstruction of metabolic models. © 2017 The Authors The Plant Journal © 2017 John Wiley & Sons Ltd.

  2. Technology integrated teaching in Malaysian schools: GIS, a SWOT analysis

    Directory of Open Access Journals (Sweden)

    Habibah Lateh, vasugiammai muniandy

    2011-08-01

    Full Text Available Geographical Information System (GIS has been introduced and widely used in schools in various countries. The year 1990 onwards, the implementation of GIS in schools showed an increase. This is due to the drastic changes and reforms in the education system. Even though the name GIS suits well to the Geography subject, but it is widely integrated in various subjects such as History, Chemistry, Physics and Science. In Malaysia, GIS is common in fields such as risk management, architecture, town planning and municipal department. Anyhow, it is still unknown in the school education system. Even upper secondary students are not familiar with GIS. The Ministry of Education in Malaysia has been continuously reforming the education towards the aim of creating a society based on economic fundamentals and knowledge. The Master Plan for Educational Development with the aim of developing individual potential with well-integrated and balanced education is already on field. Recently, Malaysia invested 18 % of the annual national budget towards upgrading its education system. The computer in education program started in 1999. Three hundred and twenty two schools were chosen as ‘break a way’ from conventional teaching method towards technology integrated teaching. Projects such as New Primary School Curriculum (KBSR, Integrated Secondary School Curriculum (KBSM, Smart School Project, School Access Centre were introduced constantly. Teacher as the cogwheel of innovations in schools were given courses in aim to develop their ICT knowledge and skill. To this date, the technology integration in subjects is not equal and it disperses through subjects. Geography is one of the ‘dry’ subjects in schools with less technology which is not preferable among students. Geographical Information System (GIS is foremost the best Geographical Information Technology (GIT to be implied in geography subject. In Malaysian Education System, GIS is still exposed just in papers

  3. Integration issues of information engineering based I-CASE tools

    OpenAIRE

    Kurbel, Karl; Schnieder, Thomas

    1994-01-01

    Problems and requirements regarding integration of methods and tools across phases of the software-development life cycle are discussed. Information engineering (IE) methodology and I-CASE (integrated CASE) tools supporting IE claim to have an integrated view across major stages of enterprise-wide information-system development: information strategy planning, business area analysis, system design, and construction. In the main part of this paper, two comprehensive I-CASE tools, ADW (Applicati...

  4. Integration of thermodynamic insights and MINLP optimisation for the synthesis, design and analysis of process flowsheets

    DEFF Research Database (Denmark)

    Hostrup, Martin; Gani, Rafiqul; Kravanja, Zdravko

    1999-01-01

    This paper presents an integrated approach to the solution of process synthesis, design and analysis problems. Integration is achieved by combining two different techniques, synthesis based on thermodynamic insights and structural optimization together with a simulation engine and a properties pr...

  5. Cross-Border Trade: An Analysis of Trade and Market Integration ...

    African Journals Online (AJOL)

    An assessment of cross-border trade and market integration reveal that inhabitants of the border areas have become economically, socially and politically integrated in spite of the conflict over the Bakassi Peninsula. Based on empirical analysis, bilateral agreements between Nigeria and Cameroon have made negligible ...

  6. A Numerical Study of Quantization-Based Integrators

    Directory of Open Access Journals (Sweden)

    Barros Fernando

    2014-01-01

    Full Text Available Adaptive step size solvers are nowadays considered fundamental to achieve efficient ODE integration. While, traditionally, ODE solvers have been designed based on discrete time machines, new approaches based on discrete event systems have been proposed. Quantization provides an efficient integration technique based on signal threshold crossing, leading to independent and modular solvers communicating through discrete events. These solvers can benefit from the large body of knowledge on discrete event simulation techniques, like parallelization, to obtain efficient numerical integration. In this paper we introduce new solvers based on quantization and adaptive sampling techniques. Preliminary numerical results comparing these solvers are presented.

  7. Remote sensing and GIS-based integrated analysis of land cover change in Duzce plain and its surroundings (north western Turkey).

    Science.gov (United States)

    Ikiel, Cercis; Ustaoglu, Beyza; Dutucu, Ayse Atalay; Kilic, Derya Evrim

    2013-02-01

    The aim of this study is to research natural land cover change caused by the permanent effects of human activities in Duzce plain and its surroundings, and to determine the current status of the land cover. For this purpose, two Landsat TM images were used in the study for the years 1987 and 2010. These images are analysed by using data image processing techniques in ERDAS Imagine©10.0 and ArcGIS©10.0 software. Land cover change nomenclature is classified according to the Coordination of Information on the Environment Level 2 Classification (1--urban fabric, 2--industrial, commercial and transport units, 3--heterogeneous agricultural areas, 4--forests, and 5--inland wetlands). Furthermore, the image analysis results are confirmed by the field research. According to the results, a decrease of 33.5 % was recorded in forest areas from 24,840.7 to 16,529.0 ha; an increase of 11.2 % was recorded in heterogeneous agricultural areas from 47,702.7 to 53,051.7 ha. Natural vegetation, which is the large part of land cover in the research area, has been changing rapidly because of rapid urbanisation and agricultural activities. As a result, it is concluded that significant changes have occurred on the natural land cover between the years 1987 and 2010 in the Duzce plain and its surroundings.

  8. Radiogenomics of hepatocellular carcinoma: multiregion analysis-based identification of prognostic imaging biomarkers by integrating gene data—a preliminary study

    Science.gov (United States)

    Xia, Wei; Chen, Ying; Zhang, Rui; Yan, Zhuangzhi; Zhou, Xiaobo; Zhang, Bo; Gao, Xin

    2018-02-01

    Our objective was to identify prognostic imaging biomarkers for hepatocellular carcinoma in contrast-enhanced computed tomography (CECT) with biological interpretations by associating imaging features and gene modules. We retrospectively analyzed 371 patients who had gene expression profiles. For the 38 patients with CECT imaging data, automatic intra-tumor partitioning was performed, resulting in three spatially distinct subregions. We extracted a total of 37 quantitative imaging features describing intensity, geometry, and texture from each subregion. Imaging features were selected after robustness and redundancy analysis. Gene modules acquired from clustering were chosen for their prognostic significance. By constructing an association map between imaging features and gene modules with Spearman rank correlations, the imaging features that significantly correlated with gene modules were obtained. These features were evaluated with Cox’s proportional hazard models and Kaplan-Meier estimates to determine their prognostic capabilities for overall survival (OS). Eight imaging features were significantly correlated with prognostic gene modules, and two of them were associated with OS. Among these, the geometry feature volume fraction of the subregion, which was significantly correlated with all prognostic gene modules representing cancer-related interpretation, was predictive of OS (Cox p  =  0.022, hazard ratio  =  0.24). The texture feature cluster prominence in the subregion, which was correlated with the prognostic gene module representing lipid metabolism and complement activation, also had the ability to predict OS (Cox p  =  0.021, hazard ratio  =  0.17). Imaging features depicting the volume fraction and textural heterogeneity in subregions have the potential to be predictors of OS with interpretable biological meaning.

  9. Process integration and pinch analysis in sugarcane industry

    Energy Technology Data Exchange (ETDEWEB)

    Prado, Adelk de Carvalho; Pinheiro, Ricardo Brant [UFMG, Departamento de Engenharia Nuclear, Programa de Pos-Graduacao em Ciencias e Tecnicas Nucleares, Belo Horizonte, MG (Brazil)], E-mail: rbp@nuclear.ufmg.br

    2010-07-01

    Process integration techniques were applied, particularly through the Pinch Analysis method, to sugarcane industry. Research was performed upon harvest data from an agroindustrial complex which processes sugarcane plant in excess of 3.5 million metric tons per year, producing motor fuel grade ethanol, standard quality sugar, and delivering excess electric power to the grid. Pinch Analysis was used in assessing internal heat recovery as well as external utility demand targets, while keeping the lowest but economically achievable targets for entropy increase. Efficiency on the use of energy was evaluated for the plant as it was found (the base case) as well as for five selected process and/or plant design modifications, always with guidance of the method. The first alternative design (case 2) was proposed to evaluate equipment mean idle time in the base case, to support subsequent comparisons. Cases 3 and 4 were used to estimate the upper limits of combined heat and power generation while raw material supply of the base case is kept; both the cases did not prove worth implementing. Cases 5 and 6 were devised to deal with the bottleneck of the plant, namely boiler capacity, in order to allow for some production increment. Inexpensive, minor modifications considered in case 5 were found unable to produce reasonable outcome gain. Nevertheless, proper changes in cane juice evaporation section (case 6) could allow sugar and ethanol combined production to rise up to 9.1% relative to the base case, without dropping cogenerated power. (author)

  10. Integration of Design and Control through Model Analysis

    DEFF Research Database (Denmark)

    Russel, Boris Mariboe; Henriksen, Jens Peter; Jørgensen, Sten Bay

    2002-01-01

    A systematic computer aided analysis of the process model is proposed as a pre-solution step for integration of design and control problems. The process model equations are classified in terms of balance equations, constitutive equations and conditional equations. Analysis of the phenomena models...... (structure selection) issues for the integrated problems are considered. (C) 2002 Elsevier Science Ltd. All rights reserved....... representing the constitutive equations identify the relationships between the important process and design variables, which help to understand, define and address some of the issues related to integration of design and control. Furthermore, the analysis is able to identify a set of process (control) variables...

  11. Integrated intelligent instruments using supercritical fluid technology for soil analysis

    International Nuclear Information System (INIS)

    Liebman, S.A.; Phillips, C.; Fitzgerald, W.; Levy, E.J.

    1994-01-01

    Contaminated soils pose a significant challenge for characterization and remediation programs that require rapid, accurate and comprehensive data in the field or laboratory. Environmental analyzers based on supercritical fluid (SF) technology have been designed and developed for meeting these global needs. The analyzers are designated the CHAMP Systems (Chemical Hazards Automated Multimedia Processors). The prototype instrumentation features SF extraction (SFE) and on-line capillary gas chromatographic (GC) analysis with chromatographic and/or spectral identification detectors, such as ultra-violet, Fourier transform infrared and mass spectrometers. Illustrations are given for a highly automated SFE-capillary GC/flame ionization (FID) configuration to provide validated screening analysis for total extractable hydrocarbons within ca. 5--10 min, as well as a full qualitative/quantitative analysis in 25--30 min. Data analysis using optional expert system and neural networks software is demonstrated for test gasoline and diesel oil mixtures in this integrated intelligent instrument approach to trace organic analysis of soils and sediments

  12. Integrative Analysis of Prognosis Data on Multiple Cancer Subtypes

    Science.gov (United States)

    Liu, Jin; Huang, Jian; Zhang, Yawei; Lan, Qing; Rothman, Nathaniel; Zheng, Tongzhang; Ma, Shuangge

    2014-01-01

    Summary In cancer research, profiling studies have been extensively conducted, searching for genes/SNPs associated with prognosis. Cancer is diverse. Examining the similarity and difference in the genetic basis of multiple subtypes of the same cancer can lead to a better understanding of their connections and distinctions. Classic meta-analysis methods analyze each subtype separately and then compare analysis results across subtypes. Integrative analysis methods, in contrast, analyze the raw data on multiple subtypes simultaneously and can outperform meta-analysis methods. In this study, prognosis data on multiple subtypes of the same cancer are analyzed. An AFT (accelerated failure time) model is adopted to describe survival. The genetic basis of multiple subtypes is described using the heterogeneity model, which allows a gene/SNP to be associated with prognosis of some subtypes but not others. A compound penalization method is developed to identify genes that contain important SNPs associated with prognosis. The proposed method has an intuitive formulation and is realized using an iterative algorithm. Asymptotic properties are rigorously established. Simulation shows that the proposed method has satisfactory performance and outperforms a penalization-based meta-analysis method and a regularized thresholding method. An NHL (non-Hodgkin lymphoma) prognosis study with SNP measurements is analyzed. Genes associated with the three major subtypes, namely DLBCL, FL, and CLL/SLL, are identified. The proposed method identifies genes that are different from alternatives and have important implications and satisfactory prediction performance. PMID:24766212

  13. Ontology based heterogeneous materials database integration and semantic query

    Science.gov (United States)

    Zhao, Shuai; Qian, Quan

    2017-10-01

    Materials digital data, high throughput experiments and high throughput computations are regarded as three key pillars of materials genome initiatives. With the fast growth of materials data, the integration and sharing of data is very urgent, that has gradually become a hot topic of materials informatics. Due to the lack of semantic description, it is difficult to integrate data deeply in semantic level when adopting the conventional heterogeneous database integration approaches such as federal database or data warehouse. In this paper, a semantic integration method is proposed to create the semantic ontology by extracting the database schema semi-automatically. Other heterogeneous databases are integrated to the ontology by means of relational algebra and the rooted graph. Based on integrated ontology, semantic query can be done using SPARQL. During the experiments, two world famous First Principle Computational databases, OQMD and Materials Project are used as the integration targets, which show the availability and effectiveness of our method.

  14. Legacy effects of wildfire on stream thermal regimes and rainbow trout ecology: an integrated analysis of observation and individual-based models

    Science.gov (United States)

    Rosenberger, Amanda E.; Dunham, Jason B.; Neuswanger, Jason R.; Railsback, Steven F.

    2015-01-01

    Management of aquatic resources in fire-prone areas requires understanding of fish species’ responses to wildfire and of the intermediate- and long-term consequences of these disturbances. We examined Rainbow Trout populations in 9 headwater streams 10 y after a major wildfire: 3 with no history of severe wildfire in the watershed (unburned), 3 in severely burned watersheds (burned), and 3 in severely burned watersheds subjected to immediate events that scoured the stream channel and eliminated streamside vegetation (burned and reorganized). Results of a previous study of this system suggested the primary lasting effects of this wildfire history on headwater stream habitat were differences in canopy cover and solar radiation, which led to higher summer stream temperatures. Nevertheless, trout were present throughout streams in burned watersheds. Older age classes were least abundant in streams draining watersheds with a burned and reorganized history, and individuals >1 y old were most abundant in streams draining watersheds with an unburned history. Burned history corresponded with fast growth, low lipid content, and early maturity of Rainbow Trout. We used an individual-based model of Rainbow Trout growth and demographic patterns to determine if temperature interactions with bioenergetics and competition among individuals could lead to observed phenotypic and ecological differences among populations in the absence of other plausible mechanisms. Modeling suggested that moderate warming associated with wildfire and channel disturbance history leads to faster individual growth, which exacerbates competition for limited food, leading to decreases in population densities. The inferred mechanisms from this modeling exercise suggest the transferability of ecological patterns to a variety of temperature-warming scenarios.

  15. Accumulated state assessment of the Yukon River watershed: part II quantitative effects-based analysis integrating Western science and traditional ecological knowledge.

    Science.gov (United States)

    Dubé, Monique G; Wilson, Julie E; Waterhouse, Jon

    2013-07-01

    This article is the second in a 2-part series assessing the accumulated state of the transboundary Yukon River (YR) basin in northern Canada and the United States. The determination of accumulated state based on available long-term (LT) discharge and water quality data is the first step in watershed cumulative effect assessment in the absence of sufficient biological monitoring data. Long-term trends in water quantity and quality were determined and a benchmark against which to measure change was defined for 5 major reaches along the YR for nitrate, total and dissolved organic carbon (TOC and DOC, respectively), total phosphate (TP), orthophosphate, pH, and specific conductivity. Deviations from the reference condition were identified as "hot moments" in time, nested within a reach. Significant increasing LT trends in discharge were found on the Canadian portion of the YR. There were significant LT decreases in nitrate, TOC, and TP at the Headwater reach, and significant increases in nitrate and specific conductivity at the Lower reach. Deviations from reference condition were found in all water quality variables but most notably during the ice-free period of the YR (May-Sept) and in the Lower reach. The greatest magnitudes of outliers were found during the spring freshet. This study also incorporated traditional ecological knowledge (TEK) into its assessment of accumulated state. In the summer of 2007 the YR Inter Tribal Watershed Council organized a team of people to paddle down the length of the YR as part of a "Healing Journey," where both Western Science and TEK paradigms were used. Water quality data were continuously collected and stories were shared between the team and communities along the YR. Healing Journey data were compared to the LT reference conditions and showed the summer of 2007 was abnormal compared to the LT water quality. This study showed the importance of establishing a reference condition by reach and season for key indicators of water

  16. Integrated severe accident containment analysis with the CONTAIN computer code

    International Nuclear Information System (INIS)

    Bergeron, K.D.; Williams, D.C.; Rexroth, P.E.; Tills, J.L.

    1985-12-01

    Analysis of physical and radiological conditions iunside the containment building during a severe (core-melt) nuclear reactor accident requires quantitative evaluation of numerous highly disparate yet coupled phenomenologies. These include two-phase thermodynamics and thermal-hydraulics, aerosol physics, fission product phenomena, core-concrete interactions, the formation and combustion of flammable gases, and performance of engineered safety features. In the past, this complexity has meant that a complete containment analysis would require application of suites of separate computer codes each of which would treat only a narrower subset of these phenomena, e.g., a thermal-hydraulics code, an aerosol code, a core-concrete interaction code, etc. In this paper, we describe the development and some recent applications of the CONTAIN code, which offers an integrated treatment of the dominant containment phenomena and the interactions among them. We describe the results of a series of containment phenomenology studies, based upon realistic accident sequence analyses in actual plants. These calculations highlight various phenomenological effects that have potentially important implications for source term and/or containment loading issues, and which are difficult or impossible to treat using a less integrated code suite

  17. Integrated modeling and analysis methodology for precision pointing applications

    Science.gov (United States)

    Gutierrez, Homero L.

    2002-07-01

    Space-based optical systems that perform tasks such as laser communications, Earth imaging, and astronomical observations require precise line-of-sight (LOS) pointing. A general approach is described for integrated modeling and analysis of these types of systems within the MATLAB/Simulink environment. The approach can be applied during all stages of program development, from early conceptual design studies to hardware implementation phases. The main objective is to predict the dynamic pointing performance subject to anticipated disturbances and noise sources. Secondary objectives include assessing the control stability, levying subsystem requirements, supporting pointing error budgets, and performing trade studies. The integrated model resides in Simulink, and several MATLAB graphical user interfaces (GUI"s) allow the user to configure the model, select analysis options, run analyses, and process the results. A convenient parameter naming and storage scheme, as well as model conditioning and reduction tools and run-time enhancements, are incorporated into the framework. This enables the proposed architecture to accommodate models of realistic complexity.

  18. Integrated dynamic landscape analysis and modeling system (IDLAMS) : installation manual.

    Energy Technology Data Exchange (ETDEWEB)

    Li, Z.; Majerus, K. A.; Sundell, R. C.; Sydelko, P. J.; Vogt, M. C.

    1999-02-24

    The Integrated Dynamic Landscape Analysis and Modeling System (IDLAMS) is a prototype, integrated land management technology developed through a joint effort between Argonne National Laboratory (ANL) and the US Army Corps of Engineers Construction Engineering Research Laboratories (USACERL). Dr. Ronald C. Sundell, Ms. Pamela J. Sydelko, and Ms. Kimberly A. Majerus were the principal investigators (PIs) for this project. Dr. Zhian Li was the primary software developer. Dr. Jeffrey M. Keisler, Mr. Christopher M. Klaus, and Mr. Michael C. Vogt developed the decision analysis component of this project. It was developed with funding support from the Strategic Environmental Research and Development Program (SERDP), a land/environmental stewardship research program with participation from the US Department of Defense (DoD), the US Department of Energy (DOE), and the US Environmental Protection Agency (EPA). IDLAMS predicts land conditions (e.g., vegetation, wildlife habitats, and erosion status) by simulating changes in military land ecosystems for given training intensities and land management practices. It can be used by military land managers to help predict the future ecological condition for a given land use based on land management scenarios of various levels of training intensity. It also can be used as a tool to help land managers compare different land management practices and further determine a set of land management activities and prescriptions that best suit the needs of a specific military installation.

  19. Transient flow analysis of integrated valve opening process

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Xinming; Qin, Benke; Bo, Hanliang, E-mail: bohl@tsinghua.edu.cn; Xu, Xingxing

    2017-03-15

    Highlights: • The control rod hydraulic driving system (CRHDS) is a new type of built-in control rod drive technology and the integrated valve (IV) is the key control component. • The transient flow experiment induced by IV is conducted and the test results are analyzed to get its working mechanism. • The theoretical model of IV opening process is established and applied to get the changing rule of the transient flow characteristic parameters. - Abstract: The control rod hydraulic driving system (CRHDS) is a new type of built-in control rod drive technology and the IV is the key control component. The working principle of integrated valve (IV) is analyzed and the IV hydraulic experiment is conducted. There is transient flow phenomenon in the valve opening process. The theoretical model of IV opening process is established by the loop system control equations and boundary conditions. The valve opening boundary condition equation is established based on the IV three dimensional flow field analysis results and the dynamic analysis of the valve core movement. The model calculation results are in good agreement with the experimental results. On this basis, the model is used to analyze the transient flow under high temperature condition. The peak pressure head is consistent with the one under room temperature and the pressure fluctuation period is longer than the one under room temperature. Furthermore, the changing rule of pressure transients with the fluid and loop structure parameters is analyzed. The peak pressure increases with the flow rate and the peak pressure decreases with the increase of the valve opening time. The pressure fluctuation period increases with the loop pipe length and the fluctuation amplitude remains largely unchanged under different equilibrium pressure conditions. The research results lay the base for the vibration reduction analysis of the CRHDS.

  20. Integrated program of using of Probabilistic Safety Analysis in Spain

    International Nuclear Information System (INIS)

    1998-01-01

    Since 25 June 1986, when the CSN (Nuclear Safety Conseil) approve the Integrated Program of Probabilistic Safety Analysis, this program has articulated the main activities of CSN. This document summarize the activities developed during these years and reviews the Integrated programme

  1. Cost-effectiveness analysis of the national implementation of integrated community case management and community-based health planning and services in Ghana for the treatment of malaria, diarrhoea and pneumonia.

    Science.gov (United States)

    Escribano Ferrer, Blanca; Hansen, Kristian Schultz; Gyapong, Margaret; Bruce, Jane; Narh Bana, Solomon A; Narh, Clement T; Allotey, Naa-Korkor; Glover, Roland; Azantilow, Naa-Charity; Bart-Plange, Constance; Sagoe-Moses, Isabella; Webster, Jayne

    2017-07-05

    Ghana has developed two main community-based strategies that aim to increase access to quality treatment for malaria, diarrhoea and suspected pneumonia: the integrated community case management (iCCM) and the community-based health planning and services (CHPS). The aim of the study was to assess the cost-effectiveness of these strategies under programme conditions. A cost-effectiveness analysis was conducted. Appropriate diagnosis and treatment given was the effectiveness measure used. Appropriate diagnosis and treatment data was obtained from a household survey conducted 2 and 8 years after implementation of iCCM in the Volta and Northern Regions of Ghana, respectively. The study population was carers of children under-5 years who had fever, diarrhoea and/or cough in the last 2 weeks prior to the interview. Costs data was obtained mainly from the National Malaria Control Programme (NMCP), the Ministry of Health, CHPS compounds and from a household survey. Appropriate diagnosis and treatment of malaria, diarrhoea and suspected pneumonia was more cost-effective under the iCCM than under CHPS in the Volta Region, even after adjusting for different discount rates, facility costs and iCCM and CHPS utilization, but not when iCCM appropriate treatment was reduced by 50%. Due to low numbers of carers visiting a CBA in the Northern Region it was not possible to conduct a cost-effectiveness analysis in this region. However, the cost analysis showed that iCCM in the Northern Region had higher cost per malaria, diarrhoea and suspected pneumonia case diagnosed and treated when compared to the Volta Region and to the CHPS strategy in the Northern Region. Integrated community case management was more cost-effective than CHPS for the treatment of malaria, diarrhoea and suspected pneumonia when utilized by carers of children under-5 years in the Volta Region. A revision of the iCCM strategy in the Northern Region is needed to improve its cost-effectiveness. Long-term financing

  2. Development of integrated cask body and base plate

    International Nuclear Information System (INIS)

    Sasaki, T.; Koyama, Y.; Yoshida, T.; Wada, T.

    2015-01-01

    The average of occupancy of stored spent-fuel in the nuclear power plants have reached 70 percent and it is anticipated that the demand of metal casks for the storage and transportation of spent-fuel rise after resuming the operations. The main part of metal cask consists of main body, neutron shield and external cylinder. We have developed the manufacturing technology of Integrated Cask Body and Base Plate by integrating Cask Body and Base Plate as monolithic forging with the goal of cost reduction, manufacturing period shortening and further reliability improvement. Here, we report the manufacturing technology, code compliance and obtained properties of Integrated Cask body and Base Plate. (author)

  3. Model-Based Integration and Interpretation of Data

    DEFF Research Database (Denmark)

    Petersen, Johannes

    2004-01-01

    Data integration and interpretation plays a crucial role in supervisory control. The paper defines a set of generic inference steps for the data integration and interpretation process based on a three-layer model of system representations. The three-layer model is used to clarify the combination...... of constraint and object-centered representations of the work domain throwing new light on the basic principles underlying the data integration and interpretation process of Rasmussen's abstraction hierarchy as well as other model-based approaches combining constraint and object-centered representations. Based...

  4. Signal integrity analysis on discontinuous microstrip line

    International Nuclear Information System (INIS)

    Qiao, Qingyang; Dai, Yawen; Chen, Zipeng

    2013-01-01

    In high speed PCB design, microstirp lines were used to control the impedance, however, the discontinuous microstrip line can cause signal integrity problems. In this paper, we use the transmission line theory to study the characteristics of microstrip lines. Research results indicate that the discontinuity such as truncation, gap and size change result in the problems such as radiation, reflection, delay and ground bounce. We change the discontinuities to distributed parameter circuits, analysed the steady-state response and transient response and the phase delay. The transient response cause radiation and voltage jump.

  5. Sensitivity Analysis of the Integrated Medical Model for ISS Programs

    Science.gov (United States)

    Goodenow, D. A.; Myers, J. G.; Arellano, J.; Boley, L.; Garcia, Y.; Saile, L.; Walton, M.; Kerstman, E.; Reyes, D.; Young, M.

    2016-01-01

    Sensitivity analysis estimates the relative contribution of the uncertainty in input values to the uncertainty of model outputs. Partial Rank Correlation Coefficient (PRCC) and Standardized Rank Regression Coefficient (SRRC) are methods of conducting sensitivity analysis on nonlinear simulation models like the Integrated Medical Model (IMM). The PRCC method estimates the sensitivity using partial correlation of the ranks of the generated input values to each generated output value. The partial part is so named because adjustments are made for the linear effects of all the other input values in the calculation of correlation between a particular input and each output. In SRRC, standardized regression-based coefficients measure the sensitivity of each input, adjusted for all the other inputs, on each output. Because the relative ranking of each of the inputs and outputs is used, as opposed to the values themselves, both methods accommodate the nonlinear relationship of the underlying model. As part of the IMM v4.0 validation study, simulations are available that predict 33 person-missions on ISS and 111 person-missions on STS. These simulated data predictions feed the sensitivity analysis procedures. The inputs to the sensitivity procedures include the number occurrences of each of the one hundred IMM medical conditions generated over the simulations and the associated IMM outputs: total quality time lost (QTL), number of evacuations (EVAC), and number of loss of crew lives (LOCL). The IMM team will report the results of using PRCC and SRRC on IMM v4.0 predictions of the ISS and STS missions created as part of the external validation study. Tornado plots will assist in the visualization of the condition-related input sensitivities to each of the main outcomes. The outcomes of this sensitivity analysis will drive review focus by identifying conditions where changes in uncertainty could drive changes in overall model output uncertainty. These efforts are an integral

  6. Advantages of Integrative Data Analysis for Developmental Research

    Science.gov (United States)

    Bainter, Sierra A.; Curran, Patrick J.

    2015-01-01

    Amid recent progress in cognitive development research, high-quality data resources are accumulating, and data sharing and secondary data analysis are becoming increasingly valuable tools. Integrative data analysis (IDA) is an exciting analytical framework that can enhance secondary data analysis in powerful ways. IDA pools item-level data across…

  7. Skill-based immigration, economic integration, and economic performance

    OpenAIRE

    Aydemir, Abdurrahman

    2014-01-01

    Studies for major immigrant-receiving countries provide evidence on the comparative economic performance of immigrant classes (skill-, kinship-, and humanitarian-based). Developed countries are increasingly competing for high-skilled immigrants, who perform better in the labor market. However, there are serious challenges to their economic integration, which highlights a need for complementary immigration and integration policies.

  8. Integrating knowledge based functionality in commercial hospital information systems.

    Science.gov (United States)

    Müller, M L; Ganslandt, T; Eich, H P; Lang, K; Ohmann, C; Prokosch, H U

    2000-01-01

    Successful integration of knowledge-based functions in the electronic patient record depends on direct and context-sensitive accessibility and availability to clinicians and must suit their workflow. In this paper we describe an exemplary integration of an existing standalone scoring system for acute abdominal pain into two different commercial hospital information systems using Java/Corba technolgy.

  9. Integrated care: a comprehensive bibliometric analysis and literature review

    Directory of Open Access Journals (Sweden)

    Xiaowei Sun

    2014-06-01

    Full Text Available Introduction: Integrated care could not only fix up fragmented health care but also improve the continuity of care and the quality of life. Despite the volume and variety of publications, little is known about how ‘integrated care’ has developed. There is a need for a systematic bibliometric analysis on studying the important features of the integrated care literature.Aim: To investigate the growth pattern, core journals and jurisdictions and identify the key research domains of integrated care.Methods: We searched Medline/PubMed using the search strategy ‘(delivery of health care, integrated [MeSH Terms] OR integrated care [Title/Abstract]’ without time and language limits. Second, we extracted the publishing year, journals, jurisdictions and keywords of the retrieved articles. Finally, descriptive statistical analysis by the Bibliographic Item Co-occurrence Matrix Builder and hierarchical clustering by SPSS were used.Results: As many as 9090 articles were retrieved. Results included: (1 the cumulative numbers of the publications on integrated care rose perpendicularly after 1993; (2 all documents were recorded by 1646 kinds of journals. There were 28 core journals; (3 the USA is the predominant publishing country; and (4 there are six key domains including: the definition/models of integrated care, interdisciplinary patient care team, disease management for chronically ill patients, types of health care organizations and policy, information system integration and legislation/jurisprudence.Discussion and conclusion: Integrated care literature has been most evident in developed countries. International Journal of Integrated Care is highly recommended in this research area. The bibliometric analysis and identification of publication hotspots provides researchers and practitioners with core target journals, as well as an overview of the field for further research in integrated care.

  10. Nucleic Acid-based Detection of Bacterial Pathogens Using Integrated Microfluidic Platform Systems

    Directory of Open Access Journals (Sweden)

    Carl A. Batt

    2009-05-01

    Full Text Available The advent of nucleic acid-based pathogen detection methods offers increased sensitivity and specificity over traditional microbiological techniques, driving the development of portable, integrated biosensors. The miniaturization and automation of integrated detection systems presents a significant advantage for rapid, portable field-based testing. In this review, we highlight current developments and directions in nucleic acid-based micro total analysis systems for the detection of bacterial pathogens. Recent progress in the miniaturization of microfluidic processing steps for cell capture, DNA extraction and purification, polymerase chain reaction, and product detection are detailed. Discussions include strategies and challenges for implementation of an integrated portable platform.

  11. Integrating Data Transformation in Principal Components Analysis

    KAUST Repository

    Maadooliat, Mehdi; Huang, Jianhua Z.; Hu, Jianhua

    2015-01-01

    Principal component analysis (PCA) is a popular dimension reduction method to reduce the complexity and obtain the informative aspects of high-dimensional datasets. When the data distribution is skewed, data transformation is commonly used prior

  12. Integration of Evidence Base into a Probabilistic Risk Assessment

    Science.gov (United States)

    Saile, Lyn; Lopez, Vilma; Bickham, Grandin; Kerstman, Eric; FreiredeCarvalho, Mary; Byrne, Vicky; Butler, Douglas; Myers, Jerry; Walton, Marlei

    2011-01-01

    INTRODUCTION: A probabilistic decision support model such as the Integrated Medical Model (IMM) utilizes an immense amount of input data that necessitates a systematic, integrated approach for data collection, and management. As a result of this approach, IMM is able to forecasts medical events, resource utilization and crew health during space flight. METHODS: Inflight data is the most desirable input for the Integrated Medical Model. Non-attributable inflight data is collected from the Lifetime Surveillance for Astronaut Health study as well as the engineers, flight surgeons, and astronauts themselves. When inflight data is unavailable cohort studies, other models and Bayesian analyses are used, in addition to subject matters experts input on occasion. To determine the quality of evidence of a medical condition, the data source is categorized and assigned a level of evidence from 1-5; the highest level is one. The collected data reside and are managed in a relational SQL database with a web-based interface for data entry and review. The database is also capable of interfacing with outside applications which expands capabilities within the database itself. Via the public interface, customers can access a formatted Clinical Findings Form (CLiFF) that outlines the model input and evidence base for each medical condition. Changes to the database are tracked using a documented Configuration Management process. DISSCUSSION: This strategic approach provides a comprehensive data management plan for IMM. The IMM Database s structure and architecture has proven to support additional usages. As seen by the resources utilization across medical conditions analysis. In addition, the IMM Database s web-based interface provides a user-friendly format for customers to browse and download the clinical information for medical conditions. It is this type of functionality that will provide Exploratory Medicine Capabilities the evidence base for their medical condition list

  13. A systems biology-based approach to uncovering the molecular mechanisms underlying the effects of dragon's blood tablet in colitis, involving the integration of chemical analysis, ADME prediction, and network pharmacology.

    Directory of Open Access Journals (Sweden)

    Haiyu Xu

    Full Text Available Traditional Chinese medicine (TCM is one of the oldest East Asian medical systems. The present study adopted a systems biology-based approach to provide new insights relating to the active constituents and molecular mechanisms underlying the effects of dragon's blood (DB tablets for the treatment of colitis. This study integrated chemical analysis, prediction of absorption, distribution, metabolism, and excretion (ADME, and network pharmacology. Firstly, a rapid, reliable, and accurate ultra-performance liquid chromatography-electrospray ionization-tandem mass spectrometry method was employed to identify 48 components of DB tablets. In silico prediction of the passive absorption of these compounds, based on Caco-2 cell permeability, and their P450 metabolism enabled the identification of 22 potentially absorbed components and 8 metabolites. Finally, networks were constructed to analyze interactions between these DB components/metabolites absorbed and their putative targets, and between the putative DB targets and known therapeutic targets for colitis. This study provided a great opportunity to deepen the understanding of the complex pharmacological mechanisms underlying the effects of DB in colitis treatment.

  14. SEURAT: visual analytics for the integrated analysis of microarray data.

    Science.gov (United States)

    Gribov, Alexander; Sill, Martin; Lück, Sonja; Rücker, Frank; Döhner, Konstanze; Bullinger, Lars; Benner, Axel; Unwin, Antony

    2010-06-03

    In translational cancer research, gene expression data is collected together with clinical data and genomic data arising from other chip based high throughput technologies. Software tools for the joint analysis of such high dimensional data sets together with clinical data are required. We have developed an open source software tool which provides interactive visualization capability for the integrated analysis of high-dimensional gene expression data together with associated clinical data, array CGH data and SNP array data. The different data types are organized by a comprehensive data manager. Interactive tools are provided for all graphics: heatmaps, dendrograms, barcharts, histograms, eventcharts and a chromosome browser, which displays genetic variations along the genome. All graphics are dynamic and fully linked so that any object selected in a graphic will be highlighted in all other graphics. For exploratory data analysis the software provides unsupervised data analytics like clustering, seriation algorithms and biclustering algorithms. The SEURAT software meets the growing needs of researchers to perform joint analysis of gene expression, genomical and clinical data.

  15. SEURAT: Visual analytics for the integrated analysis of microarray data

    Directory of Open Access Journals (Sweden)

    Bullinger Lars

    2010-06-01

    Full Text Available Abstract Background In translational cancer research, gene expression data is collected together with clinical data and genomic data arising from other chip based high throughput technologies. Software tools for the joint analysis of such high dimensional data sets together with clinical data are required. Results We have developed an open source software tool which provides interactive visualization capability for the integrated analysis of high-dimensional gene expression data together with associated clinical data, array CGH data and SNP array data. The different data types are organized by a comprehensive data manager. Interactive tools are provided for all graphics: heatmaps, dendrograms, barcharts, histograms, eventcharts and a chromosome browser, which displays genetic variations along the genome. All graphics are dynamic and fully linked so that any object selected in a graphic will be highlighted in all other graphics. For exploratory data analysis the software provides unsupervised data analytics like clustering, seriation algorithms and biclustering algorithms. Conclusions The SEURAT software meets the growing needs of researchers to perform joint analysis of gene expression, genomical and clinical data.

  16. PHIDIAS: a pathogen-host interaction data integration and analysis system

    OpenAIRE

    Xiang, Zuoshuang; Tian, Yuying; He, Yongqun

    2007-01-01

    The Pathogen-Host Interaction Data Integration and Analysis System (PHIDIAS) is a web-based database system that serves as a centralized source to search, compare, and analyze integrated genome sequences, conserved domains, and gene expression data related to pathogen-host interactions (PHIs) for pathogen species designated as high priority agents for public health and biological security. In addition, PHIDIAS allows submission, search and analysis of PHI genes and molecular networks curated ...

  17. DTI analysis methods : Voxel-based analysis

    NARCIS (Netherlands)

    Van Hecke, Wim; Leemans, Alexander; Emsell, Louise

    2016-01-01

    Voxel-based analysis (VBA) of diffusion tensor imaging (DTI) data permits the investigation of voxel-wise differences or changes in DTI metrics in every voxel of a brain dataset. It is applied primarily in the exploratory analysis of hypothesized group-level alterations in DTI parameters, as it does

  18. Integrating ICT in Agriculture for Knowledge-Based Economy

    African Journals Online (AJOL)

    agriculture –based livelihoods, demands the integration of ICT knowledge with agriculture. .... (CGIAR) shows the vital role of Agricultural development in Rwanda's ... Network, Rwanda National Backbone Project, Regional Communication.

  19. Leisure market segmentation : an integrated preferences/constraints-based approach

    NARCIS (Netherlands)

    Stemerding, M.P.; Oppewal, H.; Beckers, T.A.M.; Timmermans, H.J.P.

    1996-01-01

    Traditional segmentation schemes are often based on a grouping of consumers with similar preference functions. The research steps, ultimately leading to such segmentation schemes, are typically independent. In the present article, a new integrated approach to segmentation is introduced, which

  20. Advanced data analysis in neuroscience integrating statistical and computational models

    CERN Document Server

    Durstewitz, Daniel

    2017-01-01

    This book is intended for use in advanced graduate courses in statistics / machine learning, as well as for all experimental neuroscientists seeking to understand statistical methods at a deeper level, and theoretical neuroscientists with a limited background in statistics. It reviews almost all areas of applied statistics, from basic statistical estimation and test theory, linear and nonlinear approaches for regression and classification, to model selection and methods for dimensionality reduction, density estimation and unsupervised clustering.  Its focus, however, is linear and nonlinear time series analysis from a dynamical systems perspective, based on which it aims to convey an understanding also of the dynamical mechanisms that could have generated observed time series. Further, it integrates computational modeling of behavioral and neural dynamics with statistical estimation and hypothesis testing. This way computational models in neuroscience are not only explanat ory frameworks, but become powerfu...

  1. Integration of Simulink Models with Component-based Software Models

    Directory of Open Access Journals (Sweden)

    MARIAN, N.

    2008-06-01

    Full Text Available Model based development aims to facilitate the development of embedded control systems by emphasizing the separation of the design level from the implementation level. Model based design involves the use of multiple models that represent different views of a system, having different semantics of abstract system descriptions. Usually, in mechatronics systems, design proceeds by iterating model construction, model analysis, and model transformation. Constructing a MATLAB/Simulink model, a plant and controller behavior is simulated using graphical blocks to represent mathematical and logical constructs and process flow, then software code is generated. A Simulink model is a representation of the design or implementation of a physical system that satisfies a set of requirements. A software component-based system aims to organize system architecture and behavior as a means of computation, communication and constraints, using computational blocks and aggregates for both discrete and continuous behavior, different interconnection and execution disciplines for event-based and time-based controllers, and so on, to encompass the demands to more functionality, at even lower prices, and with opposite constraints. COMDES (Component-based Design of Software for Distributed Embedded Systems is such a component-based system framework developed by the software engineering group of Mads Clausen Institute for Product Innovation (MCI, University of Southern Denmark. Once specified, the software model has to be analyzed. One way of doing that is to integrate in wrapper files the model back into Simulink S-functions, and use its extensive simulation features, thus allowing an early exploration of the possible design choices over multiple disciplines. The paper describes a safe translation of a restricted set of MATLAB/Simulink blocks to COMDES software components, both for continuous and discrete behavior, and the transformation of the software system into the S

  2. Medical Device Integration Model Based on the Internet of Things

    Science.gov (United States)

    Hao, Aiyu; Wang, Ling

    2015-01-01

    At present, hospitals in our country have basically established the HIS system, which manages registration, treatment, and charge, among many others, of patients. During treatment, patients need to use medical devices repeatedly to acquire all sorts of inspection data. Currently, the output data of the medical devices are often manually input into information system, which is easy to get wrong or easy to cause mismatches between inspection reports and patients. For some small hospitals of which information construction is still relatively weak, the information generated by the devices is still presented in the form of paper reports. When doctors or patients want to have access to the data at a given time again, they can only look at the paper files. Data integration between medical devices has long been a difficult problem for the medical information system, because the data from medical devices are lack of mandatory unified global standards and have outstanding heterogeneity of devices. In order to protect their own interests, manufacturers use special protocols, etc., thus causing medical decices to still be the "lonely island" of hospital information system. Besides, unfocused application of the data will lead to failure to achieve a reasonable distribution of medical resources. With the deepening of IT construction in hospitals, medical information systems will be bound to develop towards mobile applications, intelligent analysis, and interconnection and interworking, on the premise that there is an effective medical device integration (MDI) technology. To this end, this paper presents a MDI model based on the Internet of Things (IoT). Through abstract classification, this model is able to extract the common characteristics of the devices, resolve the heterogeneous differences between them, and employ a unified protocol to integrate data between devices. And by the IoT technology, it realizes interconnection network of devices and conducts associate matching

  3. Integrated genome based studies of Shewanella ecophysiology

    Energy Technology Data Exchange (ETDEWEB)

    Saffarini, Daad A

    2013-03-07

    Progress is reported in these areas: Regulation of anaerobic respiration by cAMP receptor protein and role of adenylate cyclases; Identification of an octaheme c cytochrome as the terminal sulfite reductase in S. oneidensis MR-1; Identification and analysis of components of the electron transport chains that lead to reduction of thiosulfate, tetrathionate, and elemental sulfur in MR-1; Involvement of pili and flagella in metal reduction by S. oneidensis MR-1; and work suggesting that HemN1 is the major enzyme that is involved in heme biosynthesis under anaerobic conditions.

  4. Agent-Based Data Integration Framework

    Directory of Open Access Journals (Sweden)

    Łukasz Faber

    2014-01-01

    Full Text Available Combining data from diverse, heterogeneous sources while facilitating a unified access to it is an important (albeit difficult task. There are various possibilities of performing it. In this publication, we propose and describe an agent-based framework dedicated to acquiring and processing distributed, heterogeneous data collected from diverse sources (e.g., the Internet, external software, relational, and document databases. Using this multi-agent-based approach in the aspects of the general architecture (the organization and management of the framework, we create a proof-of-concept implementation. The approach is presented using a sample scenario in which the system is used to search for personal and professional profiles of scientists.

  5. Cost benefit analysis of power plant database integration

    International Nuclear Information System (INIS)

    Wilber, B.E.; Cimento, A.; Stuart, R.

    1988-01-01

    A cost benefit analysis of plant wide data integration allows utility management to evaluate integration and automation benefits from an economic perspective. With this evaluation, the utility can determine both the quantitative and qualitative savings that can be expected from data integration. The cost benefit analysis is then a planning tool which helps the utility to develop a focused long term implementation strategy that will yield significant near term benefits. This paper presents a flexible cost benefit analysis methodology which is both simple to use and yields accurate, verifiable results. Included in this paper is a list of parameters to consider, a procedure for performing the cost savings analysis, and samples of this procedure when applied to a utility. A case study is presented involving a specific utility where this procedure was applied. Their uses of the cost-benefit analysis are also described

  6. Applied Research on Big-Data-Based Analysis of Chinese Basic Education Integrating Social Values —Taking Chinese Education from 3rd to 6th Grade as Example

    Directory of Open Access Journals (Sweden)

    Shuqin Zhao

    2015-06-01

    Full Text Available Primary education is the gold stage of personal growth which is the foundation of the comprehensive development of moral, intelligence, physical, art and labor. Therefore, grasping the daily education of primary students is the focus of primary education. The application of big-data analysis could focus on the micro and overall performance of students. Maybe, these data have no significance to the individuals, but the information of all the students could solve many problems in the teaching process. Accordingly the teachers could acquire the real learning level of most of the students in school and more accurately carry out the personalized education and facilitate the efficient study of students.Using the SAS statistical models, this paper mined and analyzed the 3-6 grade's big testing data on Chinese Language courses through the VOSMaP Database of the EduCube project. In this study, the results of two-way ANOVA analysis model can provide the finding and effectively assist educators to solve the child's learning problems. In China the Chinese Language courses of Basic Education focus on the cultivation of loving the motherland language, cultural and social values; it emphasizes on the development of intellectual, morality and sound personality as well as the balanced development of moral, intellectual, physical, aesthetic, labor and so on. Based on the big data analysis, this paper concludes that the Chinese Language education fused with social values can facilitate the effectiveness of this kind of integration education in China, as well as can provide the educators of this field a new thinking in the big data era.

  7. Development of data analysis tool for combat system integration

    Directory of Open Access Journals (Sweden)

    Seung-Chun Shin

    2013-03-01

    Full Text Available System integration is an important element for the construction of naval combat ships. In particular, because impeccable combat system integration together with the sensors and weapons can ensure the combat capability and survivability of the ship, the integrated performance of the combat system should be verified and validated whether or not it fulfills the requirements of the end user. In order to conduct systematic verification and validation, a data analysis tool is requisite. This paper suggests the Data Extraction, Recording and Analysis Tool (DERAT for the data analysis of the integrated performance of the combat system, including the functional definition, architecture and effectiveness of the DERAT by presenting the test results.

  8. Variation in the Interpretation of Scientific Integrity in Community-based Participatory Health Research

    Science.gov (United States)

    Kraemer Diaz, Anne E.; Spears Johnson, Chaya R.; Arcury, Thomas A.

    2013-01-01

    Community-based participatory research (CBPR) has become essential in health disparities and environmental justice research; however, the scientific integrity of CBPR projects has become a concern. Some concerns, such as appropriate research training, lack of access to resources and finances, have been discussed as possibly limiting the scientific integrity of a project. Prior to understanding what threatens scientific integrity in CBPR, it is vital to understand what scientific integrity means for the professional and community investigators who are involved in CBPR. This analysis explores the interpretation of scientific integrity in CBPR among 74 professional and community research team members from of 25 CBPR projects in nine states in the southeastern United States in 2012. It describes the basic definition for scientific integrity and then explores variations in the interpretation of scientific integrity in CBPR. Variations in the interpretations were associated with team member identity as professional or community investigators. Professional investigators understood scientific integrity in CBPR as either conceptually or logistically flexible, as challenging to balance with community needs, or no different than traditional scientific integrity. Community investigators interpret other factors as important in scientific integrity, such as trust, accountability, and overall benefit to the community. This research demonstrates that the variations in the interpretation of scientific integrity in CBPR call for a new definition of scientific integrity in CBPR that takes into account the understanding and needs of all investigators. PMID:24161098

  9. Developing a comprehensive framework of community integration for people with acquired brain injury: a conceptual analysis.

    Science.gov (United States)

    Shaikh, Nusratnaaz M; Kersten, Paula; Siegert, Richard J; Theadom, Alice

    2018-03-06

    Despite increasing emphasis on the importance of community integration as an outcome for acquired brain injury (ABI), there is still no consensus on the definition of community integration. The aim of this study was to complete a concept analysis of community integration in people with ABI. The method of concept clarification was used to guide concept analysis of community integration based on a literature review. Articles were included if they explored community integration in people with ABI. Data extraction was performed by the initial coding of (1) the definition of community integration used in the articles, (2) attributes of community integration recognized in the articles' findings, and (3) the process of community integration. This information was synthesized to develop a model of community integration. Thirty-three articles were identified that met the inclusion criteria. The construct of community integration was found to be a non-linear process reflecting recovery over time, sequential goals, and transitions. Community integration was found to encompass six components including: independence, sense of belonging, adjustment, having a place to live, involved in a meaningful occupational activity, and being socially connected into the community. Antecedents to community integration included individual, injury-related, environmental, and societal factors. The findings of this concept analysis suggest that the concept of community integration is more diverse than previously recognized. New measures and rehabilitation plans capturing all attributes of community integration are needed in clinical practice. Implications for rehabilitation Understanding of perceptions and lived experiences of people with acquired brain injury through this analysis provides basis to ensure rehabilitation meets patients' needs. This model highlights the need for clinicians to be aware and assess the role of antecedents as well as the attributes of community integration itself to

  10. Content-Based Personalization Services Integrating Folksonomies

    Science.gov (United States)

    Musto, Cataldo; Narducci, Fedelucio; Lops, Pasquale; de Gemmis, Marco; Semeraro, Giovanni

    Basic content-based personalization consists in matching up the attributes of a user profile, in which preferences and interests are stored, with the attributes of a content object. The Web 2.0 (r)evolution has changed the game for personalization, from ‘elitary’ Web 1.0, written by few and read by many, to web content generated by everyone (user-generated content - UGC), since the role of people has evolved from passive consumers of information to that of active contributors.

  11. Integrated systems analysis of the PIUS reactor

    Energy Technology Data Exchange (ETDEWEB)

    Fullwood, F.; Kroeger, P.; Higgins, J. [Brookhaven National Lab., Upton, NY (United States)] [and others

    1993-11-01

    Results are presented of a systems failure analysis of the PIUS plant systems that are used during normal reactor operation and postulated accidents. This study was performed to provide the NRC with an understanding of the behavior of the plant. The study applied two diverse failure identification methods, Failure Modes Effects & Criticality Analysis (FMECA) and Hazards & Operability (HAZOP) to the plant systems, supported by several deterministic analyses. Conventional PRA methods were also used along with a scheme for classifying events by initiator frequency and combinations of failures. Principal results of this study are: (a) an extensive listing of potential event sequences, grouped in categories that can be used by the NRC, (b) identification of support systems that are important to safety, and (c) identification of key operator actions.

  12. Integrated systems analysis of the PIUS reactor

    International Nuclear Information System (INIS)

    Fullwood, F.; Kroeger, P.; Higgins, J.

    1993-11-01

    Results are presented of a systems failure analysis of the PIUS plant systems that are used during normal reactor operation and postulated accidents. This study was performed to provide the NRC with an understanding of the behavior of the plant. The study applied two diverse failure identification methods, Failure Modes Effects ampersand Criticality Analysis (FMECA) and Hazards ampersand Operability (HAZOP) to the plant systems, supported by several deterministic analyses. Conventional PRA methods were also used along with a scheme for classifying events by initiator frequency and combinations of failures. Principal results of this study are: (a) an extensive listing of potential event sequences, grouped in categories that can be used by the NRC, (b) identification of support systems that are important to safety, and (c) identification of key operator actions

  13. Integrative data analysis of male reproductive disorders

    DEFF Research Database (Denmark)

    Edsgard, Stefan Daniel

    of such data in conjunction with data from publicly available repositories. This thesis presents an introduction to disease genetics and molecular systems biology, followed by four studies that each provide detailed clues to the etiology of male reproductive disorders. Finally, a fifth study illustrates......-wide association data with respect to copy number variation and show that the aggregated effect of rare variants can influence the risk for testicular cancer. Paper V provides an example of the application of RNA-Seq for expression analysis of a species with an unsequenced genome. We analysed the plant...... of this thesis is the identification of the molecular basis of male reproductive disorders, with a special focus on testicular cancer. To this end, clinical samples were characterized by microarraybased transcription and genomic variation assays and molecular entities were identified by computational analysis...

  14. HTGR-INTEGRATED COAL TO LIQUIDS PRODUCTION ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Anastasia M Gandrik; Rick A Wood

    2010-10-01

    As part of the DOE’s Idaho National Laboratory (INL) nuclear energy development mission, the INL is leading a program to develop and design a high temperature gas-cooled reactor (HTGR), which has been selected as the base design for the Next Generation Nuclear Plant. Because an HTGR operates at a higher temperature, it can provide higher temperature process heat, more closely matched to chemical process temperatures, than a conventional light water reactor. Integrating HTGRs into conventional industrial processes would increase U.S. energy security and potentially reduce greenhouse gas emissions (GHG), particularly CO2. This paper focuses on the integration of HTGRs into a coal to liquids (CTL) process, for the production of synthetic diesel fuel, naphtha, and liquefied petroleum gas (LPG). The plant models for the CTL processes were developed using Aspen Plus. The models were constructed with plant production capacity set at 50,000 barrels per day of liquid products. Analysis of the conventional CTL case indicated a potential need for hydrogen supplementation from high temperature steam electrolysis (HTSE), with heat and power supplied by the HTGR. By supplementing the process with an external hydrogen source, the need to “shift” the syngas using conventional water-gas shift reactors was eliminated. HTGR electrical power generation efficiency was set at 40%, a reactor size of 600 MWth was specified, and it was assumed that heat in the form of hot helium could be delivered at a maximum temperature of 700°C to the processes. Results from the Aspen Plus model were used to perform a preliminary economic analysis and a life cycle emissions assessment. The following conclusions were drawn when evaluating the nuclear assisted CTL process against the conventional process: • 11 HTGRs (600 MWth each) are required to support production of a 50,000 barrel per day CTL facility. When compared to conventional CTL production, nuclear integration decreases coal

  15. HTGR-Integrated Coal To Liquids Production Analysis

    International Nuclear Information System (INIS)

    Gandrik, Anastasia M.; Wood, Rick A.

    2010-01-01

    As part of the DOE's Idaho National Laboratory (INL) nuclear energy development mission, the INL is leading a program to develop and design a high temperature gas-cooled reactor (HTGR), which has been selected as the base design for the Next Generation Nuclear Plant. Because an HTGR operates at a higher temperature, it can provide higher temperature process heat, more closely matched to chemical process temperatures, than a conventional light water reactor. Integrating HTGRs into conventional industrial processes would increase U.S. energy security and potentially reduce greenhouse gas emissions (GHG), particularly CO2. This paper focuses on the integration of HTGRs into a coal to liquids (CTL) process, for the production of synthetic diesel fuel, naphtha, and liquefied petroleum gas (LPG). The plant models for the CTL processes were developed using Aspen Plus. The models were constructed with plant production capacity set at 50,000 barrels per day of liquid products. Analysis of the conventional CTL case indicated a potential need for hydrogen supplementation from high temperature steam electrolysis (HTSE), with heat and power supplied by the HTGR. By supplementing the process with an external hydrogen source, the need to 'shift' the syngas using conventional water-gas shift reactors was eliminated. HTGR electrical power generation efficiency was set at 40%, a reactor size of 600 MWth was specified, and it was assumed that heat in the form of hot helium could be delivered at a maximum temperature of 700 C to the processes. Results from the Aspen Plus model were used to perform a preliminary economic analysis and a life cycle emissions assessment. The following conclusions were drawn when evaluating the nuclear assisted CTL process against the conventional process: (1) 11 HTGRs (600 MWth each) are required to support production of a 50,000 barrel per day CTL facility. When compared to conventional CTL production, nuclear integration decreases coal consumption by 66

  16. SIG-VISA: Signal-based Vertically Integrated Seismic Monitoring

    Science.gov (United States)

    Moore, D.; Mayeda, K. M.; Myers, S. C.; Russell, S.

    2013-12-01

    Traditional seismic monitoring systems rely on discrete detections produced by station processing software; however, while such detections may constitute a useful summary of station activity, they discard large amounts of information present in the original recorded signal. We present SIG-VISA (Signal-based Vertically Integrated Seismic Analysis), a system for seismic monitoring through Bayesian inference on seismic signals. By directly modeling the recorded signal, our approach incorporates additional information unavailable to detection-based methods, enabling higher sensitivity and more accurate localization using techniques such as waveform matching. SIG-VISA's Bayesian forward model of seismic signal envelopes includes physically-derived models of travel times and source characteristics as well as Gaussian process (kriging) statistical models of signal properties that combine interpolation of historical data with extrapolation of learned physical trends. Applying Bayesian inference, we evaluate the model on earthquakes as well as the 2009 DPRK test event, demonstrating a waveform matching effect as part of the probabilistic inference, along with results on event localization and sensitivity. In particular, we demonstrate increased sensitivity from signal-based modeling, in which the SIGVISA signal model finds statistical evidence for arrivals even at stations for which the IMS station processing failed to register any detection.

  17. Integration of poly-3-(hydroxybutyrate-co-hydroxyvalerate) production by Haloferax mediterranei through utilization of stillage from rice-based ethanol manufacture in India and its techno-economic analysis.

    Science.gov (United States)

    Bhattacharyya, Anirban; Jana, Kuntal; Haldar, Saubhik; Bhowmic, Asit; Mukhopadhyay, Ujjal Kumar; De, Sudipta; Mukherjee, Joydeep

    2015-05-01

    Haloferax mediterranei has potential for economical industrial-scale production of polyhydroxyalkanoate (PHA) as it can utilize cheap carbon sources, has capacity for nonsterile cultivation and allows simple product recovery. Molasses-based Indian distilleries are converting themselves to cereal-based distilleries. Waste stillage (14 l) of rice-based ethanol industry was used for the production of PHA by H. mediterranei in the simple plug-flow reactor configuration of the activated sludge process. Cells utilized stillage and accumulated 63 ± 3 % PHA of dry cell weight and produced 13.12 ± 0.05 g PHA/l. The product yield coefficient was 0.27 while 0.14 g/l h volumetric productivity was reached. Simultaneous lowering of 5-day biochemical oxygen demand and chemical oxygen demand values of stillage by 82 % was attained. The biopolymer was characterized as poly-3-(hydroxybutyrate-co-17.9 mol%-hydroxyvalerate) (PHBV). Directional properties of decanoic acid jointly with temperature-dependent water solubility in decanoic acid were employed for two-step desalination of the spent stillage medium in a cylindrical baffled-tank with an immersed heater and a stirrer holding axial and radial impellers. 99.3 % of the medium salts were recovered and re-used for PHA production. The cost of PHBV was estimated as US$2.05/kg when the annual production was simulated as 1890 tons. Desalination contributed maximally to the overall cost. Technology and cost-analysis demonstrate that PHA production integrated with ethanol manufacture is feasible in India. This study could be the basis for construction of a pilot plant.

  18. Integral finite element analysis of turntable bearing with flexible rings

    Science.gov (United States)

    Deng, Biao; Liu, Yunfei; Guo, Yuan; Tang, Shengjin; Su, Wenbin; Lei, Zhufeng; Wang, Pengcheng

    2018-03-01

    This paper suggests a method to calculate the internal load distribution and contact stress of the thrust angular contact ball turntable bearing by FEA. The influence of the stiffness of the bearing structure and the plastic deformation of contact area on the internal load distribution and contact stress of the bearing is considered. In this method, the load-deformation relationship of the rolling elements is determined by the finite element contact analysis of a single rolling element and the raceway. Based on this, the nonlinear contact between the rolling elements and the inner and outer ring raceways is same as a nonlinear compression spring and bearing integral finite element analysis model including support structure was established. The effects of structural deformation and plastic deformation on the built-in stress distribution of slewing bearing are investigated on basis of comparing the consequences of load distribution, inner and outer ring stress, contact stress and other finite element analysis results with the traditional bearing theory, which has guiding function for improving the design of slewing bearing.

  19. Development and assessment of best estimate integrated safety analysis code

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Bub Dong; Lee, Young Jin; Hwang, Moon Kyu (and others)

    2007-03-15

    Improvement of the integrated safety analysis code MARS3.0 has been carried out and a multi-D safety analysis application system has been established. Iterative matrix solver and parallel processing algorithm have been introduced, and a LINUX version has been generated to enable MARS to run in cluster PCs. MARS variables and sub-routines have been reformed and modularised to simplify code maintenance. Model uncertainty analyses have been performed for THTF, FLECHT, NEPTUN, and LOFT experiments as well as APR1400 plant. Participations in international cooperation research projects such as OECD BEMUSE, SETH, PKL, BFBT, and TMI-2 have been actively pursued as part of code assessment efforts. The assessment, evaluation and experimental data obtained through international cooperation projects have been registered and maintained in the T/H Databank. Multi-D analyses of APR1400 LBLOCA, DVI Break, SLB, and SGTR have been carried out as a part of application efforts in multi-D safety analysis. GUI based 3D input generator has been developed for user convenience. Operation of the MARS Users Group (MUG) was continued and through MUG, the technology has been transferred to 24 organisations. A set of 4 volumes of user manuals has been compiled and the correction reports for the code errors reported during MARS development have been published.

  20. Development and assessment of best estimate integrated safety analysis code

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Lee, Young Jin; Hwang, Moon Kyu

    2007-03-01

    Improvement of the integrated safety analysis code MARS3.0 has been carried out and a multi-D safety analysis application system has been established. Iterative matrix solver and parallel processing algorithm have been introduced, and a LINUX version has been generated to enable MARS to run in cluster PCs. MARS variables and sub-routines have been reformed and modularised to simplify code maintenance. Model uncertainty analyses have been performed for THTF, FLECHT, NEPTUN, and LOFT experiments as well as APR1400 plant. Participations in international cooperation research projects such as OECD BEMUSE, SETH, PKL, BFBT, and TMI-2 have been actively pursued as part of code assessment efforts. The assessment, evaluation and experimental data obtained through international cooperation projects have been registered and maintained in the T/H Databank. Multi-D analyses of APR1400 LBLOCA, DVI Break, SLB, and SGTR have been carried out as a part of application efforts in multi-D safety analysis. GUI based 3D input generator has been developed for user convenience. Operation of the MARS Users Group (MUG) was continued and through MUG, the technology has been transferred to 24 organisations. A set of 4 volumes of user manuals has been compiled and the correction reports for the code errors reported during MARS development have been published

  1. An analysis of 3D particle path integration algorithms

    International Nuclear Information System (INIS)

    Darmofal, D.L.; Haimes, R.

    1996-01-01

    Several techniques for the numerical integration of particle paths in steady and unsteady vector (velocity) fields are analyzed. Most of the analysis applies to unsteady vector fields, however, some results apply to steady vector field integration. Multistep, multistage, and some hybrid schemes are considered. It is shown that due to initialization errors, many unsteady particle path integration schemes are limited to third-order accuracy in time. Multistage schemes require at least three times more internal data storage than multistep schemes of equal order. However, for timesteps within the stability bounds, multistage schemes are generally more accurate. A linearized analysis shows that the stability of these integration algorithms are determined by the eigenvalues of the local velocity tensor. Thus, the accuracy and stability of the methods are interpreted with concepts typically used in critical point theory. This paper shows how integration schemes can lead to erroneous classification of critical points when the timestep is finite and fixed. For steady velocity fields, we demonstrate that timesteps outside of the relative stability region can lead to similar integration errors. From this analysis, guidelines for accurate timestep sizing are suggested for both steady and unsteady flows. In particular, using simulation data for the unsteady flow around a tapered cylinder, we show that accurate particle path integration requires timesteps which are at most on the order of the physical timescale of the flow

  2. Integrating human factors into process hazard analysis

    International Nuclear Information System (INIS)

    Kariuki, S.G.; Loewe, K.

    2007-01-01

    A comprehensive process hazard analysis (PHA) needs to address human factors. This paper describes an approach that systematically identifies human error in process design and the human factors that influence its production and propagation. It is deductive in nature and therefore considers human error as a top event. The combinations of different factors that may lead to this top event are analysed. It is qualitative in nature and is used in combination with other PHA methods. The method has an advantage because it does not look at the operator error as the sole contributor to the human failure within a system but a combination of all underlying factors

  3. Integrative Genomic Analysis of Complex traits

    DEFF Research Database (Denmark)

    Ehsani, Ali Reza

    In the last decade rapid development in biotechnologies has made it possible to extract extensive information about practically all levels of biological organization. An ever-increasing number of studies are reporting miltilayered datasets on the entire DNA sequence, transceroption, protein...... expression, and metabolite abundance of more and more populations in a multitude of invironments. However, a solid model for including all of this complex information in one analysis, to disentangle genetic variation and the underlying genetic architecture of complex traits and diseases, has not yet been...

  4. Global sensitivity analysis of DRAINMOD-FOREST, an integrated forest ecosystem model

    Science.gov (United States)

    Shiying Tian; Mohamed A. Youssef; Devendra M. Amatya; Eric D. Vance

    2014-01-01

    Global sensitivity analysis is a useful tool to understand process-based ecosystem models by identifying key parameters and processes controlling model predictions. This study reported a comprehensive global sensitivity analysis for DRAINMOD-FOREST, an integrated model for simulating water, carbon (C), and nitrogen (N) cycles and plant growth in lowland forests. The...

  5. Project analysis and integration economic analyses summary

    Science.gov (United States)

    Macomber, H. L.

    1986-01-01

    An economic-analysis summary was presented for the manufacture of crystalline-silicon modules involving silicon ingot/sheet, growth, slicing, cell manufacture, and module assembly. Economic analyses provided: useful quantitative aspects for complex decision-making to the Flat-plate Solar Array (FSA) Project; yardsticks for design and performance to industry; and demonstration of how to evaluate and understand the worth of research and development both to JPL and other government agencies and programs. It was concluded that future research and development funds for photovoltaics must be provided by the Federal Government because the solar industry today does not reap enough profits from its present-day sales of photovoltaic equipment.

  6. Integrative analysis of metabolomics and transcriptomics data

    DEFF Research Database (Denmark)

    Brink-Jensen, Kasper; Bak, Søren; Jørgensen, Kirsten

    2013-01-01

    ) measurements from the same samples, to identify genes controlling the production of metabolites. Due to the high dimensionality of both LC-MS and DNA microarray data, dimension reduction and variable selection are key elements of the analysis. Our proposed approach starts by identifying the basis functions......The abundance of high-dimensional measurements in the form of gene expression and mass spectroscopy calls for models to elucidate the underlying biological system. For widely studied organisms like yeast, it is possible to incorporate prior knowledge from a variety of databases, an approach used...... ("building blocks") that constitute the output from a mass spectrometry experiment. Subsequently, the weights of these basis functions are related to the observations from the corresponding gene expression data in order to identify which genes are associated with specific patterns seen in the metabolite data...

  7. Graphene based integrated tandem supercapacitors fabricated directly on separators

    KAUST Repository

    Chen, Wei

    2015-04-09

    It is of great importance to fabricate integrated supercapacitors with extended operation voltages as high energy density storage devices. In this work, we develop a novel direct electrode deposition on separator (DEDS) process to fabricate graphene based integrated tandem supercapacitors for the first time. The DEDS process generates compact graphene-polyaniline electrodes directly on the separators to form integrated supercapacitors. The integrated graphene-polyaniline tandem supercapacitors demonstrate ultrahigh volumetric energy density of 52.5 Wh L^(−1) at power density of 6037 W L^(−1) and excellent gravimetric energy density of 26.1 Wh kg^(−1) at power density of 3002 W kg^(−1) with outstanding electrochemical stability for over 10000 cycles. This study show great promises for the future development of integrated energy storage devices.

  8. Broadband image sensor array based on graphene-CMOS integration

    Science.gov (United States)

    Goossens, Stijn; Navickaite, Gabriele; Monasterio, Carles; Gupta, Shuchi; Piqueras, Juan José; Pérez, Raúl; Burwell, Gregory; Nikitskiy, Ivan; Lasanta, Tania; Galán, Teresa; Puma, Eric; Centeno, Alba; Pesquera, Amaia; Zurutuza, Amaia; Konstantatos, Gerasimos; Koppens, Frank

    2017-06-01

    Integrated circuits based on complementary metal-oxide-semiconductors (CMOS) are at the heart of the technological revolution of the past 40 years, enabling compact and low-cost microelectronic circuits and imaging systems. However, the diversification of this platform into applications other than microcircuits and visible-light cameras has been impeded by the difficulty to combine semiconductors other than silicon with CMOS. Here, we report the monolithic integration of a CMOS integrated circuit with graphene, operating as a high-mobility phototransistor. We demonstrate a high-resolution, broadband image sensor and operate it as a digital camera that is sensitive to ultraviolet, visible and infrared light (300-2,000 nm). The demonstrated graphene-CMOS integration is pivotal for incorporating 2D materials into the next-generation microelectronics, sensor arrays, low-power integrated photonics and CMOS imaging systems covering visible, infrared and terahertz frequencies.

  9. Team-Based Care: A Concept Analysis.

    Science.gov (United States)

    Baik, Dawon

    2017-10-01

    The purpose of this concept analysis is to clarify and analyze the concept of team-based care in clinical practice. Team-based care has garnered attention as a way to enhance healthcare delivery and patient care related to quality and safety. However, there is no consensus on the concept of team-based care; as a result, the lack of common definition impedes further studies on team-based care. This analysis was conducted using Walker and Avant's strategy. Literature searches were conducted using PubMed, Cumulative Index to Nursing and Allied Health Literature (CINAHL), and PsycINFO, with a timeline from January 1985 to December 2015. The analysis demonstrates that the concept of team-based care has three core attributes: (a) interprofessional collaboration, (b) patient-centered approach, and (c) integrated care process. This is accomplished through understanding other team members' roles and responsibilities, a climate of mutual respect, and organizational support. Consequences of team-based care are identified with three aspects: (a) patient, (b) healthcare professional, and (c) healthcare organization. This concept analysis helps better understand the characteristics of team-based care in the clinical practice as well as promote the development of a theoretical definition of team-based care. © 2016 Wiley Periodicals, Inc.

  10. Vertically Integrated Seismological Analysis II : Inference

    Science.gov (United States)

    Arora, N. S.; Russell, S.; Sudderth, E.

    2009-12-01

    Methods for automatically associating detected waveform features with hypothesized seismic events, and localizing those events, are a critical component of efforts to verify the Comprehensive Test Ban Treaty (CTBT). As outlined in our companion abstract, we have developed a hierarchical model which views detection, association, and localization as an integrated probabilistic inference problem. In this abstract, we provide more details on the Markov chain Monte Carlo (MCMC) methods used to solve this inference task. MCMC generates samples from a posterior distribution π(x) over possible worlds x by defining a Markov chain whose states are the worlds x, and whose stationary distribution is π(x). In the Metropolis-Hastings (M-H) method, transitions in the Markov chain are constructed in two steps. First, given the current state x, a candidate next state x‧ is generated from a proposal distribution q(x‧ | x), which may be (more or less) arbitrary. Second, the transition to x‧ is not automatic, but occurs with an acceptance probability—α(x‧ | x) = min(1, π(x‧)q(x | x‧)/π(x)q(x‧ | x)). The seismic event model outlined in our companion abstract is quite similar to those used in multitarget tracking, for which MCMC has proved very effective. In this model, each world x is defined by a collection of events, a list of properties characterizing those events (times, locations, magnitudes, and types), and the association of each event to a set of observed detections. The target distribution π(x) = P(x | y), the posterior distribution over worlds x given the observed waveform data y at all stations. Proposal distributions then implement several types of moves between worlds. For example, birth moves create new events; death moves delete existing events; split moves partition the detections for an event into two new events; merge moves combine event pairs; swap moves modify the properties and assocations for pairs of events. Importantly, the rules for

  11. Functional Module Analysis for Gene Coexpression Networks with Network Integration.

    Science.gov (United States)

    Zhang, Shuqin; Zhao, Hongyu; Ng, Michael K

    2015-01-01

    Network has been a general tool for studying the complex interactions between different genes, proteins, and other small molecules. Module as a fundamental property of many biological networks has been widely studied and many computational methods have been proposed to identify the modules in an individual network. However, in many cases, a single network is insufficient for module analysis due to the noise in the data or the tuning of parameters when building the biological network. The availability of a large amount of biological networks makes network integration study possible. By integrating such networks, more informative modules for some specific disease can be derived from the networks constructed from different tissues, and consistent factors for different diseases can be inferred. In this paper, we have developed an effective method for module identification from multiple networks under different conditions. The problem is formulated as an optimization model, which combines the module identification in each individual network and alignment of the modules from different networks together. An approximation algorithm based on eigenvector computation is proposed. Our method outperforms the existing methods, especially when the underlying modules in multiple networks are different in simulation studies. We also applied our method to two groups of gene coexpression networks for humans, which include one for three different cancers, and one for three tissues from the morbidly obese patients. We identified 13 modules with three complete subgraphs, and 11 modules with two complete subgraphs, respectively. The modules were validated through Gene Ontology enrichment and KEGG pathway enrichment analysis. We also showed that the main functions of most modules for the corresponding disease have been addressed by other researchers, which may provide the theoretical basis for further studying the modules experimentally.

  12. From organizational integration to clinical integration: analysis of the path between one level of integration to another using official documents

    Science.gov (United States)

    Mandza, Matey; Gagnon, Dominique; Carrier, Sébastien; Belzile, Louise; Demers, Louis

    2010-01-01

    Purpose Services’ integration comprises organizational, normative, economic, informational and clinical dimensions. Since 2004, the province of Quebec has devoted significant efforts to unify the governance of the main health and social care organizations of its various territories. Notwithstanding the uniformity of the national plan’s prescription, the territorial integration modalities greatly vary across the province. Theory This research is based upon a conceptual model of integration that comprises six components: inter-organizational partnership, case management, standardized assessment, a single entry point, a standardized service planning tool and a shared clinical file. Methods We conducted an embedded case study in six contrasted sites in terms of their level of integration. All documents prescribing the implementation of integration were retrieved and analyzed. Results and conclusions The analyzed documents demonstrate a growing local appropriation of the current integrative reform. Interestingly however, no link seems to exist between the quality of local prescriptions and the level of integration achieved in each site. This finding leads us to hypothesize that the variable quality of the operational accompaniment offered to implement these prescriptions is a variable in play.

  13. Integration and global analysis of isothermal titration calorimetry data for studying macromolecular interactions.

    Science.gov (United States)

    Brautigam, Chad A; Zhao, Huaying; Vargas, Carolyn; Keller, Sandro; Schuck, Peter

    2016-05-01

    Isothermal titration calorimetry (ITC) is a powerful and widely used method to measure the energetics of macromolecular interactions by recording a thermogram of differential heating power during a titration. However, traditional ITC analysis is limited by stochastic thermogram noise and by the limited information content of a single titration experiment. Here we present a protocol for bias-free thermogram integration based on automated shape analysis of the injection peaks, followed by combination of isotherms from different calorimetric titration experiments into a global analysis, statistical analysis of binding parameters and graphical presentation of the results. This is performed using the integrated public-domain software packages NITPIC, SEDPHAT and GUSSI. The recently developed low-noise thermogram integration approach and global analysis allow for more precise parameter estimates and more reliable quantification of multisite and multicomponent cooperative and competitive interactions. Titration experiments typically take 1-2.5 h each, and global analysis usually takes 10-20 min.

  14. Overcoming barriers to integrating economic analysis into risk assessment.

    Science.gov (United States)

    Hoffmann, Sandra

    2011-09-01

    Regulatory risk analysis is designed to provide decisionmakers with a clearer understanding of how policies are likely to affect risk. The systems that produce risk are biological, physical, and social and economic. As a result, risk analysis is an inherently interdisciplinary task. Yet in practice, risk analysis has been interdisciplinary in only limited ways. Risk analysis could provide more accurate assessments of risk if there were better integration of economics and other social sciences into risk assessment itself. This essay examines how discussions about risk analysis policy have influenced the roles of various disciplines in risk analysis. It explores ways in which integrated bio/physical-economic modeling could contribute to more accurate assessments of risk. It reviews examples of the kind of integrated economics-bio/physical modeling that could be used to enhance risk assessment. The essay ends with a discussion of institutional barriers to greater integration of economic modeling into risk assessment and provides suggestions on how these might be overcome. © 2011 Society for Risk Analysis.

  15. A network analysis of leadership theory : the infancy of integration.

    OpenAIRE

    Meuser, J. D.; Gardner, W. L.; Dinh, J. E.; Hu, J.; Liden, R. C.; Lord, R. G.

    2016-01-01

    We investigated the status of leadership theory integration by reviewing 14 years of published research (2000 through 2013) in 10 top journals (864 articles). The authors of these articles examined 49 leadership approaches/theories, and in 293 articles, 3 or more of these leadership approaches were included in their investigations. Focusing on these articles that reflected relatively extensive integration, we applied an inductive approach and used graphic network analysis as a guide for drawi...

  16. An integrated internal flow analysis for ramjet propulsion system

    Science.gov (United States)

    Hsieh, Shih-Yang

    An integrated numerical analysis has been conducted to study the ramjet internal flowfield. Emphasis is placed on the establishment of a unified numerical scheme and accurate representation of the internal flow development. The theoretical model is based on the complete conservation equations of mass, momentum, energy, and species concentration, with consideration of finite-rate chemical reactions and variable properties. Turbulence closure is achieved using a low-Reynolds number k-epsilon two-equation model. A new computation procedure capable of treating time-accurate, chemically reacting flows over a wide range of Mach number was developed. This numerical scheme allows for a unified treatment of the entire flowfield in a ramjet engine, including both the supersonic inlet and the combustion chamber. The algorithm is based on scaling the pressure terms in the momentum equations and preconditioning the conservation equations to circumvent numerical difficulties at low Mach numbers. The resulting equations are solved using the lower-upper (LU) factorization method in a fully-coupled manner, with the incorporation of a flux-differencing upwind TVD scheme to achieve high-order spatial accuracy. The transient behavior of the modeled system is preserved through implementation of the dual time-stepping integration technique. Calculations have been carried out for the flowfield in a typical ramjet engine consisting of an axisymmetric mixed-compression supersonic inlet and a coaxial dump combustor. Distinguished shock structures in the forward section of the inlet were clearly captured. The boundary layer thickening and flow separation behind the terminal shock due to shock/boundary-layer interactions and inlet configuration were observed. The mutual coupling between the inlet and combustor was carefully examined. In particular, strong vortices arising from the inlet shock/acoustic and shock/boundary-layer interactions may convect downstream and affect the combustion

  17. Integrated Network Analysis and Effective Tools in Plant Systems Biology

    Directory of Open Access Journals (Sweden)

    Atsushi eFukushima

    2014-11-01

    Full Text Available One of the ultimate goals in plant systems biology is to elucidate the genotype-phenotype relationship in plant cellular systems. Integrated network analysis that combines omics data with mathematical models has received particular attention. Here we focus on the latest cutting-edge computational advances that facilitate their combination. We highlight (1 network visualization tools, (2 pathway analyses, (3 genome-scale metabolic reconstruction, and (4 the integration of high-throughput experimental data and mathematical models. Multi-omics data that contain the genome, transcriptome, proteome, and metabolome and mathematical models are expected to integrate and expand our knowledge of complex plant metabolisms.

  18. Momentum integral network method for thermal-hydraulic transient analysis

    International Nuclear Information System (INIS)

    Van Tuyle, G.J.

    1983-01-01

    A new momentum integral network method has been developed, and tested in the MINET computer code. The method was developed in order to facilitate the transient analysis of complex fluid flow and heat transfer networks, such as those found in the balance of plant of power generating facilities. The method employed in the MINET code is a major extension of a momentum integral method reported by Meyer. Meyer integrated the momentum equation over several linked nodes, called a segment, and used a segment average pressure, evaluated from the pressures at both ends. Nodal mass and energy conservation determined nodal flows and enthalpies, accounting for fluid compression and thermal expansion

  19. Ontology Based Resolution of Semantic Conflicts in Information Integration

    Institute of Scientific and Technical Information of China (English)

    LU Han; LI Qing-zhong

    2004-01-01

    Semantic conflict is the conflict caused by using different ways in heterogeneous systems to express the same entity in reality.This prevents information integration from accomplishing semantic coherence.Since ontology helps to solve semantic problems, this area has become a hot topic in information integration.In this paper, we introduce semantic conflict into information integration of heterogeneous applications.We discuss the origins and categories of the conflict, and present an ontology-based schema mapping approach to eliminate semantic conflicts.

  20. Evidence-based integrative medicine in clinical veterinary oncology.

    Science.gov (United States)

    Raditic, Donna M; Bartges, Joseph W

    2014-09-01

    Integrative medicine is the combined use of complementary and alternative medicine with conventional or traditional Western medicine systems. The demand for integrative veterinary medicine is growing, but evidence-based research on its efficacy is limited. In veterinary clinical oncology, such research could be translated to human medicine, because veterinary patients with spontaneous tumors are valuable translational models for human cancers. An overview of specific herbs, botanics, dietary supplements, and acupuncture evaluated in dogs, in vitro canine cells, and other relevant species both in vivo and in vitro is presented for their potential use as integrative therapies in veterinary clinical oncology. Published by Elsevier Inc.

  1. Beyond vertical integration--Community based medical education.

    Science.gov (United States)

    Kennedy, Emma Margaret

    2006-11-01

    The term 'vertical integration' is used broadly in medical education, sometimes when discussing community based medical education (CBME). This article examines the relevance of the term 'vertical integration' and provides an alternative perspective on the complexities of facilitating the CBME process. The principles of learner centredness, patient centredness and flexibility are fundamental to learning in the diverse contexts of 'community'. Vertical integration as a structural concept is helpful for academic organisations but has less application to education in the community setting; a different approach illuminates the strengths and challenges of CBME that need consideration by these organisations.

  2. Semantic integration of gene expression analysis tools and data sources using software connectors

    Science.gov (United States)

    2013-01-01

    Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools

  3. Study on integrated design and analysis platform of NPP

    International Nuclear Information System (INIS)

    Lu Dongsen; Gao Zuying; Zhou Zhiwei

    2001-01-01

    Many calculation software have been developed to nuclear system's design and safety analysis, such as structure design software, fuel design and manage software, thermal hydraulic analysis software, severe accident simulation software, etc. This study integrates those software to a platform, develops visual modeling tool for Retran, NGFM90. And in this platform, a distribution calculation method is also provided for couple calculation between different software. The study will improve the design and analysis of NPP

  4. Integrated analysis of DCH in Surry

    International Nuclear Information System (INIS)

    Dingman, S.E.; Harper, F.T.; Pilch, M.M.; Washington, K.E.

    1993-01-01

    An evaluation of the key elements affecting Direct Containment Heating (DCH) was performed for the Surry plant. This involved determining the dominant high pressure core damage sequences, the probability of proceeding to vessel breach at high pressure, the DCH loads, and the containment strength. Each of these factors was evaluated separately, and then the results were combined to give the overall threat from DCH. The maximum containment failure probability by DCH for Surry is 10 -3 when considering four base DCH scenarios and using the two-cell equilibrium (TCE) model. However, higher contamination failure probabilities are estimated in sensitivity cases. When the depressurization and containment loads aspects are combined, the containment failure probability (conditional on station blackout sequence) is less than 19 -2 . CONTAIN calculations were performed to provide insights regarding DCH phenomenological uncertainties and potential conservatisms in the TCE model. The CONTAIN calculations indicated that the TCE calculations were conservative for Surry and that the dominant factors were neglect of heat transfer to surroundings and complete combustion of hydrogen on DCH time scales

  5. Short circuit analysis of distribution system with integration of DG

    DEFF Research Database (Denmark)

    Su, Chi; Liu, Zhou; Chen, Zhe

    2014-01-01

    and as a result bring challenges to the network protection system. This problem has been frequently discussed in the literature, but mostly considering only the balanced fault situation. This paper presents an investigation on the influence of full converter based wind turbine (WT) integration on fault currents......Integration of distributed generation (DG) such as wind turbines into distribution system is increasing all around the world, because of the flexible and environmentally friendly characteristics. However, DG integration may change the pattern of the fault currents in the distribution system...... during both balanced and unbalanced faults. Major factors such as external grid short circuit power capacity, WT integration location, connection type of WT integration transformer are taken into account. In turn, the challenges brought to the protection system in the distribution network are presented...

  6. Development of web-based integrity evaluation system for primary components in a nuclear power plant

    International Nuclear Information System (INIS)

    Lee, S.M.; Kim, J.C.; Choi, J.B.; Kim, Y.J.; Choi, S.N.; Jang, K.S.; Hong, S.Y.

    2004-01-01

    A nuclear power plant is composed of a number of primary components. Maintaining the integrity of these components is one of the most critical issues in nuclear industry. In order to maintain the integrity of these primary components, a complicated procedure is required including periodical in-service inspection, failure assessment, fracture mechanics analysis, etc. Also, experts in different fields have to co-operate to resolve the integrity issues on the basis of inspection results. This integrity evaluation process usually takes long, and thus, is detrimental for the plant productivity. Therefore, an effective safety evaluation system is essential to manage integrity issues on a nuclear power plant. In this paper, a web-based integrity evaluation system for primary components in a nuclear power plant is proposed. The proposed system, which is named as WEBIES (web-based integrity evaluation system), has been developed in the form of 3-tier system architecture. The system consists of three servers; application program server, user interface program server and data warehouse server. The application program server includes the defect acceptance analysis module and the fracture mechanics analysis module which are programmed on the basis of ASME sec. XI, appendix A. The data warehouse server provides data for the integrity evaluation including material properties, geometry information, inspection data and stress data. The user interface program server provides information to all co- workers in the field of integrity evaluation. The developed system provides engineering knowledge-based information and concurrent and collaborative working environment through internet, and thus, is expected to raise the efficiency of integrity evaluation procedures on primary components of a nuclear power plant. (orig.)

  7. Development of web-based integrity evaluation system for primary components in a nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Lee, S.M.; Kim, J.C.; Choi, J.B.; Kim, Y.J. [SAFE Research Center, Sungkyunkwan Univ., Suwon (Korea); Choi, S.N.; Jang, K.S.; Hong, S.Y. [Korea Electronic Power Research Inst., Daejeon (Korea)

    2004-07-01

    A nuclear power plant is composed of a number of primary components. Maintaining the integrity of these components is one of the most critical issues in nuclear industry. In order to maintain the integrity of these primary components, a complicated procedure is required including periodical in-service inspection, failure assessment, fracture mechanics analysis, etc. Also, experts in different fields have to co-operate to resolve the integrity issues on the basis of inspection results. This integrity evaluation process usually takes long, and thus, is detrimental for the plant productivity. Therefore, an effective safety evaluation system is essential to manage integrity issues on a nuclear power plant. In this paper, a web-based integrity evaluation system for primary components in a nuclear power plant is proposed. The proposed system, which is named as WEBIES (web-based integrity evaluation system), has been developed in the form of 3-tier system architecture. The system consists of three servers; application program server, user interface program server and data warehouse server. The application program server includes the defect acceptance analysis module and the fracture mechanics analysis module which are programmed on the basis of ASME sec. XI, appendix A. The data warehouse server provides data for the integrity evaluation including material properties, geometry information, inspection data and stress data. The user interface program server provides information to all co- workers in the field of integrity evaluation. The developed system provides engineering knowledge-based information and concurrent and collaborative working environment through internet, and thus, is expected to raise the efficiency of integrity evaluation procedures on primary components of a nuclear power plant. (orig.)

  8. Students' perceptions of vertical and horizontal integration in a discipline-based dental school.

    Science.gov (United States)

    Postma, T C; White, J G

    2017-05-01

    Integration is a key concern in discipline-based undergraduate dental curricula. Therefore, this study compared feedback on integration from students who participated in different instructional designs in a Comprehensive Patient Care course. The study was conducted at the University of Pretoria (2009-2011). Third-year cohorts (Cohorts A, B and C) participated in pre-clinical case-based learning, whilst fourth-year cohorts (Cohorts D and E) received didactic teaching in Comprehensive Patient Care. Cohorts A, D and E practised clinical Comprehensive Patient Care in a discipline-based clinic. Cohort B conducted their Comprehensive Patient Care patient examinations in a dedicated facility supervised by dedicated faculty responsible to teach integration. Students had to indicate on visual analogue scales whether the way they were taught at the school helped them to integrate knowledge from the same (horizontal integration) and preceding (vertical integration) year of study. The end-points of the scales were defined as 'definitely' and 'not at all'. Analysis of variance (ANOVA) was employed to measure the differences between cohorts according to the year of study. Third-year case-based learning cohorts rated the horizontal integration close to 80/100 and vertical integration ranging from 64 to 71/100. In year four, Cohort B rated vertical and horizontal integration 9-15% higher (ANOVA, P horizontal integration 11-18% higher (ANOVA, P integration in the discipline-based undergraduate dental curriculum. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  9. IMG: the integrated microbial genomes database and comparative analysis system

    Science.gov (United States)

    Markowitz, Victor M.; Chen, I-Min A.; Palaniappan, Krishna; Chu, Ken; Szeto, Ernest; Grechkin, Yuri; Ratner, Anna; Jacob, Biju; Huang, Jinghua; Williams, Peter; Huntemann, Marcel; Anderson, Iain; Mavromatis, Konstantinos; Ivanova, Natalia N.; Kyrpides, Nikos C.

    2012-01-01

    The Integrated Microbial Genomes (IMG) system serves as a community resource for comparative analysis of publicly available genomes in a comprehensive integrated context. IMG integrates publicly available draft and complete genomes from all three domains of life with a large number of plasmids and viruses. IMG provides tools and viewers for analyzing and reviewing the annotations of genes and genomes in a comparative context. IMG's data content and analytical capabilities have been continuously extended through regular updates since its first release in March 2005. IMG is available at http://img.jgi.doe.gov. Companion IMG systems provide support for expert review of genome annotations (IMG/ER: http://img.jgi.doe.gov/er), teaching courses and training in microbial genome analysis (IMG/EDU: http://img.jgi.doe.gov/edu) and analysis of genomes related to the Human Microbiome Project (IMG/HMP: http://www.hmpdacc-resources.org/img_hmp). PMID:22194640

  10. Evaluation of time integration methods for transient response analysis of nonlinear structures

    International Nuclear Information System (INIS)

    Park, K.C.

    1975-01-01

    Recent developments in the evaluation of direct time integration methods for the transient response analysis of nonlinear structures are presented. These developments, which are based on local stability considerations of an integrator, show that the interaction between temporal step size and nonlinearities of structural systems has a pronounced effect on both accuracy and stability of a given time integration method. The resulting evaluation technique is applied to a model nonlinear problem, in order to: 1) demonstrate that it eliminates the present costly process of evaluating time integrator for nonlinear structural systems via extensive numerical experiments; 2) identify the desirable characteristics of time integration methods for nonlinear structural problems; 3) develop improved stiffly-stable methods for application to nonlinear structures. Extension of the methodology for examination of the interaction between a time integrator and the approximate treatment of nonlinearities (such as due to pseudo-force or incremental solution procedures) is also discussed. (Auth.)

  11. FACILITATING INTEGRATED SPATIO-TEMPORAL VISUALIZATION AND ANALYSIS OF HETEROGENEOUS ARCHAEOLOGICAL AND PALAEOENVIRONMENTAL RESEARCH DATA

    Directory of Open Access Journals (Sweden)

    C. Willmes

    2012-07-01

    Full Text Available In the context of the Collaborative Research Centre 806 "Our way to Europe" (CRC806, a research database is developed for integrating data from the disciplines of archaeology, the geosciences and the cultural sciences to facilitate integrated access to heterogeneous data sources. A practice-oriented data integration concept and its implementation is presented in this contribution. The data integration approach is based on the application of Semantic Web Technology and is applied to the domains of archaeological and palaeoenvironmental data. The aim is to provide integrated spatio-temporal access to an existing wealth of data to facilitate research on the integrated data basis. For the web portal of the CRC806 research database (CRC806-Database, a number of interfaces and applications have been evaluated, developed and implemented for exposing the data to interactive analysis and visualizations.

  12. Energy efficiency analysis of styrene production by adiabatic ethylbenzene dehydrogenation using exergy analysis and heat integration

    Directory of Open Access Journals (Sweden)

    Ali Emad

    2018-03-01

    Full Text Available Styrene is a valuable commodity for polymer industries. The main route for producing styrene by dehydrogenation of ethylbenzene consumes a substantial amount of energy because of the use of high-temperature steam. In this work, the process energy requirements and recovery are studied using Exergy analysis and Heat Integration (HI based on Pinch design method. The amount of steam plays a key role in the trade-off between Styrene yield and energy savings. Therefore, optimizing the operating conditions for energy reduction is infeasible. Heat integration indicated an insignificant reduction in the net energy demand and exergy losses, but 24% and 34% saving in external heating and cooling duties, respectively. When the required steam is generated by recovering the heat of the hot reactor effluent, a considerable saving in the net energy demand, as well as the heating and cooling utilities, can be achieved. Moreover, around 68% reduction in the exergy destruction is observed.

  13. An integrated 3D design, modeling and analysis resource for SSC detector systems

    International Nuclear Information System (INIS)

    DiGiacomo, N.J.; Adams, T.; Anderson, M.K.; Davis, M.; Easom, B.; Gliozzi, J.; Hale, W.M.; Hupp, J.; Killian, K.; Krohn, M.; Leitch, R.; Lajczok, M.; Mason, L.; Mitchell, J.; Pohlen, J.; Wright, T.

    1989-01-01

    Integrated computer aided engineering and design (CAE/CAD) is having a significant impact on the way design, modeling and analysis is performed, from system concept exploration and definition through final design and integration. Experience with integrated CAE/CAD in high technology projects of scale and scope similar to SSC detectors leads them to propose an integrated computer-based design, modeling and analysis resource aimed specifically at SSC detector system development. The resource architecture emphasizes value-added contact with data and efficient design, modeling and analysis of components, sub-systems or systems with fidelity appropriate to the task. They begin with a general examination of the design, modeling and analysis cycle in high technology projects, emphasizing the transition from the classical islands of automation to the integrated CAE/CAD-based approach. They follow this with a discussion of lessons learned from various attempts to design and implement integrated CAE/CAD systems in scientific and engineering organizations. They then consider the requirements for design, modeling and analysis during SSC detector development, and describe an appropriate resource architecture. They close with a report on the status of the resource and present some results that are indicative of its performance. 10 refs., 7 figs

  14. Base compaction specification feasibility analysis.

    Science.gov (United States)

    2012-12-01

    The objective of this research is to establish the technical engineering and cost : analysis concepts that will enable WisDOT management to objectively evaluate the : feasibility of switching construction specification philosophies for aggregate base...

  15. Integrative sparse principal component analysis of gene expression data.

    Science.gov (United States)

    Liu, Mengque; Fan, Xinyan; Fang, Kuangnan; Zhang, Qingzhao; Ma, Shuangge

    2017-12-01

    In the analysis of gene expression data, dimension reduction techniques have been extensively adopted. The most popular one is perhaps the PCA (principal component analysis). To generate more reliable and more interpretable results, the SPCA (sparse PCA) technique has been developed. With the "small sample size, high dimensionality" characteristic of gene expression data, the analysis results generated from a single dataset are often unsatisfactory. Under contexts other than dimension reduction, integrative analysis techniques, which jointly analyze the raw data of multiple independent datasets, have been developed and shown to outperform "classic" meta-analysis and other multidatasets techniques and single-dataset analysis. In this study, we conduct integrative analysis by developing the iSPCA (integrative SPCA) method. iSPCA achieves the selection and estimation of sparse loadings using a group penalty. To take advantage of the similarity across datasets and generate more accurate results, we further impose contrasted penalties. Different penalties are proposed to accommodate different data conditions. Extensive simulations show that iSPCA outperforms the alternatives under a wide spectrum of settings. The analysis of breast cancer and pancreatic cancer data further shows iSPCA's satisfactory performance. © 2017 WILEY PERIODICALS, INC.

  16. NEW CORPORATE REPORTING TRENDS. ANALYSIS ON THE EVOLUTION OF INTEGRATED REPORTING

    Directory of Open Access Journals (Sweden)

    Dragu Ioana

    2013-07-01

    Full Text Available The objective of this paper is to present the new corporate reporting trends of the 21st century. Integrated reporting has been launched through a common initiative of the International Integrated Reporting Committee and global accounting organizations. However, the history of integrated reports starts before the initiative of the IIRC, and goes back in time when large corporations begun to disclose sustainability and corporate social responsibility information. Further on, we claim that the initial sustainability and CSR reports that were issued separate along with the financial annual report represent the predecessors of the current integrated reports. The paper consists of a literature review analysis on the evolution of integrated reporting, from the first stage of international non-financial initiatives, up to the current state of a single integrated annual report. In order to understand the background of integrated reporting we analyze the most relevant research papers on corporate reporting, focusing on the international organizations’ perspective on non-financial reporting, in general, and integrated reporting, in particular. Based on the literature overview, we subtracted the essential information for setting the framework of the integrated reporting evolution. The findings suggest that we can delimitate three main stages in the evolution of integrated reports, namely: the non-financial reporting initiatives, the sustainability era, and the revolution of integrated reporting. We illustrate these results by presenting each relevant point in the history of integrated reporting on a time scale axis, developed with the purpose of defining the road to integrated reporting at theoretical, empirical, and practical levels. We consider the current investigation as relevant for future studies concerning integrated reports, as this is a new area of research still in its infancy. The originality of the research derives from the novelty of

  17. Probabilistic Steady-State Operation and Interaction Analysis of Integrated Electricity, Gas and Heating Systems

    Directory of Open Access Journals (Sweden)

    Lun Yang

    2018-04-01

    Full Text Available The existing studies on probabilistic steady-state analysis of integrated energy systems (IES are limited to integrated electricity and gas networks or integrated electricity and heating networks. This paper proposes a probabilistic steady-state analysis of integrated electricity, gas and heating networks (EGH-IES. Four typical operation modes of an EGH-IES are presented at first. The probabilistic energy flow problem of the EGS-IES considering its operation modes and correlated uncertainties in wind/solar power and electricity/gas/heat loads is then formulated and solved by the Monte Carlo method based on Latin hypercube sampling and Nataf transformation. Numerical simulations are conducted on a sample EGH-IES working in the “electricity/gas following heat” mode to verify the probabilistic analysis proposed in this paper and to study the effects of uncertainties and correlations on the operation of the EGH-IES, especially uncertainty transmissions among the subnetworks.

  18. Recruitment recommendation system based on fuzzy measure and indeterminate integral

    Science.gov (United States)

    Yin, Xin; Song, Jinjie

    2017-08-01

    In this study, we propose a comprehensive evaluation approach based on indeterminate integral. By introducing the related concepts of indeterminate integral and their formulas into the recruitment recommendation system, we can calculate the suitability of each job for different applicants with the defined importance for each criterion listed in the job advertisements, the association between different criteria and subjective assessment as the prerequisite. Thus we can make recommendations to the applicants based on the score of the suitability of each job from high to low. In the end, we will exemplify the usefulness and practicality of this system with samples.

  19. Integrated Genome-Based Studies of Shewanella Echophysiology

    Energy Technology Data Exchange (ETDEWEB)

    Margrethe H. Serres

    2012-06-29

    Shewanella oneidensis MR-1 is a motile, facultative {gamma}-Proteobacterium with remarkable respiratory versatility; it can utilize a range of organic and inorganic compounds as terminal electronacceptors for anaerobic metabolism. The ability to effectively reduce nitrate, S0, polyvalent metals andradionuclides has established MR-1 as an important model dissimilatory metal-reducing microorganism for genome-based investigations of biogeochemical transformation of metals and radionuclides that are of concern to the U.S. Department of Energy (DOE) sites nationwide. Metal-reducing bacteria such as Shewanella also have a highly developed capacity for extracellular transfer of respiratory electrons to solid phase Fe and Mn oxides as well as directly to anode surfaces in microbial fuel cells. More broadly, Shewanellae are recognized free-living microorganisms and members of microbial communities involved in the decomposition of organic matter and the cycling of elements in aquatic and sedimentary systems. To function and compete in environments that are subject to spatial and temporal environmental change, Shewanella must be able to sense and respond to such changes and therefore require relatively robust sensing and regulation systems. The overall goal of this project is to apply the tools of genomics, leveraging the availability of genome sequence for 18 additional strains of Shewanella, to better understand the ecophysiology and speciation of respiratory-versatile members of this important genus. To understand these systems we propose to use genome-based approaches to investigate Shewanella as a system of integrated networks; first describing key cellular subsystems - those involved in signal transduction, regulation, and metabolism - then building towards understanding the function of whole cells and, eventually, cells within populations. As a general approach, this project will employ complimentary "top-down" - bioinformatics-based genome functional predictions, high

  20. Leaky Integrate-and-Fire Neuron Circuit Based on Floating-Gate Integrator

    Science.gov (United States)

    Kornijcuk, Vladimir; Lim, Hyungkwang; Seok, Jun Yeong; Kim, Guhyun; Kim, Seong Keun; Kim, Inho; Choi, Byung Joon; Jeong, Doo Seok

    2016-01-01

    The artificial spiking neural network (SNN) is promising and has been brought to the notice of the theoretical neuroscience and neuromorphic engineering research communities. In this light, we propose a new type of artificial spiking neuron based on leaky integrate-and-fire (LIF) behavior. A distinctive feature of the proposed FG-LIF neuron is the use of a floating-gate (FG) integrator rather than a capacitor-based one. The relaxation time of the charge on the FG relies mainly on the tunnel barrier profile, e.g., barrier height and thickness (rather than the area). This opens up the possibility of large-scale integration of neurons. The circuit simulation results offered biologically plausible spiking activity (circuit was subject to possible types of noise, e.g., thermal noise and burst noise. The simulation results indicated remarkable distributional features of interspike intervals that are fitted to Gamma distribution functions, similar to biological neurons in the neocortex. PMID:27242416

  1. Modular Architecture for Integrated Model-Based Decision Support.

    Science.gov (United States)

    Gaebel, Jan; Schreiber, Erik; Oeser, Alexander; Oeltze-Jafra, Steffen

    2018-01-01

    Model-based decision support systems promise to be a valuable addition to oncological treatments and the implementation of personalized therapies. For the integration and sharing of decision models, the involved systems must be able to communicate with each other. In this paper, we propose a modularized architecture of dedicated systems for the integration of probabilistic decision models into existing hospital environments. These systems interconnect via web services and provide model sharing and processing capabilities for clinical information systems. Along the lines of IHE integration profiles from other disciplines and the meaningful reuse of routinely recorded patient data, our approach aims for the seamless integration of decision models into hospital infrastructure and the physicians' daily work.

  2. Perceptions that influence the maintenance of scientific integrity in community-based participatory research.

    Science.gov (United States)

    Kraemer Diaz, Anne E; Spears Johnson, Chaya R; Arcury, Thomas A

    2015-06-01

    Scientific integrity is necessary for strong science; yet many variables can influence scientific integrity. In traditional research, some common threats are the pressure to publish, competition for funds, and career advancement. Community-based participatory research (CBPR) provides a different context for scientific integrity with additional and unique concerns. Understanding the perceptions that promote or discourage scientific integrity in CBPR as identified by professional and community investigators is essential to promoting the value of CBPR. This analysis explores the perceptions that facilitate scientific integrity in CBPR as well as the barriers among a sample of 74 professional and community CBPR investigators from 25 CBPR projects in nine states in the southeastern United States in 2012. There were variations in perceptions associated with team member identity as professional or community investigators. Perceptions identified to promote and discourage scientific integrity in CBPR by professional and community investigators were external pressures, community participation, funding, quality control and supervision, communication, training, and character and trust. Some perceptions such as communication and training promoted scientific integrity whereas other perceptions, such as a lack of funds and lack of trust could discourage scientific integrity. These results demonstrate that one of the most important perceptions in maintaining scientific integrity in CBPR is active community participation, which enables a co-responsibility by scientists and community members to provide oversight for scientific integrity. Credible CBPR science is crucial to empower the vulnerable communities to be heard by those in positions of power and policy making. © 2015 Society for Public Health Education.

  3. AMIC: an expandable integrated analog front-end for light distribution moments analysis

    OpenAIRE

    SPAGGIARI, MICHELE; Herrero Bosch, Vicente; Lerche, Christoph Werner; Aliaga Varea, Ramón José; Monzó Ferrer, José María; Gadea Gironés, Rafael

    2011-01-01

    In this article we introduce AMIC (Analog Moments Integrated Circuit), a novel analog Application Specific Integrated Circuit (ASIC) front-end for Positron Emission Tomography (PET) applications. Its working principle is based on mathematical analysis of light distribution through moments calculation. Each moment provides useful information about light distribution, such as energy, position, depth of interaction, skewness (deformation due to border effect) etc. A current buffer delivers a cop...

  4. [A web-based integrated clinical database for laryngeal cancer].

    Science.gov (United States)

    E, Qimin; Liu, Jialin; Li, Yong; Liang, Chuanyu

    2014-08-01

    To establish an integrated database for laryngeal cancer, and to provide an information platform for laryngeal cancer in clinical and fundamental researches. This database also meet the needs of clinical and scientific use. Under the guidance of clinical expert, we have constructed a web-based integrated clinical database for laryngeal carcinoma on the basis of clinical data standards, Apache+PHP+MySQL technology, laryngeal cancer specialist characteristics and tumor genetic information. A Web-based integrated clinical database for laryngeal carcinoma had been developed. This database had a user-friendly interface and the data could be entered and queried conveniently. In addition, this system utilized the clinical data standards and exchanged information with existing electronic medical records system to avoid the Information Silo. Furthermore, the forms of database was integrated with laryngeal cancer specialist characteristics and tumor genetic information. The Web-based integrated clinical database for laryngeal carcinoma has comprehensive specialist information, strong expandability, high feasibility of technique and conforms to the clinical characteristics of laryngeal cancer specialties. Using the clinical data standards and structured handling clinical data, the database can be able to meet the needs of scientific research better and facilitate information exchange, and the information collected and input about the tumor sufferers are very informative. In addition, the user can utilize the Internet to realize the convenient, swift visit and manipulation on the database.

  5. Argentinean integrated small reactor design and scale economy analysis of integrated reactor

    International Nuclear Information System (INIS)

    Florido, P. C.; Bergallo, J. E.; Ishida, M. V.

    2000-01-01

    This paper describes the design of CAREM, which is Argentinean integrated small reactor project and the scale economy analysis results of integrated reactor. CAREM project consists on the development, design and construction of a small nuclear power plant. CAREM is an advanced reactor conceived with new generation design solutions and standing on the large experience accumulated in the safe operation of Light Water Reactors. The CAREM is an indirect cycle reactor with some distinctive and characteristic features that greatly simplify the reactor and also contribute to a highly level of safety: integrated primary cooling system, self pressurized, primary cooling by natural circulation and safety system relying on passive features. For a fully doupled economic evaluation of integrated reactors done by IREP (Integrated Reactor Evaluation Program) code transferred to IAEA, CAREM have been used as a reference point. The results shows that integrated reactors become competitive with power larger than 200MWe with Argentinean cheapest electricity option. Due to reactor pressure vessel construction limit, low pressure drop steam generator are used to reach power output of 200MWe for natural circulation. For forced circulation, 300MWe can be achieved. (author)

  6. Integration of rocket turbine design and analysis through computer graphics

    Science.gov (United States)

    Hsu, Wayne; Boynton, Jim

    1988-01-01

    An interactive approach with engineering computer graphics is used to integrate the design and analysis processes of a rocket engine turbine into a progressive and iterative design procedure. The processes are interconnected through pre- and postprocessors. The graphics are used to generate the blade profiles, their stacking, finite element generation, and analysis presentation through color graphics. Steps of the design process discussed include pitch-line design, axisymmetric hub-to-tip meridional design, and quasi-three-dimensional analysis. The viscous two- and three-dimensional analysis codes are executed after acceptable designs are achieved and estimates of initial losses are confirmed.

  7. INS integrated motion analysis for autonomous vehicle navigation

    Science.gov (United States)

    Roberts, Barry; Bazakos, Mike

    1991-01-01

    The use of inertial navigation system (INS) measurements to enhance the quality and robustness of motion analysis techniques used for obstacle detection is discussed with particular reference to autonomous vehicle navigation. The approach to obstacle detection used here employs motion analysis of imagery generated by a passive sensor. Motion analysis of imagery obtained during vehicle travel is used to generate range measurements to points within the field of view of the sensor, which can then be used to provide obstacle detection. Results obtained with an INS integrated motion analysis approach are reviewed.

  8. Integrating computer programs for engineering analysis and design

    Science.gov (United States)

    Wilhite, A. W.; Crisp, V. K.; Johnson, S. C.

    1983-01-01

    The design of a third-generation system for integrating computer programs for engineering and design has been developed for the Aerospace Vehicle Interactive Design (AVID) system. This system consists of an engineering data management system, program interface software, a user interface, and a geometry system. A relational information system (ARIS) was developed specifically for the computer-aided engineering system. It is used for a repository of design data that are communicated between analysis programs, for a dictionary that describes these design data, for a directory that describes the analysis programs, and for other system functions. A method is described for interfacing independent analysis programs into a loosely-coupled design system. This method emphasizes an interactive extension of analysis techniques and manipulation of design data. Also, integrity mechanisms exist to maintain database correctness for multidisciplinary design tasks by an individual or a team of specialists. Finally, a prototype user interface program has been developed to aid in system utilization.

  9. Reconstruction of biological networks based on life science data integration

    Directory of Open Access Journals (Sweden)

    Kormeier Benjamin

    2010-06-01

    Full Text Available For the implementation of the virtual cell, the fundamental question is how to model and simulate complex biological networks. Therefore, based on relevant molecular database and information systems, biological data integration is an essential step in constructing biological networks. In this paper, we will motivate the applications BioDWH - an integration toolkit for building life science data warehouses, CardioVINEdb - a information system for biological data in cardiovascular-disease and VANESA- a network editor for modeling and simulation of biological networks. Based on this integration process, the system supports the generation of biological network models. A case study of a cardiovascular-disease related gene-regulated biological network is also presented.

  10. Reconstruction of biological networks based on life science data integration.

    Science.gov (United States)

    Kormeier, Benjamin; Hippe, Klaus; Arrigo, Patrizio; Töpel, Thoralf; Janowski, Sebastian; Hofestädt, Ralf

    2010-10-27

    For the implementation of the virtual cell, the fundamental question is how to model and simulate complex biological networks. Therefore, based on relevant molecular database and information systems, biological data integration is an essential step in constructing biological networks. In this paper, we will motivate the applications BioDWH--an integration toolkit for building life science data warehouses, CardioVINEdb--a information system for biological data in cardiovascular-disease and VANESA--a network editor for modeling and simulation of biological networks. Based on this integration process, the system supports the generation of biological network models. A case study of a cardiovascular-disease related gene-regulated biological network is also presented.

  11. Analysis and Modeling of Integrated Magnetics for LLC resonant Converters

    DEFF Research Database (Denmark)

    Li, Mingxiao; Ouyang, Ziwei; Zhao, Bin

    2017-01-01

    Shunt-inserted transformers are widely used toobtain high leakage inductance. This paper investigates thismethod in depth to make it applicable to integrate resonantinductor for the LLC resonant converters. The analysis andmodel of magnetizing inductance and leakage inductance forshunt...... transformers can provide a significantdifference. The way to obtain the desirable magnetizing andleakage inductance value for LLC resonant converters issimplified by the creation of air gaps together with a magneticshunt. The calculation and relation are validated by finiteelement analysis (FEA) simulations...

  12. Integrated dynamic modeling and management system mission analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, A.K.

    1994-12-28

    This document summarizes the mission analysis performed on the Integrated Dynamic Modeling and Management System (IDMMS). The IDMMS will be developed to provide the modeling and analysis capability required to understand the TWRS system behavior in terms of the identified TWRS performance measures. The IDMMS will be used to demonstrate in a verified and validated manner the satisfactory performance of the TWRS system configuration and assurance that the requirements have been satisfied.

  13. Integrated dynamic modeling and management system mission analysis

    International Nuclear Information System (INIS)

    Lee, A.K.

    1994-01-01

    This document summarizes the mission analysis performed on the Integrated Dynamic Modeling and Management System (IDMMS). The IDMMS will be developed to provide the modeling and analysis capability required to understand the TWRS system behavior in terms of the identified TWRS performance measures. The IDMMS will be used to demonstrate in a verified and validated manner the satisfactory performance of the TWRS system configuration and assurance that the requirements have been satisfied

  14. Heater-Integrated Cantilevers for Nano-Samples Thermogravimetric Analysis

    OpenAIRE

    Toffoli, Valeria; Carrato, Sergio; Lee, Dongkyu; Jeon, Sangmin; Lazzarino, Marco

    2013-01-01

    The design and characteristics of a micro-system for thermogravimetric analysis (TGA) in which heater, temperature sensor and mass sensor are integrated into a single device are presented. The system consists of a suspended cantilever that incorporates a microfabricated resistor, used as both heater and thermometer. A three-dimensional finite element analysis was used to define the structure parameters. TGA sensors were fabricated by standard microlithographic techniques and tested using mill...

  15. On the Integration of Digital Design and Analysis Tools

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    2006-01-01

    The aim of this research is to look into integrated digital design and analysis tools in order to find out if it is suited for use by architects and designers or only by specialists and technicians - and if not, then to look at what can be done to make them more available to architects and design...

  16. Multi-criteria decision analysis integrated with GIS for radio ...

    African Journals Online (AJOL)

    Multi-criteria decision analysis integrated with GIS for radio astronomical observatory site selection in peninsular of Malaysia. R Umar, Z.Z. Abidin, Z.A. Ibrahim, M.K.A. Kamarudin, S.N. Hazmin, A Endut, H Juahir ...

  17. Integrated analysis for genotypic adaptation in rice | Das | African ...

    African Journals Online (AJOL)

    Integrated analysis for genotypic adaptation in rice. S Das, RC Misra, MC Pattnaik, SK Sinha. Abstract. Development of varieties with high yield potential coupled with wide adaptability is an important plant breeding objective. The presence of genotype by environment (GxE) interaction plays a crucial role in determining the ...

  18. Integration of Design and Control Through Model Analysis

    DEFF Research Database (Denmark)

    Russel, Boris Mariboe; Henriksen, Jens Peter; Jørgensen, Sten Bay

    2000-01-01

    of the phenomena models representing the process model identify the relationships between the important process and design variables, which help to understand, define and address some of the issues related to integration of design and control issues. The model analysis is highlighted through examples involving...... processes with mass and/or energy recycle. (C) 2000 Elsevier Science Ltd. All rights reserved....

  19. Tools for integrated sequence-structure analysis with UCSF Chimera

    Directory of Open Access Journals (Sweden)

    Huang Conrad C

    2006-07-01

    Full Text Available Abstract Background Comparing related structures and viewing the structures in the context of sequence alignments are important tasks in protein structure-function research. While many programs exist for individual aspects of such work, there is a need for interactive visualization tools that: (a provide a deep integration of sequence and structure, far beyond mapping where a sequence region falls in the structure and vice versa; (b facilitate changing data of one type based on the other (for example, using only sequence-conserved residues to match structures, or adjusting a sequence alignment based on spatial fit; (c can be used with a researcher's own data, including arbitrary sequence alignments and annotations, closely or distantly related sets of proteins, etc.; and (d interoperate with each other and with a full complement of molecular graphics features. We describe enhancements to UCSF Chimera to achieve these goals. Results The molecular graphics program UCSF Chimera includes a suite of tools for interactive analyses of sequences and structures. Structures automatically associate with sequences in imported alignments, allowing many kinds of crosstalk. A novel method is provided to superimpose structures in the absence of a pre-existing sequence alignment. The method uses both sequence and secondary structure, and can match even structures with very low sequence identity. Another tool constructs structure-based sequence alignments from superpositions of two or more proteins. Chimera is designed to be extensible, and mechanisms for incorporating user-specific data without Chimera code development are also provided. Conclusion The tools described here apply to many problems involving comparison and analysis of protein structures and their sequences. Chimera includes complete documentation and is intended for use by a wide range of scientists, not just those in the computational disciplines. UCSF Chimera is free for non-commercial use and is

  20. Enhancing yeast transcription analysis through integration of heterogeneous data

    DEFF Research Database (Denmark)

    Grotkjær, Thomas; Nielsen, Jens

    2004-01-01

    of Saccharomyces cerevisiae whole genome transcription data. A special focus is on the quantitative aspects of normalisation and mathematical modelling approaches, since they are expected to play an increasing role in future DNA microarray analysis studies. Data analysis is exemplified with cluster analysis......DNA microarray technology enables the simultaneous measurement of the transcript level of thousands of genes. Primary analysis can be done with basic statistical tools and cluster analysis, but effective and in depth analysis of the vast amount of transcription data requires integration with data...... from several heterogeneous data Sources, such as upstream promoter sequences, genome-scale metabolic models, annotation databases and other experimental data. In this review, we discuss how experimental design, normalisation, heterogeneous data and mathematical modelling can enhance analysis...

  1. An emotion-based view of acquisition integration capability

    NARCIS (Netherlands)

    Q.N. Huy (Quy N.); T.H. Reus (Taco)

    2011-01-01

    markdownabstractWe propose an emotion-based view of acquisition integration capability by developing an inter-firm model that focuses on dealing constructively with emotions during various organizational identification processes following mergers and acquisitions. The model describes diverse types

  2. Learning Radiology in an Integrated Problem-Based Learning (PBL ...

    African Journals Online (AJOL)

    Background: The Faculty of Medicine (FoM) has been training health professions in Uganda since 1924. Five years ago, it decided to change the undergraduate curriculum from traditional to Problem Based Learning (PBL) and adopted the SPICES model. Radiology was integrated into the different courses throughout the 5 ...

  3. Evaluation of polymer based third order nonlinear integrated optics devices

    NARCIS (Netherlands)

    Driessen, A.; Hoekstra, Hugo; Blom, F.C.; Horst, F.; Horst, F.; Krijnen, Gijsbertus J.M.; van Schoot, J.B.P.; van Schoot, J.B.P.; Lambeck, Paul; Popma, T.J.A.; Diemeer, Mart

    Nonlinear polymers are promising materials for high speed active integrated optics devices. In this paper we evaluate the perspectives polymer based nonlinear optical devices can offer. Special attention is directed to the materials aspects. In our experimental work we applied mainly Akzo Nobel DANS

  4. An Integrated Textual Case-Based System A. Almu

    African Journals Online (AJOL)

    Almu: An Integrated Textual Case-Based System. 66. The semantic relationship in WordNet links the Four-. Part-Of-Speech of Nouns, Verbs, Adverbs and. Adjectives together to synonyms sets (Miller 1995). Therefore, the words or terms of the problem have to be tagged with their appropriate POS before passing them to the ...

  5. Gain Scheduling of Observer-Based Controllers with Integral Action

    DEFF Research Database (Denmark)

    Trangbæk, Klaus; Stoustrup, Jakob; Bendtsen, Jan Dimon

    2006-01-01

     This paper presents a method for continuous gain scheduling of  observer-based controllers with integral action. Given two stabilising controllers for a given system, explicit state space formulae are presented, allowing to change gradually from one  controller to the other while preserving...

  6. Design of model based LQG control for integrated building systems

    NARCIS (Netherlands)

    Yahiaoui, A.; Hensen, J.L.M.; Soethout, L.L.; Paassen, van A.H.C.

    2006-01-01

    The automation of the operation of integrated building systems requires using modern control techniques to enhance the quality of the building indoor environments. This paper describes the theatrical base and practical application of an optimal dynamic regulator using modelbased Linear Quadratic

  7. Integrated pathway-based transcription regulation network mining and visualization based on gene expression profiles.

    Science.gov (United States)

    Kibinge, Nelson; Ono, Naoaki; Horie, Masafumi; Sato, Tetsuo; Sugiura, Tadao; Altaf-Ul-Amin, Md; Saito, Akira; Kanaya, Shigehiko

    2016-06-01

    Conventionally, workflows examining transcription regulation networks from gene expression data involve distinct analytical steps. There is a need for pipelines that unify data mining and inference deduction into a singular framework to enhance interpretation and hypotheses generation. We propose a workflow that merges network construction with gene expression data mining focusing on regulation processes in the context of transcription factor driven gene regulation. The pipeline implements pathway-based modularization of expression profiles into functional units to improve biological interpretation. The integrated workflow was implemented as a web application software (TransReguloNet) with functions that enable pathway visualization and comparison of transcription factor activity between sample conditions defined in the experimental design. The pipeline merges differential expression, network construction, pathway-based abstraction, clustering and visualization. The framework was applied in analysis of actual expression datasets related to lung, breast and prostrate cancer. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. A numerical integration-based yield estimation method for integrated circuits

    International Nuclear Information System (INIS)

    Liang Tao; Jia Xinzhang

    2011-01-01

    A novel integration-based yield estimation method is developed for yield optimization of integrated circuits. This method tries to integrate the joint probability density function on the acceptability region directly. To achieve this goal, the simulated performance data of unknown distribution should be converted to follow a multivariate normal distribution by using Box-Cox transformation (BCT). In order to reduce the estimation variances of the model parameters of the density function, orthogonal array-based modified Latin hypercube sampling (OA-MLHS) is presented to generate samples in the disturbance space during simulations. The principle of variance reduction of model parameters estimation through OA-MLHS together with BCT is also discussed. Two yield estimation examples, a fourth-order OTA-C filter and a three-dimensional (3D) quadratic function are used for comparison of our method with Monte Carlo based methods including Latin hypercube sampling and importance sampling under several combinations of sample sizes and yield values. Extensive simulations show that our method is superior to other methods with respect to accuracy and efficiency under all of the given cases. Therefore, our method is more suitable for parametric yield optimization. (semiconductor integrated circuits)

  9. A numerical integration-based yield estimation method for integrated circuits

    Energy Technology Data Exchange (ETDEWEB)

    Liang Tao; Jia Xinzhang, E-mail: tliang@yahoo.cn [Key Laboratory of Ministry of Education for Wide Bandgap Semiconductor Materials and Devices, School of Microelectronics, Xidian University, Xi' an 710071 (China)

    2011-04-15

    A novel integration-based yield estimation method is developed for yield optimization of integrated circuits. This method tries to integrate the joint probability density function on the acceptability region directly. To achieve this goal, the simulated performance data of unknown distribution should be converted to follow a multivariate normal distribution by using Box-Cox transformation (BCT). In order to reduce the estimation variances of the model parameters of the density function, orthogonal array-based modified Latin hypercube sampling (OA-MLHS) is presented to generate samples in the disturbance space during simulations. The principle of variance reduction of model parameters estimation through OA-MLHS together with BCT is also discussed. Two yield estimation examples, a fourth-order OTA-C filter and a three-dimensional (3D) quadratic function are used for comparison of our method with Monte Carlo based methods including Latin hypercube sampling and importance sampling under several combinations of sample sizes and yield values. Extensive simulations show that our method is superior to other methods with respect to accuracy and efficiency under all of the given cases. Therefore, our method is more suitable for parametric yield optimization. (semiconductor integrated circuits)

  10. Fostering integrity in postgraduate research: an evidence-based policy and support framework.

    Science.gov (United States)

    Mahmud, Saadia; Bretag, Tracey

    2014-01-01

    Postgraduate research students have a unique position in the debate on integrity in research as students and novice researchers. To assess how far policies for integrity in postgraduate research meet the needs of students as "research trainees," we reviewed online policies for integrity in postgraduate research at nine particular Australian universities against the Australian Code for Responsible Conduct of Research (the Code) and the five core elements of exemplary academic integrity policy identified by Bretag et al. (2011 ), i.e., access, approach, responsibility, detail, and support. We found inconsistency with the Code in the definition of research misconduct and a lack of adequate detail and support. Based on our analysis, previous research, and the literature, we propose a framework for policy and support for postgraduate research that encompasses a consistent and educative approach to integrity maintained across the university at all levels of scholarship and for all stakeholders.

  11. Integration of End-User Cloud Storage for CMS Analysis

    CERN Document Server

    Riahi, Hassen; Álvarez Ayllón, Alejandro; Balcas, Justas; Ciangottini, Diego; Hernández, José M; Keeble, Oliver; Magini, Nicolò; Manzi, Andrea; Mascetti, Luca; Mascheroni, Marco; Tanasijczuk, Andres Jorge; Vaandering, Eric Wayne

    2018-01-01

    End-user Cloud storage is increasing rapidly in popularity in research communities thanks to the collaboration capabilities it offers, namely synchronisation and sharing. CERN IT has implemented a model of such storage named, CERNBox, integrated with the CERN AuthN and AuthZ services. To exploit the use of the end-user Cloud storage for the distributed data analysis activity, the CMS experiment has started the integration of CERNBox as a Grid resource. This will allow CMS users to make use of their own storage in the Cloud for their analysis activities as well as to benefit from synchronisation and sharing capabilities to achieve results faster and more effectively. It will provide an integration model of Cloud storages in the Grid, which is implemented and commissioned over the world’s largest computing Grid infrastructure, Worldwide LHC Computing Grid (WLCG). In this paper, we present the integration strategy and infrastructure changes needed in order to transparently integrate end-user Cloud storage with...

  12. Spatial Data Integration Using Ontology-Based Approach

    Science.gov (United States)

    Hasani, S.; Sadeghi-Niaraki, A.; Jelokhani-Niaraki, M.

    2015-12-01

    In today's world, the necessity for spatial data for various organizations is becoming so crucial that many of these organizations have begun to produce spatial data for that purpose. In some circumstances, the need to obtain real time integrated data requires sustainable mechanism to process real-time integration. Case in point, the disater management situations that requires obtaining real time data from various sources of information. One of the problematic challenges in the mentioned situation is the high degree of heterogeneity between different organizations data. To solve this issue, we introduce an ontology-based method to provide sharing and integration capabilities for the existing databases. In addition to resolving semantic heterogeneity, better access to information is also provided by our proposed method. Our approach is consisted of three steps, the first step is identification of the object in a relational database, then the semantic relationships between them are modelled and subsequently, the ontology of each database is created. In a second step, the relative ontology will be inserted into the database and the relationship of each class of ontology will be inserted into the new created column in database tables. Last step is consisted of a platform based on service-oriented architecture, which allows integration of data. This is done by using the concept of ontology mapping. The proposed approach, in addition to being fast and low cost, makes the process of data integration easy and the data remains unchanged and thus takes advantage of the legacy application provided.

  13. SPATIAL DATA INTEGRATION USING ONTOLOGY-BASED APPROACH

    Directory of Open Access Journals (Sweden)

    S. Hasani

    2015-12-01

    Full Text Available In today's world, the necessity for spatial data for various organizations is becoming so crucial that many of these organizations have begun to produce spatial data for that purpose. In some circumstances, the need to obtain real time integrated data requires sustainable mechanism to process real-time integration. Case in point, the disater management situations that requires obtaining real time data from various sources of information. One of the problematic challenges in the mentioned situation is the high degree of heterogeneity between different organizations data. To solve this issue, we introduce an ontology-based method to provide sharing and integration capabilities for the existing databases. In addition to resolving semantic heterogeneity, better access to information is also provided by our proposed method. Our approach is consisted of three steps, the first step is identification of the object in a relational database, then the semantic relationships between them are modelled and subsequently, the ontology of each database is created. In a second step, the relative ontology will be inserted into the database and the relationship of each class of ontology will be inserted into the new created column in database tables. Last step is consisted of a platform based on service-oriented architecture, which allows integration of data. This is done by using the concept of ontology mapping. The proposed approach, in addition to being fast and low cost, makes the process of data integration easy and the data remains unchanged and thus takes advantage of the legacy application provided.

  14. Extreme Wave Analysis by Integrating Model and Wave Buoy Data

    Directory of Open Access Journals (Sweden)

    Fabio Dentale

    2018-03-01

    Full Text Available Estimating the extreme values of significant wave height (HS, generally described by the HS return period TR function HS(TR and by its confidence intervals, is a necessity in many branches of coastal science and engineering. The availability of indirect wave data generated by global and regional wind and wave model chains have brought radical changes to the estimation procedures of such probability distribution—weather and wave modeling systems are routinely run all over the world, and HS time series for each grid point are produced and published after assimilation (analysis of the ground truth. However, while the sources of such indirect data are numerous, and generally of good quality, many aspects of their procedures are hidden to the users, who cannot evaluate the reliability and the limits of the HS(TR deriving from such data. In order to provide a simple engineering tool to evaluate the probability of extreme sea-states as well as the quality of such estimates, we propose here a procedure based on integrating HS time series generated by model chains with those recorded by wave buoys in the same area.

  15. Patient safety and infection control: bases for curricular integration.

    Science.gov (United States)

    Silva, Andréa Mara Bernardes da; Bim, Lucas Lazarini; Bim, Felipe Lazarini; Sousa, Alvaro Francisco Lopes; Domingues, Pedro Castania Amadio; Nicolussi, Adriana Cristina; Andrade, Denise de

    2018-05-01

    To analyze curricular integration between teaching of patient safety and good infection prevention and control practices. Integrative review, designed to answer the question: "How does curricular integration of content about 'patient safety teaching' and content about 'infection prevention and control practices' occur in undergraduate courses in the health field?". The following databases were searched for primary studies: CINAHL, LILACS, ScienceDirect, Web of Science, Scopus, Europe PMC and MEDLINE. The final sample consisted of 13 studies. After content analysis, primary studies were grouped into two subject categories: "Innovative teaching practices" and "Curricular evaluation. Patient safety related to infection prevention and control practices is present in the curriculum of health undergraduate courses, but is not coordinated with other themes, is taught sporadically, and focuses mainly on hand hygiene.

  16. A surface-integral-equation approach to the propagation of waves in EBG-based devices

    NARCIS (Netherlands)

    Lancellotti, V.; Tijhuis, A.G.

    2012-01-01

    We combine surface integral equations with domain decomposition to formulate and (numerically) solve the problem of electromagnetic (EM) wave propagation inside finite-sized structures. The approach is of interest for (but not limited to) the analysis of devices based on the phenomenon of

  17. Containment integrity analysis with SAMPSON/DCRA module

    International Nuclear Information System (INIS)

    Hosoda, Seigo; Shirakawa, Noriyuki; Naitoh, Masanori

    2006-01-01

    The integrity of PWR containment under a severe accident is analyzed using the debris concrete reaction analysis code. If core fuels melt through the pressure vessel and the debris accumulates on the reactor cavity of a lower part of containment, its temperature continues to rise due to decay heat and the debris ablates the concrete floor. In case that cooling water is issued into the containment cavity and the amount of debris is limited to 30% of core fuels, our analyses showed that the debris could be cooled and frozen so that integrity of containment could hold. (author)

  18. Bayou Choctaw Well Integrity Grading Component Based on Geomechanical Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Park, Byoung [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Geotechnology & Engineering Dept.

    2016-09-08

    This letter report provides a Bayou Choctaw (BC) Strategic Petroleum Reserve (SPR) well grading system based on the geomechanical simulation. The analyses described in this letter were used to evaluate the caverns’ geomechanical effect on wellbore integrity, which is an important component in the well integrity grading system recently developed by Roberts et al. [2015]. Using these analyses, the wellbores for caverns BC-17 and 20 are expected to be significantly impacted by cavern geomechanics, BC-18 and 19 are expected to be medium impacted; and the other caverns are expected to be less impacted.

  19. Exergy analysis of a combined heat and power plant with integrated lignocellulosic ethanol production

    DEFF Research Database (Denmark)

    Lythcke-Jørgensen, Christoffer Ernst; Haglind, Fredrik; Clausen, Lasse Røngaard

    2014-01-01

    production. An exergy analysis is carried out for a modelled polygeneration system in which lignocellulosic ethanol production based on hydrothermal pretreatment is integrated in an existing combined heat and power (CHP) plant. The ethanol facility is driven by steam extracted from the CHP unit when feasible...... district heating production in the ethanol facility. The results suggest that the efficiency of integrating lignocellulosic ethanol production in CHP plants is highly dependent on operation, and it is therefore suggested that the expected operation pattern of such polygeneration system is taken......Lignocellulosic ethanol production is often assumed integrated in polygeneration systems because of its energy intensive nature. The objective of this study is to investigate potential irreversibilities from such integration, and what impact it has on the efficiency of the integrated ethanol...

  20. Analysis of nutrient flows in integrated intensive aquaculture systems

    NARCIS (Netherlands)

    Schneider, O.; Sereti, V.; Eding, E.H.; Verreth, J.A.J.

    2005-01-01

    This paper analyses nutrient conversions, which are taking place in integrated intensive aquaculture systems. In these systems fish is cultured next to other organisms, which are converting otherwise discharged nutrients into valuable products. These conversions are analyzed based on nitrogen and

  1. An example of system integration for RCRA policy analysis

    International Nuclear Information System (INIS)

    Tonn, B.; Goeltz, R.; Schmidt, K.

    1991-01-01

    This paper describes the synthesis of various computer technologies and software systems used on a project to estimate the costs of remediating Solid Waste Management Units (SWMUs) that fall under the corrective action provisions of the Resource Conservation and Recovery Act (RCRA). The project used two databases collected by Research Triangle Institute (RTI) that contain information on SWMUs and a PC-based software system called CORA that develops cost estimates for remediating SWMUs. The project team developed rules to categorize every SWMU in the databases by the kinds of technologies required to clean them up. These results were input into CORA, which estimated costs associated with the technologies. Early on, several computing challenges presented themselves. First, the databases have several hundred thousand records each. Second, the categorization rules could not be written to cover all combinations of variables. Third, CORA is run interactively and the analysis plan called for running CORA tens of thousands of times. Fourth, large data transfers needed to take place between RTI and Oak Ridge National Laboratory. Solutions to these problems required systems integration. SWMU categorization was streamlined by using INTERNET as was the data transfer. SAS was used to create files used by a program called SuperKey that was used to run CORA. Because the analysis plan required the generation of hundreds of thousands of cost estimates, memory management software was needed to allow the portable IBM P70 to do the job. During the course of the project, several other software packages were used, including: SAS System for Personal Computers (SAS/PC), DBase III, LOTUS 1-2-3, PIZAZZ PLUS, LOTUS Freelance Plus, and Word Perfect. Only the comprehensive use of all available hardware and software resources allowed this project to be completed within the time and budget constraints. 5 refs., 3 figs., 3 tabs

  2. Heat integration and analysis of decarbonised IGCC sites

    Energy Technology Data Exchange (ETDEWEB)

    Ng, K.S.; Lopez, Y.; Campbell, G.M.; Sadhukhan, J. [University of Manchester, Manchester (United Kingdom). School of Chemical Engineering & Analytical Science

    2010-02-15

    Integrated gasification combined cycle (IGCC) power generation systems have become of interest due to their high combined heat and power (CHP) generation efficiency and flexibility to include carbon capture and storage (CCS) in order to reduce CO{sub 2} emissions. However, IGCC's biggest challenge is its high cost of energy production. In this study, decarbonised coal IGCC sites integrated with CCS have been investigated for heat integration and economic value analyses. It is envisaged that the high energy production cost of an IGCC site can be offset by maximising site-wide heat recovery and thereby improving the cost of electricity (COE) of CHP generation. Strategies for designing high efficiency CHP networks have been proposed based on thermodynamic heuristics and pinch theory. Additionally, a comprehensive methodology to determine the COE from a process site has been developed. In this work, we have established thermodynamic and economic comparisons between IGCC sites with and without CCS and a trade-off between the degree of decarbonisation and the COE from the heat integrated IGCC sites. The results show that the COE from the heat integrated decarbonised IGCC sites is significantly lower compared to IGCC sites without heat integration making application of CCS in IGCC sites economically competitive.

  3. Graph-based sequence annotation using a data integration approach.

    Science.gov (United States)

    Pesch, Robert; Lysenko, Artem; Hindle, Matthew; Hassani-Pak, Keywan; Thiele, Ralf; Rawlings, Christopher; Köhler, Jacob; Taubert, Jan

    2008-08-25

    The automated annotation of data from high throughput sequencing and genomics experiments is a significant challenge for bioinformatics. Most current approaches rely on sequential pipelines of gene finding and gene function prediction methods that annotate a gene with information from different reference data sources. Each function prediction method contributes evidence supporting a functional assignment. Such approaches generally ignore the links between the information in the reference datasets. These links, however, are valuable for assessing the plausibility of a function assignment and can be used to evaluate the confidence in a prediction. We are working towards a novel annotation system that uses the network of information supporting the function assignment to enrich the annotation process for use by expert curators and predicting the function of previously unannotated genes. In this paper we describe our success in the first stages of this development. We present the data integration steps that are needed to create the core database of integrated reference databases (UniProt, PFAM, PDB, GO and the pathway database Ara-Cyc) which has been established in the ONDEX data integration system. We also present a comparison between different methods for integration of GO terms as part of the function assignment pipeline and discuss the consequences of this analysis for improving the accuracy of gene function annotation. The methods and algorithms presented in this publication are an integral part of the ONDEX system which is freely available from http://ondex.sf.net/.

  4. Development of Spreadsheet-Based Integrated Transaction Processing Systems and Financial Reporting Systems

    Science.gov (United States)

    Ariana, I. M.; Bagiada, I. M.

    2018-01-01

    Development of spreadsheet-based integrated transaction processing systems and financial reporting systems is intended to optimize the capabilities of spreadsheet in accounting data processing. The purpose of this study are: 1) to describe the spreadsheet-based integrated transaction processing systems and financial reporting systems; 2) to test its technical and operational feasibility. This study type is research and development. The main steps of study are: 1) needs analysis (need assessment); 2) developing spreadsheet-based integrated transaction processing systems and financial reporting systems; and 3) testing the feasibility of spreadsheet-based integrated transaction processing systems and financial reporting systems. The technical feasibility include the ability of hardware and operating systems to respond the application of accounting, simplicity and ease of use. Operational feasibility include the ability of users using accounting applications, the ability of accounting applications to produce information, and control applications of the accounting applications. The instrument used to assess the technical and operational feasibility of the systems is the expert perception questionnaire. The instrument uses 4 Likert scale, from 1 (strongly disagree) to 4 (strongly agree). Data were analyzed using percentage analysis by comparing the number of answers within one (1) item by the number of ideal answer within one (1) item. Spreadsheet-based integrated transaction processing systems and financial reporting systems integrate sales, purchases, and cash transaction processing systems to produce financial reports (statement of profit or loss and other comprehensive income, statement of changes in equity, statement of financial position, and statement of cash flows) and other reports. Spreadsheet-based integrated transaction processing systems and financial reporting systems is feasible from the technical aspects (87.50%) and operational aspects (84.17%).

  5. Integrated community-based dementia care: the Geriant model

    Directory of Open Access Journals (Sweden)

    Ludo Glimmerveen

    2015-09-01

    Full Text Available This article gives an in-depth description of the service delivery model of Geriant, a Dutch organization providing community-based care services for people suffering from dementia. Core to its model is the provision of clinical case management, embedded in multidisciplinary dementia care teams. As Geriant's client group includes people from the first presumption of dementia until they can no longer live at home, its care model provides valuable lessons about how different mechanisms of integration are flexibly put to use if the complexity of clients” care needs increases. It showcases how the integration of services for a specific sub-population is combined with alignment of these services with generalist network partners. After a detailed description of the programme and its results, this article builds on the work of Walter Leutz for a conceptual discussion of Geriant's approach to care integration

  6. Plant-wide integrated equipment monitoring and analysis system

    International Nuclear Information System (INIS)

    Morimoto, C.N.; Hunter, T.A.; Chiang, S.C.

    2004-01-01

    A nuclear power plant equipment monitoring system monitors plant equipment and reports deteriorating equipment conditions. The more advanced equipment monitoring systems can also provide information for understanding the symptoms and diagnosing the root cause of a problem. Maximizing the equipment availability and minimizing or eliminating consequential damages are the ultimate goals of equipment monitoring systems. GE Integrated Equipment Monitoring System (GEIEMS) is designed as an integrated intelligent monitoring and analysis system for plant-wide application for BWR plants. This approach reduces system maintenance efforts and equipment monitoring costs and provides information for integrated planning. This paper describes GEIEMS and how the current system is being upgraded to meet General Electric's vision for plant-wide decision support. (author)

  7. INSIGHT: an integrated scoping analysis tool for in-core fuel management of PWR

    International Nuclear Information System (INIS)

    Yamamoto, Akio; Noda, Hidefumi; Ito, Nobuaki; Maruyama, Taiji.

    1997-01-01

    An integrated software tool for scoping analysis of in-core fuel management, INSIGHT, has been developed to automate the scoping analysis and to improve the fuel cycle cost using advanced optimization techniques. INSIGHT is an interactive software tool executed on UNIX based workstations that is equipped with an X-window system. INSIGHT incorporates the GALLOP loading pattern (LP) optimization module that utilizes hybrid genetic algorithms, the PATMAKER interactive LP design module, the MCA multicycle analysis module, an integrated database, and other utilities. Two benchmark problems were analyzed to confirm the key capabilities of INSIGHT: LP optimization and multicycle analysis. The first was the single cycle LP optimization problem that included various constraints. The second one was the multicycle LP optimization problem that includes the assembly burnup limitation at rod cluster control (RCC) positions. The results for these problems showed the feasibility of INSIGHT for the practical scoping analysis, whose work almost consists of LP generation and multicycle analysis. (author)

  8. Dictionary-based image reconstruction for superresolution in integrated circuit imaging.

    Science.gov (United States)

    Cilingiroglu, T Berkin; Uyar, Aydan; Tuysuzoglu, Ahmet; Karl, W Clem; Konrad, Janusz; Goldberg, Bennett B; Ünlü, M Selim

    2015-06-01

    Resolution improvement through signal processing techniques for integrated circuit imaging is becoming more crucial as the rapid decrease in integrated circuit dimensions continues. Although there is a significant effort to push the limits of optical resolution for backside fault analysis through the use of solid immersion lenses, higher order laser beams, and beam apodization, signal processing techniques are required for additional improvement. In this work, we propose a sparse image reconstruction framework which couples overcomplete dictionary-based representation with a physics-based forward model to improve resolution and localization accuracy in high numerical aperture confocal microscopy systems for backside optical integrated circuit analysis. The effectiveness of the framework is demonstrated on experimental data.

  9. Hybrid Pixel-Based Method for Cardiac Ultrasound Fusion Based on Integration of PCA and DWT

    Directory of Open Access Journals (Sweden)

    Samaneh Mazaheri

    2015-01-01

    Full Text Available Medical image fusion is the procedure of combining several images from one or multiple imaging modalities. In spite of numerous attempts in direction of automation ventricle segmentation and tracking in echocardiography, due to low quality images with missing anatomical details or speckle noises and restricted field of view, this problem is a challenging task. This paper presents a fusion method which particularly intends to increase the segment-ability of echocardiography features such as endocardial and improving the image contrast. In addition, it tries to expand the field of view, decreasing impact of noise and artifacts and enhancing the signal to noise ratio of the echo images. The proposed algorithm weights the image information regarding an integration feature between all the overlapping images, by using a combination of principal component analysis and discrete wavelet transform. For evaluation, a comparison has been done between results of some well-known techniques and the proposed method. Also, different metrics are implemented to evaluate the performance of proposed algorithm. It has been concluded that the presented pixel-based method based on the integration of PCA and DWT has the best result for the segment-ability of cardiac ultrasound images and better performance in all metrics.

  10. An Integrated Solution for Performing Thermo-fluid Conjugate Analysis

    Science.gov (United States)

    Kornberg, Oren

    2009-01-01

    A method has been developed which integrates a fluid flow analyzer and a thermal analyzer to produce both steady state and transient results of 1-D, 2-D, and 3-D analysis models. The Generalized Fluid System Simulation Program (GFSSP) is a one dimensional, general purpose fluid analysis code which computes pressures and flow distributions in complex fluid networks. The MSC Systems Improved Numerical Differencing Analyzer (MSC.SINDA) is a one dimensional general purpose thermal analyzer that solves network representations of thermal systems. Both GFSSP and MSC.SINDA have graphical user interfaces which are used to build the respective model and prepare it for analysis. The SINDA/GFSSP Conjugate Integrator (SGCI) is a formbase graphical integration program used to set input parameters for the conjugate analyses and run the models. The contents of this paper describes SGCI and its thermo-fluids conjugate analysis techniques and capabilities by presenting results from some example models including the cryogenic chill down of a copper pipe, a bar between two walls in a fluid stream, and a solid plate creating a phase change in a flowing fluid.

  11. Introduction to stochastic analysis integrals and differential equations

    CERN Document Server

    Mackevicius, Vigirdas

    2013-01-01

    This is an introduction to stochastic integration and stochastic differential equations written in an understandable way for a wide audience, from students of mathematics to practitioners in biology, chemistry, physics, and finances. The presentation is based on the naïve stochastic integration, rather than on abstract theories of measure and stochastic processes. The proofs are rather simple for practitioners and, at the same time, rather rigorous for mathematicians. Detailed application examples in natural sciences and finance are presented. Much attention is paid to simulation diffusion pro

  12. Vehicle Integrated Performance Analysis, the VIPA Experience: Reconnecting with Technical Integration

    Science.gov (United States)

    McGhee, David S.

    2005-01-01

    Today's NASA is facing significant challenges and changes. The Exploration initiative indicates a large increase in projects with limited increase in budget. The Columbia report has criticized NASA for its lack of insight and technical integration impacting its ability to provide safety. The Aldridge report is advocating NASA find new ways of doing business. Very early in the Space Launch Initiative (SLI) program a small team of engineers at MSFC were asked to propose a process for performing a system level assessment of a launch vehicle. The request was aimed primarily at providing insight and making NASA a "smart buyer." Out of this effort the VIPA team was created. The difference between the VIPA effort and many integration attempts is that VIPA focuses on using experienced people from various disciplines and a process which focuses them on a technically integrated assessment. Most previous attempts have focused on developing an all encompassing software tool. In addition, VIPA anchored its process formulation in the experience of its members and in early developmental Space Shuttle experience. The primary reference for this is NASA-TP-2001-210092, "Launch Vehicle Design Process: Characterization, Technical Integration, and Lessons Learned," and discussions with its authors. The foundations of VIPA's process are described. The VIPA team also recognized the need to drive detailed analysis earlier in the design process. Analyses and techniques typically done in later design phases, are brought forward using improved computing technology. The intent is to allow the identification of significant sensitivities, trades, and design issues much earlier in the program. This process is driven by the T-model for Technical Integration described in the aforementioned reference. VIPA's approach to performing system level technical integration is discussed in detail. Proposed definitions are offered to clarify this discussion and the general systems integration dialog. VIPA

  13. Integrating computer aided radiography and plantar pressure measurements for complex gait analysis

    International Nuclear Information System (INIS)

    Gefen, A.; Megido-Ravid, M.; Itzchak, Y.; Arcan, M.

    1998-01-01

    Radiographic Fluoroscopy (DRF) and Contact Pressure Display (CPD). The CPD method uses a birefiingent integrated optical sandwich for contact stress analysis, e.g. plantar pressure distribution. The DRF method displays and electronically records skeletal motion using X-ray radiation, providing the exact bone and joint positions during gait. Integrating the two techniques, contribution of each segment to the HFS behavior may be studied by applying image processing and analysis techniques. The combined resulted data may be used not only to detect and diagnose gait pathologies but also as a base for development of advanced numerical models of the foot structure

  14. PHIDIAS: a pathogen-host interaction data integration and analysis system.

    Science.gov (United States)

    Xiang, Zuoshuang; Tian, Yuying; He, Yongqun

    2007-01-01

    The Pathogen-Host Interaction Data Integration and Analysis System (PHIDIAS) is a web-based database system that serves as a centralized source to search, compare, and analyze integrated genome sequences, conserved domains, and gene expression data related to pathogen-host interactions (PHIs) for pathogen species designated as high priority agents for public health and biological security. In addition, PHIDIAS allows submission, search and analysis of PHI genes and molecular networks curated from peer-reviewed literature. PHIDIAS is publicly available at http://www.phidias.us.

  15. A Comprehensive Database and Analysis Framework To Incorporate Multiscale Data Types and Enable Integrated Analysis of Bioactive Polyphenols.

    Science.gov (United States)

    Ho, Lap; Cheng, Haoxiang; Wang, Jun; Simon, James E; Wu, Qingli; Zhao, Danyue; Carry, Eileen; Ferruzzi, Mario G; Faith, Jeremiah; Valcarcel, Breanna; Hao, Ke; Pasinetti, Giulio M

    2018-03-05

    The development of a given botanical preparation for eventual clinical application requires extensive, detailed characterizations of the chemical composition, as well as the biological availability, biological activity, and safety profiles of the botanical. These issues are typically addressed using diverse experimental protocols and model systems. Based on this consideration, in this study we established a comprehensive database and analysis framework for the collection, collation, and integrative analysis of diverse, multiscale data sets. Using this framework, we conducted an integrative analysis of heterogeneous data from in vivo and in vitro investigation of a complex bioactive dietary polyphenol-rich preparation (BDPP) and built an integrated network linking data sets generated from this multitude of diverse experimental paradigms. We established a comprehensive database and analysis framework as well as a systematic and logical means to catalogue and collate the diverse array of information gathered, which is securely stored and added to in a standardized manner to enable fast query. We demonstrated the utility of the database in (1) a statistical ranking scheme to prioritize response to treatments and (2) in depth reconstruction of functionality studies. By examination of these data sets, the system allows analytical querying of heterogeneous data and the access of information related to interactions, mechanism of actions, functions, etc., which ultimately provide a global overview of complex biological responses. Collectively, we present an integrative analysis framework that leads to novel insights on the biological activities of a complex botanical such as BDPP that is based on data-driven characterizations of interactions between BDPP-derived phenolic metabolites and their mechanisms of action, as well as synergism and/or potential cancellation of biological functions. Out integrative analytical approach provides novel means for a systematic integrative

  16. Printed organic thin-film transistor-based integrated circuits

    International Nuclear Information System (INIS)

    Mandal, Saumen; Noh, Yong-Young

    2015-01-01

    Organic electronics is moving ahead on its journey towards reality. However, this technology will only be possible when it is able to meet specific criteria including flexibility, transparency, disposability and low cost. Printing is one of the conventional techniques to deposit thin films from solution-based ink. It is used worldwide for visual modes of information, and it is now poised to enter into the manufacturing processes of various consumer electronics. The continuous progress made in the field of functional organic semiconductors has achieved high solubility in common solvents as well as high charge carrier mobility, which offers ample opportunity for organic-based printed integrated circuits. In this paper, we present a comprehensive review of all-printed organic thin-film transistor-based integrated circuits, mainly ring oscillators. First, the necessity of all-printed organic integrated circuits is discussed; we consider how the gap between printed electronics and real applications can be bridged. Next, various materials for printed organic integrated circuits are discussed. The features of these circuits and their suitability for electronics using different printing and coating techniques follow. Interconnection technology is equally important to make this product industrially viable; much attention in this review is placed here. For high-frequency operation, channel length should be sufficiently small; this could be achievable with a combination of surface treatment-assisted printing or laser writing. Registration is also an important issue related to printing; the printed gate should be perfectly aligned with the source and drain to minimize parasitic capacitances. All-printed organic inverters and ring oscillators are discussed here, along with their importance. Finally, future applications of all-printed organic integrated circuits are highlighted. (paper)

  17. Hand-Based Biometric Analysis

    Science.gov (United States)

    Bebis, George (Inventor); Amayeh, Gholamreza (Inventor)

    2015-01-01

    Hand-based biometric analysis systems and techniques are described which provide robust hand-based identification and verification. An image of a hand is obtained, which is then segmented into a palm region and separate finger regions. Acquisition of the image is performed without requiring particular orientation or placement restrictions. Segmentation is performed without the use of reference points on the images. Each segment is analyzed by calculating a set of Zernike moment descriptors for the segment. The feature parameters thus obtained are then fused and compared to stored sets of descriptors in enrollment templates to arrive at an identity decision. By using Zernike moments, and through additional manipulation, the biometric analysis is invariant to rotation, scale, or translation or an in put image. Additionally, the analysis utilizes re-use of commonly-seen terms in Zernike calculations to achieve additional efficiencies over traditional Zernike moment calculation.

  18. Combination and Integration of Qualitative and Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Philipp Mayring

    2001-02-01

    Full Text Available In this paper, I am going to outline ways of combining qualitative and quantitative steps of analysis on five levels. On the technical level, programs for the computer-aided analysis of qualitative data offer various combinations. Where the data are concerned, the employment of categories (for instance by using qualitative content analysis allows for combining qualitative and quantitative forms of data analysis. On the individual level, the creation of types and the inductive generalisation of cases allow for proceeding from individual case material to quantitative generalisations. As for research design, different models can be distinguished (preliminary study, generalisation, elaboration, triangulation which combine qualitative and quantitative steps of analysis. Where the logic of research is concerned, it can be shown that an extended process model which combined qualitative and quantitative research can be appropriate and thus lead to an integration of the two approaches. URN: urn:nbn:de:0114-fqs010162

  19. Advantages of integrated and sustainability based assessment for metabolism based strategic planning of urban water systems.

    Science.gov (United States)

    Behzadian, Kourosh; Kapelan, Zoran

    2015-09-15

    Despite providing water-related services as the primary purpose of urban water system (UWS), all relevant activities require capital investments and operational expenditures, consume resources (e.g. materials and chemicals), and may increase negative environmental impacts (e.g. contaminant discharge, emissions to water and air). Performance assessment of such a metabolic system may require developing a holistic approach which encompasses various system elements and criteria. This paper analyses the impact of integration of UWS components on the metabolism based performance assessment for future planning using a number of intervention strategies. It also explores the importance of sustainability based criteria in the assessment of long-term planning. Two assessment approaches analysed here are: (1) planning for only water supply system (WSS) as a part of the UWS and (2) planning for an integrated UWS including potable water, stormwater, wastewater and water recycling. WaterMet(2) model is used to simulate metabolic type processes in the UWS and calculate quantitative performance indicators. The analysis is demonstrated on the problem of strategic level planning of a real-world UWS to where optional intervention strategies are applied. The resulting performance is assessed using the multiple criteria of both conventional and sustainability type; and optional intervention strategies are then ranked using the Compromise Programming method. The results obtained show that the high ranked intervention strategies in the integrated UWS are those supporting both water supply and stormwater/wastewater subsystems (e.g. rainwater harvesting and greywater recycling schemes) whilst these strategies are ranked low in the WSS and those targeting improvement of water supply components only (e.g. rehabilitation of clean water pipes and addition of new water resources) are preferred instead. Results also demonstrate that both conventional and sustainability type performance indicators

  20. Sparse Group Penalized Integrative Analysis of Multiple Cancer Prognosis Datasets

    Science.gov (United States)

    Liu, Jin; Huang, Jian; Xie, Yang; Ma, Shuangge

    2014-01-01

    SUMMARY In cancer research, high-throughput profiling studies have been extensively conducted, searching for markers associated with prognosis. Because of the “large d, small n” characteristic, results generated from the analysis of a single dataset can be unsatisfactory. Recent studies have shown that integrative analysis, which simultaneously analyzes multiple datasets, can be more effective than single-dataset analysis and classic meta-analysis. In most of existing integrative analysis, the homogeneity model has been assumed, which postulates that different datasets share the same set of markers. Several approaches have been designed to reinforce this assumption. In practice, different datasets may differ in terms of patient selection criteria, profiling techniques, and many other aspects. Such differences may make the homogeneity model too restricted. In this study, we assume the heterogeneity model, under which different datasets are allowed to have different sets of markers. With multiple cancer prognosis datasets, we adopt the AFT (accelerated failure time) model to describe survival. This model may have the lowest computational cost among popular semiparametric survival models. For marker selection, we adopt a sparse group MCP (minimax concave penalty) approach. This approach has an intuitive formulation and can be computed using an effective group coordinate descent algorithm. Simulation study shows that it outperforms the existing approaches under both the homogeneity and heterogeneity models. Data analysis further demonstrates the merit of heterogeneity model and proposed approach. PMID:23938111