WorldWideScience

Sample records for integrated analysis based

  1. Train integrity detection risk analysis based on PRISM

    Science.gov (United States)

    Wen, Yuan

    2018-04-01

    GNSS based Train Integrity Monitoring System (TIMS) is an effective and low-cost detection scheme for train integrity detection. However, as an external auxiliary system of CTCS, GNSS may be influenced by external environments, such as uncertainty of wireless communication channels, which may lead to the failure of communication and positioning. In order to guarantee the reliability and safety of train operation, a risk analysis method of train integrity detection based on PRISM is proposed in this article. First, we analyze the risk factors (in GNSS communication process and the on-board communication process) and model them. Then, we evaluate the performance of the model in PRISM based on the field data. Finally, we discuss how these risk factors influence the train integrity detection process.

  2. An Analysis of Delay-based and Integrator-based Sequence Detectors for Grid-Connected Converters

    DEFF Research Database (Denmark)

    Khazraj, Hesam; Silva, Filipe Miguel Faria da; Bak, Claus Leth

    2017-01-01

    -signal cancellation operators are the main members of the delay-based sequence detectors. The aim of this paper is to provide a theoretical and experimental comparative study between integrator and delay based sequence detectors. The theoretical analysis is conducted based on the small-signal modelling......Detecting and separating positive and negative sequence components of the grid voltage or current is of vital importance in the control of grid-connected power converters, HVDC systems, etc. To this end, several techniques have been proposed in recent years. These techniques can be broadly...... classified into two main classes: The integrator-based techniques and Delay-based techniques. The complex-coefficient filter-based technique, dual second-order generalized integrator-based method, multiple reference frame approach are the main members of the integrator-based sequence detector and the delay...

  3. Integrated failure probability estimation based on structural integrity analysis and failure data: Natural gas pipeline case

    International Nuclear Information System (INIS)

    Dundulis, Gintautas; Žutautaitė, Inga; Janulionis, Remigijus; Ušpuras, Eugenijus; Rimkevičius, Sigitas; Eid, Mohamed

    2016-01-01

    In this paper, the authors present an approach as an overall framework for the estimation of the failure probability of pipelines based on: the results of the deterministic-probabilistic structural integrity analysis (taking into account loads, material properties, geometry, boundary conditions, crack size, and defected zone thickness), the corrosion rate, the number of defects and failure data (involved into the model via application of Bayesian method). The proposed approach is applied to estimate the failure probability of a selected part of the Lithuanian natural gas transmission network. The presented approach for the estimation of integrated failure probability is a combination of several different analyses allowing us to obtain: the critical crack's length and depth, the failure probability of the defected zone thickness, dependency of the failure probability on the age of the natural gas transmission pipeline. A model's uncertainty analysis and uncertainty propagation analysis are performed, as well. - Highlights: • Degradation mechanisms of natural gas transmission pipelines. • Fracture mechanic analysis of the pipe with crack. • Stress evaluation of the pipe with critical crack. • Deterministic-probabilistic structural integrity analysis of gas pipeline. • Integrated estimation of pipeline failure probability by Bayesian method.

  4. Research on Integrated Analysis Method for Equipment and Tactics Based on Intervention Strategy Discussion

    Institute of Scientific and Technical Information of China (English)

    陈超; 张迎新; 毛赤龙

    2012-01-01

    As the increase of the complexity of the information warfare,its intervention strategy needs to be designed in an integrated environment.However,the current research always breaks the internal relation between equipment and tactics,and it is difficult to meet the requirements of their integrated analysis.In this paper,the research status quo of the integrated analysis about equipment and tactics is discussed first,some shortages of the current methods are summarized then,and an evolvement mechanism of the integrated analysis for equipment and tactics is given finally.Based on these,a framework of integrated analysis is proposed.This method's effectiveness is validated by an example.

  5. WebGimm: An integrated web-based platform for cluster analysis, functional analysis, and interactive visualization of results.

    Science.gov (United States)

    Joshi, Vineet K; Freudenberg, Johannes M; Hu, Zhen; Medvedovic, Mario

    2011-01-17

    Cluster analysis methods have been extensively researched, but the adoption of new methods is often hindered by technical barriers in their implementation and use. WebGimm is a free cluster analysis web-service, and an open source general purpose clustering web-server infrastructure designed to facilitate easy deployment of integrated cluster analysis servers based on clustering and functional annotation algorithms implemented in R. Integrated functional analyses and interactive browsing of both, clustering structure and functional annotations provides a complete analytical environment for cluster analysis and interpretation of results. The Java Web Start client-based interface is modeled after the familiar cluster/treeview packages making its use intuitive to a wide array of biomedical researchers. For biomedical researchers, WebGimm provides an avenue to access state of the art clustering procedures. For Bioinformatics methods developers, WebGimm offers a convenient avenue to deploy their newly developed clustering methods. WebGimm server, software and manuals can be freely accessed at http://ClusterAnalysis.org/.

  6. Integrating forest inventory and analysis data into a LIDAR-based carbon monitoring system

    Science.gov (United States)

    Kristofer D. Johnson; Richard Birdsey; Andrew O Finley; Anu Swantaran; Ralph Dubayah; Craig Wayson; Rachel. Riemann

    2014-01-01

    Forest Inventory and Analysis (FIA) data may be a valuable component of a LIDAR-based carbon monitoring system, but integration of the two observation systems is not without challenges. To explore integration methods, two wall-to-wall LIDAR-derived biomass maps were compared to FIA data at both the plot and county levels in Anne Arundel and Howard Counties in Maryland...

  7. Stability Analysis and Variational Integrator for Real-Time Formation Based on Potential Field

    Directory of Open Access Journals (Sweden)

    Shengqing Yang

    2014-01-01

    Full Text Available This paper investigates a framework of real-time formation of autonomous vehicles by using potential field and variational integrator. Real-time formation requires vehicles to have coordinated motion and efficient computation. Interactions described by potential field can meet the former requirement which results in a nonlinear system. Stability analysis of such nonlinear system is difficult. Our methodology of stability analysis is discussed in error dynamic system. Transformation of coordinates from inertial frame to body frame can help the stability analysis focus on the structure instead of particular coordinates. Then, the Jacobian of reduced system can be calculated. It can be proved that the formation is stable at the equilibrium point of error dynamic system with the effect of damping force. For consideration of calculation, variational integrator is introduced. It is equivalent to solving algebraic equations. Forced Euler-Lagrange equation in discrete expression is used to construct a forced variational integrator for vehicles in potential field and obstacle environment. By applying forced variational integrator on computation of vehicles' motion, real-time formation of vehicles in obstacle environment can be implemented. Algorithm based on forced variational integrator is designed for a leader-follower formation.

  8. Aerodynamic multi-objective integrated optimization based on principal component analysis

    Directory of Open Access Journals (Sweden)

    Jiangtao HUANG

    2017-08-01

    Full Text Available Based on improved multi-objective particle swarm optimization (MOPSO algorithm with principal component analysis (PCA methodology, an efficient high-dimension multi-objective optimization method is proposed, which, as the purpose of this paper, aims to improve the convergence of Pareto front in multi-objective optimization design. The mathematical efficiency, the physical reasonableness and the reliability in dealing with redundant objectives of PCA are verified by typical DTLZ5 test function and multi-objective correlation analysis of supercritical airfoil, and the proposed method is integrated into aircraft multi-disciplinary design (AMDEsign platform, which contains aerodynamics, stealth and structure weight analysis and optimization module. Then the proposed method is used for the multi-point integrated aerodynamic optimization of a wide-body passenger aircraft, in which the redundant objectives identified by PCA are transformed to optimization constraints, and several design methods are compared. The design results illustrate that the strategy used in this paper is sufficient and multi-point design requirements of the passenger aircraft are reached. The visualization level of non-dominant Pareto set is improved by effectively reducing the dimension without losing the primary feature of the problem.

  9. PANDORA: keyword-based analysis of protein sets by integration of annotation sources.

    Science.gov (United States)

    Kaplan, Noam; Vaaknin, Avishay; Linial, Michal

    2003-10-01

    Recent advances in high-throughput methods and the application of computational tools for automatic classification of proteins have made it possible to carry out large-scale proteomic analyses. Biological analysis and interpretation of sets of proteins is a time-consuming undertaking carried out manually by experts. We have developed PANDORA (Protein ANnotation Diagram ORiented Analysis), a web-based tool that provides an automatic representation of the biological knowledge associated with any set of proteins. PANDORA uses a unique approach of keyword-based graphical analysis that focuses on detecting subsets of proteins that share unique biological properties and the intersections of such sets. PANDORA currently supports SwissProt keywords, NCBI Taxonomy, InterPro entries and the hierarchical classification terms from ENZYME, SCOP and GO databases. The integrated study of several annotation sources simultaneously allows a representation of biological relations of structure, function, cellular location, taxonomy, domains and motifs. PANDORA is also integrated into the ProtoNet system, thus allowing testing thousands of automatically generated clusters. We illustrate how PANDORA enhances the biological understanding of large, non-uniform sets of proteins originating from experimental and computational sources, without the need for prior biological knowledge on individual proteins.

  10. Automics: an integrated platform for NMR-based metabonomics spectral processing and data analysis

    Directory of Open Access Journals (Sweden)

    Qu Lijia

    2009-03-01

    Full Text Available Abstract Background Spectral processing and post-experimental data analysis are the major tasks in NMR-based metabonomics studies. While there are commercial and free licensed software tools available to assist these tasks, researchers usually have to use multiple software packages for their studies because software packages generally focus on specific tasks. It would be beneficial to have a highly integrated platform, in which these tasks can be completed within one package. Moreover, with open source architecture, newly proposed algorithms or methods for spectral processing and data analysis can be implemented much more easily and accessed freely by the public. Results In this paper, we report an open source software tool, Automics, which is specifically designed for NMR-based metabonomics studies. Automics is a highly integrated platform that provides functions covering almost all the stages of NMR-based metabonomics studies. Automics provides high throughput automatic modules with most recently proposed algorithms and powerful manual modules for 1D NMR spectral processing. In addition to spectral processing functions, powerful features for data organization, data pre-processing, and data analysis have been implemented. Nine statistical methods can be applied to analyses including: feature selection (Fisher's criterion, data reduction (PCA, LDA, ULDA, unsupervised clustering (K-Mean and supervised regression and classification (PLS/PLS-DA, KNN, SIMCA, SVM. Moreover, Automics has a user-friendly graphical interface for visualizing NMR spectra and data analysis results. The functional ability of Automics is demonstrated with an analysis of a type 2 diabetes metabolic profile. Conclusion Automics facilitates high throughput 1D NMR spectral processing and high dimensional data analysis for NMR-based metabonomics applications. Using Automics, users can complete spectral processing and data analysis within one software package in most cases

  11. Automics: an integrated platform for NMR-based metabonomics spectral processing and data analysis.

    Science.gov (United States)

    Wang, Tao; Shao, Kang; Chu, Qinying; Ren, Yanfei; Mu, Yiming; Qu, Lijia; He, Jie; Jin, Changwen; Xia, Bin

    2009-03-16

    Spectral processing and post-experimental data analysis are the major tasks in NMR-based metabonomics studies. While there are commercial and free licensed software tools available to assist these tasks, researchers usually have to use multiple software packages for their studies because software packages generally focus on specific tasks. It would be beneficial to have a highly integrated platform, in which these tasks can be completed within one package. Moreover, with open source architecture, newly proposed algorithms or methods for spectral processing and data analysis can be implemented much more easily and accessed freely by the public. In this paper, we report an open source software tool, Automics, which is specifically designed for NMR-based metabonomics studies. Automics is a highly integrated platform that provides functions covering almost all the stages of NMR-based metabonomics studies. Automics provides high throughput automatic modules with most recently proposed algorithms and powerful manual modules for 1D NMR spectral processing. In addition to spectral processing functions, powerful features for data organization, data pre-processing, and data analysis have been implemented. Nine statistical methods can be applied to analyses including: feature selection (Fisher's criterion), data reduction (PCA, LDA, ULDA), unsupervised clustering (K-Mean) and supervised regression and classification (PLS/PLS-DA, KNN, SIMCA, SVM). Moreover, Automics has a user-friendly graphical interface for visualizing NMR spectra and data analysis results. The functional ability of Automics is demonstrated with an analysis of a type 2 diabetes metabolic profile. Automics facilitates high throughput 1D NMR spectral processing and high dimensional data analysis for NMR-based metabonomics applications. Using Automics, users can complete spectral processing and data analysis within one software package in most cases. Moreover, with its open source architecture, interested

  12. Simultaneous and integrated neutron-based techniques for material analysis of a metallic ancient flute

    International Nuclear Information System (INIS)

    Festa, G; Andreani, C; Pietropaolo, A; Grazzi, F; Scherillo, A; Barzagli, E; Sutton, L F; Bognetti, L; Bini, A; Schooneveld, E

    2013-01-01

    A metallic 19th century flute was studied by means of integrated and simultaneous neutron-based techniques: neutron diffraction, neutron radiative capture analysis and neutron radiography. This experiment follows benchmark measurements devoted to assessing the effectiveness of a multitask beamline concept for neutron-based investigation on materials. The aim of this study is to show the potential application of the approach using multiple and integrated neutron-based techniques for musical instruments. Such samples, in the broad scenario of cultural heritage, represent an exciting research field. They may represent an interesting link between different disciplines such as nuclear physics, metallurgy and acoustics. (paper)

  13. Human reliability analysis of performing tasks in plants based on fuzzy integral

    International Nuclear Information System (INIS)

    Washio, Takashi; Kitamura, Yutaka; Takahashi, Hideaki

    1991-01-01

    The effective improvement of the human working conditions in nuclear power plants might be a solution for the enhancement of the operation safety. The human reliability analysis (HRA) gives a methodological basis of the improvement based on the evaluation of human reliability under various working conditions. This study investigates some difficulties of the human reliability analysis using conventional linear models and recent fuzzy integral models, and provides some solutions to the difficulties. The following practical features of the provided methods are confirmed in comparison with the conventional methods: (1) Applicability to various types of tasks (2) Capability of evaluating complicated dependencies among working condition factors (3) A priori human reliability evaluation based on a systematic task analysis of human action processes (4) A conversion scheme to probability from indices representing human reliability. (author)

  14. Integrating model checking with HiP-HOPS in model-based safety analysis

    International Nuclear Information System (INIS)

    Sharvia, Septavera; Papadopoulos, Yiannis

    2015-01-01

    The ability to perform an effective and robust safety analysis on the design of modern safety–critical systems is crucial. Model-based safety analysis (MBSA) has been introduced in recent years to support the assessment of complex system design by focusing on the system model as the central artefact, and by automating the synthesis and analysis of failure-extended models. Model checking and failure logic synthesis and analysis (FLSA) are two prominent MBSA paradigms. Extensive research has placed emphasis on the development of these techniques, but discussion on their integration remains limited. In this paper, we propose a technique in which model checking and Hierarchically Performed Hazard Origin and Propagation Studies (HiP-HOPS) – an advanced FLSA technique – can be applied synergistically with benefit for the MBSA process. The application of the technique is illustrated through an example of a brake-by-wire system. - Highlights: • We propose technique to integrate HiP-HOPS and model checking. • State machines can be systematically constructed from HiP-HOPS. • The strengths of different MBSA techniques are combined. • Demonstrated through modeling and analysis of brake-by-wire system. • Root cause analysis is automated and system dynamic behaviors analyzed and verified

  15. Analysis of Hybrid-Integrated High-Speed Electro-Absorption Modulated Lasers Based on EM/Circuit Co-simulation

    DEFF Research Database (Denmark)

    Johansen, Tom Keinicke; Krozer, Viktor; Kazmierski, C.

    2009-01-01

    An improved electromagnetic simulation (EM) based approach has been developed for optimization of the electrical to optical (E/O) transmission properties of integrated electro-absorption modulated lasers (EMLs) aiming at 100 Gbit/s Ethernet applications. Our approach allows for an accurate analysis...... of the EML performance in a hybrid microstrip assembly. The established EM-based approach provides a design methodology for the future hybrid integration of the EML with its driving electronics....

  16. Study on Network Error Analysis and Locating based on Integrated Information Decision System

    Science.gov (United States)

    Yang, F.; Dong, Z. H.

    2017-10-01

    Integrated information decision system (IIDS) integrates multiple sub-system developed by many facilities, including almost hundred kinds of software, which provides with various services, such as email, short messages, drawing and sharing. Because the under-layer protocols are different, user standards are not unified, many errors are occurred during the stages of setup, configuration, and operation, which seriously affect the usage. Because the errors are various, which may be happened in different operation phases, stages, TCP/IP communication protocol layers, sub-system software, it is necessary to design a network error analysis and locating tool for IIDS to solve the above problems. This paper studies on network error analysis and locating based on IIDS, which provides strong theory and technology supports for the running and communicating of IIDS.

  17. Integrated vehicle-based safety systems (IVBSS) : light vehicle platform field operational test data analysis plan.

    Science.gov (United States)

    2009-12-22

    This document presents the University of Michigan Transportation Research Institutes plan to : perform analysis of data collected from the light vehicle platform field operational test of the : Integrated Vehicle-Based Safety Systems (IVBSS) progr...

  18. Integrated vehicle-based safety systems (IVBSS) : heavy truck platform field operational test data analysis plan.

    Science.gov (United States)

    2009-11-23

    This document presents the University of Michigan Transportation Research Institutes plan to perform : analysis of data collected from the heavy truck platform field operational test of the Integrated Vehicle- : Based Safety Systems (IVBSS) progra...

  19. Integrative omics analysis. A study based on Plasmodium falciparum mRNA and protein data.

    Science.gov (United States)

    Tomescu, Oana A; Mattanovich, Diethard; Thallinger, Gerhard G

    2014-01-01

    Technological improvements have shifted the focus from data generation to data analysis. The availability of large amounts of data from transcriptomics, protemics and metabolomics experiments raise new questions concerning suitable integrative analysis methods. We compare three integrative analysis techniques (co-inertia analysis, generalized singular value decomposition and integrative biclustering) by applying them to gene and protein abundance data from the six life cycle stages of Plasmodium falciparum. Co-inertia analysis is an analysis method used to visualize and explore gene and protein data. The generalized singular value decomposition has shown its potential in the analysis of two transcriptome data sets. Integrative Biclustering applies biclustering to gene and protein data. Using CIA, we visualize the six life cycle stages of Plasmodium falciparum, as well as GO terms in a 2D plane and interpret the spatial configuration. With GSVD, we decompose the transcriptomic and proteomic data sets into matrices with biologically meaningful interpretations and explore the processes captured by the data sets. IBC identifies groups of genes, proteins, GO Terms and life cycle stages of Plasmodium falciparum. We show method-specific results as well as a network view of the life cycle stages based on the results common to all three methods. Additionally, by combining the results of the three methods, we create a three-fold validated network of life cycle stage specific GO terms: Sporozoites are associated with transcription and transport; merozoites with entry into host cell as well as biosynthetic and metabolic processes; rings with oxidation-reduction processes; trophozoites with glycolysis and energy production; schizonts with antigenic variation and immune response; gametocyctes with DNA packaging and mitochondrial transport. Furthermore, the network connectivity underlines the separation of the intraerythrocytic cycle from the gametocyte and sporozoite stages

  20. Direct integration multiple collision integral transport analysis method for high energy fusion neutronics

    International Nuclear Information System (INIS)

    Koch, K.R.

    1985-01-01

    A new analysis method specially suited for the inherent difficulties of fusion neutronics was developed to provide detailed studies of the fusion neutron transport physics. These studies should provide a better understanding of the limitations and accuracies of typical fusion neutronics calculations. The new analysis method is based on the direct integration of the integral form of the neutron transport equation and employs a continuous energy formulation with the exact treatment of the energy angle kinematics of the scattering process. In addition, the overall solution is analyzed in terms of uncollided, once-collided, and multi-collided solution components based on a multiple collision treatment. Furthermore, the numerical evaluations of integrals use quadrature schemes that are based on the actual dependencies exhibited in the integrands. The new DITRAN computer code was developed on the Cyber 205 vector supercomputer to implement this direct integration multiple-collision fusion neutronics analysis. Three representative fusion reactor models were devised and the solutions to these problems were studied to provide suitable choices for the numerical quadrature orders as well as the discretized solution grid and to understand the limitations of the new analysis method. As further verification and as a first step in assessing the accuracy of existing fusion-neutronics calculations, solutions obtained using the new analysis method were compared to typical multigroup discrete ordinates calculations

  1. A methodology for developing high-integrity knowledge base using document analysis and ECPN matrix analysis with backward simulation

    International Nuclear Information System (INIS)

    Park, Joo Hyun

    1999-02-01

    When transitions occur in large systems such as nuclear power plants (NPPs) or industrial process plants, it is often difficult to diagnose them. Various computer-based operator-aiding systems have been developed in order to help operators diagnose the transitions of the plants. In procedures for developing knowledge base system like operator-aiding systems, the knowledge acquisition and the knowledge base verification are core activities. This dissertation describes a knowledge acquisition method and a knowledge base verification method for developing high-integrity knowledge base system of NPP expert systems. The knowledge acquisition is one of the most difficult and time-consuming activities in developing knowledge base systems. There are two kinds of knowledge acquisition methods in view of knowledge sources. One is an acquisition method from human expert. This method, however, is not adequate to acquire the knowledge of NPP expert systems because the number of experts is not sufficient. In this work, we propose a novel knowledge acquisition method through documents analysis. The knowledge base can be built correctly, rapidly, and partially automatically through this method. This method is especially useful when it is difficult to find domain experts. Reliability of knowledge base systems depends on the quality of their knowledge base. Petri Net has been used to verify knowledge bases due to their formal outputs. The methods using Petri Net however are difficult to apply to large and complex knowledge bases because the Net becomes very large and complex. Also, with Petri Net, it is difficult to find proper input patterns that make anomalies occur. In order to overcome this difficulty, in this work, the anomaly candidates detection methods are developed based on Extended CPN (ECPN) matrix analysis. This work also defines the backward simulation of CPN to find compact input patterns for anomaly detection, which starts simulation from the anomaly candidates

  2. Integrated, paper-based potentiometric electronic tongue for the analysis of beer and wine

    International Nuclear Information System (INIS)

    Nery, Emilia Witkowska; Kubota, Lauro T.

    2016-01-01

    The following manuscript details the stages of construction of a novel paper-based electronic tongue with an integrated Ag/AgCl reference, which can operate using a minimal amount of sample (40 μL). First, we optimized the fabrication procedure of silver electrodes, testing a set of different methodologies (electroless plating, use of silver nanoparticles and commercial silver paints). Later a novel, integrated electronic tongue system was assembled with the use of readily available materials such as paper, wax, lamination sheets, bleach etc. New system was thoroughly characterized and the ion-selective potentiometric sensors presented performance close to theoretical. An electronic tongue, composed of electrodes sensitive to sodium, calcium, ammonia and a cross-sensitive, anion-selective electrode was used to analyze 34 beer samples (12 types, 19 brands). This system was able to discriminate beers from different brands, and types, indicate presence of stabilizers and antioxidants, dyes or even unmalted cereals and carbohydrates added to the fermentation wort. Samples could be classified by type of fermentation (low, high) and system was able to predict pH and in part also alcohol content of tested beers. In the next step sample volume was minimalized by the use of paper sample pads and measurement in flow conditions. In order to test the impact of this advancement a four electrode system, with cross-sensitive (anion-selective, cation-selective, Ca"2"+/Mg"2"+, K"+/Na"+) electrodes was applied for the analysis of 11 types of wine (4 types of grapes, red/white, 3 countries). Proposed matrix was able to group wines produced from different varieties of grapes (Chardonnay, Americanas, Malbec, Merlot) using only 40 μL of sample. Apart from that, storage stability studies were performed using a multimeter, therefore showing that not only fabrication but also detection can be accomplished by means of off-the-shelf components. This manuscript not only describes new

  3. Integrated, paper-based potentiometric electronic tongue for the analysis of beer and wine

    Energy Technology Data Exchange (ETDEWEB)

    Nery, Emilia Witkowska, E-mail: ewitkowskanery@ichf.edu.pl [Department of Analytical Chemistry, Institute of Chemistry – UNICAMP, P.O. Box 6154, 13084-971 Campinas, SP (Brazil); National Institute of Science and Technology in Bioanalytics, Institute of Chemistry – UNICAMP, P.O. Box 6154, Campinas (Brazil); Kubota, Lauro T. [Department of Analytical Chemistry, Institute of Chemistry – UNICAMP, P.O. Box 6154, 13084-971 Campinas, SP (Brazil); National Institute of Science and Technology in Bioanalytics, Institute of Chemistry – UNICAMP, P.O. Box 6154, Campinas (Brazil)

    2016-04-28

    The following manuscript details the stages of construction of a novel paper-based electronic tongue with an integrated Ag/AgCl reference, which can operate using a minimal amount of sample (40 μL). First, we optimized the fabrication procedure of silver electrodes, testing a set of different methodologies (electroless plating, use of silver nanoparticles and commercial silver paints). Later a novel, integrated electronic tongue system was assembled with the use of readily available materials such as paper, wax, lamination sheets, bleach etc. New system was thoroughly characterized and the ion-selective potentiometric sensors presented performance close to theoretical. An electronic tongue, composed of electrodes sensitive to sodium, calcium, ammonia and a cross-sensitive, anion-selective electrode was used to analyze 34 beer samples (12 types, 19 brands). This system was able to discriminate beers from different brands, and types, indicate presence of stabilizers and antioxidants, dyes or even unmalted cereals and carbohydrates added to the fermentation wort. Samples could be classified by type of fermentation (low, high) and system was able to predict pH and in part also alcohol content of tested beers. In the next step sample volume was minimalized by the use of paper sample pads and measurement in flow conditions. In order to test the impact of this advancement a four electrode system, with cross-sensitive (anion-selective, cation-selective, Ca{sup 2+}/Mg{sup 2+}, K{sup +}/Na{sup +}) electrodes was applied for the analysis of 11 types of wine (4 types of grapes, red/white, 3 countries). Proposed matrix was able to group wines produced from different varieties of grapes (Chardonnay, Americanas, Malbec, Merlot) using only 40 μL of sample. Apart from that, storage stability studies were performed using a multimeter, therefore showing that not only fabrication but also detection can be accomplished by means of off-the-shelf components. This manuscript not only

  4. Nature-based integration

    DEFF Research Database (Denmark)

    Pitkänen, Kati; Oratuomi, Joose; Hellgren, Daniela

    Increased attention to, and careful planning of the integration of migrants into Nordic societies is ever more important. Nature based integration is a new solution to respond to this need. This report presents the results of a Nordic survey and workshop and illustrates current practices of nature...... based integration by case study descriptions from Denmark, Sweden Norway and Finland. Across Nordic countries several practical projects and initiatives have been launched to promote the benefits of nature in integration and there is also growing academic interest in the topic. Nordic countries have...... the potential of becoming real forerunners in nature based integration even at the global scale....

  5. Integration of ROOT Notebooks as an ATLAS analysis web-based tool in outreach and public data release

    CERN Document Server

    Sanchez, Arturo; The ATLAS collaboration

    2016-01-01

    The integration of the ROOT data analysis framework with the Jupyter Notebook technology presents an incredible potential in the enhance and expansion of educational and training programs: starting from university students in their early years, passing to new ATLAS PhD students and post doctoral researchers, to those senior analysers and professors that want to restart their contact with the analysis of data or to include a more friendly but yet very powerful open source tool in the classroom. Such tools have been already tested in several environments and a fully web-based integration together with Open Access Data repositories brings the possibility to go a step forward in the search of ATLAS for integration between several CERN projects in the field of the education and training, developing new computing solutions on the way.

  6. Control Synthesis for the Flow-Based Microfluidic Large-Scale Integration Biochips

    DEFF Research Database (Denmark)

    Minhass, Wajid Hassan; Pop, Paul; Madsen, Jan

    2013-01-01

    In this paper we are interested in flow-based microfluidic biochips, which are able to integrate the necessary functions for biochemical analysis on-chip. In these chips, the flow of liquid is manipulated using integrated microvalves. By combining severalmicrovalves, more complex units, such asmi......In this paper we are interested in flow-based microfluidic biochips, which are able to integrate the necessary functions for biochemical analysis on-chip. In these chips, the flow of liquid is manipulated using integrated microvalves. By combining severalmicrovalves, more complex units...

  7. International Space Station Configuration Analysis and Integration

    Science.gov (United States)

    Anchondo, Rebekah

    2016-01-01

    Ambitious engineering projects, such as NASA's International Space Station (ISS), require dependable modeling, analysis, visualization, and robotics to ensure that complex mission strategies are carried out cost effectively, sustainably, and safely. Learn how Booz Allen Hamilton's Modeling, Analysis, Visualization, and Robotics Integration Center (MAVRIC) team performs engineering analysis of the ISS Configuration based primarily on the use of 3D CAD models. To support mission planning and execution, the team tracks the configuration of ISS and maintains configuration requirements to ensure operational goals are met. The MAVRIC team performs multi-disciplinary integration and trade studies to ensure future configurations meet stakeholder needs.

  8. Integration of a satellite ground support system based on analysis of the satellite ground support domain

    Science.gov (United States)

    Pendley, R. D.; Scheidker, E. J.; Levitt, D. S.; Myers, C. R.; Werking, R. D.

    1994-11-01

    This analysis defines a complete set of ground support functions based on those practiced in real space flight operations during the on-orbit phase of a mission. These functions are mapped against ground support functions currently in use by NASA and DOD. Software components to provide these functions can be hosted on RISC-based work stations and integrated to provide a modular, integrated ground support system. Such modular systems can be configured to provide as much ground support functionality as desired. This approach to ground systems has been widely proposed and prototyped both by government institutions and commercial vendors. The combined set of ground support functions we describe can be used as a standard to evaluate candidate ground systems. This approach has also been used to develop a prototype of a modular, loosely-integrated ground support system, which is discussed briefly. A crucial benefit to a potential user is that all the components are flight-qualified, thus giving high confidence in their accuracy and reliability.

  9. An Integrative Object-Based Image Analysis Workflow for Uav Images

    Science.gov (United States)

    Yu, Huai; Yan, Tianheng; Yang, Wen; Zheng, Hong

    2016-06-01

    In this work, we propose an integrative framework to process UAV images. The overall process can be viewed as a pipeline consisting of the geometric and radiometric corrections, subsequent panoramic mosaicking and hierarchical image segmentation for later Object Based Image Analysis (OBIA). More precisely, we first introduce an efficient image stitching algorithm after the geometric calibration and radiometric correction, which employs a fast feature extraction and matching by combining the local difference binary descriptor and the local sensitive hashing. We then use a Binary Partition Tree (BPT) representation for the large mosaicked panoramic image, which starts by the definition of an initial partition obtained by an over-segmentation algorithm, i.e., the simple linear iterative clustering (SLIC). Finally, we build an object-based hierarchical structure by fully considering the spectral and spatial information of the super-pixels and their topological relationships. Moreover, an optimal segmentation is obtained by filtering the complex hierarchies into simpler ones according to some criterions, such as the uniform homogeneity and semantic consistency. Experimental results on processing the post-seismic UAV images of the 2013 Ya'an earthquake demonstrate the effectiveness and efficiency of our proposed method.

  10. AN INTEGRATIVE OBJECT-BASED IMAGE ANALYSIS WORKFLOW FOR UAV IMAGES

    Directory of Open Access Journals (Sweden)

    H. Yu

    2016-06-01

    Full Text Available In this work, we propose an integrative framework to process UAV images. The overall process can be viewed as a pipeline consisting of the geometric and radiometric corrections, subsequent panoramic mosaicking and hierarchical image segmentation for later Object Based Image Analysis (OBIA. More precisely, we first introduce an efficient image stitching algorithm after the geometric calibration and radiometric correction, which employs a fast feature extraction and matching by combining the local difference binary descriptor and the local sensitive hashing. We then use a Binary Partition Tree (BPT representation for the large mosaicked panoramic image, which starts by the definition of an initial partition obtained by an over-segmentation algorithm, i.e., the simple linear iterative clustering (SLIC. Finally, we build an object-based hierarchical structure by fully considering the spectral and spatial information of the super-pixels and their topological relationships. Moreover, an optimal segmentation is obtained by filtering the complex hierarchies into simpler ones according to some criterions, such as the uniform homogeneity and semantic consistency. Experimental results on processing the post-seismic UAV images of the 2013 Ya’an earthquake demonstrate the effectiveness and efficiency of our proposed method.

  11. Performance-Based Technology Selection Filter description report. INEL Buried Waste Integrated Demonstration System Analysis project

    Energy Technology Data Exchange (ETDEWEB)

    O`Brien, M.C.; Morrison, J.L.; Morneau, R.A.; Rudin, M.J.; Richardson, J.G.

    1992-05-01

    A formal methodology has been developed for identifying technology gaps and assessing innovative or postulated technologies for inclusion in proposed Buried Waste Integrated Demonstration (BWID) remediation systems. Called the Performance-Based Technology Selection Filter, the methodology provides a formalized selection process where technologies and systems are rated and assessments made based on performance measures, and regulatory and technical requirements. The results are auditable, and can be validated with field data. This analysis methodology will be applied to the remedial action of transuranic contaminated waste pits and trenches buried at the Idaho National Engineering Laboratory (INEL).

  12. Integrated piping structural analysis system

    International Nuclear Information System (INIS)

    Motoi, Toshio; Yamadera, Masao; Horino, Satoshi; Idehata, Takamasa

    1979-01-01

    Structural analysis of the piping system for nuclear power plants has become larger in scale and in quantity. In addition, higher quality analysis is regarded as of major importance nowadays from the point of view of nuclear plant safety. In order to fulfill to the above requirements, an integrated piping structural analysis system (ISAP-II) has been developed. Basic philosophy of this system is as follows: 1. To apply the date base system. All information is concentrated. 2. To minimize the manual process in analysis, evaluation and documentation. Especially to apply the graphic system as much as possible. On the basis of the above philosophy four subsystems were made. 1. Data control subsystem. 2. Analysis subsystem. 3. Plotting subsystem. 4. Report subsystem. Function of the data control subsystem is to control all information of the data base. Piping structural analysis can be performed by using the analysis subsystem. Isometric piping drawing and mode shape, etc. can be plotted by using the plotting subsystem. Total analysis report can be made without the manual process through the reporting subsystem. (author)

  13. Energy saving analysis and management modeling based on index decomposition analysis integrated energy saving potential method: Application to complex chemical processes

    International Nuclear Information System (INIS)

    Geng, Zhiqiang; Gao, Huachao; Wang, Yanqing; Han, Yongming; Zhu, Qunxiong

    2017-01-01

    Highlights: • The integrated framework that combines IDA with energy-saving potential method is proposed. • Energy saving analysis and management framework of complex chemical processes is obtained. • This proposed method is efficient in energy optimization and carbon emissions of complex chemical processes. - Abstract: Energy saving and management of complex chemical processes play a crucial role in the sustainable development procedure. In order to analyze the effect of the technology, management level, and production structure having on energy efficiency and energy saving potential, this paper proposed a novel integrated framework that combines index decomposition analysis (IDA) with energy saving potential method. The IDA method can obtain the level of energy activity, energy hierarchy and energy intensity effectively based on data-drive to reflect the impact of energy usage. The energy saving potential method can verify the correctness of the improvement direction proposed by the IDA method. Meanwhile, energy efficiency improvement, energy consumption reduction and energy savings can be visually discovered by the proposed framework. The demonstration analysis of ethylene production has verified the practicality of the proposed method. Moreover, we can obtain the corresponding improvement for the ethylene production based on the demonstration analysis. The energy efficiency index and the energy saving potential of these worst months can be increased by 6.7% and 7.4%, respectively. And the carbon emissions can be reduced by 7.4–8.2%.

  14. Fuzzy knowledge bases integration based on ontology

    OpenAIRE

    Ternovoy, Maksym; Shtogrina, Olena

    2012-01-01

    the paper describes the approach for fuzzy knowledge bases integration with the usage of ontology. This approach is based on metadata-base usage for integration of different knowledge bases with common ontology. The design process of metadata-base is described.

  15. Development of safety analysis technology for integral reactor

    Energy Technology Data Exchange (ETDEWEB)

    Sim, Suk K.; Song, J. H.; Chung, Y. J. and others

    1999-03-01

    Inherent safety features and safety system characteristics of the SMART integral reactor are investigated in this study. Performance and safety of the SMART conceptual design have been evaluated and confirmed through the performance and safety analyses using safety analysis system codes as well as a preliminary performance and safety analysis methodology. SMART design base events and their acceptance criteria are identified to develop a preliminary PIRT for the SMART integral reactor. Using the preliminary PIRT, a set of experimental program for the thermal hydraulic separate effect tests and the integral effect tests was developed for the thermal hydraulic model development and the system code validation. Safety characteristics as well as the safety issues of the integral reactor has been identified during the study, which will be used to resolve the safety issues and guide the regulatory criteria for the integral reactor. The results of the performance and safety analyses performed during the study were used to feedback for the SMART conceptual design. The performance and safety analysis code systems as well as the preliminary safety analysis methodology developed in this study will be validated as the SMART design evolves. The performance and safety analysis technology developed during the study will be utilized for the SMART basic design development. (author)

  16. Analysis Method for Integrating Components of Product

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jun Ho [Inzest Co. Ltd, Seoul (Korea, Republic of); Lee, Kun Sang [Kookmin Univ., Seoul (Korea, Republic of)

    2017-04-15

    This paper presents some of the methods used to incorporate the parts constituting a product. A new relation function concept and its structure are introduced to analyze the relationships of component parts. This relation function has three types of information, which can be used to establish a relation function structure. The relation function structure of the analysis criteria was established to analyze and present the data. The priority components determined by the analysis criteria can be integrated. The analysis criteria were divided based on their number and orientation, as well as their direct or indirect characteristic feature. This paper presents a design algorithm for component integration. This algorithm was applied to actual products, and the components inside the product were integrated. Therefore, the proposed algorithm was used to conduct research to improve the brake discs for bicycles. As a result, an improved product similar to the related function structure was actually created.

  17. Analysis Method for Integrating Components of Product

    International Nuclear Information System (INIS)

    Choi, Jun Ho; Lee, Kun Sang

    2017-01-01

    This paper presents some of the methods used to incorporate the parts constituting a product. A new relation function concept and its structure are introduced to analyze the relationships of component parts. This relation function has three types of information, which can be used to establish a relation function structure. The relation function structure of the analysis criteria was established to analyze and present the data. The priority components determined by the analysis criteria can be integrated. The analysis criteria were divided based on their number and orientation, as well as their direct or indirect characteristic feature. This paper presents a design algorithm for component integration. This algorithm was applied to actual products, and the components inside the product were integrated. Therefore, the proposed algorithm was used to conduct research to improve the brake discs for bicycles. As a result, an improved product similar to the related function structure was actually created.

  18. CAD-Based Modeling of Advanced Rotary Wing Structures for Integrated 3-D Aeromechanics Analysis

    Science.gov (United States)

    Staruk, William

    This dissertation describes the first comprehensive use of integrated 3-D aeromechanics modeling, defined as the coupling of 3-D solid finite element method (FEM) structural dynamics with 3-D computational fluid dynamics (CFD), for the analysis of a real helicopter rotor. The development of this new methodology (a departure from how rotor aeroelastic analysis has been performed for 40 years), its execution on a real rotor, and the fundamental understanding of aeromechanics gained from it, are the key contributions of this dissertation. This work also presents the first CFD/CSD analysis of a tiltrotor in edgewise flight, revealing many of its unique loading mechanisms. The use of 3-D FEM, integrated with a trim solver and aerodynamics modeling, has the potential to enhance the design of advanced rotors by overcoming fundamental limitations of current generation beam-based analysis tools and offering integrated internal dynamic stress and strain predictions for design. Two primary goals drove this research effort: 1) developing a methodology to create 3-D CAD-based brick finite element models of rotors including multibody joints, controls, and aerodynamic interfaces, and 2) refining X3D, the US Army's next generation rotor structural dynamics solver featuring 3-D FEM within a multibody formulation with integrated aerodynamics, to model a tiltrotor in the edgewise conversion flight regime, which drives critical proprotor structural loads. Prior tiltrotor analysis has primarily focused on hover aerodynamics with rigid blades or forward flight whirl-flutter stability with simplified aerodynamics. The first goal was met with the development of a detailed methodology for generating multibody 3-D structural models, starting from CAD geometry, continuing to higher-order hexahedral finite element meshing, to final assembly of the multibody model by creating joints, assigning material properties, and defining the aerodynamic interface. Several levels of verification and

  19. Microprocessor-based integrated LMFBR core surveillance. Pt. 2

    International Nuclear Information System (INIS)

    Elies, V.

    1985-12-01

    This report is the result of the KfK part of a joint study of KfK and INTERATOM. The aim of this study is to explore the advantages of microprocessors and microelectronics for a more sophisticated core surveillance, which is based on the integration of separate surveillance techniques. After a description of the experimental results gained with the different surveillance techniques so far, it is shown which kinds of correlation can be done using the evaluation results obtained from the single surveillance systems. The main part of this report contains the systems analysis of a microcomputer-based system integrating different surveillance methods. After an analysis of the hardware requirements a hardware structure for the integrated system is proposed. The software structure is then described for the subsystem performing the different surveillance algorithms as well as for the system which does the correlation thus deriving additional information from the single results. (orig.) [de

  20. Integration Strategy Is a Key Step in Network-Based Analysis and Dramatically Affects Network Topological Properties and Inferring Outcomes

    Science.gov (United States)

    Jin, Nana; Wu, Deng; Gong, Yonghui; Bi, Xiaoman; Jiang, Hong; Li, Kongning; Wang, Qianghu

    2014-01-01

    An increasing number of experiments have been designed to detect intracellular and intercellular molecular interactions. Based on these molecular interactions (especially protein interactions), molecular networks have been built for using in several typical applications, such as the discovery of new disease genes and the identification of drug targets and molecular complexes. Because the data are incomplete and a considerable number of false-positive interactions exist, protein interactions from different sources are commonly integrated in network analyses to build a stable molecular network. Although various types of integration strategies are being applied in current studies, the topological properties of the networks from these different integration strategies, especially typical applications based on these network integration strategies, have not been rigorously evaluated. In this paper, systematic analyses were performed to evaluate 11 frequently used methods using two types of integration strategies: empirical and machine learning methods. The topological properties of the networks of these different integration strategies were found to significantly differ. Moreover, these networks were found to dramatically affect the outcomes of typical applications, such as disease gene predictions, drug target detections, and molecular complex identifications. The analysis presented in this paper could provide an important basis for future network-based biological researches. PMID:25243127

  1. Dictionary-based image reconstruction for superresolution in integrated circuit imaging.

    Science.gov (United States)

    Cilingiroglu, T Berkin; Uyar, Aydan; Tuysuzoglu, Ahmet; Karl, W Clem; Konrad, Janusz; Goldberg, Bennett B; Ünlü, M Selim

    2015-06-01

    Resolution improvement through signal processing techniques for integrated circuit imaging is becoming more crucial as the rapid decrease in integrated circuit dimensions continues. Although there is a significant effort to push the limits of optical resolution for backside fault analysis through the use of solid immersion lenses, higher order laser beams, and beam apodization, signal processing techniques are required for additional improvement. In this work, we propose a sparse image reconstruction framework which couples overcomplete dictionary-based representation with a physics-based forward model to improve resolution and localization accuracy in high numerical aperture confocal microscopy systems for backside optical integrated circuit analysis. The effectiveness of the framework is demonstrated on experimental data.

  2. Continuous integration congestion cost allocation based on sensitivity

    International Nuclear Information System (INIS)

    Wu, Z.Q.; Wang, Y.N.

    2004-01-01

    Congestion cost allocation is a very important topic in congestion management. Allocation methods based on the Aumann-Shapley value use the discrete numerical integration method, which needs to solve the incremented OPF solution many times, and as such it is not suitable for practical application to large-scale systems. The optimal solution and its sensitivity change tendency during congestion removal using a DC optimal power flow (OPF) process is analysed. A simple continuous integration method based on the sensitivity is proposed for the congestion cost allocation. The proposed sensitivity analysis method needs a smaller computation time than the method based on using the quadratic method and inner point iteration. The proposed congestion cost allocation method uses a continuous integration method rather than discrete numerical integration. The method does not need to solve the incremented OPF solutions; which allows it use in large-scale systems. The method can also be used for AC OPF congestion management. (author)

  3. Decision-Based Design Integrating Consumer Preferences into Engineering Design

    CERN Document Server

    Chen, Wei; Wassenaar, Henk Jan

    2013-01-01

    Building upon the fundamental principles of decision theory, Decision-Based Design: Integrating Consumer Preferences into Engineering Design presents an analytical approach to enterprise-driven Decision-Based Design (DBD) as a rigorous framework for decision making in engineering design.  Once the related fundamentals of decision theory, economic analysis, and econometrics modelling are established, the remaining chapters describe the entire process, the associated analytical techniques, and the design case studies for integrating consumer preference modeling into the enterprise-driven DBD framework. Methods for identifying key attributes, optimal design of human appraisal experiments, data collection, data analysis, and demand model estimation are presented and illustrated using engineering design case studies. The scope of the chapters also provides: •A rigorous framework of integrating the interests from both producer and consumers in engineering design, •Analytical techniques of consumer choice model...

  4. Pixel extraction based integral imaging with controllable viewing direction

    International Nuclear Information System (INIS)

    Ji, Chao-Chao; Deng, Huan; Wang, Qiong-Hua

    2012-01-01

    We propose pixel extraction based integral imaging with a controllable viewing direction. The proposed integral imaging can provide viewers three-dimensional (3D) images in a very small viewing angle. The viewing angle and the viewing direction of the reconstructed 3D images are controlled by the pixels extracted from an elemental image array. Theoretical analysis and a 3D display experiment of the viewing direction controllable integral imaging are carried out. The experimental results verify the correctness of the theory. A 3D display based on the integral imaging can protect the viewer’s privacy and has huge potential for a television to show multiple 3D programs at the same time. (paper)

  5. Harmonic analysis in integrated energy system based on compressed sensing

    International Nuclear Information System (INIS)

    Yang, Ting; Pen, Haibo; Wang, Dan; Wang, Zhaoxia

    2016-01-01

    Highlights: • We propose a harmonic/inter-harmonic analysis scheme with compressed sensing theory. • Property of sparseness of harmonic signal in electrical power system is proved. • The ratio formula of fundamental and harmonic components sparsity is presented. • Spectral Projected Gradient-Fundamental Filter reconstruction algorithm is proposed. • SPG-FF enhances the precision of harmonic detection and signal reconstruction. - Abstract: The advent of Integrated Energy Systems enabled various distributed energy to access the system through different power electronic devices. The development of this has made the harmonic environment more complex. It needs low complexity and high precision of harmonic detection and analysis methods to improve power quality. To solve the shortages of large data storage capacities and high complexity of compression in sampling under the Nyquist sampling framework, this research paper presents a harmonic analysis scheme based on compressed sensing theory. The proposed scheme enables the performance of the functions of compressive sampling, signal reconstruction and harmonic detection simultaneously. In the proposed scheme, the sparsity of the harmonic signals in the base of the Discrete Fourier Transform (DFT) is numerically calculated first. This is followed by providing a proof of the matching satisfaction of the necessary conditions for compressed sensing. The binary sparse measurement is then leveraged to reduce the storage space in the sampling unit in the proposed scheme. In the recovery process, the scheme proposed a novel reconstruction algorithm called the Spectral Projected Gradient with Fundamental Filter (SPG-FF) algorithm to enhance the reconstruction precision. One of the actual microgrid systems is used as simulation example. The results of the experiment shows that the proposed scheme effectively enhances the precision of harmonic and inter-harmonic detection with low computing complexity, and has good

  6. Sensitivity Analysis Based on Markovian Integration by Parts Formula

    Directory of Open Access Journals (Sweden)

    Yongsheng Hang

    2017-10-01

    Full Text Available Sensitivity analysis is widely applied in financial risk management and engineering; it describes the variations brought by the changes of parameters. Since the integration by parts technique for Markov chains is well developed in recent years, in this paper we apply it for computation of sensitivity and show the closed-form expressions for two commonly-used time-continuous Markovian models. By comparison, we conclude that our approach outperforms the existing technique of computing sensitivity on Markovian models.

  7. A multilayered integrated sensor for three-dimensional, micro total analysis systems

    International Nuclear Information System (INIS)

    Xiao, Jing; Song, Fuchuan; Seo, Sang-Woo

    2013-01-01

    This paper presents a layer-by-layer integration approach of different functional devices and demonstrates a heterogeneously integrated optical sensor featuring a micro-ring resonator and a high-speed thin-film InGaAs-based photodetector co-integrated with a microfluidic droplet generation device. A thin optical device structure allows a seamless integration with other polymer-based devices on a silicon platform. The integrated sensor successfully demonstrates its transient measurement capability of two-phase liquid flow in a microfluidic droplet generation device. The proposed approach represents an important step toward fully integrated micro total analysis systems. (paper)

  8. Analysis of Optimal Operation of an Energy Integrated Distillation Plant

    DEFF Research Database (Denmark)

    Li, Hong Wen; Hansen, C.A.; Gani, Rafiqul

    2003-01-01

    The efficiency of manufacturing systems can be significantly increased through diligent application of control based on mathematical models thereby enabling more tight integration of decision making with systems operation. In the present paper analysis of optimal operation of an energy integrated...

  9. Integrated, paper-based potentiometric electronic tongue for the analysis of beer and wine.

    Science.gov (United States)

    Nery, Emilia Witkowska; Kubota, Lauro T

    2016-04-28

    The following manuscript details the stages of construction of a novel paper-based electronic tongue with an integrated Ag/AgCl reference, which can operate using a minimal amount of sample (40 μL). First, we optimized the fabrication procedure of silver electrodes, testing a set of different methodologies (electroless plating, use of silver nanoparticles and commercial silver paints). Later a novel, integrated electronic tongue system was assembled with the use of readily available materials such as paper, wax, lamination sheets, bleach etc. New system was thoroughly characterized and the ion-selective potentiometric sensors presented performance close to theoretical. An electronic tongue, composed of electrodes sensitive to sodium, calcium, ammonia and a cross-sensitive, anion-selective electrode was used to analyze 34 beer samples (12 types, 19 brands). This system was able to discriminate beers from different brands, and types, indicate presence of stabilizers and antioxidants, dyes or even unmalted cereals and carbohydrates added to the fermentation wort. Samples could be classified by type of fermentation (low, high) and system was able to predict pH and in part also alcohol content of tested beers. In the next step sample volume was minimalized by the use of paper sample pads and measurement in flow conditions. In order to test the impact of this advancement a four electrode system, with cross-sensitive (anion-selective, cation-selective, Ca(2+)/Mg(2+), K(+)/Na(+)) electrodes was applied for the analysis of 11 types of wine (4 types of grapes, red/white, 3 countries). Proposed matrix was able to group wines produced from different varieties of grapes (Chardonnay, Americanas, Malbec, Merlot) using only 40 μL of sample. Apart from that, storage stability studies were performed using a multimeter, therefore showing that not only fabrication but also detection can be accomplished by means of off-the-shelf components. This manuscript not only describes new

  10. Integrated Reliability and Risk Analysis System (IRRAS)

    International Nuclear Information System (INIS)

    Russell, K.D.; McKay, M.K.; Sattison, M.B.; Skinner, N.L.; Wood, S.T.; Rasmuson, D.M.

    1992-01-01

    The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the user the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification. Version 1.0 of the IRRAS program was released in February of 1987. Since that time, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 4.0 and is the subject of this Reference Manual. Version 4.0 of IRRAS provides the same capabilities as Version 1.0 and adds a relational data base facility for managing the data, improved functionality, and improved algorithm performance

  11. PHIDIAS: a pathogen-host interaction data integration and analysis system

    OpenAIRE

    Xiang, Zuoshuang; Tian, Yuying; He, Yongqun

    2007-01-01

    The Pathogen-Host Interaction Data Integration and Analysis System (PHIDIAS) is a web-based database system that serves as a centralized source to search, compare, and analyze integrated genome sequences, conserved domains, and gene expression data related to pathogen-host interactions (PHIs) for pathogen species designated as high priority agents for public health and biological security. In addition, PHIDIAS allows submission, search and analysis of PHI genes and molecular networks curated ...

  12. Integrative Analysis of Omics Big Data.

    Science.gov (United States)

    Yu, Xiang-Tian; Zeng, Tao

    2018-01-01

    The diversity and huge omics data take biology and biomedicine research and application into a big data era, just like that popular in human society a decade ago. They are opening a new challenge from horizontal data ensemble (e.g., the similar types of data collected from different labs or companies) to vertical data ensemble (e.g., the different types of data collected for a group of person with match information), which requires the integrative analysis in biology and biomedicine and also asks for emergent development of data integration to address the great changes from previous population-guided to newly individual-guided investigations.Data integration is an effective concept to solve the complex problem or understand the complicate system. Several benchmark studies have revealed the heterogeneity and trade-off that existed in the analysis of omics data. Integrative analysis can combine and investigate many datasets in a cost-effective reproducible way. Current integration approaches on biological data have two modes: one is "bottom-up integration" mode with follow-up manual integration, and the other one is "top-down integration" mode with follow-up in silico integration.This paper will firstly summarize the combinatory analysis approaches to give candidate protocol on biological experiment design for effectively integrative study on genomics and then survey the data fusion approaches to give helpful instruction on computational model development for biological significance detection, which have also provided newly data resources and analysis tools to support the precision medicine dependent on the big biomedical data. Finally, the problems and future directions are highlighted for integrative analysis of omics big data.

  13. Lectures on functional analysis and the Lebesgue integral

    CERN Document Server

    Komornik, Vilmos

    2016-01-01

    This textbook, based on three series of lectures held by the author at the University of Strasbourg, presents functional analysis in a non-traditional way by generalizing elementary theorems of plane geometry to spaces of arbitrary dimension. This approach leads naturally to the basic notions and theorems. Most results are illustrated by the small ℓp spaces. The Lebesgue integral, meanwhile, is treated via the direct approach of Frigyes Riesz, whose constructive definition of measurable functions leads to optimal, clear-cut versions of the classical theorems of Fubini-Tonelli and Radon-Nikodým. Lectures on Functional Analysis and the Lebesgue Integral presents the most important topics for students, with short, elegant proofs. The exposition style follows the Hungarian mathematical tradition of Paul Erdős and others. The order of the first two parts, functional analysis and the Lebesgue integral, may be reversed. In the third and final part they are combined to study various spaces of continuous and integ...

  14. Integration of ROOT notebook as an ATLAS analysis web-based tool in outreach and public data release projects

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00237353; The ATLAS collaboration

    2017-01-01

    Integration of the ROOT data analysis framework with the Jupyter Notebook technology presents the potential of enhancement and expansion of educational and training programs. It can be beneficial for university students in their early years, new PhD students and post-doctoral researchers, as well as for senior researchers and teachers who want to refresh their data analysis skills or to introduce a more friendly and yet very powerful open source tool in the classroom. Such tools have been already tested in several environments. A fully web-based integration of the tools and the Open Access Data repositories brings the possibility to go a step forward in the ATLAS quest of making use of several CERN projects in the field of the education and training, developing new computing solutions on the way.

  15. Cross-Border Trade: An Analysis of Trade and Market Integration ...

    African Journals Online (AJOL)

    An assessment of cross-border trade and market integration reveal that inhabitants of the border areas have become economically, socially and politically integrated in spite of the conflict over the Bakassi Peninsula. Based on empirical analysis, bilateral agreements between Nigeria and Cameroon have made negligible ...

  16. Application of a faith-based integration tool to assess mental and physical health interventions.

    Science.gov (United States)

    Saunders, Donna M; Leak, Jean; Carver, Monique E; Smith, Selina A

    2017-01-01

    To build on current research involving faith-based interventions (FBIs) for addressing mental and physical health, this study a) reviewed the extent to which relevant publications integrate faith concepts with health and b) initiated analysis of the degree of FBI integration with intervention outcomes. Derived from a systematic search of articles published between 2007 and 2017, 36 studies were assessed with a Faith-Based Integration Assessment Tool (FIAT) to quantify faith-health integration. Basic statistical procedures were employed to determine the association of faith-based integration with intervention outcomes. The assessed studies possessed (on average) moderate, inconsistent integration because of poor use of faith measures, and moderate, inconsistent use of faith practices. Analysis procedures for determining the effect of FBI integration on intervention outcomes were inadequate for formulating practical conclusions. Regardless of integration, interventions were associated with beneficial outcomes. To determine the link between FBI integration and intervention outcomes, additional analyses are needed.

  17. Network Based Integrated Analysis of Phenotype-Genotype Data for Prioritization of Candidate Symptom Genes

    Directory of Open Access Journals (Sweden)

    Xing Li

    2014-01-01

    Full Text Available Background. Symptoms and signs (symptoms in brief are the essential clinical manifestations for individualized diagnosis and treatment in traditional Chinese medicine (TCM. To gain insights into the molecular mechanism of symptoms, we develop a computational approach to identify the candidate genes of symptoms. Methods. This paper presents a network-based approach for the integrated analysis of multiple phenotype-genotype data sources and the prediction of the prioritizing genes for the associated symptoms. The method first calculates the similarities between symptoms and diseases based on the symptom-disease relationships retrieved from the PubMed bibliographic database. Then the disease-gene associations and protein-protein interactions are utilized to construct a phenotype-genotype network. The PRINCE algorithm is finally used to rank the potential genes for the associated symptoms. Results. The proposed method gets reliable gene rank list with AUC (area under curve 0.616 in classification. Some novel genes like CALCA, ESR1, and MTHFR were predicted to be associated with headache symptoms, which are not recorded in the benchmark data set, but have been reported in recent published literatures. Conclusions. Our study demonstrated that by integrating phenotype-genotype relationships into a complex network framework it provides an effective approach to identify candidate genes of symptoms.

  18. Integrated genetic analysis microsystems

    International Nuclear Information System (INIS)

    Lagally, Eric T; Mathies, Richard A

    2004-01-01

    With the completion of the Human Genome Project and the ongoing DNA sequencing of the genomes of other animals, bacteria, plants and others, a wealth of new information about the genetic composition of organisms has become available. However, as the demand for sequence information grows, so does the workload required both to generate this sequence and to use it for targeted genetic analysis. Microfabricated genetic analysis systems are well poised to assist in the collection and use of these data through increased analysis speed, lower analysis cost and higher parallelism leading to increased assay throughput. In addition, such integrated microsystems may point the way to targeted genetic experiments on single cells and in other areas that are otherwise very difficult. Concomitant with these advantages, such systems, when fully integrated, should be capable of forming portable systems for high-speed in situ analyses, enabling a new standard in disciplines such as clinical chemistry, forensics, biowarfare detection and epidemiology. This review will discuss the various technologies available for genetic analysis on the microscale, and efforts to integrate them to form fully functional robust analysis devices. (topical review)

  19. GAUSS Market Analysis for Integrated Satellite Communication and Navigation Location Based services

    Science.gov (United States)

    Di Fazio, Antonella; Dricot, Fabienne; Tata, Francesco

    2003-07-01

    The demand for mobile information services coupled with positioning technologies for delivering value- added services that depend on a user's location has rapidly increased during last years. In particular, services and applications related with improved mobility safety and transport efficiency look very attractive.Solutions for location services vary in respect of positioning accuracy and the technical infrastructure required, and the associated investment in terminals and networks. From the analysis of the state-of-the art, it comes that various technologies are currently available on the European market, while mobile industry is gearing up to launch a wide variety of location services like tracking, alarming and locating.Nevertheless, when addressing safety of life as well as security applications, severe hurdles have to be posed in the light of existing technologies. Existing navigation (e.g. GPS) and communication systems are not able to completely satisfy the needs and requirements of safety-of-life-critical applications. As a matter of fact, the GPS system's main weaknesses today is its lack of integrity, which means its inability to warn users of a malfunction in a reasonable time, while the other positioning techniques do not provide satisfactory accuracy as well, and terrestrial communication networks are not capable to cope with stringent requirement in terms of service reliability and coverage.In this context, GAUSS proposes an innovative satellite-based solution using novel technology and effective tools for addressing mobility challenges in a cost efficient manner, improving safety and effectiveness.GAUSS (Galileo And UMTS Synergetic System) is a Research and Technological Development project co- funded by European Commission, within the frame of the 5th IST Programme. The project lasted two years, and it was successfully completed in November 2002. GAUSS key concept is the integration of Satellite Navigation GNSS and UMTS communication technology, to

  20. Development of Probabilistic Structural Analysis Integrated with Manufacturing Processes

    Science.gov (United States)

    Pai, Shantaram S.; Nagpal, Vinod K.

    2007-01-01

    An effort has been initiated to integrate manufacturing process simulations with probabilistic structural analyses in order to capture the important impacts of manufacturing uncertainties on component stress levels and life. Two physics-based manufacturing process models (one for powdered metal forging and the other for annular deformation resistance welding) have been linked to the NESSUS structural analysis code. This paper describes the methodology developed to perform this integration including several examples. Although this effort is still underway, particularly for full integration of a probabilistic analysis, the progress to date has been encouraging and a software interface that implements the methodology has been developed. The purpose of this paper is to report this preliminary development.

  1. An operator expansion technique for path integral analysis

    International Nuclear Information System (INIS)

    Tsvetkov, I.V.

    1995-01-01

    A new method of path integral analysis in the framework of a power series technique is presented. The method is based on the operator expansion of an exponential. A regular procedure to calculate the correction terms is found. (orig.)

  2. Integrated knowledge base tool for acquisition and verification of NPP alarm systems

    International Nuclear Information System (INIS)

    Park, Joo Hyun; Seong, Poong Hyun

    1998-01-01

    Knowledge acquisition and knowledge base verification are important activities in developing knowledge-based systems such as alarm processing systems. In this work, we developed the integrated tool, for knowledge acquisition and verification of NPP alarm processing systems, by using G2 tool. The tool integrates document analysis method and ECPN matrix analysis method, for knowledge acquisition and knowledge verification, respectively. This tool enables knowledge engineers to perform their tasks from knowledge acquisition to knowledge verification consistently

  3. Integrity Analysis of Damaged Steam Generator Tubes

    International Nuclear Information System (INIS)

    Stanic, D.

    1998-01-01

    Variety of degradation mechanisms affecting steam generator tubes makes steam generators as one of the critical components in the nuclear power plants. Depending of their nature, degradation mechanisms cause different types of damages. It requires performance of extensive integrity analysis in order to access various conditions of crack behavior under operating and accidental conditions. Development and application of advanced eddy current techniques for steam generator examination provide good characterization of found damages. Damage characteristics (shape, orientation and dimensions) may be defined and used for further evaluation of damage influence on tube integrity. In comparison with experimental and analytical methods, numerical methods are also efficient tools for integrity assessment. Application of finite element methods provides relatively simple modeling of different type of damages and simulation of various operating conditions. The stress and strain analysis may be performed for elastic and elasto-plastic state with good ability for visual presentation of results. Furthermore, the fracture mechanics parameters may be calculated. Results obtained by numerical analysis supplemented with experimental results are the base for definition of alternative plugging criteria which may significantly reduce the number of plugged tubes. (author)

  4. Interface-based software integration

    Directory of Open Access Journals (Sweden)

    Aziz Ahmad Rais

    2016-07-01

    Full Text Available Enterprise architecture frameworks define the goals of enterprise architecture in order to make business processes and IT operations more effective, and to reduce the risk of future investments. These enterprise architecture frameworks offer different architecture development methods that help in building enterprise architecture. In practice, the larger organizations become, the larger their enterprise architecture and IT become. This leads to an increasingly complex system of enterprise architecture development and maintenance. Application software architecture is one type of architecture that, along with business architecture, data architecture and technology architecture, composes enterprise architecture. From the perspective of integration, enterprise architecture can be considered a system of interaction between multiple examples of application software. Therefore, effective software integration is a very important basis for the future success of the enterprise architecture in question. This article will provide interface-based integration practice in order to help simplify the process of building such a software integration system. The main goal of interface-based software integration is to solve problems that may arise with software integration requirements and developing software integration architecture.

  5. Uncertainty analysis of an integrated energy system based on information theory

    International Nuclear Information System (INIS)

    Fu, Xueqian; Sun, Hongbin; Guo, Qinglai; Pan, Zhaoguang; Xiong, Wen; Wang, Li

    2017-01-01

    Currently, a custom-designed configuration of different renewable technologies named the integrated energy system (IES) has become popular due to its high efficiency, benefiting from complementary multi-energy technologies. This paper proposes an information entropy approach to quantify uncertainty in an integrated energy system based on a stochastic model that drives a power system model derived from an actual network on Barry Island. Due to the complexity of co-behaviours between generators, a copula-based approach is utilized to articulate the dependency structure of the generator outputs with regard to such factors as weather conditions. Correlation coefficients and mutual information, which are effective for assessing the dependence relationships, are applied to judge whether the stochastic IES model is correct. The calculated information values can be used to analyse the impacts of the coupling of power and heat on power flows and heat flows, and this approach will be helpful for improving the operation of IES. - Highlights: • The paper explores uncertainty of an integrated energy system. • The dependent weather model is verified from the perspective of correlativity. • The IES model considers the dependence between power and heat. • The information theory helps analyse the complexity of IES operation. • The application of the model is studied using an operational system on Barry Island.

  6. Nondestructive Analysis of Tumor-Associated Membrane Protein Integrating Imaging and Amplified Detection in situ Based on Dual-Labeled DNAzyme.

    Science.gov (United States)

    Chen, Xiaoxia; Zhao, Jing; Chen, Tianshu; Gao, Tao; Zhu, Xiaoli; Li, Genxi

    2018-01-01

    Comprehensive analysis of the expression level and location of tumor-associated membrane proteins (TMPs) is of vital importance for the profiling of tumor cells. Currently, two kinds of independent techniques, i.e. ex situ detection and in situ imaging, are usually required for the quantification and localization of TMPs respectively, resulting in some inevitable problems. Methods: Herein, based on a well-designed and fluorophore-labeled DNAzyme, we develop an integrated and facile method, in which imaging and quantification of TMPs in situ are achieved simultaneously in a single system. The labeled DNAzyme not only produces localized fluorescence for the visualization of TMPs but also catalyzes the cleavage of a substrate to produce quantitative fluorescent signals that can be collected from solution for the sensitive detection of TMPs. Results: Results from the DNAzyme-based in situ imaging and quantification of TMPs match well with traditional immunofluorescence and western blotting. In addition to the advantage of two-in-one, the DNAzyme-based method is highly sensitivity, allowing the detection of TMPs in only 100 cells. Moreover, the method is nondestructive. Cells after analysis could retain their physiological activity and could be cultured for other applications. Conclusion: The integrated system provides solid results for both imaging and quantification of TMPs, making it a competitive method over some traditional techniques for the analysis of TMPs, which offers potential application as a toolbox in the future.

  7. An integrated 3D design, modeling and analysis resource for SSC detector systems

    International Nuclear Information System (INIS)

    DiGiacomo, N.J.; Adams, T.; Anderson, M.K.; Davis, M.; Easom, B.; Gliozzi, J.; Hale, W.M.; Hupp, J.; Killian, K.; Krohn, M.; Leitch, R.; Lajczok, M.; Mason, L.; Mitchell, J.; Pohlen, J.; Wright, T.

    1989-01-01

    Integrated computer aided engineering and design (CAE/CAD) is having a significant impact on the way design, modeling and analysis is performed, from system concept exploration and definition through final design and integration. Experience with integrated CAE/CAD in high technology projects of scale and scope similar to SSC detectors leads them to propose an integrated computer-based design, modeling and analysis resource aimed specifically at SSC detector system development. The resource architecture emphasizes value-added contact with data and efficient design, modeling and analysis of components, sub-systems or systems with fidelity appropriate to the task. They begin with a general examination of the design, modeling and analysis cycle in high technology projects, emphasizing the transition from the classical islands of automation to the integrated CAE/CAD-based approach. They follow this with a discussion of lessons learned from various attempts to design and implement integrated CAE/CAD systems in scientific and engineering organizations. They then consider the requirements for design, modeling and analysis during SSC detector development, and describe an appropriate resource architecture. They close with a report on the status of the resource and present some results that are indicative of its performance. 10 refs., 7 figs

  8. Establishing community-based integrated care for elderly patients through interprofessional teamwork: a qualitative analysis

    Directory of Open Access Journals (Sweden)

    Asakawa T

    2017-10-01

    Full Text Available Tomohiro Asakawa,1 Hidenobu Kawabata,1 Kengo Kisa,2 Takayoshi Terashita,3 Manabu Murakami,4 Junji Otaki1 1Department of Medical Education and General Medicine, Graduate School of Medicine, Hokkaido University, Sapporo, 2Kutchan-Kosei General Hospital, Kutchan, Hokkaido, 3Graduate School of Radiological Technology Gunma Prefectural College of Health Sciences, Kamioki-machi, Maebashi, Gunma, 4International Relations Office, Graduate School of Medicine, Hokkaido University, Sapporo, Hokkaido, Japan Background: Working in multidisciplinary teams is indispensable for ensuring high-quality care for elderly people in Japan’s rapidly aging society. However, health professionals often experience difficulty collaborating in practice because of their different educational backgrounds, ideas, and the roles of each profession. In this qualitative descriptive study, we reveal how to build interdisciplinary collaboration in multidisciplinary teams. Methods: Semi-structured interviews were conducted with a total of 26 medical professionals, including physicians, nurses, public health nurses, medical social workers, and clerical personnel. Each participant worked as a team member of community-based integrated care. The central topic of the interviews was what the participants needed to establish collaboration during the care of elderly residents. Each interview lasted for about 60 minutes. All the interviews were recorded, transcribed verbatim, and subjected to content analysis. Results: The analysis yielded the following three categories concerning the necessary elements of building collaboration: 1 two types of meeting configuration; 2 building good communication; and 3 effective leadership. The two meetings described in the first category – “community care meetings” and “individual care meetings” – were aimed at bringing together the disciplines and discussing individual cases, respectively. Building good communication referred to the activities

  9. Elements for successful sensor-based process control {Integrated Metrology}

    International Nuclear Information System (INIS)

    Butler, Stephanie Watts

    1998-01-01

    Current productivity needs have stimulated development of alternative metrology, control, and equipment maintenance methods. Specifically, sensor applications provide the opportunity to increase productivity, tighten control, reduce scrap, and improve maintenance schedules and procedures. Past experience indicates a complete integrated solution must be provided for sensor-based control to be used successfully in production. In this paper, Integrated Metrology is proposed as the term for an integrated solution that will result in a successful application of sensors for process control. This paper defines and explores the perceived four elements of successful sensor applications: business needs, integration, components, and form. Based upon analysis of existing successful commercially available controllers, the necessary business factors have been determined to be strong, measurable industry-wide business needs whose solution is profitable and feasible. This paper examines why the key aspect of integration is the decision making process. A detailed discussion is provided of the components of most importance to sensor based control: decision-making methods, the 3R's of sensors, and connectivity. A metric for one of the R's (resolution) is proposed to allow focus on this important aspect of measurement. A form for these integrated components which synergistically partitions various aspects of control at the equipment and MES levels to efficiently achieve desired benefits is recommended

  10. Elements for successful sensor-based process control {Integrated Metrology}

    Science.gov (United States)

    Butler, Stephanie Watts

    1998-11-01

    Current productivity needs have stimulated development of alternative metrology, control, and equipment maintenance methods. Specifically, sensor applications provide the opportunity to increase productivity, tighten control, reduce scrap, and improve maintenance schedules and procedures. Past experience indicates a complete integrated solution must be provided for sensor-based control to be used successfully in production. In this paper, Integrated Metrology is proposed as the term for an integrated solution that will result in a successful application of sensors for process control. This paper defines and explores the perceived four elements of successful sensor applications: business needs, integration, components, and form. Based upon analysis of existing successful commercially available controllers, the necessary business factors have been determined to be strong, measurable industry-wide business needs whose solution is profitable and feasible. This paper examines why the key aspect of integration is the decision making process. A detailed discussion is provided of the components of most importance to sensor based control: decision-making methods, the 3R's of sensors, and connectivity. A metric for one of the R's (resolution) is proposed to allow focus on this important aspect of measurement. A form for these integrated components which synergistically partitions various aspects of control at the equipment and MES levels to efficiently achieve desired benefits is recommended.

  11. Microprocessor-based integrated LMFBR core surveillance

    International Nuclear Information System (INIS)

    Gmeiner, L.

    1984-06-01

    This report results from a joint study of KfK and INTERATOM. The aim of this study is to explore the advantages of microprocessors and microelectronics for a more sophisticated core surveillance, which is based on the integration of separate surveillance techniques. Due to new developments in microelectronics and related software an approach to LMFBR core surveillance can be conceived that combines a number of measurements into a more intelligent decision-making data processing system. The following techniques are considered to contribute essentially to an integrated core surveillance system: - subassembly state and thermal hydraulics performance monitoring, - temperature noise analysis, - acoustic core surveillance, - failure characterization and failure prediction based on DND- and cover gas signals, and - flux tilting techniques. Starting from a description of these techniques it is shown that by combination and correlation of these individual techniques a higher degree of cost-effectiveness, reliability and accuracy can be achieved. (orig./GL) [de

  12. Integrated design and performance analysis of the KO HCCR TBM for ITER

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dong Won, E-mail: dwlee@kaeri.re.kr [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Jin, Hyung Gon; Lee, Eo Hwak; Yoon, Jae Sung; Kim, Suk Kwon; Lee, Cheol Woo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Ahn, Mu-Young; Cho, Seungyon [National Fusion Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    Highlights: • Integrated analysis is performed with the conventional CFD code (ANSYS-CFX). • Overall pressure drop and coolant flow scheme are investigated. • Manifold design is being performed considering flow distribution. - Abstract: To develop tritium breeding technology for a Fusion Reactor, Korea has participated in the Test Blanket Module (TBM) program in ITER. The He Cooled Ceramic Reflector (HCCR) TBM consists of functional components such as First Wall (FW), Breeding Zone (BZ), Side Wall (SW), and Back Manifold (BM) and it was designed based on the separate analyses for each component in 2012. Based on the each component analysis model, the integrated model is prepared and thermal-hydraulic analysis for the HCCR TBM is performed in the present study. The coolant flow distribution from BM and SW to FW and BZ, and resulted structure temperatures are obtained with the integrated model. It is found that the non-uniform flow rate occurs at FW and BZ and it causes excess of the design limit (550 °C) at some region. Based on this integrated model, we will perform the design optimization for obtaining uniform flow distribution for satisfying the design requirements.

  13. PHIDIAS: a pathogen-host interaction data integration and analysis system.

    Science.gov (United States)

    Xiang, Zuoshuang; Tian, Yuying; He, Yongqun

    2007-01-01

    The Pathogen-Host Interaction Data Integration and Analysis System (PHIDIAS) is a web-based database system that serves as a centralized source to search, compare, and analyze integrated genome sequences, conserved domains, and gene expression data related to pathogen-host interactions (PHIs) for pathogen species designated as high priority agents for public health and biological security. In addition, PHIDIAS allows submission, search and analysis of PHI genes and molecular networks curated from peer-reviewed literature. PHIDIAS is publicly available at http://www.phidias.us.

  14. [Integrated health care organizations: guideline for analysis].

    Science.gov (United States)

    Vázquez Navarrete, M Luisa; Vargas Lorenzo, Ingrid; Farré Calpe, Joan; Terraza Núñez, Rebeca

    2005-01-01

    There has been a tendency recently to abandon competition and to introduce policies that promote collaboration between health providers as a means of improving the efficiency of the system and the continuity of care. A number of countries, most notably the United States, have experienced the integration of health care providers to cover the continuum of care of a defined population. Catalonia has witnessed the steady emergence of increasing numbers of integrated health organisations (IHO) but, unlike the United States, studies on health providers' integration are scarce. As part of a research project currently underway, a guide was developed to study Catalan IHOs, based on a classical literature review and the development of a theoretical framework. The guide proposes analysing the IHO's performance in relation to their final objectives of improving the efficiency and continuity of health care by an analysis of the integration type (based on key characteristics); external elements (existence of other suppliers, type of services' payment mechanisms); and internal elements (model of government, organization and management) that influence integration. Evaluation of the IHO's performance focuses on global strategies and results on coordination of care and efficiency. Two types of coordination are evaluated: information coordination and coordination of care management. Evaluation of the efficiency of the IHO refers to technical and allocative efficiency. This guide may have to be modified for use in the Catalan context.

  15. Development of Spreadsheet-Based Integrated Transaction Processing Systems and Financial Reporting Systems

    Science.gov (United States)

    Ariana, I. M.; Bagiada, I. M.

    2018-01-01

    Development of spreadsheet-based integrated transaction processing systems and financial reporting systems is intended to optimize the capabilities of spreadsheet in accounting data processing. The purpose of this study are: 1) to describe the spreadsheet-based integrated transaction processing systems and financial reporting systems; 2) to test its technical and operational feasibility. This study type is research and development. The main steps of study are: 1) needs analysis (need assessment); 2) developing spreadsheet-based integrated transaction processing systems and financial reporting systems; and 3) testing the feasibility of spreadsheet-based integrated transaction processing systems and financial reporting systems. The technical feasibility include the ability of hardware and operating systems to respond the application of accounting, simplicity and ease of use. Operational feasibility include the ability of users using accounting applications, the ability of accounting applications to produce information, and control applications of the accounting applications. The instrument used to assess the technical and operational feasibility of the systems is the expert perception questionnaire. The instrument uses 4 Likert scale, from 1 (strongly disagree) to 4 (strongly agree). Data were analyzed using percentage analysis by comparing the number of answers within one (1) item by the number of ideal answer within one (1) item. Spreadsheet-based integrated transaction processing systems and financial reporting systems integrate sales, purchases, and cash transaction processing systems to produce financial reports (statement of profit or loss and other comprehensive income, statement of changes in equity, statement of financial position, and statement of cash flows) and other reports. Spreadsheet-based integrated transaction processing systems and financial reporting systems is feasible from the technical aspects (87.50%) and operational aspects (84.17%).

  16. From heat integration targets toward implementation – A TSA (total site analysis)-based design approach for heat recovery systems in industrial clusters

    International Nuclear Information System (INIS)

    Hackl, Roman; Harvey, Simon

    2015-01-01

    The European process industry is facing major challenges to decrease production costs. One strategy to achieve this is by increasing energy efficiency. Single chemical processes are often well-integrated and the tools to target and design such measures are well developed. Site-wide heat integration based on total site analysis tools can be used to identify opportunities to further increase energy efficiency. However, the methodology has to be developed further in order to enable identification of practical heat integration measures in a systematic way. Designing site-wide heat recovery systems across an industrial cluster is complex and involves aspects apart from thermal process and utility flows. This work presents a method for designing a roadmap of heat integration investments based on total site analysis. The method is applied to a chemical cluster in Sweden. The results of the case study show that application of the proposed method can achieve up to 42% of the previously targeted hot utility savings of 129 MW. A roadmap of heat integration systems is suggested, ranging from less complex systems that achieve a minor share of the heat recovery potential to sophisticated, strongly interdependent systems demanding large investments and a high level of collaboration. - Highlights: • Methodology focused on the practical implementation of site-wide heat recovery. • Algorithm to determine a roadmap of heat integration investments. • Case study: 42% hot utility savings potential at a pay-back period of 3.9y.

  17. Analysis of Price Variation and Market Integration of Prosopis ...

    African Journals Online (AJOL)

    Analysis of Price Variation and Market Integration of Prosopis Africana (guill. ... select five markets based on the presence of traders selling the commodity in the markets ... T- test result showed that Prosopis africana seed trade is profitable and ...

  18. Thermodynamic analysis and optimization of IT-SOFC-based integrated coal gasification fuel cell power plants

    NARCIS (Netherlands)

    Romano, M.C.; Campanari, S.; Spallina, V.; Lozza, G.

    2011-01-01

    This work discusses the thermodynamic analysis of integrated gasification fuel cell plants, where a simple cycle gas turbine works in a hybrid cycle with a pressurized intermediate temperature–solid oxide fuel cell (SOFC), integrated with a coal gasification and syngas cleanup island and a bottoming

  19. Integrated Case Based and Rule Based Reasoning for Decision Support

    OpenAIRE

    Eshete, Azeb Bekele

    2009-01-01

    This project is a continuation of my specialization project which was focused on studying theoretical concepts related to case based reasoning method, rule based reasoning method and integration of them. The integration of rule-based and case-based reasoning methods has shown a substantial improvement with regards to performance over the individual methods. Verdande Technology As wants to try integrating the rule based reasoning method with an existing case based system. This project focu...

  20. Proportional-integral controller based small-signal analysis of hybrid distributed generation systems

    International Nuclear Information System (INIS)

    Ray, Prakash K.; Mohanty, Soumya R.; Kishor, Nand

    2011-01-01

    Research highlights: → We aim to minimize the deviation of frequency in an integrated energy resources like offshore wind, photovoltaic (PV), fuel cell (FC) and diesel engine generator (DEG) along with the energy storage elements like flywheel energy storage system (FESS) and battery energy storage system (BESS). → Further ultracapacitor (UC) as an alternative energy storage element and proportional-integral (PI) controller is addressed in order to achieve improvements in the deviation of frequency profiles. → A comparative assessment of frequency deviation for different hybrid systems is also carried out in the presence of high voltage direct current (HVDC) link and high voltage alternating current (HVAC) line. → In the study both qualitative and quantitative analysis reflects the improvements in frequency deviation profiles with use of ultracapacitor (UC) as energy storage element. -- Abstract: The large band variation in the wind speed and unpredictable solar radiation causes remarkable fluctuations of output power in offshore wind and photovoltaic system respectively, which leads to large deviation in the system frequency. In this context, to minimize the deviation in frequency, this paper presents integration of different energy resources like offshore wind, photovoltaic (PV), fuel cell (FC) and diesel engine generator (DEG) along with the energy storage elements like flywheel energy storage system (FESS) and battery energy storage system (BESS). Further ultracapacitor (UC) as an alternative energy storage element and proportional-integral (PI) controller is addressed in order to achieve improvements in the deviation of frequency profiles. A comparative assessment of frequency deviation for different hybrid systems is also carried out in the presence of high-voltage direct current (HVDC) link and high-voltage alternating current (HVAC) line. Frequency deviation for different isolated hybrid systems are presented graphically as well as in terms of

  1. Integrated Ecological River Health Assessments, Based on Water Chemistry, Physical Habitat Quality and Biological Integrity

    Directory of Open Access Journals (Sweden)

    Ji Yoon Kim

    2015-11-01

    Full Text Available This study evaluated integrative river ecosystem health using stressor-based models of physical habitat health, chemical water health, and biological health of fish and identified multiple-stressor indicators influencing the ecosystem health. Integrated health responses (IHRs, based on star-plot approach, were calculated from qualitative habitat evaluation index (QHEI, nutrient pollution index (NPI, and index of biological integrity (IBI in four different longitudinal regions (Groups I–IV. For the calculations of IHRs values, multi-metric QHEI, NPI, and IBI models were developed and their criteria for the diagnosis of the health were determined. The longitudinal patterns of the river were analyzed by a self-organizing map (SOM model and the key major stressors in the river were identified by principal component analysis (PCA. Our model scores of integrated health responses (IHRs suggested that mid-stream and downstream regions were impaired, and the key stressors were closely associated with nutrient enrichment (N and P and organic matter pollutions from domestic wastewater disposal plants and urban sewage. This modeling approach of IHRs may be used as an effective tool for evaluations of integrative ecological river health..

  2. Construction of an integrated database to support genomic sequence analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gilbert, W.; Overbeek, R.

    1994-11-01

    The central goal of this project is to develop an integrated database to support comparative analysis of genomes including DNA sequence data, protein sequence data, gene expression data and metabolism data. In developing the logic-based system GenoBase, a broader integration of available data was achieved due to assistance from collaborators. Current goals are to easily include new forms of data as they become available and to easily navigate through the ensemble of objects described within the database. This report comments on progress made in these areas.

  3. Integrated Curriculum and Subject-based Curriculum: Achievement and Attitudes

    Science.gov (United States)

    Casady, Victoria

    The research conducted for this mixed-method study, qualitative and quantitative, analyzed the results of an academic year-long study to determine whether the use of an integrated fourth grade curriculum would benefit student achievement in the areas of English language arts, social studies, and science more than a subject-based traditional curriculum. The research was conducted based on the international, national, and state test scores, which show a slowing or lack of growth. Through pre- and post-assessments, student questionnaires, and administrative interviews, the researcher analyzed the phenomenological experiences of the students to determine if the integrated curriculum was a beneficial restructuring of the curriculum. The research questions for this study focused on the achievement and attitudes of the students in the study and whether the curriculum they were taught impacted their achievement and attitudes over the course of one school year. The curricula for the study were organized to cover the current standards, where the integrated curriculum focused on connections between subject areas to help students make connections to what they are learning and the world beyond the classroom. The findings of this study indicated that utilizing the integrated curriculum could increase achievement as well as students' attitudes toward specific content areas. The ANOVA analysis for English language arts was not determined to be significant; although, greater growth in the students from the integrated curriculum setting was recorded. The ANOVA for social studies (0.05) and the paired t-tests (0.001) for science both determined significant positive differences. The qualitative analysis led to the discovery that the experiences of the students from the integrated curriculum setting were more positive. The evaluation of the data from this study led the researcher to determine that the integrated curriculum was a worthwhile endeavor to increase achievement and attitudes

  4. A Key Event Path Analysis Approach for Integrated Systems

    Directory of Open Access Journals (Sweden)

    Jingjing Liao

    2012-01-01

    Full Text Available By studying the key event paths of probabilistic event structure graphs (PESGs, a key event path analysis approach for integrated system models is proposed. According to translation rules concluded from integrated system architecture descriptions, the corresponding PESGs are constructed from the colored Petri Net (CPN models. Then the definitions of cycle event paths, sequence event paths, and key event paths are given. Whereafter based on the statistic results after the simulation of CPN models, key event paths are found out by the sensitive analysis approach. This approach focuses on the logic structures of CPN models, which is reliable and could be the basis of structured analysis for discrete event systems. An example of radar model is given to characterize the application of this approach, and the results are worthy of trust.

  5. Nucleic Acid-based Detection of Bacterial Pathogens Using Integrated Microfluidic Platform Systems

    Directory of Open Access Journals (Sweden)

    Carl A. Batt

    2009-05-01

    Full Text Available The advent of nucleic acid-based pathogen detection methods offers increased sensitivity and specificity over traditional microbiological techniques, driving the development of portable, integrated biosensors. The miniaturization and automation of integrated detection systems presents a significant advantage for rapid, portable field-based testing. In this review, we highlight current developments and directions in nucleic acid-based micro total analysis systems for the detection of bacterial pathogens. Recent progress in the miniaturization of microfluidic processing steps for cell capture, DNA extraction and purification, polymerase chain reaction, and product detection are detailed. Discussions include strategies and challenges for implementation of an integrated portable platform.

  6. An integrated reliability-based design optimization of offshore towers

    International Nuclear Information System (INIS)

    Karadeniz, Halil; Togan, Vedat; Vrouwenvelder, Ton

    2009-01-01

    After recognizing the uncertainty in the parameters such as material, loading, geometry and so on in contrast with the conventional optimization, the reliability-based design optimization (RBDO) concept has become more meaningful to perform an economical design implementation, which includes a reliability analysis and an optimization algorithm. RBDO procedures include structural analysis, reliability analysis and sensitivity analysis both for optimization and for reliability. The efficiency of the RBDO system depends on the mentioned numerical algorithms. In this work, an integrated algorithms system is proposed to implement the RBDO of the offshore towers, which are subjected to the extreme wave loading. The numerical strategies interacting with each other to fulfill the RBDO of towers are as follows: (a) a structural analysis program, SAPOS, (b) an optimization program, SQP and (c) a reliability analysis program based on FORM. A demonstration of an example tripod tower under the reliability constraints based on limit states of the critical stress, buckling and the natural frequency is presented.

  7. An integrated reliability-based design optimization of offshore towers

    Energy Technology Data Exchange (ETDEWEB)

    Karadeniz, Halil [Faculty of Civil Engineering and Geosciences, Delft University of Technology, Delft (Netherlands)], E-mail: h.karadeniz@tudelft.nl; Togan, Vedat [Department of Civil Engineering, Karadeniz Technical University, Trabzon (Turkey); Vrouwenvelder, Ton [Faculty of Civil Engineering and Geosciences, Delft University of Technology, Delft (Netherlands)

    2009-10-15

    After recognizing the uncertainty in the parameters such as material, loading, geometry and so on in contrast with the conventional optimization, the reliability-based design optimization (RBDO) concept has become more meaningful to perform an economical design implementation, which includes a reliability analysis and an optimization algorithm. RBDO procedures include structural analysis, reliability analysis and sensitivity analysis both for optimization and for reliability. The efficiency of the RBDO system depends on the mentioned numerical algorithms. In this work, an integrated algorithms system is proposed to implement the RBDO of the offshore towers, which are subjected to the extreme wave loading. The numerical strategies interacting with each other to fulfill the RBDO of towers are as follows: (a) a structural analysis program, SAPOS, (b) an optimization program, SQP and (c) a reliability analysis program based on FORM. A demonstration of an example tripod tower under the reliability constraints based on limit states of the critical stress, buckling and the natural frequency is presented.

  8. Modeling and Analysis of Hybrid Cellular/WLAN Systems with Integrated Service-Based Vertical Handoff Schemes

    Science.gov (United States)

    Xia, Weiwei; Shen, Lianfeng

    We propose two vertical handoff schemes for cellular network and wireless local area network (WLAN) integration: integrated service-based handoff (ISH) and integrated service-based handoff with queue capabilities (ISHQ). Compared with existing handoff schemes in integrated cellular/WLAN networks, the proposed schemes consider a more comprehensive set of system characteristics such as different features of voice and data services, dynamic information about the admitted calls, user mobility and vertical handoffs in two directions. The code division multiple access (CDMA) cellular network and IEEE 802.11e WLAN are taken into account in the proposed schemes. We model the integrated networks by using multi-dimensional Markov chains and the major performance measures are derived for voice and data services. The important system parameters such as thresholds to prioritize handoff voice calls and queue sizes are optimized. Numerical results demonstrate that the proposed ISHQ scheme can maximize the utilization of overall bandwidth resources with the best quality of service (QoS) provisioning for voice and data services.

  9. Problems in mathematical analysis III integration

    CERN Document Server

    Kaczor, W J

    2003-01-01

    We learn by doing. We learn mathematics by doing problems. This is the third volume of Problems in Mathematical Analysis. The topic here is integration for real functions of one real variable. The first chapter is devoted to the Riemann and the Riemann-Stieltjes integrals. Chapter 2 deals with Lebesgue measure and integration. The authors include some famous, and some not so famous, integral inequalities related to Riemann integration. Many of the problems for Lebesgue integration concern convergence theorems and the interchange of limits and integrals. The book closes with a section on Fourier series, with a concentration on Fourier coefficients of functions from particular classes and on basic theorems for convergence of Fourier series. The book is primarily geared toward students in analysis, as a study aid, for problem-solving seminars, or for tutorials. It is also an excellent resource for instructors who wish to incorporate problems into their lectures. Solutions for the problems are provided in the boo...

  10. Measure and integral an introduction to real analysis

    CERN Document Server

    Wheeden, Richard L

    2015-01-01

    Now considered a classic text on the topic, Measure and Integral: An Introduction to Real Analysis provides an introduction to real analysis by first developing the theory of measure and integration in the simple setting of Euclidean space, and then presenting a more general treatment based on abstract notions characterized by axioms and with less geometric content.Published nearly forty years after the first edition, this long-awaited Second Edition also:Studies the Fourier transform of functions in the spaces L1, L2, and Lp, 1 p Shows the Hilbert transform to be a bounded operator on L2, as an application of the L2 theory of the Fourier transform in the one-dimensional caseCovers fractional integration and some topics related to mean oscillation properties of functions, such as the classes of Hölder continuous functions and the space of functions of bounded mean oscillationDerives a subrepresentation formula, which in higher dimensions plays a role roughly similar to the one played by the fundamental theor...

  11. A Comprehensive Database and Analysis Framework To Incorporate Multiscale Data Types and Enable Integrated Analysis of Bioactive Polyphenols.

    Science.gov (United States)

    Ho, Lap; Cheng, Haoxiang; Wang, Jun; Simon, James E; Wu, Qingli; Zhao, Danyue; Carry, Eileen; Ferruzzi, Mario G; Faith, Jeremiah; Valcarcel, Breanna; Hao, Ke; Pasinetti, Giulio M

    2018-03-05

    The development of a given botanical preparation for eventual clinical application requires extensive, detailed characterizations of the chemical composition, as well as the biological availability, biological activity, and safety profiles of the botanical. These issues are typically addressed using diverse experimental protocols and model systems. Based on this consideration, in this study we established a comprehensive database and analysis framework for the collection, collation, and integrative analysis of diverse, multiscale data sets. Using this framework, we conducted an integrative analysis of heterogeneous data from in vivo and in vitro investigation of a complex bioactive dietary polyphenol-rich preparation (BDPP) and built an integrated network linking data sets generated from this multitude of diverse experimental paradigms. We established a comprehensive database and analysis framework as well as a systematic and logical means to catalogue and collate the diverse array of information gathered, which is securely stored and added to in a standardized manner to enable fast query. We demonstrated the utility of the database in (1) a statistical ranking scheme to prioritize response to treatments and (2) in depth reconstruction of functionality studies. By examination of these data sets, the system allows analytical querying of heterogeneous data and the access of information related to interactions, mechanism of actions, functions, etc., which ultimately provide a global overview of complex biological responses. Collectively, we present an integrative analysis framework that leads to novel insights on the biological activities of a complex botanical such as BDPP that is based on data-driven characterizations of interactions between BDPP-derived phenolic metabolites and their mechanisms of action, as well as synergism and/or potential cancellation of biological functions. Out integrative analytical approach provides novel means for a systematic integrative

  12. Application of Stochastic Sensitivity Analysis to Integrated Force Method

    Directory of Open Access Journals (Sweden)

    X. F. Wei

    2012-01-01

    Full Text Available As a new formulation in structural analysis, Integrated Force Method has been successfully applied to many structures for civil, mechanical, and aerospace engineering due to the accurate estimate of forces in computation. Right now, it is being further extended to the probabilistic domain. For the assessment of uncertainty effect in system optimization and identification, the probabilistic sensitivity analysis of IFM was further investigated in this study. A set of stochastic sensitivity analysis formulation of Integrated Force Method was developed using the perturbation method. Numerical examples are presented to illustrate its application. Its efficiency and accuracy were also substantiated with direct Monte Carlo simulations and the reliability-based sensitivity method. The numerical algorithm was shown to be readily adaptable to the existing program since the models of stochastic finite element and stochastic design sensitivity are almost identical.

  13. Methodology for dimensional variation analysis of ITER integrated systems

    International Nuclear Information System (INIS)

    Fuentes, F. Javier; Trouvé, Vincent; Cordier, Jean-Jacques; Reich, Jens

    2016-01-01

    Highlights: • Tokamak dimensional management methodology, based on 3D variation analysis, is presented. • Dimensional Variation Model implementation workflow is described. • Methodology phases are described in detail. The application of this methodology to the tolerance analysis of ITER Vacuum Vessel is presented. • Dimensional studies are a valuable tool for the assessment of Tokamak PCR (Project Change Requests), DR (Deviation Requests) and NCR (Non-Conformance Reports). - Abstract: The ITER machine consists of a large number of complex systems highly integrated, with critical functional requirements and reduced design clearances to minimize the impact in cost and performances. Tolerances and assembly accuracies in critical areas could have a serious impact in the final performances, compromising the machine assembly and plasma operation. The management of tolerances allocated to part manufacture and assembly processes, as well as the control of potential deviations and early mitigation of non-compliances with the technical requirements, is a critical activity on the project life cycle. A 3D tolerance simulation analysis of ITER Tokamak machine has been developed based on 3DCS dedicated software. This integrated dimensional variation model is representative of Tokamak manufacturing functional tolerances and assembly processes, predicting accurate values for the amount of variation on critical areas. This paper describes the detailed methodology to implement and update the Tokamak Dimensional Variation Model. The model is managed at system level. The methodology phases are illustrated by its application to the Vacuum Vessel (VV), considering the status of maturity of VV dimensional variation model. The following topics are described in this paper: • Model description and constraints. • Model implementation workflow. • Management of input and output data. • Statistical analysis and risk assessment. The management of the integration studies based on

  14. Methodology for dimensional variation analysis of ITER integrated systems

    Energy Technology Data Exchange (ETDEWEB)

    Fuentes, F. Javier, E-mail: FranciscoJavier.Fuentes@iter.org [ITER Organization, Route de Vinon-sur-Verdon—CS 90046, 13067 St Paul-lez-Durance (France); Trouvé, Vincent [Assystem Engineering & Operation Services, rue J-M Jacquard CS 60117, 84120 Pertuis (France); Cordier, Jean-Jacques; Reich, Jens [ITER Organization, Route de Vinon-sur-Verdon—CS 90046, 13067 St Paul-lez-Durance (France)

    2016-11-01

    Highlights: • Tokamak dimensional management methodology, based on 3D variation analysis, is presented. • Dimensional Variation Model implementation workflow is described. • Methodology phases are described in detail. The application of this methodology to the tolerance analysis of ITER Vacuum Vessel is presented. • Dimensional studies are a valuable tool for the assessment of Tokamak PCR (Project Change Requests), DR (Deviation Requests) and NCR (Non-Conformance Reports). - Abstract: The ITER machine consists of a large number of complex systems highly integrated, with critical functional requirements and reduced design clearances to minimize the impact in cost and performances. Tolerances and assembly accuracies in critical areas could have a serious impact in the final performances, compromising the machine assembly and plasma operation. The management of tolerances allocated to part manufacture and assembly processes, as well as the control of potential deviations and early mitigation of non-compliances with the technical requirements, is a critical activity on the project life cycle. A 3D tolerance simulation analysis of ITER Tokamak machine has been developed based on 3DCS dedicated software. This integrated dimensional variation model is representative of Tokamak manufacturing functional tolerances and assembly processes, predicting accurate values for the amount of variation on critical areas. This paper describes the detailed methodology to implement and update the Tokamak Dimensional Variation Model. The model is managed at system level. The methodology phases are illustrated by its application to the Vacuum Vessel (VV), considering the status of maturity of VV dimensional variation model. The following topics are described in this paper: • Model description and constraints. • Model implementation workflow. • Management of input and output data. • Statistical analysis and risk assessment. The management of the integration studies based on

  15. An extensive analysis of disease-gene associations using network integration and fast kernel-based gene prioritization methods

    Science.gov (United States)

    Valentini, Giorgio; Paccanaro, Alberto; Caniza, Horacio; Romero, Alfonso E.; Re, Matteo

    2014-01-01

    Objective In the context of “network medicine”, gene prioritization methods represent one of the main tools to discover candidate disease genes by exploiting the large amount of data covering different types of functional relationships between genes. Several works proposed to integrate multiple sources of data to improve disease gene prioritization, but to our knowledge no systematic studies focused on the quantitative evaluation of the impact of network integration on gene prioritization. In this paper, we aim at providing an extensive analysis of gene-disease associations not limited to genetic disorders, and a systematic comparison of different network integration methods for gene prioritization. Materials and methods We collected nine different functional networks representing different functional relationships between genes, and we combined them through both unweighted and weighted network integration methods. We then prioritized genes with respect to each of the considered 708 medical subject headings (MeSH) diseases by applying classical guilt-by-association, random walk and random walk with restart algorithms, and the recently proposed kernelized score functions. Results The results obtained with classical random walk algorithms and the best single network achieved an average area under the curve (AUC) across the 708 MeSH diseases of about 0.82, while kernelized score functions and network integration boosted the average AUC to about 0.89. Weighted integration, by exploiting the different “informativeness” embedded in different functional networks, outperforms unweighted integration at 0.01 significance level, according to the Wilcoxon signed rank sum test. For each MeSH disease we provide the top-ranked unannotated candidate genes, available for further bio-medical investigation. Conclusions Network integration is necessary to boost the performances of gene prioritization methods. Moreover the methods based on kernelized score functions can further

  16. Integration Processes of Delay Differential Equation Based on Modified Laguerre Functions

    Directory of Open Access Journals (Sweden)

    Yeguo Sun

    2012-01-01

    Full Text Available We propose long-time convergent numerical integration processes for delay differential equations. We first construct an integration process based on modified Laguerre functions. Then we establish its global convergence in certain weighted Sobolev space. The proposed numerical integration processes can also be used for systems of delay differential equations. We also developed a technique for refinement of modified Laguerre-Radau interpolations. Lastly, numerical results demonstrate the spectral accuracy of the proposed method and coincide well with analysis.

  17. Bending analysis of embedded nanoplates based on the integral formulation of Eringen's nonlocal theory using the finite element method

    Science.gov (United States)

    Ansari, R.; Torabi, J.; Norouzzadeh, A.

    2018-04-01

    Due to the capability of Eringen's nonlocal elasticity theory to capture the small length scale effect, it is widely used to study the mechanical behaviors of nanostructures. Previous studies have indicated that in some cases, the differential form of this theory cannot correctly predict the behavior of structure, and the integral form should be employed to avoid obtaining inconsistent results. The present study deals with the bending analysis of nanoplates resting on elastic foundation based on the integral formulation of Eringen's nonlocal theory. Since the formulation is presented in a general form, arbitrary kernel functions can be used. The first order shear deformation plate theory is considered to model the nanoplates, and the governing equations for both integral and differential forms are presented. Finally, the finite element method is applied to solve the problem. Selected results are given to investigate the effects of elastic foundation and to compare the predictions of integral nonlocal model with those of its differential nonlocal and local counterparts. It is found that by the use of proposed integral formulation of Eringen's nonlocal model, the paradox observed for the cantilever nanoplate is resolved.

  18. Inertial navigation sensor integrated motion analysis for autonomous vehicle navigation

    Science.gov (United States)

    Roberts, Barry; Bhanu, Bir

    1992-01-01

    Recent work on INS integrated motion analysis is described. Results were obtained with a maximally passive system of obstacle detection (OD) for ground-based vehicles and rotorcraft. The OD approach involves motion analysis of imagery acquired by a passive sensor in the course of vehicle travel to generate range measurements to world points within the sensor FOV. INS data and scene analysis results are used to enhance interest point selection, the matching of the interest points, and the subsequent motion-based computations, tracking, and OD. The most important lesson learned from the research described here is that the incorporation of inertial data into the motion analysis program greatly improves the analysis and makes the process more robust.

  19. Integrating eQTL data with GWAS summary statistics in pathway-based analysis with application to schizophrenia.

    Science.gov (United States)

    Wu, Chong; Pan, Wei

    2018-04-01

    Many genetic variants affect complex traits through gene expression, which can be exploited to boost statistical power and enhance interpretation in genome-wide association studies (GWASs) as demonstrated by the transcriptome-wide association study (TWAS) approach. Furthermore, due to polygenic inheritance, a complex trait is often affected by multiple genes with similar functions as annotated in gene pathways. Here, we extend TWAS from gene-based analysis to pathway-based analysis: we integrate public pathway collections, expression quantitative trait locus (eQTL) data and GWAS summary association statistics (or GWAS individual-level data) to identify gene pathways associated with complex traits. The basic idea is to weight the SNPs of the genes in a pathway based on their estimated cis-effects on gene expression, then adaptively test for association of the pathway with a GWAS trait by effectively aggregating possibly weak association signals across the genes in the pathway. The P values can be calculated analytically and thus fast. We applied our proposed test with the KEGG and GO pathways to two schizophrenia (SCZ) GWAS summary association data sets, denoted by SCZ1 and SCZ2 with about 20,000 and 150,000 subjects, respectively. Most of the significant pathways identified by analyzing the SCZ1 data were reproduced by the SCZ2 data. Importantly, we identified 15 novel pathways associated with SCZ, such as GABA receptor complex (GO:1902710), which could not be uncovered by the standard single SNP-based analysis or gene-based TWAS. The newly identified pathways may help us gain insights into the biological mechanism underlying SCZ. Our results showcase the power of incorporating gene expression information and gene functional annotations into pathway-based association testing for GWAS. © 2018 WILEY PERIODICALS, INC.

  20. A new method to identify the foot of continental slope based on an integrated profile analysis

    Science.gov (United States)

    Wu, Ziyin; Li, Jiabiao; Li, Shoujun; Shang, Jihong; Jin, Xiaobin

    2017-06-01

    A new method is proposed to identify automatically the foot of the continental slope (FOS) based on the integrated analysis of topographic profiles. Based on the extremum points of the second derivative and the Douglas-Peucker algorithm, it simplifies the topographic profiles, then calculates the second derivative of the original profiles and the D-P profiles. Seven steps are proposed to simplify the original profiles. Meanwhile, multiple identification methods are proposed to determine the FOS points, including gradient, water depth and second derivative values of data points, as well as the concave and convex, continuity and segmentation of the topographic profiles. This method can comprehensively and intelligently analyze the topographic profiles and their derived slopes, second derivatives and D-P profiles, based on which, it is capable to analyze the essential properties of every single data point in the profile. Furthermore, it is proposed to remove the concave points of the curve and in addition, to implement six FOS judgment criteria.

  1. MONGKIE: an integrated tool for network analysis and visualization for multi-omics data.

    Science.gov (United States)

    Jang, Yeongjun; Yu, Namhee; Seo, Jihae; Kim, Sun; Lee, Sanghyuk

    2016-03-18

    Network-based integrative analysis is a powerful technique for extracting biological insights from multilayered omics data such as somatic mutations, copy number variations, and gene expression data. However, integrated analysis of multi-omics data is quite complicated and can hardly be done in an automated way. Thus, a powerful interactive visual mining tool supporting diverse analysis algorithms for identification of driver genes and regulatory modules is much needed. Here, we present a software platform that integrates network visualization with omics data analysis tools seamlessly. The visualization unit supports various options for displaying multi-omics data as well as unique network models for describing sophisticated biological networks such as complex biomolecular reactions. In addition, we implemented diverse in-house algorithms for network analysis including network clustering and over-representation analysis. Novel functions include facile definition and optimized visualization of subgroups, comparison of a series of data sets in an identical network by data-to-visual mapping and subsequent overlaying function, and management of custom interaction networks. Utility of MONGKIE for network-based visual data mining of multi-omics data was demonstrated by analysis of the TCGA glioblastoma data. MONGKIE was developed in Java based on the NetBeans plugin architecture, thus being OS-independent with intrinsic support of module extension by third-party developers. We believe that MONGKIE would be a valuable addition to network analysis software by supporting many unique features and visualization options, especially for analysing multi-omics data sets in cancer and other diseases. .

  2. Entropy-based analysis and bioinformatics-inspired integration of global economic information transfer.

    Directory of Open Access Journals (Sweden)

    Jinkyu Kim

    Full Text Available The assessment of information transfer in the global economic network helps to understand the current environment and the outlook of an economy. Most approaches on global networks extract information transfer based mainly on a single variable. This paper establishes an entirely new bioinformatics-inspired approach to integrating information transfer derived from multiple variables and develops an international economic network accordingly. In the proposed methodology, we first construct the transfer entropies (TEs between various intra- and inter-country pairs of economic time series variables, test their significances, and then use a weighted sum approach to aggregate information captured in each TE. Through a simulation study, the new method is shown to deliver better information integration compared to existing integration methods in that it can be applied even when intra-country variables are correlated. Empirical investigation with the real world data reveals that Western countries are more influential in the global economic network and that Japan has become less influential following the Asian currency crisis.

  3. Entropy-based analysis and bioinformatics-inspired integration of global economic information transfer.

    Science.gov (United States)

    Kim, Jinkyu; Kim, Gunn; An, Sungbae; Kwon, Young-Kyun; Yoon, Sungroh

    2013-01-01

    The assessment of information transfer in the global economic network helps to understand the current environment and the outlook of an economy. Most approaches on global networks extract information transfer based mainly on a single variable. This paper establishes an entirely new bioinformatics-inspired approach to integrating information transfer derived from multiple variables and develops an international economic network accordingly. In the proposed methodology, we first construct the transfer entropies (TEs) between various intra- and inter-country pairs of economic time series variables, test their significances, and then use a weighted sum approach to aggregate information captured in each TE. Through a simulation study, the new method is shown to deliver better information integration compared to existing integration methods in that it can be applied even when intra-country variables are correlated. Empirical investigation with the real world data reveals that Western countries are more influential in the global economic network and that Japan has become less influential following the Asian currency crisis.

  4. Integrated Radiation Analysis and Design Tools

    Data.gov (United States)

    National Aeronautics and Space Administration — The Integrated Radiation Analysis and Design Tools (IRADT) Project develops and maintains an integrated tool set that collects the current best practices, databases,...

  5. Ontology-based Vaccine and Drug Adverse Event Representation and Theory-guided Systematic Causal Network Analysis toward Integrative Pharmacovigilance Research.

    Science.gov (United States)

    He, Yongqun

    2016-06-01

    Compared with controlled terminologies ( e.g. , MedDRA, CTCAE, and WHO-ART), the community-based Ontology of AEs (OAE) has many advantages in adverse event (AE) classifications. The OAE-derived Ontology of Vaccine AEs (OVAE) and Ontology of Drug Neuropathy AEs (ODNAE) serve as AE knowledge bases and support data integration and analysis. The Immune Response Gene Network Theory explains molecular mechanisms of vaccine-related AEs. The OneNet Theory of Life treats the whole process of a life of an organism as a single complex and dynamic network ( i.e. , OneNet). A new "OneNet effectiveness" tenet is proposed here to expand the OneNet theory. Derived from the OneNet theory, the author hypothesizes that one human uses one single genotype-rooted mechanism to respond to different vaccinations and drug treatments, and experimentally identified mechanisms are manifestations of the OneNet blueprint mechanism under specific conditions. The theories and ontologies interact together as semantic frameworks to support integrative pharmacovigilance research.

  6. Students' perceptions of vertical and horizontal integration in a discipline-based dental school.

    Science.gov (United States)

    Postma, T C; White, J G

    2017-05-01

    Integration is a key concern in discipline-based undergraduate dental curricula. Therefore, this study compared feedback on integration from students who participated in different instructional designs in a Comprehensive Patient Care course. The study was conducted at the University of Pretoria (2009-2011). Third-year cohorts (Cohorts A, B and C) participated in pre-clinical case-based learning, whilst fourth-year cohorts (Cohorts D and E) received didactic teaching in Comprehensive Patient Care. Cohorts A, D and E practised clinical Comprehensive Patient Care in a discipline-based clinic. Cohort B conducted their Comprehensive Patient Care patient examinations in a dedicated facility supervised by dedicated faculty responsible to teach integration. Students had to indicate on visual analogue scales whether the way they were taught at the school helped them to integrate knowledge from the same (horizontal integration) and preceding (vertical integration) year of study. The end-points of the scales were defined as 'definitely' and 'not at all'. Analysis of variance (ANOVA) was employed to measure the differences between cohorts according to the year of study. Third-year case-based learning cohorts rated the horizontal integration close to 80/100 and vertical integration ranging from 64 to 71/100. In year four, Cohort B rated vertical and horizontal integration 9-15% higher (ANOVA, P horizontal integration 11-18% higher (ANOVA, P integration in the discipline-based undergraduate dental curriculum. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  7. The Vehicle Integrated Performance Analysis Experience: Reconnecting With Technical Integration

    Science.gov (United States)

    McGhee, D. S.

    2006-01-01

    Very early in the Space Launch Initiative program, a small team of engineers at MSFC proposed a process for performing system-level assessments of a launch vehicle. Aimed primarily at providing insight and making NASA a smart buyer, the Vehicle Integrated Performance Analysis (VIPA) team was created. The difference between the VIPA effort and previous integration attempts is that VIPA a process using experienced people from various disciplines, which focuses them on a technically integrated assessment. The foundations of VIPA s process are described. The VIPA team also recognized the need to target early detailed analysis toward identifying significant systems issues. This process is driven by the T-model for technical integration. VIPA s approach to performing system-level technical integration is discussed in detail. The VIPA process significantly enhances the development and monitoring of realizable project requirements. VIPA s assessment validates the concept s stated performance, identifies significant issues either with the concept or the requirements, and then reintegrates these issues to determine impacts. This process is discussed along with a description of how it may be integrated into a program s insight and review process. The VIPA process has gained favor with both engineering and project organizations for being responsive and insightful

  8. Global sensitivity analysis of DRAINMOD-FOREST, an integrated forest ecosystem model

    Science.gov (United States)

    Shiying Tian; Mohamed A. Youssef; Devendra M. Amatya; Eric D. Vance

    2014-01-01

    Global sensitivity analysis is a useful tool to understand process-based ecosystem models by identifying key parameters and processes controlling model predictions. This study reported a comprehensive global sensitivity analysis for DRAINMOD-FOREST, an integrated model for simulating water, carbon (C), and nitrogen (N) cycles and plant growth in lowland forests. The...

  9. Computational Approaches for Integrative Analysis of the Metabolome and Microbiome

    Directory of Open Access Journals (Sweden)

    Jasmine Chong

    2017-11-01

    Full Text Available The study of the microbiome, the totality of all microbes inhabiting the host or an environmental niche, has experienced exponential growth over the past few years. The microbiome contributes functional genes and metabolites, and is an important factor for maintaining health. In this context, metabolomics is increasingly applied to complement sequencing-based approaches (marker genes or shotgun metagenomics to enable resolution of microbiome-conferred functionalities associated with health. However, analyzing the resulting multi-omics data remains a significant challenge in current microbiome studies. In this review, we provide an overview of different computational approaches that have been used in recent years for integrative analysis of metabolome and microbiome data, ranging from statistical correlation analysis to metabolic network-based modeling approaches. Throughout the process, we strive to present a unified conceptual framework for multi-omics integration and interpretation, as well as point out potential future directions.

  10. A DSM-based framework for integrated function modelling

    DEFF Research Database (Denmark)

    Eisenbart, Boris; Gericke, Kilian; Blessing, Lucienne T. M.

    2017-01-01

    an integrated function modelling framework, which specifically aims at relating between the different function modelling perspectives prominently addressed in different disciplines. It uses interlinked matrices based on the concept of DSM and MDM in order to facilitate cross-disciplinary modelling and analysis...... of the functionality of a system. The article further presents the application of the framework based on a product example. Finally, an empirical study in industry is presented. Therein, feedback on the potential of the proposed framework to support interdisciplinary design practice as well as on areas of further...

  11. Functional Analysis of OMICs Data and Small Molecule Compounds in an Integrated "Knowledge-Based" Platform.

    Science.gov (United States)

    Dubovenko, Alexey; Nikolsky, Yuri; Rakhmatulin, Eugene; Nikolskaya, Tatiana

    2017-01-01

    Analysis of NGS and other sequencing data, gene variants, gene expression, proteomics, and other high-throughput (OMICs) data is challenging because of its biological complexity and high level of technical and biological noise. One way to deal with both problems is to perform analysis with a high fidelity annotated knowledgebase of protein interactions, pathways, and functional ontologies. This knowledgebase has to be structured in a computer-readable format and must include software tools for managing experimental data, analysis, and reporting. Here, we present MetaCore™ and Key Pathway Advisor (KPA), an integrated platform for functional data analysis. On the content side, MetaCore and KPA encompass a comprehensive database of molecular interactions of different types, pathways, network models, and ten functional ontologies covering human, mouse, and rat genes. The analytical toolkit includes tools for gene/protein list enrichment analysis, statistical "interactome" tool for the identification of over- and under-connected proteins in the dataset, and a biological network analysis module made up of network generation algorithms and filters. The suite also features Advanced Search, an application for combinatorial search of the database content, as well as a Java-based tool called Pathway Map Creator for drawing and editing custom pathway maps. Applications of MetaCore and KPA include molecular mode of action of disease research, identification of potential biomarkers and drug targets, pathway hypothesis generation, analysis of biological effects for novel small molecule compounds and clinical applications (analysis of large cohorts of patients, and translational and personalized medicine).

  12. Analysis of silicon-based integrated photovoltaic-electrochemical hydrogen generation system under varying temperature and illumination

    Institute of Scientific and Technical Information of China (English)

    Vishwa Bhatt; Brijesh Tripathi; Pankaj Yadav; Manoj Kumar

    2017-01-01

    Last decade witnessed tremendous research and development in the area of photo-electrolytic hydrogen generation using chemically stable nanostructured photo-cathode/anode materials.Due to intimately coupled charge separation and photo-catalytic processes,it is very difficult to optimize individual components of such system leading to a very low demonstrated solar-to-fuel efficiency (SFE) of less than 1%.Recently there has been growing interest in an integrated photovoltaic-electrochemical (PV-EC) system based on GaAs solar cells with the demonstrated SFE of 24.5% under concentrated illumination condition.But a high cost of GaAs based solar cells and recent price drop of poly-crystalline silicon (pc-Si) solar cells motivated researchers to explore silicon based integrated PV-EC system.In this paper a theoretical framework is introduced to model silicon-based integrated PV-EC device.The theoretical framework is used to analyze the coupling and kinetic losses of a silicon solar cell based integrated PV-EC water splitting system under varying temperature and illumination.The kinetic loss occurs in the range of 19.1%-27.9% and coupling loss takes place in the range of 5.45%-6.74% with respect to varying illumination in the range of 20-100 mW/cm2.Similarly,the effect of varying temperature has severe impact on the performance of the system,wherein the coupling loss occurs in the range of 0.84%-21.51% for the temperature variation from 25 to 50 ℃.

  13. Geospatial analysis based on GIS integrated with LADAR.

    Science.gov (United States)

    Fetterman, Matt R; Freking, Robert; Fernandez-Cull, Christy; Hinkle, Christopher W; Myne, Anu; Relyea, Steven; Winslow, Jim

    2013-10-07

    In this work, we describe multi-layered analyses of a high-resolution broad-area LADAR data set in support of expeditionary activities. High-level features are extracted from the LADAR data, such as the presence and location of buildings and cars, and then these features are used to populate a GIS (geographic information system) tool. We also apply line-of-sight (LOS) analysis to develop a path-planning module. Finally, visualization is addressed and enhanced with a gesture-based control system that allows the user to navigate through the enhanced data set in a virtual immersive experience. This work has operational applications including military, security, disaster relief, and task-based robotic path planning.

  14. Development of web-based integrity evaluation system for primary components in a nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Lee, S.M.; Kim, J.C.; Choi, J.B.; Kim, Y.J. [SAFE Research Center, Sungkyunkwan Univ., Suwon (Korea); Choi, S.N.; Jang, K.S.; Hong, S.Y. [Korea Electronic Power Research Inst., Daejeon (Korea)

    2004-07-01

    A nuclear power plant is composed of a number of primary components. Maintaining the integrity of these components is one of the most critical issues in nuclear industry. In order to maintain the integrity of these primary components, a complicated procedure is required including periodical in-service inspection, failure assessment, fracture mechanics analysis, etc. Also, experts in different fields have to co-operate to resolve the integrity issues on the basis of inspection results. This integrity evaluation process usually takes long, and thus, is detrimental for the plant productivity. Therefore, an effective safety evaluation system is essential to manage integrity issues on a nuclear power plant. In this paper, a web-based integrity evaluation system for primary components in a nuclear power plant is proposed. The proposed system, which is named as WEBIES (web-based integrity evaluation system), has been developed in the form of 3-tier system architecture. The system consists of three servers; application program server, user interface program server and data warehouse server. The application program server includes the defect acceptance analysis module and the fracture mechanics analysis module which are programmed on the basis of ASME sec. XI, appendix A. The data warehouse server provides data for the integrity evaluation including material properties, geometry information, inspection data and stress data. The user interface program server provides information to all co- workers in the field of integrity evaluation. The developed system provides engineering knowledge-based information and concurrent and collaborative working environment through internet, and thus, is expected to raise the efficiency of integrity evaluation procedures on primary components of a nuclear power plant. (orig.)

  15. Development of web-based integrity evaluation system for primary components in a nuclear power plant

    International Nuclear Information System (INIS)

    Lee, S.M.; Kim, J.C.; Choi, J.B.; Kim, Y.J.; Choi, S.N.; Jang, K.S.; Hong, S.Y.

    2004-01-01

    A nuclear power plant is composed of a number of primary components. Maintaining the integrity of these components is one of the most critical issues in nuclear industry. In order to maintain the integrity of these primary components, a complicated procedure is required including periodical in-service inspection, failure assessment, fracture mechanics analysis, etc. Also, experts in different fields have to co-operate to resolve the integrity issues on the basis of inspection results. This integrity evaluation process usually takes long, and thus, is detrimental for the plant productivity. Therefore, an effective safety evaluation system is essential to manage integrity issues on a nuclear power plant. In this paper, a web-based integrity evaluation system for primary components in a nuclear power plant is proposed. The proposed system, which is named as WEBIES (web-based integrity evaluation system), has been developed in the form of 3-tier system architecture. The system consists of three servers; application program server, user interface program server and data warehouse server. The application program server includes the defect acceptance analysis module and the fracture mechanics analysis module which are programmed on the basis of ASME sec. XI, appendix A. The data warehouse server provides data for the integrity evaluation including material properties, geometry information, inspection data and stress data. The user interface program server provides information to all co- workers in the field of integrity evaluation. The developed system provides engineering knowledge-based information and concurrent and collaborative working environment through internet, and thus, is expected to raise the efficiency of integrity evaluation procedures on primary components of a nuclear power plant. (orig.)

  16. Electromagnetic Field Analysis of an Electric Dipole Antenna Based on a Surface Integral Equation in Multilayered Dissipative Media

    Directory of Open Access Journals (Sweden)

    Yidong Xu

    2017-07-01

    Full Text Available In this paper, a novel method based on the Poggio–Miller–Chang-Harrington–Wu–Tsai (PMCHWT integral equation is presented to study the electromagnetic fields excited by vertical or horizontal electric dipoles in the presence of a layered region which consists of K-layered dissipative media and the air above. To transform the continuous integral equation into a block tridiagonal matrix with the feature of convenient solution, the Rao–Wilton–Glisson (RWG functions are introduced as expansion and testing functions. The electromagnetic fields excited by an electric dipole are calculated and compared with the available results, where the electric dipole antenna is buried in the non-planar air–sea–seabed, air–rock–earth–mine, and multilayered sphere structures. The analysis and computations demonstrate that the method exhibits high accuracy and solving performance in the near field propagation region.

  17. Influencing Factors and Development Trend Analysis of China Electric Grid Investment Demand Based on a Panel Co-Integration Model

    OpenAIRE

    Jinchao Li; Lin Chen; Yuwei Xiang; Jinying Li; Dong Peng

    2018-01-01

    Electric grid investment demand analysis is significant to reasonably arranging construction funds for the electric grid and reduce costs. This paper used the panel data of electric grid investment from 23 provinces of China between 2004 and 2016 as samples to analyze the influence between electric grid investment demand and GDP, population scale, social electricity consumption, installed electrical capacity, and peak load based on co-integration tests. We find that GDP and peak load have pos...

  18. IIS--Integrated Interactome System: a web-based platform for the annotation, analysis and visualization of protein-metabolite-gene-drug interactions by integrating a variety of data sources and tools.

    Science.gov (United States)

    Carazzolle, Marcelo Falsarella; de Carvalho, Lucas Miguel; Slepicka, Hugo Henrique; Vidal, Ramon Oliveira; Pereira, Gonçalo Amarante Guimarães; Kobarg, Jörg; Meirelles, Gabriela Vaz

    2014-01-01

    High-throughput screening of physical, genetic and chemical-genetic interactions brings important perspectives in the Systems Biology field, as the analysis of these interactions provides new insights into protein/gene function, cellular metabolic variations and the validation of therapeutic targets and drug design. However, such analysis depends on a pipeline connecting different tools that can automatically integrate data from diverse sources and result in a more comprehensive dataset that can be properly interpreted. We describe here the Integrated Interactome System (IIS), an integrative platform with a web-based interface for the annotation, analysis and visualization of the interaction profiles of proteins/genes, metabolites and drugs of interest. IIS works in four connected modules: (i) Submission module, which receives raw data derived from Sanger sequencing (e.g. two-hybrid system); (ii) Search module, which enables the user to search for the processed reads to be assembled into contigs/singlets, or for lists of proteins/genes, metabolites and drugs of interest, and add them to the project; (iii) Annotation module, which assigns annotations from several databases for the contigs/singlets or lists of proteins/genes, generating tables with automatic annotation that can be manually curated; and (iv) Interactome module, which maps the contigs/singlets or the uploaded lists to entries in our integrated database, building networks that gather novel identified interactions, protein and metabolite expression/concentration levels, subcellular localization and computed topological metrics, GO biological processes and KEGG pathways enrichment. This module generates a XGMML file that can be imported into Cytoscape or be visualized directly on the web. We have developed IIS by the integration of diverse databases following the need of appropriate tools for a systematic analysis of physical, genetic and chemical-genetic interactions. IIS was validated with yeast two

  19. Development of a three dimensional elastic plastic analysis system for the integrity evaluation of nuclear power plant components

    International Nuclear Information System (INIS)

    Huh, Nam Su; Im, Chang Ju; Kim, Young Jin; Pyo, Chang Ryul; Park, Chi Yong

    2000-01-01

    In order to evaluate the integrity of nuclear power plant components, the analysis based on fracture mechanics is crucial. For this purpose, finite element method is popularly used to obtain J-integral. However, it is time consuming to design the finite element model of a cracked structure. Also, the J-integral should by verified by alternative methods since it may differ depending on the calculation method. The objective of this paper is to develop a three-dimensional elastic-plastic J-integral analysis system which is named as EPAS program. The EPAS program consists of an automatic mesh generator for a through-wall crack and a surface crack, a solver based on ABAQUS program, and a J-integral calculation program which provides DI (Domain Integral) and EDI (Equivalent Domain Integral) based J-integral calculation. Using the EPAS program, an optimized finite element model for a cracked structure can be generated and corresponding J-integral can be obtained subsequently

  20. JAVA based LCD Reconstruction and Analysis Tools

    International Nuclear Information System (INIS)

    Bower, G.

    2004-01-01

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  1. Java based LCD reconstruction and analysis tools

    International Nuclear Information System (INIS)

    Bower, Gary; Cassell, Ron; Graf, Norman; Johnson, Tony; Ronan, Mike

    2001-01-01

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  2. An extensive analysis of disease-gene associations using network integration and fast kernel-based gene prioritization methods.

    Science.gov (United States)

    Valentini, Giorgio; Paccanaro, Alberto; Caniza, Horacio; Romero, Alfonso E; Re, Matteo

    2014-06-01

    In the context of "network medicine", gene prioritization methods represent one of the main tools to discover candidate disease genes by exploiting the large amount of data covering different types of functional relationships between genes. Several works proposed to integrate multiple sources of data to improve disease gene prioritization, but to our knowledge no systematic studies focused on the quantitative evaluation of the impact of network integration on gene prioritization. In this paper, we aim at providing an extensive analysis of gene-disease associations not limited to genetic disorders, and a systematic comparison of different network integration methods for gene prioritization. We collected nine different functional networks representing different functional relationships between genes, and we combined them through both unweighted and weighted network integration methods. We then prioritized genes with respect to each of the considered 708 medical subject headings (MeSH) diseases by applying classical guilt-by-association, random walk and random walk with restart algorithms, and the recently proposed kernelized score functions. The results obtained with classical random walk algorithms and the best single network achieved an average area under the curve (AUC) across the 708 MeSH diseases of about 0.82, while kernelized score functions and network integration boosted the average AUC to about 0.89. Weighted integration, by exploiting the different "informativeness" embedded in different functional networks, outperforms unweighted integration at 0.01 significance level, according to the Wilcoxon signed rank sum test. For each MeSH disease we provide the top-ranked unannotated candidate genes, available for further bio-medical investigation. Network integration is necessary to boost the performances of gene prioritization methods. Moreover the methods based on kernelized score functions can further enhance disease gene ranking results, by adopting both

  3. Strategic Integration of Multiple Bioinformatics Resources for System Level Analysis of Biological Networks.

    Science.gov (United States)

    D'Souza, Mark; Sulakhe, Dinanath; Wang, Sheng; Xie, Bing; Hashemifar, Somaye; Taylor, Andrew; Dubchak, Inna; Conrad Gilliam, T; Maltsev, Natalia

    2017-01-01

    Recent technological advances in genomics allow the production of biological data at unprecedented tera- and petabyte scales. Efficient mining of these vast and complex datasets for the needs of biomedical research critically depends on a seamless integration of the clinical, genomic, and experimental information with prior knowledge about genotype-phenotype relationships. Such experimental data accumulated in publicly available databases should be accessible to a variety of algorithms and analytical pipelines that drive computational analysis and data mining.We present an integrated computational platform Lynx (Sulakhe et al., Nucleic Acids Res 44:D882-D887, 2016) ( http://lynx.cri.uchicago.edu ), a web-based database and knowledge extraction engine. It provides advanced search capabilities and a variety of algorithms for enrichment analysis and network-based gene prioritization. It gives public access to the Lynx integrated knowledge base (LynxKB) and its analytical tools via user-friendly web services and interfaces. The Lynx service-oriented architecture supports annotation and analysis of high-throughput experimental data. Lynx tools assist the user in extracting meaningful knowledge from LynxKB and experimental data, and in the generation of weighted hypotheses regarding the genes and molecular mechanisms contributing to human phenotypes or conditions of interest. The goal of this integrated platform is to support the end-to-end analytical needs of various translational projects.

  4. Integrative Analysis of Cancer Diagnosis Studies with Composite Penalization

    Science.gov (United States)

    Liu, Jin; Huang, Jian; Ma, Shuangge

    2013-01-01

    Summary In cancer diagnosis studies, high-throughput gene profiling has been extensively conducted, searching for genes whose expressions may serve as markers. Data generated from such studies have the “large d, small n” feature, with the number of genes profiled much larger than the sample size. Penalization has been extensively adopted for simultaneous estimation and marker selection. Because of small sample sizes, markers identified from the analysis of single datasets can be unsatisfactory. A cost-effective remedy is to conduct integrative analysis of multiple heterogeneous datasets. In this article, we investigate composite penalization methods for estimation and marker selection in integrative analysis. The proposed methods use the minimax concave penalty (MCP) as the outer penalty. Under the homogeneity model, the ridge penalty is adopted as the inner penalty. Under the heterogeneity model, the Lasso penalty and MCP are adopted as the inner penalty. Effective computational algorithms based on coordinate descent are developed. Numerical studies, including simulation and analysis of practical cancer datasets, show satisfactory performance of the proposed methods. PMID:24578589

  5. Preference-Based Recommendations for OLAP Analysis

    Science.gov (United States)

    Jerbi, Houssem; Ravat, Franck; Teste, Olivier; Zurfluh, Gilles

    This paper presents a framework for integrating OLAP and recommendations. We focus on the anticipatory recommendation process that assists the user during his OLAP analysis by proposing to him the forthcoming analysis step. We present a context-aware preference model that matches decision-makers intuition, and we discuss a preference-based approach for generating personalized recommendations.

  6. Asymptotic Effectiveness of the Event-Based Sampling According to the Integral Criterion

    Directory of Open Access Journals (Sweden)

    Marek Miskowicz

    2007-01-01

    Full Text Available A rapid progress in intelligent sensing technology creates new interest in a development of analysis and design of non-conventional sampling schemes. The investigation of the event-based sampling according to the integral criterion is presented in this paper. The investigated sampling scheme is an extension of the pure linear send-on- delta/level-crossing algorithm utilized for reporting the state of objects monitored by intelligent sensors. The motivation of using the event-based integral sampling is outlined. The related works in adaptive sampling are summarized. The analytical closed-form formulas for the evaluation of the mean rate of event-based traffic, and the asymptotic integral sampling effectiveness, are derived. The simulation results verifying the analytical formulas are reported. The effectiveness of the integral sampling is compared with the related linear send-on-delta/level-crossing scheme. The calculation of the asymptotic effectiveness for common signals, which model the state evolution of dynamic systems in time, is exemplified.

  7. Comparative analysis of the influence of creep of concrete composite beams of steel - concrete model based on Volterra integral equation

    Directory of Open Access Journals (Sweden)

    Partov Doncho

    2017-01-01

    Full Text Available The paper presents analysis of the stress-strain behaviour and deflection changes due to creep in statically determinate composite steel-concrete beam according to EUROCODE 2, ACI209R-92 and Gardner&Lockman models. The mathematical model involves the equation of equilibrium, compatibility and constitutive relationship, i.e. an elastic law for the steel part and an integral-type creep law of Boltzmann - Volterra for the concrete part considering the above mentioned models. On the basis of the theory of viscoelastic body of Maslov-Arutyunian-Trost-Zerna-Bažant for determining the redistribution of stresses in beam section between concrete plate and steel beam with respect to time 't', two independent Volterra integral equations of the second kind have been derived. Numerical method based on linear approximation of the singular kernel function in the integral equation is presented. Example with the model proposed is investigated.

  8. STINGRAY: system for integrated genomic resources and analysis.

    Science.gov (United States)

    Wagner, Glauber; Jardim, Rodrigo; Tschoeke, Diogo A; Loureiro, Daniel R; Ocaña, Kary A C S; Ribeiro, Antonio C B; Emmel, Vanessa E; Probst, Christian M; Pitaluga, André N; Grisard, Edmundo C; Cavalcanti, Maria C; Campos, Maria L M; Mattoso, Marta; Dávila, Alberto M R

    2014-03-07

    The STINGRAY system has been conceived to ease the tasks of integrating, analyzing, annotating and presenting genomic and expression data from Sanger and Next Generation Sequencing (NGS) platforms. STINGRAY includes: (a) a complete and integrated workflow (more than 20 bioinformatics tools) ranging from functional annotation to phylogeny; (b) a MySQL database schema, suitable for data integration and user access control; and (c) a user-friendly graphical web-based interface that makes the system intuitive, facilitating the tasks of data analysis and annotation. STINGRAY showed to be an easy to use and complete system for analyzing sequencing data. While both Sanger and NGS platforms are supported, the system could be faster using Sanger data, since the large NGS datasets could potentially slow down the MySQL database usage. STINGRAY is available at http://stingray.biowebdb.org and the open source code at http://sourceforge.net/projects/stingray-biowebdb/.

  9. Integral data analysis for resonance parameters determination

    International Nuclear Information System (INIS)

    Larson, N.M.; Leal, L.C.; Derrien, H.

    1997-09-01

    Neutron time-of-flight experiments have long been used to determine resonance parameters. Those resonance parameters have then been used in calculations of integral quantities such as Maxwellian averages or resonance integrals, and results of those calculations in turn have been used as a criterion for acceptability of the resonance analysis. However, the calculations were inadequate because covariances on the parameter values were not included in the calculations. In this report an effort to correct for that deficiency is documented: (1) the R-matrix analysis code SAMMY has been modified to include integral quantities of importance, (2) directly within the resonance parameter analysis, and (3) to determine the best fit to both differential (microscopic) and integral (macroscopic) data simultaneously. This modification was implemented because it is expected to have an impact on the intermediate-energy range that is important for criticality safety applications

  10. Integration and global analysis of isothermal titration calorimetry data for studying macromolecular interactions.

    Science.gov (United States)

    Brautigam, Chad A; Zhao, Huaying; Vargas, Carolyn; Keller, Sandro; Schuck, Peter

    2016-05-01

    Isothermal titration calorimetry (ITC) is a powerful and widely used method to measure the energetics of macromolecular interactions by recording a thermogram of differential heating power during a titration. However, traditional ITC analysis is limited by stochastic thermogram noise and by the limited information content of a single titration experiment. Here we present a protocol for bias-free thermogram integration based on automated shape analysis of the injection peaks, followed by combination of isotherms from different calorimetric titration experiments into a global analysis, statistical analysis of binding parameters and graphical presentation of the results. This is performed using the integrated public-domain software packages NITPIC, SEDPHAT and GUSSI. The recently developed low-noise thermogram integration approach and global analysis allow for more precise parameter estimates and more reliable quantification of multisite and multicomponent cooperative and competitive interactions. Titration experiments typically take 1-2.5 h each, and global analysis usually takes 10-20 min.

  11. A computer software system for integration and analysis of grid-based remote sensing data with other natural resource data. Remote Sensing Project

    Science.gov (United States)

    Tilmann, S. E.; Enslin, W. R.; Hill-Rowley, R.

    1977-01-01

    A computer-based information system is described designed to assist in the integration of commonly available spatial data for regional planning and resource analysis. The Resource Analysis Program (RAP) provides a variety of analytical and mapping phases for single factor or multi-factor analyses. The unique analytical and graphic capabilities of RAP are demonstrated with a study conducted in Windsor Township, Eaton County, Michigan. Soil, land cover/use, topographic and geological maps were used as a data base to develope an eleven map portfolio. The major themes of the portfolio are land cover/use, non-point water pollution, waste disposal, and ground water recharge.

  12. Integrated optical 3D digital imaging based on DSP scheme

    Science.gov (United States)

    Wang, Xiaodong; Peng, Xiang; Gao, Bruce Z.

    2008-03-01

    We present a scheme of integrated optical 3-D digital imaging (IO3DI) based on digital signal processor (DSP), which can acquire range images independently without PC support. This scheme is based on a parallel hardware structure with aid of DSP and field programmable gate array (FPGA) to realize 3-D imaging. In this integrated scheme of 3-D imaging, the phase measurement profilometry is adopted. To realize the pipeline processing of the fringe projection, image acquisition and fringe pattern analysis, we present a multi-threads application program that is developed under the environment of DSP/BIOS RTOS (real-time operating system). Since RTOS provides a preemptive kernel and powerful configuration tool, with which we are able to achieve a real-time scheduling and synchronization. To accelerate automatic fringe analysis and phase unwrapping, we make use of the technique of software optimization. The proposed scheme can reach a performance of 39.5 f/s (frames per second), so it may well fit into real-time fringe-pattern analysis and can implement fast 3-D imaging. Experiment results are also presented to show the validity of proposed scheme.

  13. Integration of XRootD into the cloud infrastructure for ALICE data analysis

    Science.gov (United States)

    Kompaniets, Mikhail; Shadura, Oksana; Svirin, Pavlo; Yurchenko, Volodymyr; Zarochentsev, Andrey

    2015-12-01

    Cloud technologies allow easy load balancing between different tasks and projects. From the viewpoint of the data analysis in the ALICE experiment, cloud allows to deploy software using Cern Virtual Machine (CernVM) and CernVM File System (CVMFS), to run different (including outdated) versions of software for long term data preservation and to dynamically allocate resources for different computing activities, e.g. grid site, ALICE Analysis Facility (AAF) and possible usage for local projects or other LHC experiments. We present a cloud solution for Tier-3 sites based on OpenStack and Ceph distributed storage with an integrated XRootD based storage element (SE). One of the key features of the solution is based on idea that Ceph has been used as a backend for Cinder Block Storage service for OpenStack, and in the same time as a storage backend for XRootD, with redundancy and availability of data preserved by Ceph settings. For faster and easier OpenStack deployment was applied the Packstack solution, which is based on the Puppet configuration management system. Ceph installation and configuration operations are structured and converted to Puppet manifests describing node configurations and integrated into Packstack. This solution can be easily deployed, maintained and used even in small groups with limited computing resources and small organizations, which usually have lack of IT support. The proposed infrastructure has been tested on two different clouds (SPbSU & BITP) and integrates successfully with the ALICE data analysis model.

  14. Towards risk-based structural integrity methods for PWRs

    International Nuclear Information System (INIS)

    Chapman, O.J.V.; Lloyd, R.B.

    1992-01-01

    This paper describes the development of risk-based structural integrity assurance methods and their application to Pressurized Water Reactor (PWR) plant. In-service inspection is introduced as a way of reducing the failure probability of high risk sites and the latter are identified using reliability analysis; the extent and interval of inspection can also be optimized. The methodology is illustrated by reference to the aspect of reliability of weldments in PWR systems. (author)

  15. Practical use of the integrated reporting framework – an analysis of the content of integrated reports of selected companies

    Directory of Open Access Journals (Sweden)

    Monika Raulinajtys-Grzybek

    2017-09-01

    Full Text Available Practical use of the integrated reporting framework – an analysis of the content of integrated reports of selected companies The purpose of the article is to provide a research tool for an initial assessment of whether a company’s integrated reports meet the objectives set out in the IIRC Integrated Reporting Framework and its empirical verification. In particular, the research addresses whether the reports meet the goal of improving the quality of information available and covering all factors that influence the organization’s ability to create value. The article uses the theoretical output on the principles of preparing integrated reports and analyzes the content of selected integrated reports. Based on the source analysis, a research tool has been developed for an initial assessment of whether an integrated report fulfills its objectives. It consists of 42 questions that verify the coverage of the defined elements and the implementation of the guiding principles set by the IIRC. For empirical verification of the tool, a comparative analysis was carried out for reports prepared by selected companies operating in the utilities sector. Answering questions from the research tool allows a researcher to formulate conclusions about the implementation of the guiding principles and the completeness of the presentation of the content elements. As a result of the analysis of selected integrated reports, it was stated that various elements of the report are presented with different levels of accuracy in different reports. Reports provide the most complete information on performance and strategy. The information about business model and prospective data is in some cases presented without making a link to other parts of the report – e.g. risks and opportunities, financial data or capitals. The absence of such links limits the ability to claim that an integrated report meets its objectives, since a set of individual reports, each presenting

  16. A numerical integration-based yield estimation method for integrated circuits

    International Nuclear Information System (INIS)

    Liang Tao; Jia Xinzhang

    2011-01-01

    A novel integration-based yield estimation method is developed for yield optimization of integrated circuits. This method tries to integrate the joint probability density function on the acceptability region directly. To achieve this goal, the simulated performance data of unknown distribution should be converted to follow a multivariate normal distribution by using Box-Cox transformation (BCT). In order to reduce the estimation variances of the model parameters of the density function, orthogonal array-based modified Latin hypercube sampling (OA-MLHS) is presented to generate samples in the disturbance space during simulations. The principle of variance reduction of model parameters estimation through OA-MLHS together with BCT is also discussed. Two yield estimation examples, a fourth-order OTA-C filter and a three-dimensional (3D) quadratic function are used for comparison of our method with Monte Carlo based methods including Latin hypercube sampling and importance sampling under several combinations of sample sizes and yield values. Extensive simulations show that our method is superior to other methods with respect to accuracy and efficiency under all of the given cases. Therefore, our method is more suitable for parametric yield optimization. (semiconductor integrated circuits)

  17. A numerical integration-based yield estimation method for integrated circuits

    Energy Technology Data Exchange (ETDEWEB)

    Liang Tao; Jia Xinzhang, E-mail: tliang@yahoo.cn [Key Laboratory of Ministry of Education for Wide Bandgap Semiconductor Materials and Devices, School of Microelectronics, Xidian University, Xi' an 710071 (China)

    2011-04-15

    A novel integration-based yield estimation method is developed for yield optimization of integrated circuits. This method tries to integrate the joint probability density function on the acceptability region directly. To achieve this goal, the simulated performance data of unknown distribution should be converted to follow a multivariate normal distribution by using Box-Cox transformation (BCT). In order to reduce the estimation variances of the model parameters of the density function, orthogonal array-based modified Latin hypercube sampling (OA-MLHS) is presented to generate samples in the disturbance space during simulations. The principle of variance reduction of model parameters estimation through OA-MLHS together with BCT is also discussed. Two yield estimation examples, a fourth-order OTA-C filter and a three-dimensional (3D) quadratic function are used for comparison of our method with Monte Carlo based methods including Latin hypercube sampling and importance sampling under several combinations of sample sizes and yield values. Extensive simulations show that our method is superior to other methods with respect to accuracy and efficiency under all of the given cases. Therefore, our method is more suitable for parametric yield optimization. (semiconductor integrated circuits)

  18. Real analysis measure theory, integration, and Hilbert spaces

    CERN Document Server

    Stein, Elias M

    2005-01-01

    Real Analysis is the third volume in the Princeton Lectures in Analysis, a series of four textbooks that aim to present, in an integrated manner, the core areas of analysis. Here the focus is on the development of measure and integration theory, differentiation and integration, Hilbert spaces, and Hausdorff measure and fractals. This book reflects the objective of the series as a whole: to make plain the organic unity that exists between the various parts of the subject, and to illustrate the wide applicability of ideas of analysis to other fields of mathematics and science. After

  19. Analysis of Waste Isolation Pilot Plant Samples: Integrated Summary Report

    Energy Technology Data Exchange (ETDEWEB)

    Britt, Phillip F [ORNL

    2015-03-01

    Analysis of Waste Isolation Pilot Plant Samples: Integrated Summary Report. Summaries of conclusions, analytical processes, and analytical results. Analysis of samples taken from the Waste Isolation Pilot Plant (WIPP) near Carlsbad, New Mexico in support of the WIPP Technical Assessment Team (TAT) activities to determine to the extent feasible the mechanisms and chemical reactions that may have resulted in the breach of at least one waste drum and release of waste material in WIPP Panel 7 Room 7 on February 14, 2014. This report integrates and summarizes the results contained in three separate reports, described below, and draws conclusions based on those results. Chemical and Radiochemical Analyses of WIPP Samples R-15 C5 SWB and R16 C-4 Lip; PNNL-24003, Pacific Northwest National Laboratory, December 2014 Analysis of Waste Isolation Pilot Plant (WIPP) Underground and MgO Samples by the Savannah River National Laboratory (SRNL); SRNL-STI-2014-00617; Savannah River National Laboratory, December 2014 Report for WIPP UG Sample #3, R15C5 (9/3/14); LLNL-TR-667015; Lawrence Livermore National Laboratory, January 2015 This report is also contained in the Waste Isolation Pilot Plant Technical Assessment Team Report; SRNL-RP-2015-01198; Savannah River National Laboratory, March 17, 2015, as Appendix C: Analysis Integrated Summary Report.

  20. A Numerical Study of Quantization-Based Integrators

    Directory of Open Access Journals (Sweden)

    Barros Fernando

    2014-01-01

    Full Text Available Adaptive step size solvers are nowadays considered fundamental to achieve efficient ODE integration. While, traditionally, ODE solvers have been designed based on discrete time machines, new approaches based on discrete event systems have been proposed. Quantization provides an efficient integration technique based on signal threshold crossing, leading to independent and modular solvers communicating through discrete events. These solvers can benefit from the large body of knowledge on discrete event simulation techniques, like parallelization, to obtain efficient numerical integration. In this paper we introduce new solvers based on quantization and adaptive sampling techniques. Preliminary numerical results comparing these solvers are presented.

  1. Semantic integration of gene expression analysis tools and data sources using software connectors

    Science.gov (United States)

    2013-01-01

    Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools

  2. Integrative analysis to select cancer candidate biomarkers to targeted validation

    Science.gov (United States)

    Heberle, Henry; Domingues, Romênia R.; Granato, Daniela C.; Yokoo, Sami; Canevarolo, Rafael R.; Winck, Flavia V.; Ribeiro, Ana Carolina P.; Brandão, Thaís Bianca; Filgueiras, Paulo R.; Cruz, Karen S. P.; Barbuto, José Alexandre; Poppi, Ronei J.; Minghim, Rosane; Telles, Guilherme P.; Fonseca, Felipe Paiva; Fox, Jay W.; Santos-Silva, Alan R.; Coletta, Ricardo D.; Sherman, Nicholas E.; Paes Leme, Adriana F.

    2015-01-01

    Targeted proteomics has flourished as the method of choice for prospecting for and validating potential candidate biomarkers in many diseases. However, challenges still remain due to the lack of standardized routines that can prioritize a limited number of proteins to be further validated in human samples. To help researchers identify candidate biomarkers that best characterize their samples under study, a well-designed integrative analysis pipeline, comprising MS-based discovery, feature selection methods, clustering techniques, bioinformatic analyses and targeted approaches was performed using discovery-based proteomic data from the secretomes of three classes of human cell lines (carcinoma, melanoma and non-cancerous). Three feature selection algorithms, namely, Beta-binomial, Nearest Shrunken Centroids (NSC), and Support Vector Machine-Recursive Features Elimination (SVM-RFE), indicated a panel of 137 candidate biomarkers for carcinoma and 271 for melanoma, which were differentially abundant between the tumor classes. We further tested the strength of the pipeline in selecting candidate biomarkers by immunoblotting, human tissue microarrays, label-free targeted MS and functional experiments. In conclusion, the proposed integrative analysis was able to pre-qualify and prioritize candidate biomarkers from discovery-based proteomics to targeted MS. PMID:26540631

  3. Integrating Expert Knowledge with Statistical Analysis for Landslide Susceptibility Assessment at Regional Scale

    Directory of Open Access Journals (Sweden)

    Christos Chalkias

    2016-03-01

    Full Text Available In this paper, an integration landslide susceptibility model by combining expert-based and bivariate statistical analysis (Landslide Susceptibility Index—LSI approaches is presented. Factors related with the occurrence of landslides—such as elevation, slope angle, slope aspect, lithology, land cover, Mean Annual Precipitation (MAP and Peak Ground Acceleration (PGA—were analyzed within a GIS environment. This integrated model produced a landslide susceptibility map which categorized the study area according to the probability level of landslide occurrence. The accuracy of the final map was evaluated by Receiver Operating Characteristics (ROC analysis depending on an independent (validation dataset of landslide events. The prediction ability was found to be 76% revealing that the integration of statistical analysis with human expertise can provide an acceptable landslide susceptibility assessment at regional scale.

  4. IPAD: the Integrated Pathway Analysis Database for Systematic Enrichment Analysis.

    Science.gov (United States)

    Zhang, Fan; Drabier, Renee

    2012-01-01

    Next-Generation Sequencing (NGS) technologies and Genome-Wide Association Studies (GWAS) generate millions of reads and hundreds of datasets, and there is an urgent need for a better way to accurately interpret and distill such large amounts of data. Extensive pathway and network analysis allow for the discovery of highly significant pathways from a set of disease vs. healthy samples in the NGS and GWAS. Knowledge of activation of these processes will lead to elucidation of the complex biological pathways affected by drug treatment, to patient stratification studies of new and existing drug treatments, and to understanding the underlying anti-cancer drug effects. There are approximately 141 biological human pathway resources as of Jan 2012 according to the Pathguide database. However, most currently available resources do not contain disease, drug or organ specificity information such as disease-pathway, drug-pathway, and organ-pathway associations. Systematically integrating pathway, disease, drug and organ specificity together becomes increasingly crucial for understanding the interrelationships between signaling, metabolic and regulatory pathway, drug action, disease susceptibility, and organ specificity from high-throughput omics data (genomics, transcriptomics, proteomics and metabolomics). We designed the Integrated Pathway Analysis Database for Systematic Enrichment Analysis (IPAD, http://bioinfo.hsc.unt.edu/ipad), defining inter-association between pathway, disease, drug and organ specificity, based on six criteria: 1) comprehensive pathway coverage; 2) gene/protein to pathway/disease/drug/organ association; 3) inter-association between pathway, disease, drug, and organ; 4) multiple and quantitative measurement of enrichment and inter-association; 5) assessment of enrichment and inter-association analysis with the context of the existing biological knowledge and a "gold standard" constructed from reputable and reliable sources; and 6) cross-linking of

  5. J-integral evaluation and stability analysis in the unstable ductile fracture

    International Nuclear Information System (INIS)

    Miyoshi, Toshiro; Yoshida, Yuichiro; Shiratori, Masaki.

    1984-01-01

    Concerning unstable ductile fracture, which is an important problem on the structural stability of line pipes, nuclear reactor piping and so on, the research on fracture mechanics parameters which control the beginning of the stable growth and unstable growth of cracks attracts interest. At present, as the parameters, the T-modulus based on J-integral crack tip opening angle, crack opening angle averaged over crack developing part, plastic work coefficient and so on have been proposed. The research on the effectiveness and inter-relation of these parameters is divided into generation phase and application phase, and by these researches, it was reported that all T-modulus, CTOA and COA took almost constant values in relation to crack development, except initial transition period. In order to decide which parameter is most appropriate, the detailed analysis is required. In this study, the analysis of unstable ductile fracture of a central crack test piece and a small tensile test piece was carried out by finite element method, and the evaluation of J-integral in relation to crack development, J-integral resistance value when COA is assumed to be a constant, the form of an unstable fracture occurring point and the compliance dependence were examined. The method of analysis, the evaluation of J-integral, J-integral resistance value, unstable fracture occurring point and stability diagram are described. (Kako, I.)

  6. Research and Application of Construction of Operation Integration for Smart Power Distribution and Consumption Based on “Integration of Marketing with Distribution”

    Directory of Open Access Journals (Sweden)

    Zhenbao Feng

    2014-05-01

    Full Text Available The “information integrated platform of marketing and distribution integration system” researched and developed by this article is an advanced application platform to concurrently design and develop the automation of marketing and power distribution through integration and analysis of existing data based on the data platform of Jiaozuo Power Supply Corporation. It uses data mining and data bus technology, uniform analysis of comprehensive marketing and distribution data. And it conducts a real time monitoring on power utilization information for marketing and early warning maintenance business of power distribution according to electric business model, which realizes an integration of marketing and distribution business, achieves the target of integrated operation of marketing and distribution, improves the operation level of business, reduces maintenance costs of distribution grid, increases electricity sales of distribution grid and provide reliable practical basis for operation and maintenance of Jiaozuo power marketing and distribution.

  7. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE), Version 5.0: Integrated Reliability and Risk Analysis System (IRRAS) reference manual. Volume 2

    International Nuclear Information System (INIS)

    Russell, K.D.; Kvarfordt, K.J.; Skinner, N.L.; Wood, S.T.; Rasmuson, D.M.

    1994-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of several microcomputer programs that were developed to create and analyze probabilistic risk assessments (PRAs), primarily for nuclear power plants. The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the use the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification to report generation. Version 1.0 of the IRRAS program was released in February of 1987. Since then, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 5.0 and is the subject of this Reference Manual. Version 5.0 of IRRAS provides the same capabilities as earlier versions and ads the ability to perform location transformations, seismic analysis, and provides enhancements to the user interface as well as improved algorithm performance. Additionally, version 5.0 contains new alphanumeric fault tree and event used for event tree rules, recovery rules, and end state partitioning

  8. A fully integrated optical detector with a-Si:H based color photodiodes

    Energy Technology Data Exchange (ETDEWEB)

    Watty, Krystian; Merfort, Christian; Seibel, Konstantin; Schoeler, Lars; Boehm, Markus [Institute for Microsystem Technologies (IMT), University of Siegen, Hoelderlinstr. 3, 57076 Siegen (Germany)

    2010-03-15

    The fabrication of an electrophoresis separation microchip with monolithic integrated excitation light source and variospectral photodiodes for absorption detection is presented in this paper. Microchip based separation techniques are essential elements in the development of fully integrated micro-total analysis systems ({mu}-TAS). An integrated microfluidic device, like an application specific lab-on-microchip (ALM) (Seibel et al., in: MRS Spring Meeting, San Francisco, USA, 2005 1), includes all components, necessary to perform a chemical analysis on chip and it can be used as a stand-alone unit directly at the point of sampling. Variospectral diodes based on hydrogenated amorphous silicon (a-Si:H) technology allow for advanced optical detection schemes, because the spectral sensitivity of the devices can be tailored to fit the emission of specific fluorescent markers. Important features of a-Si:H variospectral photodiodes are a high dynamic range, a bias-tunable spectral sensitivity and a very good linearity for the separation of mixed color signals. Principle of ALM device. (Abstract Copyright [2010], Wiley Periodicals, Inc.)

  9. Integrated uncertainty analysis using RELAP/SCDAPSIM/MOD4.0

    International Nuclear Information System (INIS)

    Perez, M.; Reventos, F.; Wagner, R.; Allison, C.

    2009-01-01

    The RELAP/SCDAPSIM/MOD4.0 code, designed to predict the behavior of reactor systems during normal and accident conditions, is being developed as part of an international nuclear technology Software Development and Training Program (SDTP). RELAP/SCDAPSIM/MOD4.0, which is the first version of RELAP5 completely rewritten to FORTRAN 90/95/2000 standards, uses the publicly available RELAP5 and SCDAP models in combination with (a) advanced programming and numerical techniques, (b) advanced SDTP-member-developed models for LWR, HWR, and research reactor analysis, and (c) a variety of other member-developed computational packages. One such computational package is an integrated uncertainty analysis package being developed jointly by the Technical University of Catalunya (UPC) and Innovative Systems Software (ISS). The integrated uncertainty analysis approach used in the package uses the following steps: 1. Selection of the plant; 2. Selection of the scenario; 3. Selection of the safety criteria; 4. Identification and ranking of the relevant phenomena based on the safety criteria; 5. Selection of the appropriate code parameters to represent those phenomena; 6. Association of uncertainty by means of Probability Distribution Functions (PDFs) for each selected parameter; 7. Random sampling of the selected parameters according to its PDF and performing multiple computer runs to obtain uncertainty bands with a certain percentile and confidence level; 8. Processing the results of the multiple computer runs to estimate the uncertainty bands for the computed quantities associated with the selected safety criteria. The first four steps are performed by the user prior to the RELAP/SCDAPSIM/MOD4.0 analysis. The remaining steps are included with the MOD4.0 integrated uncertainty analysis (IUA) package. This paper briefly describes the integrated uncertainty analysis package including (a) the features of the package, (b) the implementation of the package into RELAP/SCDAPSIM/MOD4.0, and

  10. Semantic web for integrated network analysis in biomedicine.

    Science.gov (United States)

    Chen, Huajun; Ding, Li; Wu, Zhaohui; Yu, Tong; Dhanapalan, Lavanya; Chen, Jake Y

    2009-03-01

    The Semantic Web technology enables integration of heterogeneous data on the World Wide Web by making the semantics of data explicit through formal ontologies. In this article, we survey the feasibility and state of the art of utilizing the Semantic Web technology to represent, integrate and analyze the knowledge in various biomedical networks. We introduce a new conceptual framework, semantic graph mining, to enable researchers to integrate graph mining with ontology reasoning in network data analysis. Through four case studies, we demonstrate how semantic graph mining can be applied to the analysis of disease-causal genes, Gene Ontology category cross-talks, drug efficacy analysis and herb-drug interactions analysis.

  11. Integrating computer aided radiography and plantar pressure measurements for complex gait analysis

    International Nuclear Information System (INIS)

    Gefen, A.; Megido-Ravid, M.; Itzchak, Y.; Arcan, M.

    1998-01-01

    Radiographic Fluoroscopy (DRF) and Contact Pressure Display (CPD). The CPD method uses a birefiingent integrated optical sandwich for contact stress analysis, e.g. plantar pressure distribution. The DRF method displays and electronically records skeletal motion using X-ray radiation, providing the exact bone and joint positions during gait. Integrating the two techniques, contribution of each segment to the HFS behavior may be studied by applying image processing and analysis techniques. The combined resulted data may be used not only to detect and diagnose gait pathologies but also as a base for development of advanced numerical models of the foot structure

  12. Model-Based Integration and Interpretation of Data

    DEFF Research Database (Denmark)

    Petersen, Johannes

    2004-01-01

    Data integration and interpretation plays a crucial role in supervisory control. The paper defines a set of generic inference steps for the data integration and interpretation process based on a three-layer model of system representations. The three-layer model is used to clarify the combination...... of constraint and object-centered representations of the work domain throwing new light on the basic principles underlying the data integration and interpretation process of Rasmussen's abstraction hierarchy as well as other model-based approaches combining constraint and object-centered representations. Based...

  13. DESIGN ANALYSIS OF ELECTRICAL MACHINES THROUGH INTEGRATED NUMERICAL APPROACH

    Directory of Open Access Journals (Sweden)

    ARAVIND C.V.

    2016-02-01

    Full Text Available An integrated design platform for the newer type of machines is presented in this work. The machine parameters are evaluated out using developed modelling tool. With the machine parameters, the machine is modelled using computer aided tool. The designed machine is brought to simulation tool to perform electromagnetic and electromechanical analysis. In the simulation, conditions setting are performed to setup the materials, meshes, rotational speed and the excitation circuit. Electromagnetic analysis is carried out to predict the behavior of the machine based on the movement of flux in the machines. Besides, electromechanical analysis is carried out to analyse the speed-torque characteristic, the current-torque characteristic and the phase angle-torque characteristic. After all the results are analysed, the designed machine is used to generate S block function that is compatible with MATLAB/SIMULINK tool for the dynamic operational characteristics. This allows the integration of existing drive system into the new machines designed in the modelling tool. An example of the machine design is presented to validate the usage of such a tool.

  14. Advanced Concept Architecture Design and Integrated Analysis (ACADIA)

    Science.gov (United States)

    2017-11-03

    1 Advanced Concept Architecture Design and Integrated Analysis (ACADIA) Submitted to the National Institute of Aerospace (NIA) on...Research Report 20161001 - 20161030 Advanced Concept Architecture Design and Integrated Analysis (ACADIA) W911NF-16-2-0229 8504Cedric Justin, Youngjun

  15. AMIC: an expandable integrated analog front-end for light distribution moments analysis

    OpenAIRE

    SPAGGIARI, MICHELE; Herrero Bosch, Vicente; Lerche, Christoph Werner; Aliaga Varea, Ramón José; Monzó Ferrer, José María; Gadea Gironés, Rafael

    2011-01-01

    In this article we introduce AMIC (Analog Moments Integrated Circuit), a novel analog Application Specific Integrated Circuit (ASIC) front-end for Positron Emission Tomography (PET) applications. Its working principle is based on mathematical analysis of light distribution through moments calculation. Each moment provides useful information about light distribution, such as energy, position, depth of interaction, skewness (deformation due to border effect) etc. A current buffer delivers a cop...

  16. Integrated Reliability and Risk Analysis System (IRRAS) Version 2.0 user's guide

    International Nuclear Information System (INIS)

    Russell, K.D.; Sattison, M.B.; Rasmuson, D.M.

    1990-06-01

    The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the user the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification. Also provided in the system is an integrated full-screen editor for use when interfacing with remote mainframe computer systems. Version 1.0 of the IRRAS program was released in February of 1987. Since that time, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 2.0 and is the subject of this user's guide. Version 2.0 of IRRAS provides all of the same capabilities as Version 1.0 and adds a relational data base facility for managing the data, improved functionality, and improved algorithm performance. 9 refs., 292 figs., 4 tabs

  17. Integrated Genome-Based Studies of Shewanella Ecophysiology

    Energy Technology Data Exchange (ETDEWEB)

    Andrei L. Osterman, Ph.D.

    2012-12-17

    Integration of bioinformatics and experimental techniques was applied to mapping and characterization of the key components (pathways, enzymes, transporters, regulators) of the core metabolic machinery in Shewanella oneidensis and related species with main focus was on metabolic and regulatory pathways involved in utilization of various carbon and energy sources. Among the main accomplishments reflected in ten joint publications with other participants of Shewanella Federation are: (i) A systems-level reconstruction of carbohydrate utilization pathways in the genus of Shewanella (19 species). This analysis yielded reconstruction of 18 sugar utilization pathways including 10 novel pathway variants and prediction of > 60 novel protein families of enzymes, transporters and regulators involved in these pathways. Selected functional predictions were verified by focused biochemical and genetic experiments. Observed growth phenotypes were consistent with bioinformatic predictions providing strong validation of the technology and (ii) Global genomic reconstruction of transcriptional regulons in 16 Shewanella genomes. The inferred regulatory network includes 82 transcription factors, 8 riboswitches and 6 translational attenuators. Of those, 45 regulons were inferred directly from the genome context analysis, whereas others were propagated from previously characterized regulons in other species. Selected regulatory predictions were experimentally tested. Integration of this analysis with microarray data revealed overall consistency and provided additional layer of interactions between regulons. All the results were captured in the new database RegPrecise, which is a joint development with the LBNL team. A more detailed analysis of the individual subsystems, pathways and regulons in Shewanella spp included bioinfiormatics-based prediction and experimental characterization of: (i) N-Acetylglucosamine catabolic pathway; (ii)Lactate utilization machinery; (iii) Novel Nrt

  18. Qualitative Analysis of Integration Adapter Modeling

    OpenAIRE

    Ritter, Daniel; Holzleitner, Manuel

    2015-01-01

    Integration Adapters are a fundamental part of an integration system, since they provide (business) applications access to its messaging channel. However, their modeling and configuration remain under-represented. In previous work, the integration control and data flow syntax and semantics have been expressed in the Business Process Model and Notation (BPMN) as a semantic model for message-based integration, while adapter and the related quality of service modeling were left for further studi...

  19. INTEGRATION OF FACILITY MODELING CAPABILITIES FOR NUCLEAR NONPROLIFERATION ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Gorensek, M.; Hamm, L.; Garcia, H.; Burr, T.; Coles, G.; Edmunds, T.; Garrett, A.; Krebs, J.; Kress, R.; Lamberti, V.; Schoenwald, D.; Tzanos, C.; Ward, R.

    2011-07-18

    Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come from many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.

  20. Integration of facility modeling capabilities for nuclear nonproliferation analysis

    International Nuclear Information System (INIS)

    Garcia, Humberto; Burr, Tom; Coles, Garill A.; Edmunds, Thomas A.; Garrett, Alfred; Gorensek, Maximilian; Hamm, Luther; Krebs, John; Kress, Reid L.; Lamberti, Vincent; Schoenwald, David; Tzanos, Constantine P.; Ward, Richard C.

    2012-01-01

    Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come from many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.

  1. Integration Of Facility Modeling Capabilities For Nuclear Nonproliferation Analysis

    International Nuclear Information System (INIS)

    Gorensek, M.; Hamm, L.; Garcia, H.; Burr, T.; Coles, G.; Edmunds, T.; Garrett, A.; Krebs, J.; Kress, R.; Lamberti, V.; Schoenwald, D.; Tzanos, C.; Ward, R.

    2011-01-01

    Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come from many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.

  2. Vertical integration and market power: A model-based analysis of restructuring in the Korean electricity market

    International Nuclear Information System (INIS)

    Bunn, Derek W.; Martoccia, Maria; Ochoa, Patricia; Kim, Haein; Ahn, Nam-Sung; Yoon, Yong-Beom

    2010-01-01

    An agent-based simulation model is developed using computational learning to investigate the impact of vertical integration between electricity generators and retailers on market power in a competitive wholesale market setting. It is observed that if partial vertical integration creates some market foreclosure, whether this leads to an increase or decrease in market power is situation specific. A detailed application to the Korean market structure reveals this to be the case. We find that in various cases, whilst vertical integration generally reduces spot prices, it can increase or decrease the market power of other market generators, depending upon the market share and the technology segment of the market, which is integrated, as well as the market concentrations before and after the integration.

  3. Vertical integration and market power. A model-based analysis of restructuring in the Korean electricity market

    Energy Technology Data Exchange (ETDEWEB)

    Bunn, Derek W.; Martoccia, Maria; Ochoa, Patricia [London Business School, London (United Kingdom); Kim, Haein; Ahn, Nam-Sung; Yoon, Yong-Beom [Korean Electric Power Corporation, Seoul (Korea)

    2010-07-15

    An agent-based simulation model is developed using computational learning to investigate the impact of vertical integration between electricity generators and retailers on market power in a competitive wholesale market setting. It is observed that if partial vertical integration creates some market foreclosure, whether this leads to an increase or decrease in market power is situation specific. A detailed application to the Korean market structure reveals this to be the case. We find that in various cases, whilst vertical integration generally reduces spot prices, it can increase or decrease the market power of other market generators, depending upon the market share and the technology segment of the market, which is integrated, as well as the market concentrations before and after the integration. (author)

  4. Vertical integration and market power: A model-based analysis of restructuring in the Korean electricity market

    Energy Technology Data Exchange (ETDEWEB)

    Bunn, Derek W., E-mail: dbunn@london.ed [London Business School, London (United Kingdom); Martoccia, Maria; Ochoa, Patricia [London Business School, London (United Kingdom); Kim, Haein; Ahn, Nam-Sung; Yoon, Yong-Beom [Korean Electric Power Corporation, Seoul (Korea, Republic of)

    2010-07-15

    An agent-based simulation model is developed using computational learning to investigate the impact of vertical integration between electricity generators and retailers on market power in a competitive wholesale market setting. It is observed that if partial vertical integration creates some market foreclosure, whether this leads to an increase or decrease in market power is situation specific. A detailed application to the Korean market structure reveals this to be the case. We find that in various cases, whilst vertical integration generally reduces spot prices, it can increase or decrease the market power of other market generators, depending upon the market share and the technology segment of the market, which is integrated, as well as the market concentrations before and after the integration.

  5. Integrative Analysis of Metabolic Models – from Structure to Dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Hartmann, Anja, E-mail: hartmann@ipk-gatersleben.de [Leibniz Institute of Plant Genetics and Crop Plant Research (IPK), Gatersleben (Germany); Schreiber, Falk [Monash University, Melbourne, VIC (Australia); Martin-Luther-University Halle-Wittenberg, Halle (Germany)

    2015-01-26

    The characterization of biological systems with respect to their behavior and functionality based on versatile biochemical interactions is a major challenge. To understand these complex mechanisms at systems level modeling approaches are investigated. Different modeling formalisms allow metabolic models to be analyzed depending on the question to be solved, the biochemical knowledge and the availability of experimental data. Here, we describe a method for an integrative analysis of the structure and dynamics represented by qualitative and quantitative metabolic models. Using various formalisms, the metabolic model is analyzed from different perspectives. Determined structural and dynamic properties are visualized in the context of the metabolic model. Interaction techniques allow the exploration and visual analysis thereby leading to a broader understanding of the behavior and functionality of the underlying biological system. The System Biology Metabolic Model Framework (SBM{sup 2} – Framework) implements the developed method and, as an example, is applied for the integrative analysis of the crop plant potato.

  6. Integrated oncogeriatric approach: a systematic review of the literature using concept analysis.

    Science.gov (United States)

    Tremblay, Dominique; Charlebois, Kathleen; Terret, Catherine; Joannette, Sonia; Latreille, Jean

    2012-01-01

    The purpose of this study was to provide a more precise definition of an integrated oncogeriatric approach (IOGA) through concept analysis. The literature was reviewed from January 2005 to April 2011 integrating three broad terms: geriatric oncology, multidisciplinarity and integrated care delivery models. Citation selection was based on: (1) elderly cancer patients as the study population; (2) disease management and (3) case studies, intervention studies, assessments, evaluations and studies. Inclusion and exclusion criteria were refined in the course of the literature search. Initiatives in geriatric oncology that relate to oncology services, social support services and primary care services for elderly cancer patients. Elderly cancer patients aged 70 years old or more. Rodgers' concept analysis method was used for this study. The analysis was carried out according to thematic analysis based on the elements of the Chronic Care Model. The search identified 618 citations. After in-depth appraisal of 327 potential citations, 62 articles that met our inclusion criteria were included in the analysis. Three IOGA main attributes were identified, which constitute IOGA's core aspects: geriatric assessment (GA), comorbidity burden and treatment outcomes. The IOGA concept comprises two broad antecedents: coordinated healthcare delivery and primary supportive care services. Regarding the consequents of an integrated approach in geriatric oncology, the studies reviewed remain inconclusive. Our study highlights the pioneering character of the multidimensional IOGA concept, for which the relationship between clinical and organisational attributes, on the one hand, and contextual antecedents, on the other, is not well understood. We have yet to ascertain IOGA's consequents. IMPLICATIONS OF KEY FINDINGS: There is clearly a need for a whole-system approach to change that will provide direction for multilevel (clinical, organisational, strategic) interventions to support

  7. Common integration sites of published datasets identified using a graph-based framework

    Directory of Open Access Journals (Sweden)

    Alessandro Vasciaveo

    2016-01-01

    Full Text Available With next-generation sequencing, the genomic data available for the characterization of integration sites (IS has dramatically increased. At present, in a single experiment, several thousand viral integration genome targets can be investigated to define genomic hot spots. In a previous article, we renovated a formal CIS analysis based on a rigid fixed window demarcation into a more stretchy definition grounded on graphs. Here, we present a selection of supporting data related to the graph-based framework (GBF from our previous article, in which a collection of common integration sites (CIS was identified on six published datasets. In this work, we will focus on two datasets, ISRTCGD and ISHIV, which have been previously discussed. Moreover, we show in more detail the workflow design that originates the datasets.

  8. Application of symplectic integrator to numerical fluid analysis

    International Nuclear Information System (INIS)

    Tanaka, Nobuatsu

    2000-01-01

    This paper focuses on application of the symplectic integrator to numerical fluid analysis. For the purpose, we introduce Hamiltonian particle dynamics to simulate fluid behavior. The method is based on both the Hamiltonian formulation of a system and the particle methods, and is therefore called Hamiltonian Particle Dynamics (HPD). In this paper, an example of HPD applications, namely the behavior of incompressible inviscid fluid, is solved. In order to improve accuracy of HPD with respect to space, CIVA, which is a highly accurate interpolation method, is combined, but the combined method is subject to problems in that the invariants of the system are not conserved in a long-time computation. For solving the problems, symplectic time integrators are introduced and the effectiveness is confirmed by numerical analyses. (author)

  9. Probabilistic Steady-State Operation and Interaction Analysis of Integrated Electricity, Gas and Heating Systems

    Directory of Open Access Journals (Sweden)

    Lun Yang

    2018-04-01

    Full Text Available The existing studies on probabilistic steady-state analysis of integrated energy systems (IES are limited to integrated electricity and gas networks or integrated electricity and heating networks. This paper proposes a probabilistic steady-state analysis of integrated electricity, gas and heating networks (EGH-IES. Four typical operation modes of an EGH-IES are presented at first. The probabilistic energy flow problem of the EGS-IES considering its operation modes and correlated uncertainties in wind/solar power and electricity/gas/heat loads is then formulated and solved by the Monte Carlo method based on Latin hypercube sampling and Nataf transformation. Numerical simulations are conducted on a sample EGH-IES working in the “electricity/gas following heat” mode to verify the probabilistic analysis proposed in this paper and to study the effects of uncertainties and correlations on the operation of the EGH-IES, especially uncertainty transmissions among the subnetworks.

  10. A web-based system architecture for ontology-based data integration in the domain of IT benchmarking

    Science.gov (United States)

    Pfaff, Matthias; Krcmar, Helmut

    2018-03-01

    In the domain of IT benchmarking (ITBM), a variety of data and information are collected. Although these data serve as the basis for business analyses, no unified semantic representation of such data yet exists. Consequently, data analysis across different distributed data sets and different benchmarks is almost impossible. This paper presents a system architecture and prototypical implementation for an integrated data management of distributed databases based on a domain-specific ontology. To preserve the semantic meaning of the data, the ITBM ontology is linked to data sources and functions as the central concept for database access. Thus, additional databases can be integrated by linking them to this domain-specific ontology and are directly available for further business analyses. Moreover, the web-based system supports the process of mapping ontology concepts to external databases by introducing a semi-automatic mapping recommender and by visualizing possible mapping candidates. The system also provides a natural language interface to easily query linked databases. The expected result of this ontology-based approach of knowledge representation and data access is an increase in knowledge and data sharing in this domain, which will enhance existing business analysis methods.

  11. Integrated modeling and analysis methodology for precision pointing applications

    Science.gov (United States)

    Gutierrez, Homero L.

    2002-07-01

    Space-based optical systems that perform tasks such as laser communications, Earth imaging, and astronomical observations require precise line-of-sight (LOS) pointing. A general approach is described for integrated modeling and analysis of these types of systems within the MATLAB/Simulink environment. The approach can be applied during all stages of program development, from early conceptual design studies to hardware implementation phases. The main objective is to predict the dynamic pointing performance subject to anticipated disturbances and noise sources. Secondary objectives include assessing the control stability, levying subsystem requirements, supporting pointing error budgets, and performing trade studies. The integrated model resides in Simulink, and several MATLAB graphical user interfaces (GUI"s) allow the user to configure the model, select analysis options, run analyses, and process the results. A convenient parameter naming and storage scheme, as well as model conditioning and reduction tools and run-time enhancements, are incorporated into the framework. This enables the proposed architecture to accommodate models of realistic complexity.

  12. A Scalable Data Integration and Analysis Architecture for Sensor Data of Pediatric Asthma.

    Science.gov (United States)

    Stripelis, Dimitris; Ambite, José Luis; Chiang, Yao-Yi; Eckel, Sandrah P; Habre, Rima

    2017-04-01

    According to the Centers for Disease Control, in the United States there are 6.8 million children living with asthma. Despite the importance of the disease, the available prognostic tools are not sufficient for biomedical researchers to thoroughly investigate the potential risks of the disease at scale. To overcome these challenges we present a big data integration and analysis infrastructure developed by our Data and Software Coordination and Integration Center (DSCIC) of the NIBIB-funded Pediatric Research using Integrated Sensor Monitoring Systems (PRISMS) program. Our goal is to help biomedical researchers to efficiently predict and prevent asthma attacks. The PRISMS-DSCIC is responsible for collecting, integrating, storing, and analyzing real-time environmental, physiological and behavioral data obtained from heterogeneous sensor and traditional data sources. Our architecture is based on the Apache Kafka, Spark and Hadoop frameworks and PostgreSQL DBMS. A main contribution of this work is extending the Spark framework with a mediation layer, based on logical schema mappings and query rewriting, to facilitate data analysis over a consistent harmonized schema. The system provides both batch and stream analytic capabilities over the massive data generated by wearable and fixed sensors.

  13. A Cloud Based Data Integration Framework

    OpenAIRE

    Jiang , Nan; Xu , Lai; Vrieze , Paul ,; Lim , Mian-Guan; Jarabo , Oscar

    2012-01-01

    Part 7: Cloud-Based Support; International audience; Virtual enterprise (VE) relies on resource sharing and collaboration across geographically dispersed and dynamically allied businesses in order to better respond to market opportunities. It is generally considered that effective data integration and management is crucial to realise the value of VE. This paper describes a cloud-based data integration framework that can be used for supporting VE to discover, explore and respond more emerging ...

  14. Game analysis of product-service integration

    Directory of Open Access Journals (Sweden)

    Heping Zhong

    2014-10-01

    Full Text Available Purpose: This paper aims at defining the value creation mechanism and income distribution strategies of product-service integration in order to promote product-service integration of a firm.Design/methodology/approach: This paper conducts researches quantitatively on the coordination mechanism of product-service integration by using game theory, and uses the methods of Shapley value and Equal growth rate to further discuss income distribution strategies of product-service integration.Findings: Product-service integration increases the total income of a firm and the added value of the income decreases as the unit price demand variation coefficient of products and services increases, while decreases as the marginal cost of products increases, decreases as the marginal cost of services increases. Moreover, the findings suggest that both income distribution strategies of product-service integration based on Shapley value method and Equal growth rate method can make the product department and service department of a firm win-win and realize the pareto improvement. The choice of what kind of distribution strategy to coordinate the actions between departments depends on the department playing dominant role in the firm. Generally speaking, for a firm at the center of market, when the product department is the main contributor to firm income, the service department will choose the income distribution strategy of product-service integration based on Shapley value method; when the service department is the main contributor to firm income, the service department will choose the income distribution strategy of product-service integration based on Equal growth rate method.Research limitations/implications: This paper makes some strict assumptions such as complete information, risk neutral, linear cost function and so on and the discussion is limited to the simple relationship between product department and service department.Practical implications: Product

  15. Multimedia Based E-learning : Design and Integration of Multimedia Content in E-learning

    Directory of Open Access Journals (Sweden)

    Abdulaziz Omar Alsadhan

    2014-05-01

    Full Text Available The advancement in multimedia and information technologies also have impacted the way of imparting education. This advancement has led to rapid use of e learning systems and has enabled greater integration of multimedia content into e learning systems. This paper present a model for development of e learning systems based on multimedia content. The model is called “Multimedia based e learning” and is loosely based on waterfall software development model. This model consists of three distinct phases; Multimedia Content Modelling, Multimedia content Development, Multimedia content Integration. These three phases are further sub divided into 7 different activities which are analysis, design, technical requirements, content development, content production & integration, implementation and evaluation. This model defines a general framework that can be applied for the development of e learning systems across all disciplines and subjects.

  16. Cost benefit analysis of power plant database integration

    International Nuclear Information System (INIS)

    Wilber, B.E.; Cimento, A.; Stuart, R.

    1988-01-01

    A cost benefit analysis of plant wide data integration allows utility management to evaluate integration and automation benefits from an economic perspective. With this evaluation, the utility can determine both the quantitative and qualitative savings that can be expected from data integration. The cost benefit analysis is then a planning tool which helps the utility to develop a focused long term implementation strategy that will yield significant near term benefits. This paper presents a flexible cost benefit analysis methodology which is both simple to use and yields accurate, verifiable results. Included in this paper is a list of parameters to consider, a procedure for performing the cost savings analysis, and samples of this procedure when applied to a utility. A case study is presented involving a specific utility where this procedure was applied. Their uses of the cost-benefit analysis are also described

  17. Integrated cost-effectiveness analysis of agri-environmental measures for water quality.

    Science.gov (United States)

    Balana, Bedru B; Jackson-Blake, Leah; Martin-Ortega, Julia; Dunn, Sarah

    2015-09-15

    This paper presents an application of integrated methodological approach for identifying cost-effective combinations of agri-environmental measures to achieve water quality targets. The methodological approach involves linking hydro-chemical modelling with economic costs of mitigation measures. The utility of the approach was explored for the River Dee catchment in North East Scotland, examining the cost-effectiveness of mitigation measures for nitrogen (N) and phosphorus (P) pollutants. In-stream nitrate concentration was modelled using the STREAM-N and phosphorus using INCA-P model. Both models were first run for baseline conditions and then their effectiveness for changes in land management was simulated. Costs were based on farm income foregone, capital and operational expenditures. The costs and effects data were integrated using 'Risk Solver Platform' optimization in excel to produce the most cost-effective combination of measures by which target nutrient reductions could be attained at a minimum economic cost. The analysis identified different combination of measures as most cost-effective for the two pollutants. An important aspect of this paper is integration of model-based effectiveness estimates with economic cost of measures for cost-effectiveness analysis of land and water management options. The methodological approach developed is not limited to the two pollutants and the selected agri-environmental measures considered in the paper; the approach can be adapted to the cost-effectiveness analysis of any catchment-scale environmental management options. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Integration of Design and Control through Model Analysis

    DEFF Research Database (Denmark)

    Russel, Boris Mariboe; Henriksen, Jens Peter; Jørgensen, Sten Bay

    2002-01-01

    A systematic computer aided analysis of the process model is proposed as a pre-solution step for integration of design and control problems. The process model equations are classified in terms of balance equations, constitutive equations and conditional equations. Analysis of the phenomena models...... (structure selection) issues for the integrated problems are considered. (C) 2002 Elsevier Science Ltd. All rights reserved....... representing the constitutive equations identify the relationships between the important process and design variables, which help to understand, define and address some of the issues related to integration of design and control. Furthermore, the analysis is able to identify a set of process (control) variables...

  19. Integration of thermodynamic insights and MINLP optimisation for the synthesis, design and analysis of process flowsheets

    DEFF Research Database (Denmark)

    Hostrup, Martin; Gani, Rafiqul; Kravanja, Zdravko

    1999-01-01

    This paper presents an integrated approach to the solution of process synthesis, design and analysis problems. Integration is achieved by combining two different techniques, synthesis based on thermodynamic insights and structural optimization together with a simulation engine and a properties pr...

  20. Requirement analysis and architecture of data communication system for integral reactor

    International Nuclear Information System (INIS)

    Jeong, K. I.; Kwon, H. J.; Park, J. H.; Park, H. Y.; Koo, I. S.

    2005-05-01

    When digitalizing the Instrumentation and Control(I and C) systems in Nuclear Power Plants(NPP), a communication network is required for exchanging the digitalized data between I and C equipments in a NPP. A requirements analysis and an analysis of design elements and techniques are required for the design of a communication network. Through the requirements analysis of the code and regulation documents such as NUREG/CR-6082, section 7.9 of NUREG 0800 , IEEE Standard 7-4.3.2 and IEEE Standard 603, the extracted requirements can be used as a design basis and design concept for a detailed design of a communication network in the I and C system of an integral reactor. Design elements and techniques such as a physical topology, protocol transmission media and interconnection device should be considered for designing a communication network. Each design element and technique should be analyzed and evaluated as a portion of the integrated communication network design. In this report, the basic design requirements related to the design of communication network are investigated by using the code and regulation documents and an analysis of the design elements and techniques is performed. Based on these investigation and analysis, the overall architecture including the safety communication network and the non-safety communication network is proposed for an integral reactor

  1. An expert system for integrated structural analysis and design optimization for aerospace structures

    Science.gov (United States)

    1992-04-01

    The results of a research study on the development of an expert system for integrated structural analysis and design optimization is presented. An Object Representation Language (ORL) was developed first in conjunction with a rule-based system. This ORL/AI shell was then used to develop expert systems to provide assistance with a variety of structural analysis and design optimization tasks, in conjunction with procedural modules for finite element structural analysis and design optimization. The main goal of the research study was to provide expertise, judgment, and reasoning capabilities in the aerospace structural design process. This will allow engineers performing structural analysis and design, even without extensive experience in the field, to develop error-free, efficient and reliable structural designs very rapidly and cost-effectively. This would not only improve the productivity of design engineers and analysts, but also significantly reduce time to completion of structural design. An extensive literature survey in the field of structural analysis, design optimization, artificial intelligence, and database management systems and their application to the structural design process was first performed. A feasibility study was then performed, and the architecture and the conceptual design for the integrated 'intelligent' structural analysis and design optimization software was then developed. An Object Representation Language (ORL), in conjunction with a rule-based system, was then developed using C++. Such an approach would improve the expressiveness for knowledge representation (especially for structural analysis and design applications), provide ability to build very large and practical expert systems, and provide an efficient way for storing knowledge. Functional specifications for the expert systems were then developed. The ORL/AI shell was then used to develop a variety of modules of expert systems for a variety of modeling, finite element analysis, and

  2. Overcoming barriers to integrating economic analysis into risk assessment.

    Science.gov (United States)

    Hoffmann, Sandra

    2011-09-01

    Regulatory risk analysis is designed to provide decisionmakers with a clearer understanding of how policies are likely to affect risk. The systems that produce risk are biological, physical, and social and economic. As a result, risk analysis is an inherently interdisciplinary task. Yet in practice, risk analysis has been interdisciplinary in only limited ways. Risk analysis could provide more accurate assessments of risk if there were better integration of economics and other social sciences into risk assessment itself. This essay examines how discussions about risk analysis policy have influenced the roles of various disciplines in risk analysis. It explores ways in which integrated bio/physical-economic modeling could contribute to more accurate assessments of risk. It reviews examples of the kind of integrated economics-bio/physical modeling that could be used to enhance risk assessment. The essay ends with a discussion of institutional barriers to greater integration of economic modeling into risk assessment and provides suggestions on how these might be overcome. © 2011 Society for Risk Analysis.

  3. Integrating fire management analysis into land management planning

    Science.gov (United States)

    Thomas J. Mills

    1983-01-01

    The analysis of alternative fire management programs should be integrated into the land and resource management planning process, but a single fire management analysis model cannot meet all planning needs. Therefore, a set of simulation models that are analytically separate from integrated land management planning models are required. The design of four levels of fire...

  4. Evaluating airline energy efficiency: An integrated approach with Network Epsilon-based Measure and Network Slacks-based Measure

    International Nuclear Information System (INIS)

    Xu, Xin; Cui, Qiang

    2017-01-01

    This paper focuses on evaluating airline energy efficiency, which is firstly divided into four stages: Operations Stage, Fleet Maintenance Stage, Services Stage and Sales Stage. The new four-stage network structure of airline energy efficiency is a modification of existing models. A new approach, integrated with Network Epsilon-based Measure and Network Slacks-based Measure, is applied to assess the overall energy efficiency and divisional efficiency of 19 international airlines from 2008 to 2014. The influencing factors of airline energy efficiency are analyzed through the regression analysis. The results indicate the followings: 1. The integrated model can identify the benchmarking airlines in the overall system and stages. 2. Most airlines' energy efficiencies keep steady during the period, except for some sharply fluctuations. The efficiency decreases mainly centralized in the year 2008–2011, affected by the financial crisis in the USA. 3. The average age of fleet is positively correlated with the overall energy efficiency, and each divisional efficiency has different significant influencing factors. - Highlights: • An integrated approach with Network Epsilon-based Measure and Network Slacks-based Measure is developed. • 19 airlines' energy efficiencies are evaluated. • Garuda Indonesia has the highest overall energy efficiency.

  5. Urban water metabolism efficiency assessment: integrated analysis of available and virtual water.

    Science.gov (United States)

    Huang, Chu-Long; Vause, Jonathan; Ma, Hwong-Wen; Yu, Chang-Ping

    2013-05-01

    Resolving the complex environmental problems of water pollution and shortage which occur during urbanization requires the systematic assessment of urban water metabolism efficiency (WME). While previous research has tended to focus on either available or virtual water metabolism, here we argue that the systematic problems arising during urbanization require an integrated assessment of available and virtual WME, using an indicator system based on material flow analysis (MFA) results. Future research should focus on the following areas: 1) analysis of available and virtual water flow patterns and processes through urban districts in different urbanization phases in years with varying amounts of rainfall, and their environmental effects; 2) based on the optimization of social, economic and environmental benefits, establishment of an indicator system for urban WME assessment using MFA results; 3) integrated assessment of available and virtual WME in districts with different urbanization levels, to facilitate study of the interactions between the natural and social water cycles; 4) analysis of mechanisms driving differences in WME between districts with different urbanization levels, and the selection of dominant social and economic driving indicators, especially those impacting water resource consumption. Combinations of these driving indicators could then be used to design efficient water resource metabolism solutions, and integrated management policies for reduced water consumption. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. Integrating neural network technology and noise analysis

    International Nuclear Information System (INIS)

    Uhrig, R.E.; Oak Ridge National Lab., TN

    1995-01-01

    The integrated use of neural network and noise analysis technologies offers advantages not available by the use of either technology alone. The application of neural network technology to noise analysis offers an opportunity to expand the scope of problems where noise analysis is useful and unique ways in which the integration of these technologies can be used productively. The two-sensor technique, in which the responses of two sensors to an unknown driving source are related, is used to demonstration such integration. The relationship between power spectral densities (PSDs) of accelerometer signals is derived theoretically using noise analysis to demonstrate its uniqueness. This relationship is modeled from experimental data using a neural network when the system is working properly, and the actual PSD of one sensor is compared with the PSD of that sensor predicted by the neural network using the PSD of the other sensor as an input. A significant deviation between the actual and predicted PSDs indicate that system is changing (i.e., failing). Experiments carried out on check values and bearings illustrate the usefulness of the methodology developed. (Author)

  7. Driving pattern analysis of Nordic region based on the national travel surveys for electric vehicle integration

    DEFF Research Database (Denmark)

    Liu, Zhaoxi; Wu, Qiuwei; Christensen, Linda

    2015-01-01

    to the power system. This paper presents a methodology to transform driving behavior of persons into the one of cars in order to analyze the driving pattern of electric vehicles (EVs) based on the National Travel Surveys. In the proposed methodology, a statistical process is used to obtain the driving behavior......EVs show great potential to cope with the intermittency of renewable energy sources (RES) and provide demand side flexibility required by the smart grid.On the other hand, the EVs will increase the electricity consumption. Large scale integration of EVs will probably have substantial impacts...... of cars by grouping the survey respondents according to the driving license number and car number and mapping the households with similar characteristics. The proposed methodology was used to carry out the driving pattern analysis in the Nordic region. The detailed driving requirements and the charging...

  8. Frame-based safety analysis approach for decision-based errors

    International Nuclear Information System (INIS)

    Fan, Chin-Feng; Yihb, Swu

    1997-01-01

    A frame-based approach is proposed to analyze decision-based errors made by automatic controllers or human operators due to erroneous reference frames. An integrated framework, Two Frame Model (TFM), is first proposed to model the dynamic interaction between the physical process and the decision-making process. Two important issues, consistency and competing processes, are raised. Consistency between the physical and logic frames makes a TFM-based system work properly. Loss of consistency refers to the failure mode that the logic frame does not accurately reflect the state of the controlled processes. Once such failure occurs, hazards may arise. Among potential hazards, the competing effect between the controller and the controlled process is the most severe one, which may jeopardize a defense-in-depth design. When the logic and physical frames are inconsistent, conventional safety analysis techniques are inadequate. We propose Frame-based Fault Tree; Analysis (FFTA) and Frame-based Event Tree Analysis (FETA) under TFM to deduce the context for decision errors and to separately generate the evolution of the logical frame as opposed to that of the physical frame. This multi-dimensional analysis approach, different from the conventional correctness-centred approach, provides a panoramic view in scenario generation. Case studies using the proposed techniques are also given to demonstrate their usage and feasibility

  9. Development of data analysis tool for combat system integration

    Directory of Open Access Journals (Sweden)

    Seung-Chun Shin

    2013-03-01

    Full Text Available System integration is an important element for the construction of naval combat ships. In particular, because impeccable combat system integration together with the sensors and weapons can ensure the combat capability and survivability of the ship, the integrated performance of the combat system should be verified and validated whether or not it fulfills the requirements of the end user. In order to conduct systematic verification and validation, a data analysis tool is requisite. This paper suggests the Data Extraction, Recording and Analysis Tool (DERAT for the data analysis of the integrated performance of the combat system, including the functional definition, architecture and effectiveness of the DERAT by presenting the test results.

  10. Analysis of e-learning implementation readiness based on integrated elr model

    Science.gov (United States)

    Adiyarta, K.; Napitupulu, D.; Rahim, R.; Abdullah, D.; Setiawan, MI

    2018-04-01

    E-learning nowadays has become a requirement for institutions to support their learning activities. To adopt e-learning, an institution requires a large strategy and resources for optimal application. Unfortunately, not all institutions that have used e-learning got the desired results or expectations. This study aims to identify the extent of the level of readiness of e-learning implementation in institution X. The degree of institutional readiness will determine the success of future e-learning utilization. In addition, institutional readiness measurement are needed to evaluate the effectiveness of strategies in e-learning development. The research method used is survey with questionnaire designed based on integration of 8 best practice ELR (e-learning readiness) model. The results showed that from 13 factors of integrated ELR model being measured, there are 3 readiness factors included in the category of not ready and needs a lot of work. They are human resource (2.57), technology skill (2.38) and content factors (2.41). In general, e-learning implementation in institutions is in the category of not ready but needs some of work (3.27). Therefore, the institution should consider which factors or areas of ELR factors are considered still not ready and needs improvement in the future.

  11. Environmental science applications with Rapid Integrated Mapping and analysis System (RIMS)

    Science.gov (United States)

    Shiklomanov, A.; Prusevich, A.; Gordov, E.; Okladnikov, I.; Titov, A.

    2016-11-01

    The Rapid Integrated Mapping and analysis System (RIMS) has been developed at the University of New Hampshire as an online instrument for multidisciplinary data visualization, analysis and manipulation with a focus on hydrological applications. Recently it was enriched with data and tools to allow more sophisticated analysis of interdisciplinary data. Three different examples of specific scientific applications with RIMS are demonstrated and discussed. Analysis of historical changes in major components of the Eurasian pan-Arctic water budget is based on historical discharge data, gridded observational meteorological fields, and remote sensing data for sea ice area. Express analysis of the extremely hot and dry summer of 2010 across European Russia is performed using a combination of near-real time and historical data to evaluate the intensity and spatial distribution of this event and its socioeconomic impacts. Integrative analysis of hydrological, water management, and population data for Central Asia over the last 30 years provides an assessment of regional water security due to changes in climate, water use and demography. The presented case studies demonstrate the capabilities of RIMS as a powerful instrument for hydrological and coupled human-natural systems research.

  12. Energy and carbon emissions analysis and prediction of complex petrochemical systems based on an improved extreme learning machine integrated interpretative structural model

    International Nuclear Information System (INIS)

    Han, Yongming; Zhu, Qunxiong; Geng, Zhiqiang; Xu, Yuan

    2017-01-01

    Highlights: • The ELM integrated ISM (ISM-ELM) method is proposed. • The proposed method is more efficient and accurate than the ELM through the UCI data set. • Energy and carbon emissions analysis and prediction of petrochemical industries based ISM-ELM is obtained. • The proposed method is valid in improving energy efficiency and reducing carbon emissions of ethylene plants. - Abstract: Energy saving and carbon emissions reduction of the petrochemical industry are affected by many factors. Thus, it is difficult to analyze and optimize the energy of complex petrochemical systems accurately. This paper proposes an energy and carbon emissions analysis and prediction approach based on an improved extreme learning machine (ELM) integrated interpretative structural model (ISM) (ISM-ELM). ISM based the partial correlation coefficient is utilized to analyze key parameters that affect the energy and carbon emissions of the complex petrochemical system, and can denoise and reduce dimensions of data to decrease the training time and errors of the ELM prediction model. Meanwhile, in terms of the model accuracy and the training time, the robustness and effectiveness of the ISM-ELM model are better than the ELM through standard data sets from the University of California Irvine (UCI) repository. Moreover, a multi-inputs and single-output (MISO) model of energy and carbon emissions of complex ethylene systems is established based on the ISM-ELM. Finally, detailed analyses and simulations using the real ethylene plant data demonstrate the effectiveness of the ISM-ELM and can guide the improvement direction of energy saving and carbon emissions reduction in complex petrochemical systems.

  13. atBioNet– an integrated network analysis tool for genomics and biomarker discovery

    Directory of Open Access Journals (Sweden)

    Ding Yijun

    2012-07-01

    Full Text Available Abstract Background Large amounts of mammalian protein-protein interaction (PPI data have been generated and are available for public use. From a systems biology perspective, Proteins/genes interactions encode the key mechanisms distinguishing disease and health, and such mechanisms can be uncovered through network analysis. An effective network analysis tool should integrate different content-specific PPI databases into a comprehensive network format with a user-friendly platform to identify key functional modules/pathways and the underlying mechanisms of disease and toxicity. Results atBioNet integrates seven publicly available PPI databases into a network-specific knowledge base. Knowledge expansion is achieved by expanding a user supplied proteins/genes list with interactions from its integrated PPI network. The statistically significant functional modules are determined by applying a fast network-clustering algorithm (SCAN: a Structural Clustering Algorithm for Networks. The functional modules can be visualized either separately or together in the context of the whole network. Integration of pathway information enables enrichment analysis and assessment of the biological function of modules. Three case studies are presented using publicly available disease gene signatures as a basis to discover new biomarkers for acute leukemia, systemic lupus erythematosus, and breast cancer. The results demonstrated that atBioNet can not only identify functional modules and pathways related to the studied diseases, but this information can also be used to hypothesize novel biomarkers for future analysis. Conclusion atBioNet is a free web-based network analysis tool that provides a systematic insight into proteins/genes interactions through examining significant functional modules. The identified functional modules are useful for determining underlying mechanisms of disease and biomarker discovery. It can be accessed at: http

  14. atBioNet--an integrated network analysis tool for genomics and biomarker discovery.

    Science.gov (United States)

    Ding, Yijun; Chen, Minjun; Liu, Zhichao; Ding, Don; Ye, Yanbin; Zhang, Min; Kelly, Reagan; Guo, Li; Su, Zhenqiang; Harris, Stephen C; Qian, Feng; Ge, Weigong; Fang, Hong; Xu, Xiaowei; Tong, Weida

    2012-07-20

    Large amounts of mammalian protein-protein interaction (PPI) data have been generated and are available for public use. From a systems biology perspective, Proteins/genes interactions encode the key mechanisms distinguishing disease and health, and such mechanisms can be uncovered through network analysis. An effective network analysis tool should integrate different content-specific PPI databases into a comprehensive network format with a user-friendly platform to identify key functional modules/pathways and the underlying mechanisms of disease and toxicity. atBioNet integrates seven publicly available PPI databases into a network-specific knowledge base. Knowledge expansion is achieved by expanding a user supplied proteins/genes list with interactions from its integrated PPI network. The statistically significant functional modules are determined by applying a fast network-clustering algorithm (SCAN: a Structural Clustering Algorithm for Networks). The functional modules can be visualized either separately or together in the context of the whole network. Integration of pathway information enables enrichment analysis and assessment of the biological function of modules. Three case studies are presented using publicly available disease gene signatures as a basis to discover new biomarkers for acute leukemia, systemic lupus erythematosus, and breast cancer. The results demonstrated that atBioNet can not only identify functional modules and pathways related to the studied diseases, but this information can also be used to hypothesize novel biomarkers for future analysis. atBioNet is a free web-based network analysis tool that provides a systematic insight into proteins/genes interactions through examining significant functional modules. The identified functional modules are useful for determining underlying mechanisms of disease and biomarker discovery. It can be accessed at: http://www.fda.gov/ScienceResearch/BioinformaticsTools/ucm285284.htm.

  15. Team-Based Care: A Concept Analysis.

    Science.gov (United States)

    Baik, Dawon

    2017-10-01

    The purpose of this concept analysis is to clarify and analyze the concept of team-based care in clinical practice. Team-based care has garnered attention as a way to enhance healthcare delivery and patient care related to quality and safety. However, there is no consensus on the concept of team-based care; as a result, the lack of common definition impedes further studies on team-based care. This analysis was conducted using Walker and Avant's strategy. Literature searches were conducted using PubMed, Cumulative Index to Nursing and Allied Health Literature (CINAHL), and PsycINFO, with a timeline from January 1985 to December 2015. The analysis demonstrates that the concept of team-based care has three core attributes: (a) interprofessional collaboration, (b) patient-centered approach, and (c) integrated care process. This is accomplished through understanding other team members' roles and responsibilities, a climate of mutual respect, and organizational support. Consequences of team-based care are identified with three aspects: (a) patient, (b) healthcare professional, and (c) healthcare organization. This concept analysis helps better understand the characteristics of team-based care in the clinical practice as well as promote the development of a theoretical definition of team-based care. © 2016 Wiley Periodicals, Inc.

  16. Integral equation based stability analysis of short wavelength drift modes in tokamaks

    International Nuclear Information System (INIS)

    Hirose, A.; Elia, M.

    2003-01-01

    Linear stability of electron skin-size drift modes in collisionless tokamak discharges has been investigated in terms of electromagnetic, kinetic integral equations in which neither ions nor electrons are assumed to be adiabatic. A slab-like ion temperature gradient mode persists in such a short wavelength regime. However, toroidicity has a strong stabilizing influence on this mode. In the electron branch, the toroidicity induced skin-size drift mode previously predicted in terms of local kinetic analysis has been recovered. The mode is driven by positive magnetic shear and strongly stabilized for negative shear. The corresponding mixing length anomalous thermal diffusivity exhibits favourable isotope dependence. (author)

  17. Russian and Foreign Experience of Integration of Agent-Based Models and Geographic Information Systems

    Directory of Open Access Journals (Sweden)

    Konstantin Anatol’evich Gulin

    2016-11-01

    Full Text Available The article provides an overview of the mechanisms of integration of agent-based models and GIS technology developed by Russian and foreign researchers. The basic framework of the article is based on critical analysis of domestic and foreign literature (monographs, scientific articles. The study is based on the application of universal scientific research methods: system approach, analysis and synthesis, classification, systematization and grouping, generalization and comparison. The article presents theoretical and methodological bases of integration of agent-based models and geographic information systems. The concept and essence of agent-based models are explained; their main advantages (compared to other modeling methods are identified. The paper characterizes the operating environment of agents as a key concept in the theory of agent-based modeling. It is shown that geographic information systems have a wide range of information resources for calculations, searching, modeling of the real world in various aspects, acting as an effective tool for displaying the agents’ operating environment and allowing to bring the model as close as possible to the real conditions. The authors also focus on a wide range of possibilities for various researches in different spatial and temporal contexts. Comparative analysis of platforms supporting the integration of agent-based models and geographic information systems has been carried out. The authors give examples of complex socio-economic models: the model of a creative city, humanitarian assistance model. In the absence of standards for research results description, the authors focus on the models’ elements such as the characteristics of the agents and their operation environment, agents’ behavior, rules of interaction between the agents and the external environment. The paper describes the possibilities and prospects of implementing these models

  18. Case for integral core-disruptive accident analysis

    International Nuclear Information System (INIS)

    Luck, L.B.; Bell, C.R.

    1985-01-01

    Integral analysis is an approach used at the Los Alamos National Laboratory to cope with the broad multiplicity of accident paths and complex phenomena that characterize the transition phase of core-disruptive accident progression in a liquid-metal-cooled fast breeder reactor. The approach is based on the combination of a reference calculation, which is intended to represent a band of similar accident paths, and associated system- and separate-effect studies, which are designed to determine the effect of uncertainties. Results are interpreted in the context of a probabilistic framework. The approach was applied successfully in two studies; illustrations from the Clinch River Breeder Reactor licensing assessment are included

  19. Process Integration Analysis of an Industrial Hydrogen Production Process

    OpenAIRE

    Stolten, Detlef; Grube, Thomas; Tock, Laurence; Maréchal, François; Metzger, Christian; Arpentinier, Philippe

    2010-01-01

    The energy efficiency of an industrial hydrogen production process using steam methane reforming (SMR) combined with the water gas shift reaction (WGS) is analyzed using process integration techniques based on heat cascade calculation and pinch analysis with the aim of identifying potential measures to enhance the process performance. The challenge is to satisfy the high temperature heat demand of the SMR reaction by minimizing the consumption of natural gas to feed the combustion and to expl...

  20. Integrated Reliability and Risk Analysis System (IRRAS), Version 2.5: Reference manual

    International Nuclear Information System (INIS)

    Russell, K.D.; McKay, M.K.; Sattison, M.B.; Skinner, N.L.; Wood, S.T.; Rasmuson, D.M.

    1991-03-01

    The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the user the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification. Version 1.0 of the IRRAS program was released in February of 1987. Since that time, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 2.5 and is the subject of this Reference Manual. Version 2.5 of IRRAS provides the same capabilities as Version 1.0 and adds a relational data base facility for managing the data, improved functionality, and improved algorithm performance. 7 refs., 348 figs

  1. Developing a Self-Report-Based Sequential Analysis Method for Educational Technology Systems: A Process-Based Usability Evaluation

    Science.gov (United States)

    Lin, Yi-Chun; Hsieh, Ya-Hui; Hou, Huei-Tse

    2015-01-01

    The development of a usability evaluation method for educational systems or applications, called the self-report-based sequential analysis, is described herein. The method aims to extend the current practice by proposing self-report-based sequential analysis as a new usability method, which integrates the advantages of self-report in survey…

  2. Identifying novel glioma associated pathways based on systems biology level meta-analysis.

    Science.gov (United States)

    Hu, Yangfan; Li, Jinquan; Yan, Wenying; Chen, Jiajia; Li, Yin; Hu, Guang; Shen, Bairong

    2013-01-01

    With recent advances in microarray technology, including genomics, proteomics, and metabolomics, it brings a great challenge for integrating this "-omics" data to analysis complex disease. Glioma is an extremely aggressive and lethal form of brain tumor, and thus the study of the molecule mechanism underlying glioma remains very important. To date, most studies focus on detecting the differentially expressed genes in glioma. However, the meta-analysis for pathway analysis based on multiple microarray datasets has not been systematically pursued. In this study, we therefore developed a systems biology based approach by integrating three types of omics data to identify common pathways in glioma. Firstly, the meta-analysis has been performed to study the overlapping of signatures at different levels based on the microarray gene expression data of glioma. Among these gene expression datasets, 12 pathways were found in GeneGO database that shared by four stages. Then, microRNA expression profiles and ChIP-seq data were integrated for the further pathway enrichment analysis. As a result, we suggest 5 of these pathways could be served as putative pathways in glioma. Among them, the pathway of TGF-beta-dependent induction of EMT via SMAD is of particular importance. Our results demonstrate that the meta-analysis based on systems biology level provide a more useful approach to study the molecule mechanism of complex disease. The integration of different types of omics data, including gene expression microarrays, microRNA and ChIP-seq data, suggest some common pathways correlated with glioma. These findings will offer useful potential candidates for targeted therapeutic intervention of glioma.

  3. INSIGHT: an integrated scoping analysis tool for in-core fuel management of PWR

    International Nuclear Information System (INIS)

    Yamamoto, Akio; Noda, Hidefumi; Ito, Nobuaki; Maruyama, Taiji.

    1997-01-01

    An integrated software tool for scoping analysis of in-core fuel management, INSIGHT, has been developed to automate the scoping analysis and to improve the fuel cycle cost using advanced optimization techniques. INSIGHT is an interactive software tool executed on UNIX based workstations that is equipped with an X-window system. INSIGHT incorporates the GALLOP loading pattern (LP) optimization module that utilizes hybrid genetic algorithms, the PATMAKER interactive LP design module, the MCA multicycle analysis module, an integrated database, and other utilities. Two benchmark problems were analyzed to confirm the key capabilities of INSIGHT: LP optimization and multicycle analysis. The first was the single cycle LP optimization problem that included various constraints. The second one was the multicycle LP optimization problem that includes the assembly burnup limitation at rod cluster control (RCC) positions. The results for these problems showed the feasibility of INSIGHT for the practical scoping analysis, whose work almost consists of LP generation and multicycle analysis. (author)

  4. Evaluation of time integration methods for transient response analysis of nonlinear structures

    International Nuclear Information System (INIS)

    Park, K.C.

    1975-01-01

    Recent developments in the evaluation of direct time integration methods for the transient response analysis of nonlinear structures are presented. These developments, which are based on local stability considerations of an integrator, show that the interaction between temporal step size and nonlinearities of structural systems has a pronounced effect on both accuracy and stability of a given time integration method. The resulting evaluation technique is applied to a model nonlinear problem, in order to: 1) demonstrate that it eliminates the present costly process of evaluating time integrator for nonlinear structural systems via extensive numerical experiments; 2) identify the desirable characteristics of time integration methods for nonlinear structural problems; 3) develop improved stiffly-stable methods for application to nonlinear structures. Extension of the methodology for examination of the interaction between a time integrator and the approximate treatment of nonlinearities (such as due to pseudo-force or incremental solution procedures) is also discussed. (Auth.)

  5. Performance analysis of solar energy integrated with natural-gas-to-methanol process

    International Nuclear Information System (INIS)

    Yang, Sheng; Liu, Zhiqiang; Tang, Zhiyong; Wang, Yifan; Chen, Qianqian; Sun, Yuhan

    2017-01-01

    Highlights: • Solar energy integrated with natural-gas-to-methanol process is proposed. • The two processes are modeled and simulated. • Performance analysis of the two processes are conducted. • The proposed process can cut down the greenhouse gas emission. • The proposed process can save natural gas consumption. - Abstract: Methanol is an important platform chemical. Methanol production using natural gas as raw material has short processing route and well developed equipment and technology. However, natural gas reserves are not large in China. Solar energy power generation system integrated with natural-gas-to-methanol (NGTM) process is developed, which may provide a technical routine for methanol production in the future. The solar energy power generation produces electricity for reforming unit and system consumption in solar energy integrated natural-gas-to-methanol system (SGTM). Performance analysis of conventional natural-gas-to-methanol process and solar energy integrated with natural-gas-to-methanol process are presented based on simulation results. Performance analysis was conducted considering carbon efficiency, production cost, solar energy price, natural gas price, and carbon tax. Results indicate that solar energy integrated with natural-gas-to-methanol process is able to cut down the greenhouse gas (GHG) emission. In addition, solar energy can replace natural gas as fuel. This can reduce the consumption of natural gas, which equals to 9.2% of the total consumed natural gas. However, it is not economical considering the current technology readiness level, compared with conventional natural-gas-to-methanol process.

  6. Fostering integrity in postgraduate research: an evidence-based policy and support framework.

    Science.gov (United States)

    Mahmud, Saadia; Bretag, Tracey

    2014-01-01

    Postgraduate research students have a unique position in the debate on integrity in research as students and novice researchers. To assess how far policies for integrity in postgraduate research meet the needs of students as "research trainees," we reviewed online policies for integrity in postgraduate research at nine particular Australian universities against the Australian Code for Responsible Conduct of Research (the Code) and the five core elements of exemplary academic integrity policy identified by Bretag et al. (2011 ), i.e., access, approach, responsibility, detail, and support. We found inconsistency with the Code in the definition of research misconduct and a lack of adequate detail and support. Based on our analysis, previous research, and the literature, we propose a framework for policy and support for postgraduate research that encompasses a consistent and educative approach to integrity maintained across the university at all levels of scholarship and for all stakeholders.

  7. Transient flow analysis of integrated valve opening process

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Xinming; Qin, Benke; Bo, Hanliang, E-mail: bohl@tsinghua.edu.cn; Xu, Xingxing

    2017-03-15

    Highlights: • The control rod hydraulic driving system (CRHDS) is a new type of built-in control rod drive technology and the integrated valve (IV) is the key control component. • The transient flow experiment induced by IV is conducted and the test results are analyzed to get its working mechanism. • The theoretical model of IV opening process is established and applied to get the changing rule of the transient flow characteristic parameters. - Abstract: The control rod hydraulic driving system (CRHDS) is a new type of built-in control rod drive technology and the IV is the key control component. The working principle of integrated valve (IV) is analyzed and the IV hydraulic experiment is conducted. There is transient flow phenomenon in the valve opening process. The theoretical model of IV opening process is established by the loop system control equations and boundary conditions. The valve opening boundary condition equation is established based on the IV three dimensional flow field analysis results and the dynamic analysis of the valve core movement. The model calculation results are in good agreement with the experimental results. On this basis, the model is used to analyze the transient flow under high temperature condition. The peak pressure head is consistent with the one under room temperature and the pressure fluctuation period is longer than the one under room temperature. Furthermore, the changing rule of pressure transients with the fluid and loop structure parameters is analyzed. The peak pressure increases with the flow rate and the peak pressure decreases with the increase of the valve opening time. The pressure fluctuation period increases with the loop pipe length and the fluctuation amplitude remains largely unchanged under different equilibrium pressure conditions. The research results lay the base for the vibration reduction analysis of the CRHDS.

  8. Environmental sustainable decision making – The need and obstacles for integration of LCA into decision analysis

    DEFF Research Database (Denmark)

    Dong, Yan; Miraglia, Simona; Manzo, Stefano

    2018-01-01

    systems, revealing potential problem shifting between life cycle stages. Through the integration with traditional risk based decision analysis, LCA may thus facilitate a better informed decision process. In this study we explore how environmental impacts are taken into account in different fields......Decision analysis is often used to help decision makers choose among alternatives, based on the expected utility associated to each alternative as function of its consequences and potential impacts. Environmental impacts are not always among the prioritized concerns of traditional decision making...... of interest for decision makers to identify the need, potential and obstacles for integrating LCA into conventional approaches to decision problems. Three application areas are used as examples: transportation planning, flood management, and food production and consumption. The analysis of these cases shows...

  9. An integrated acquisition, display, and analysis system

    International Nuclear Information System (INIS)

    Ahmad, T.; Huckins, R.J.

    1987-01-01

    The design goal of the ND9900/Genuie was to integrate a high performance data acquisition and display subsystem with a state-of-the-art 32-bit supermicrocomputer. This was achieved by integrating a Digital Equipment Corporation MicroVAX II CPU board with acquisition and display controllers via the Q-bus. The result is a tightly coupled processing and analysis system for Pulse Height Analysis and other applications. The system architecture supports distributed processing, so that acquisition and display functions are semi-autonomous, making the VAX concurrently available for applications programs

  10. Generalization of the event-based Carnevale-Hines integration scheme for integrate-and-fire models

    NARCIS (Netherlands)

    van Elburg, R.A.J.; van Ooyen, A.

    2009-01-01

    An event-based integration scheme for an integrate-and-fire neuron model with exponentially decaying excitatory synaptic currents and double exponential inhibitory synaptic currents has been introduced by Carnevale and Hines. However, the integration scheme imposes nonphysiological constraints on

  11. Generalization of the Event-Based Carnevale-Hines Integration Scheme for Integrate-and-Fire Models

    NARCIS (Netherlands)

    van Elburg, Ronald A. J.; van Ooyen, Arjen

    An event-based integration scheme for an integrate-and-fire neuron model with exponentially decaying excitatory synaptic currents and double exponential inhibitory synaptic currents has been introduced by Carnevale and Hines. However, the integration scheme imposes nonphysiological constraints on

  12. Computerized integrated data base production system (COMPINDAS)

    International Nuclear Information System (INIS)

    Marek, D.; Buerk, K.

    1990-05-01

    Based on many years of experience, and with the main objective in mind to guarantee long-term database quality and efficiency of input processes, Fachinformationszentrum Karlsruhe is developing an integrated interactive data management systems for bibliographic and factual databases. Its concept includes the following range of applications: Subject analysis with computer-assisted classification, indexing and translation; technical procedures with online acquisition and management of literature and factual data, recording by means of optical scanning, computer-assisted bibliographic description, control and update procedures; support of the whole process by continuous surveillance of document flow. All these procedures will be performed in an integrated manner. They system is to meet high standards for flexibility, data integrity and effectiveness of system functions. Independent of the type of data, the appropriate database or the subject field to be handled, all data will be stored in one large pool. One main goal is to avoid duplication of work and redundancy of data storage. The system will work online, interactive and conversational. COMPINDAS is being established on the basis of the ADABAS as database management system for storage and retrieval. The applications are being generated by means of aDis of ASTEC in Munich. aDis is used for the definition of the data structures, checking routines, coupling processes, and the design of dialogue and batch routines including masks. (author). 7 figs

  13. Computerized integrated data base production system (COMPINDAS)

    Energy Technology Data Exchange (ETDEWEB)

    Marek, D; Buerk, K [Fachinformationszentrum Karlsruhe, Gesellschaft fuer Wissenschaftlich-Technische Information mbH, Eggenstein-Leopoldshafen (Germany)

    1990-05-01

    Based on many years of experience, and with the main objective in mind to guarantee long-term database quality and efficiency of input processes, Fachinformationszentrum Karlsruhe is developing an integrated interactive data management systems for bibliographic and factual databases. Its concept includes the following range of applications: Subject analysis with computer-assisted classification, indexing and translation; technical procedures with online acquisition and management of literature and factual data, recording by means of optical scanning, computer-assisted bibliographic description, control and update procedures; support of the whole process by continuous surveillance of document flow. All these procedures will be performed in an integrated manner. They system is to meet high standards for flexibility, data integrity and effectiveness of system functions. Independent of the type of data, the appropriate database or the subject field to be handled, all data will be stored in one large pool. One main goal is to avoid duplication of work and redundancy of data storage. The system will work online, interactive and conversational. COMPINDAS is being established on the basis of the ADABAS as database management system for storage and retrieval. The applications are being generated by means of aDis of ASTEC in Munich. aDis is used for the definition of the data structures, checking routines, coupling processes, and the design of dialogue and batch routines including masks. (author). 7 figs.

  14. Control-oriented Automatic System for Transport Analysis (ASTRA)-Matlab integration for Tokamaks

    International Nuclear Information System (INIS)

    Sevillano, M.G.; Garrido, I.; Garrido, A.J.

    2011-01-01

    The exponential growth in energy consumption has led to a renewed interest in the development of alternatives to fossil fuels. Between the unconventional resources that may help to meet this energy demand, nuclear fusion has arisen as a promising source, which has given way to an unprecedented interest in solving the different control problems existing in nuclear fusion reactors such as Tokamaks. The aim of this manuscript is to show how one of the most popular codes used to simulate the performance of Tokamaks, the Automatic System For Transport Analysis (ASTRA) code, can be integrated into the Matlab-Simulink tool in order to make easier and more comfortable the development of suitable controllers for Tokamaks. As a demonstrative case study to show the feasibility and the goodness of the proposed ASTRA-Matlab integration, a modified anti-windup Proportional Integral Derivative (PID)-based controller for the loop voltage of a Tokamak has been implemented. The integration achieved represents an original and innovative work in the Tokamak control area and it provides new possibilities for the development and application of advanced control schemes to the standardized and widely extended ASTRA transport code for Tokamaks. -- Highlights: → The paper presents a useful tool for rapid prototyping of different solutions to deal with the control problems arising in Tokamaks. → The proposed tool embeds the standardized Automatic System For Transport Analysis (ASTRA) code for Tokamaks within the well-known Matlab-Simulink software. → This allows testing and combining diverse control schemes in a unified way considering the ASTRA as the plant of the system. → A demonstrative Proportional Integral Derivative (PID)-based case study is provided to show the feasibility and capabilities of the proposed integration.

  15. Facilitating Integration of Electron Beam Lithography Devices with Interactive Videodisc, Computer-Based Simulation and Job Aids.

    Science.gov (United States)

    Von Der Linn, Robert Christopher

    A needs assessment of the Grumman E-Beam Systems Group identified the requirement for additional skill mastery for the engineers who assemble, integrate, and maintain devices used to manufacture integrated circuits. Further analysis of the tasks involved led to the decision to develop interactive videodisc, computer-based job aids to enable…

  16. Integrated information system for analysis of nuclear power plants

    International Nuclear Information System (INIS)

    Galperin, A.

    1994-01-01

    Performing complicated engineering analyses of a nuclear power plant requires storage and manipulation of a large amount of information, both data and knowledge. This information is characterized by its multidisciplinary nature, complexity, and diversity. The problems caused by inefficient and lengthy manual operations involving the data flow management within the frame-work of the safety-related analysis of a power plant can be solved by applying the computer aided engineering principles. These principles are the basis of the design of an integrated information storage system (IRIS). The basic idea is to create a computerized environment, which includes both database and functional capabilities. Consideration and analysis of the data types and required data manipulation capabilities as well as operational requirements, resulted in the choice of an object-oriented data-base management system (OODBMS) as a development platform for solving the software engineering problems. Several advantages of OODBMSs over conventional relations database systems were found of crucial importance, especially providing the necessary flexibility for different data types and extensibility potential. A detailed design of a data model is produced for the plant technical data and for the storage of analysis results. The overall system architecture was designed to assure the feasibility of integrating database capabilities with procedures and functions written in conventional algorithmic programming languages

  17. Decision making based on data analysis and optimization algorithm applied for cogeneration systems integration into a grid

    Science.gov (United States)

    Asmar, Joseph Al; Lahoud, Chawki; Brouche, Marwan

    2018-05-01

    Cogeneration and trigeneration systems can contribute to the reduction of primary energy consumption and greenhouse gas emissions in residential and tertiary sectors, by reducing fossil fuels demand and grid losses with respect to conventional systems. The cogeneration systems are characterized by a very high energy efficiency (80 to 90%) as well as a less polluting aspect compared to the conventional energy production. The integration of these systems into the energy network must simultaneously take into account their economic and environmental challenges. In this paper, a decision-making strategy will be introduced and is divided into two parts. The first one is a strategy based on a multi-objective optimization tool with data analysis and the second part is based on an optimization algorithm. The power dispatching of the Lebanese electricity grid is then simulated and considered as a case study in order to prove the compatibility of the cogeneration power calculated by our decision-making technique. In addition, the thermal energy produced by the cogeneration systems which capacity is selected by our technique shows compatibility with the thermal demand for district heating.

  18. Simulation, integration, and economic analysis of gas-to-liquid processes

    International Nuclear Information System (INIS)

    Bao, Buping; El-Halwagi, Mahmoud M.; Elbashir, Nimir O.

    2010-01-01

    Gas-to-liquid (GTL) involves the chemical conversion of natural gas into synthetic crude that can be upgraded and separated into different useful hydrocarbon fractions including liquid transportation fuels. Such technology can also be used to convert other abundant natural resources such as coal and biomass to fuels and value added chemicals (referred to as coal-to-liquid (CTL) and biomass-to-liquid (BTL)). A leading GTL technology is the Fischer-Tropsch (FT) process. The objective of this work is to provide a techno-economic analysis of the GTL process and to identify optimization and integration opportunities for cost saving and reduction of energy usage while accounting for the environmental impact. First, a base-case flowsheet is synthesized to include the key processing steps of the plant. Then, a computer-aided process simulation is carried out to determine the key mass and energy flows, performance criteria, and equipment specifications. Next, energy and mass integration studies are performed to address the following items: (a) heating and cooling utilities, (b) combined heat and power (process cogeneration), (c) management of process water, (c) optimization of tail gas allocation, and (d) recovery of catalyst-supporting hydrocarbon solvents. Finally, these integration studies are conducted and the results are documented in terms of conserving energy and mass resources as well as providing economic impact. Finally, an economic analysis is undertaken to determine the plant capacity needed to achieve the break-even point and to estimate the return on investment for the base-case study. (author)

  19. PHIDIAS- Pathogen Host Interaction Data Integration and Analysis

    Indian Academy of Sciences (India)

    PHIDIAS- Pathogen Host Interaction Data Integration and Analysis- allows searching of integrated genome sequences, conserved domains and gene expressions data related to pathogen host interactions in high priority agents for public health and security ...

  20. Open Source GIS based integrated watershed management

    Science.gov (United States)

    Byrne, J. M.; Lindsay, J.; Berg, A. A.

    2013-12-01

    Optimal land and water management to address future and current resource stresses and allocation challenges requires the development of state-of-the-art geomatics and hydrological modelling tools. Future hydrological modelling tools should be of high resolution, process based with real-time capability to assess changing resource issues critical to short, medium and long-term enviromental management. The objective here is to merge two renowned, well published resource modeling programs to create an source toolbox for integrated land and water management applications. This work will facilitate a much increased efficiency in land and water resource security, management and planning. Following an 'open-source' philosophy, the tools will be computer platform independent with source code freely available, maximizing knowledge transfer and the global value of the proposed research. The envisioned set of water resource management tools will be housed within 'Whitebox Geospatial Analysis Tools'. Whitebox, is an open-source geographical information system (GIS) developed by Dr. John Lindsay at the University of Guelph. The emphasis of the Whitebox project has been to develop a user-friendly interface for advanced spatial analysis in environmental applications. The plugin architecture of the software is ideal for the tight-integration of spatially distributed models and spatial analysis algorithms such as those contained within the GENESYS suite. Open-source development extends knowledge and technology transfer to a broad range of end-users and builds Canadian capability to address complex resource management problems with better tools and expertise for managers in Canada and around the world. GENESYS (Generate Earth Systems Science input) is an innovative, efficient, high-resolution hydro- and agro-meteorological model for complex terrain watersheds developed under the direction of Dr. James Byrne. GENESYS is an outstanding research and applications tool to address

  1. Advanced image based methods for structural integrity monitoring: Review and prospects

    Science.gov (United States)

    Farahani, Behzad V.; Sousa, Pedro José; Barros, Francisco; Tavares, Paulo J.; Moreira, Pedro M. G. P.

    2018-02-01

    There is a growing trend in engineering to develop methods for structural integrity monitoring and characterization of in-service mechanical behaviour of components. The fast growth in recent years of image processing techniques and image-based sensing for experimental mechanics, brought about a paradigm change in phenomena sensing. Hence, several widely applicable optical approaches are playing a significant role in support of experiment. The current review manuscript describes advanced image based methods for structural integrity monitoring, and focuses on methods such as Digital Image Correlation (DIC), Thermoelastic Stress Analysis (TSA), Electronic Speckle Pattern Interferometry (ESPI) and Speckle Pattern Shearing Interferometry (Shearography). These non-contact full-field techniques rely on intensive image processing methods to measure mechanical behaviour, and evolve even as reviews such as this are being written, which justifies a special effort to keep abreast of this progress.

  2. An integrated one-chip-sensor system for microRNA quantitative analysis based on digital droplet polymerase chain reaction

    Science.gov (United States)

    Tsukuda, Masahiko; Wiederkehr, Rodrigo Sergio; Cai, Qing; Majeed, Bivragh; Fiorini, Paolo; Stakenborg, Tim; Matsuno, Toshinobu

    2016-04-01

    A silicon microfluidic chip was developed for microRNA (miRNA) quantitative analysis. It performs sequentially reverse transcription and polymerase chain reaction in a digital droplet format. Individual processes take place on different cavities, and reagent and sample mixing is carried out on a chip, prior to entering each compartment. The droplets are generated on a T-junction channel before the polymerase chain reaction step. Also, a miniaturized fluorescence detector was developed, based on an optical pick-up head of digital versatile disc (DVD) and a micro-photomultiplier tube. The chip integrated in the detection system was tested using synthetic miRNA with known concentrations, ranging from 300 to 3,000 templates/µL. Results proved the functionality of the system.

  3. Integrating Data Transformation in Principal Components Analysis

    KAUST Repository

    Maadooliat, Mehdi

    2015-01-02

    Principal component analysis (PCA) is a popular dimension reduction method to reduce the complexity and obtain the informative aspects of high-dimensional datasets. When the data distribution is skewed, data transformation is commonly used prior to applying PCA. Such transformation is usually obtained from previous studies, prior knowledge, or trial-and-error. In this work, we develop a model-based method that integrates data transformation in PCA and finds an appropriate data transformation using the maximum profile likelihood. Extensions of the method to handle functional data and missing values are also developed. Several numerical algorithms are provided for efficient computation. The proposed method is illustrated using simulated and real-world data examples.

  4. IMP: Integrated method for power analysis

    Energy Technology Data Exchange (ETDEWEB)

    1989-03-01

    An integrated, easy to use, economical package of microcomputer programs has been developed which can be used by small hydro developers to evaluate potential sites for small scale hydroelectric plants in British Columbia. The programs enable evaluation of sites located far from the nearest stream gauging station, for which streamflow data are not available. For each of the province's 6 hydrologic regions, a streamflow record for one small watershed is provided in the data base. The program can then be used to generate synthetic streamflow records and to compare results obtained by the modelling procedure with the actual data. The program can also be used to explore the significance of modelling parameters and to develop a detailed appreciation for the accuracy which can be obtained under various circumstances. The components of the program are an atmospheric model of precipitation; a watershed model that will generate a continuous series of streamflow data, based on information from the atmospheric model; a flood frequency analysis system that uses site-specific topographic data plus information from the atmospheric model to generate a flood frequency curve; a hydroelectric power simulation program which determines daily energy output for a run-of-river or reservoir storage site based on selected generation facilities and the time series generated in the watershed model; and a graphic analysis package that provides direct visualization of data and modelling results. This report contains a description of the programs, a user guide, the theory behind the model, the modelling methodology, and results from a workshop that reviewed the program package. 32 refs., 16 figs., 18 tabs.

  5. Abel integral equations analysis and applications

    CERN Document Server

    Gorenflo, Rudolf

    1991-01-01

    In many fields of application of mathematics, progress is crucially dependent on the good flow of information between (i) theoretical mathematicians looking for applications, (ii) mathematicians working in applications in need of theory, and (iii) scientists and engineers applying mathematical models and methods. The intention of this book is to stimulate this flow of information. In the first three chapters (accessible to third year students of mathematics and physics and to mathematically interested engineers) applications of Abel integral equations are surveyed broadly including determination of potentials, stereology, seismic travel times, spectroscopy, optical fibres. In subsequent chapters (requiring some background in functional analysis) mapping properties of Abel integral operators and their relation to other integral transforms in various function spaces are investi- gated, questions of existence and uniqueness of solutions of linear and nonlinear Abel integral equations are treated, and for equatio...

  6. Development of integrated cask body and base plate

    International Nuclear Information System (INIS)

    Sasaki, T.; Koyama, Y.; Yoshida, T.; Wada, T.

    2015-01-01

    The average of occupancy of stored spent-fuel in the nuclear power plants have reached 70 percent and it is anticipated that the demand of metal casks for the storage and transportation of spent-fuel rise after resuming the operations. The main part of metal cask consists of main body, neutron shield and external cylinder. We have developed the manufacturing technology of Integrated Cask Body and Base Plate by integrating Cask Body and Base Plate as monolithic forging with the goal of cost reduction, manufacturing period shortening and further reliability improvement. Here, we report the manufacturing technology, code compliance and obtained properties of Integrated Cask body and Base Plate. (author)

  7. Integrated analysis of genetic data with R

    Directory of Open Access Journals (Sweden)

    Zhao Jing

    2006-01-01

    Full Text Available Abstract Genetic data are now widely available. There is, however, an apparent lack of concerted effort to produce software systems for statistical analysis of genetic data compared with other fields of statistics. It is often a tremendous task for end-users to tailor them for particular data, especially when genetic data are analysed in conjunction with a large number of covariates. Here, R http://www.r-project.org, a free, flexible and platform-independent environment for statistical modelling and graphics is explored as an integrated system for genetic data analysis. An overview of some packages currently available for analysis of genetic data is given. This is followed by examples of package development and practical applications. With clear advantages in data management, graphics, statistical analysis, programming, internet capability and use of available codes, it is a feasible, although challenging, task to develop it into an integrated platform for genetic analysis; this will require the joint efforts of many researchers.

  8. Integrated care: a comprehensive bibliometric analysis and literature review

    Directory of Open Access Journals (Sweden)

    Xiaowei Sun

    2014-06-01

    Full Text Available Introduction: Integrated care could not only fix up fragmented health care but also improve the continuity of care and the quality of life. Despite the volume and variety of publications, little is known about how ‘integrated care’ has developed. There is a need for a systematic bibliometric analysis on studying the important features of the integrated care literature.Aim: To investigate the growth pattern, core journals and jurisdictions and identify the key research domains of integrated care.Methods: We searched Medline/PubMed using the search strategy ‘(delivery of health care, integrated [MeSH Terms] OR integrated care [Title/Abstract]’ without time and language limits. Second, we extracted the publishing year, journals, jurisdictions and keywords of the retrieved articles. Finally, descriptive statistical analysis by the Bibliographic Item Co-occurrence Matrix Builder and hierarchical clustering by SPSS were used.Results: As many as 9090 articles were retrieved. Results included: (1 the cumulative numbers of the publications on integrated care rose perpendicularly after 1993; (2 all documents were recorded by 1646 kinds of journals. There were 28 core journals; (3 the USA is the predominant publishing country; and (4 there are six key domains including: the definition/models of integrated care, interdisciplinary patient care team, disease management for chronically ill patients, types of health care organizations and policy, information system integration and legislation/jurisprudence.Discussion and conclusion: Integrated care literature has been most evident in developed countries. International Journal of Integrated Care is highly recommended in this research area. The bibliometric analysis and identification of publication hotspots provides researchers and practitioners with core target journals, as well as an overview of the field for further research in integrated care.

  9. Integrated Rudder/Fin Concise Control Based on Frequency Domain Analysis

    OpenAIRE

    W. Guan; Z. J. Su; G. Q. Zhang

    2013-01-01

    This paper describes a concise robust controller design of integrated rudder and fin control system in use of the closed loop gain shaping algorithm (CGSA) strategy. Compared with the arbitrary selection of weighting function in integrated rudder and fin H∞ mixed sensitivity control design procedures, the CGSA methods provided a relatively more straightforward and concise design method. Simulations were described that the overall performance of each CGSA rudder and fin control loop and the in...

  10. Integrated data base for spent fuel and radwaste: inventories

    International Nuclear Information System (INIS)

    Notz, K.J.; Carter, W.L.; Kibbey, A.H.

    1982-01-01

    The Integrated Data Base (IDB) program provides and maintains current, integrated data on spent reactor fuel and radwaste, including historical data, current inventories, projected inventories, and material characteristics. The IDB program collects, organizes, integrates, and - where necessary - reconciles inventory and projection (I/P) and characteristics information to provide a coherent, self-consistent data base on spent fuel and radwaste

  11. PROSPECTS OF THE REGIONAL INTEGRATION POLICY BASED ON CLUSTER FORMATION

    Directory of Open Access Journals (Sweden)

    Elena Tsepilova

    2018-01-01

    Full Text Available The purpose of this article is to develop the theoretical foundations of regional integration policy and to determine its prospects on the basis of cluster formation. The authors use such research methods as systematization, comparative and complex analysis, synthesis, statistical method. Within the framework of the research, the concept of regional integration policy is specified, and its integration core – cluster – is allocated. The authors work out an algorithm of regional clustering, which will ensure the growth of economy and tax income. Measures have been proposed to optimize the organizational mechanism of interaction between the participants of the territorial cluster and the authorities that allow to ensure the effective functioning of clusters, including taxation clusters. Based on the results of studying the existing methods for assessing the effectiveness of cluster policy, the authors propose their own approach to evaluating the consequences of implementing the regional integration policy, according to which the list of quantitative and qualitative indicators is defined. The present article systematizes the experience and results of the cluster policy of certain European countries, that made it possible to determine the prospects and synergetic effect from the development of clusters as an integration foundation of regional policy in the Russian Federation. The authors carry out the analysis of activity of cluster formations using the example of the Rostov region – a leader in the formation of conditions for the cluster policy development in the Southern Federal District. 11 clusters and cluster initiatives are developing in this region. As a result, the authors propose measures for support of the already existing clusters and creation of the new ones.

  12. Fundamentals of Inertial Navigation, Satellite-based Positioning and their Integration

    CERN Document Server

    Noureldin, Aboelmagd; Georgy, Jacques

    2013-01-01

    Fundamentals of Inertial Navigation, Satellite-based Positioning and their Integration is an introduction to the field of Integrated Navigation Systems. It serves as an excellent reference for working engineers as well as textbook for beginners and students new to the area. The book is easy to read and understand with minimum background knowledge. The authors explain the derivations in great detail. The intermediate steps are thoroughly explained so that a beginner can easily follow the material. The book shows a step-by-step implementation of navigation algorithms and provides all the necessary details. It provides detailed illustrations for an easy comprehension. The book also demonstrates real field experiments and in-vehicle road test results with professional discussions and analysis. This work is unique in discussing the different INS/GPS integration schemes in an easy to understand and straightforward way. Those schemes include loosely vs tightly coupled, open loop vs closed loop, and many more.

  13. Fluidic Logic Used in a Systems Approach to Enable Integrated Single-cell Functional Analysis

    Directory of Open Access Journals (Sweden)

    Naveen Ramalingam

    2016-09-01

    Full Text Available The study of single cells has evolved over the past several years to include expression and genomic analysis of an increasing number of single cells. Several studies have demonstrated wide-spread variation and heterogeneity within cell populations of similar phenotype. While the characterization of these populations will likely set the foundation for our understanding of genomic- and expression-based diversity, it will not be able to link the functional differences of a single cell to its underlying genomic structure and activity. Currently, it is difficult to perturb single cells in a controlled environment, monitor and measure the response due to perturbation, and link these response measurements to downstream genomic and transcriptomic analysis. In order to address this challenge, we developed a platform to integrate and miniaturize many of the experimental steps required to study single-cell function. The heart of this platform is an elastomer-based Integrated Fluidic Circuit (IFC that uses fluidic logic to select and sequester specific single cells based on a phenotypic trait for downstream experimentation. Experiments with sequestered cells that have been performed include on-chip culture, exposure to a variety of stimulants, and post-exposure image-based response analysis, followed by preparation of the mRNA transcriptome for massively parallel sequencing analysis. The flexible system embodies experimental design and execution that enable routine functional studies of single cells.

  14. Integrating clinicians, knowledge and data: expert-based cooperative analysis in healthcare decision support.

    Science.gov (United States)

    Gibert, Karina; García-Alonso, Carlos; Salvador-Carulla, Luis

    2010-09-30

    Decision support in health systems is a highly difficult task, due to the inherent complexity of the process and structures involved. This paper introduces a new hybrid methodology Expert-based Cooperative Analysis (EbCA), which incorporates explicit prior expert knowledge in data analysis methods, and elicits implicit or tacit expert knowledge (IK) to improve decision support in healthcare systems. EbCA has been applied to two different case studies, showing its usability and versatility: 1) Bench-marking of small mental health areas based on technical efficiency estimated by EbCA-Data Envelopment Analysis (EbCA-DEA), and 2) Case-mix of schizophrenia based on functional dependency using Clustering Based on Rules (ClBR). In both cases comparisons towards classical procedures using qualitative explicit prior knowledge were made. Bayesian predictive validity measures were used for comparison with expert panels results. Overall agreement was tested by Intraclass Correlation Coefficient in case "1" and kappa in both cases. EbCA is a new methodology composed by 6 steps:. 1) Data collection and data preparation; 2) acquisition of "Prior Expert Knowledge" (PEK) and design of the "Prior Knowledge Base" (PKB); 3) PKB-guided analysis; 4) support-interpretation tools to evaluate results and detect inconsistencies (here Implicit Knowledg -IK- might be elicited); 5) incorporation of elicited IK in PKB and repeat till a satisfactory solution; 6) post-processing results for decision support. EbCA has been useful for incorporating PEK in two different analysis methods (DEA and Clustering), applied respectively to assess technical efficiency of small mental health areas and for case-mix of schizophrenia based on functional dependency. Differences in results obtained with classical approaches were mainly related to the IK which could be elicited by using EbCA and had major implications for the decision making in both cases. This paper presents EbCA and shows the convenience of

  15. Integrating clinicians, knowledge and data: expert-based cooperative analysis in healthcare decision support

    Directory of Open Access Journals (Sweden)

    García-Alonso Carlos

    2010-09-01

    Full Text Available Abstract Background Decision support in health systems is a highly difficult task, due to the inherent complexity of the process and structures involved. Method This paper introduces a new hybrid methodology Expert-based Cooperative Analysis (EbCA, which incorporates explicit prior expert knowledge in data analysis methods, and elicits implicit or tacit expert knowledge (IK to improve decision support in healthcare systems. EbCA has been applied to two different case studies, showing its usability and versatility: 1 Bench-marking of small mental health areas based on technical efficiency estimated by EbCA-Data Envelopment Analysis (EbCA-DEA, and 2 Case-mix of schizophrenia based on functional dependency using Clustering Based on Rules (ClBR. In both cases comparisons towards classical procedures using qualitative explicit prior knowledge were made. Bayesian predictive validity measures were used for comparison with expert panels results. Overall agreement was tested by Intraclass Correlation Coefficient in case "1" and kappa in both cases. Results EbCA is a new methodology composed by 6 steps:. 1 Data collection and data preparation; 2 acquisition of "Prior Expert Knowledge" (PEK and design of the "Prior Knowledge Base" (PKB; 3 PKB-guided analysis; 4 support-interpretation tools to evaluate results and detect inconsistencies (here Implicit Knowledg -IK- might be elicited; 5 incorporation of elicited IK in PKB and repeat till a satisfactory solution; 6 post-processing results for decision support. EbCA has been useful for incorporating PEK in two different analysis methods (DEA and Clustering, applied respectively to assess technical efficiency of small mental health areas and for case-mix of schizophrenia based on functional dependency. Differences in results obtained with classical approaches were mainly related to the IK which could be elicited by using EbCA and had major implications for the decision making in both cases. Discussion This

  16. Leaky Integrate-and-Fire Neuron Circuit Based on Floating-Gate Integrator

    Science.gov (United States)

    Kornijcuk, Vladimir; Lim, Hyungkwang; Seok, Jun Yeong; Kim, Guhyun; Kim, Seong Keun; Kim, Inho; Choi, Byung Joon; Jeong, Doo Seok

    2016-01-01

    The artificial spiking neural network (SNN) is promising and has been brought to the notice of the theoretical neuroscience and neuromorphic engineering research communities. In this light, we propose a new type of artificial spiking neuron based on leaky integrate-and-fire (LIF) behavior. A distinctive feature of the proposed FG-LIF neuron is the use of a floating-gate (FG) integrator rather than a capacitor-based one. The relaxation time of the charge on the FG relies mainly on the tunnel barrier profile, e.g., barrier height and thickness (rather than the area). This opens up the possibility of large-scale integration of neurons. The circuit simulation results offered biologically plausible spiking activity (circuit was subject to possible types of noise, e.g., thermal noise and burst noise. The simulation results indicated remarkable distributional features of interspike intervals that are fitted to Gamma distribution functions, similar to biological neurons in the neocortex. PMID:27242416

  17. Preliminary Integrated Safety Analysis Status Report

    International Nuclear Information System (INIS)

    Gwyn, D.

    2001-01-01

    This report provides the status of the potential Monitored Geologic Repository (MGR) Integrated Safety Analysis (EA) by identifying the initial work scope scheduled for completion during the ISA development period, the schedules associated with the tasks identified, safety analysis issues encountered, and a summary of accomplishments during the reporting period. This status covers the period from October 1, 2000 through March 30, 2001

  18. Opto-electronic DNA chip-based integrated card for clinical diagnostics.

    Science.gov (United States)

    Marchand, Gilles; Broyer, Patrick; Lanet, Véronique; Delattre, Cyril; Foucault, Frédéric; Menou, Lionel; Calvas, Bernard; Roller, Denis; Ginot, Frédéric; Campagnolo, Raymond; Mallard, Frédéric

    2008-02-01

    Clinical diagnostics is one of the most promising applications for microfluidic lab-on-a-chip or lab-on-card systems. DNA chips, which provide multiparametric data, are privileged tools for genomic analysis. However, automation of molecular biology protocol and use of these DNA chips in fully integrated systems remains a great challenge. Simplicity of chip and/or card/instrument interfaces is amongst the most critical issues to be addressed. Indeed, current detection systems for DNA chip reading are often complex, expensive, bulky and even limited in terms of sensitivity or accuracy. Furthermore, for liquid handling in the lab-on-cards, many devices use complex and bulky systems, either to directly manipulate fluids, or to ensure pneumatic or mechanical control of integrated valves. All these drawbacks prevent or limit the use of DNA-chip-based integrated systems, for point-of-care testing or as a routine diagnostics tool. We present here a DNA-chip-based protocol integration on a plastic card for clinical diagnostics applications including: (1) an opto-electronic DNA-chip, (2) fluid handling using electrically activated embedded pyrotechnic microvalves with closing/opening functions. We demonstrate both fluidic and electric packaging of the optoelectronic DNA chip without major alteration of its electronical and biological functionalities, and fluid control using novel electrically activable pyrotechnic microvalves. Finally, we suggest a complete design of a card dedicated to automation of a complex biological protocol with a fully electrical fluid handling and DNA chip reading.

  19. An integrated knowledge-based and optimization tool for the sustainable selection of wastewater treatment process concepts

    DEFF Research Database (Denmark)

    Castillo, A.; Cheali, Peam; Gómez, V.

    2016-01-01

    The increasing demand on wastewater treatment plants (WWTPs) has involved an interest in improving the alternative treatment selection process. In this study, an integrated framework including an intelligent knowledge-based system and superstructure-based optimization has been developed and applied...... to a real case study. Hence, a multi-criteria analysis together with mathematical models is applied to generate a ranked short-list of feasible treatments for three different scenarios. Finally, the uncertainty analysis performed allows for increasing the quality and robustness of the decisions considering...... benefit and synergy is achieved when both tools are integrated because expert knowledge and expertise are considered together with mathematical models to select the most appropriate treatment alternative...

  20. Integration of facility modeling capabilities for nuclear nonproliferation analysis

    International Nuclear Information System (INIS)

    Burr, Tom; Gorensek, M.B.; Krebs, John; Kress, Reid L.; Lamberti, Vincent; Schoenwald, David; Ward, Richard C.

    2012-01-01

    Developing automated methods for data collection and analysis that can facilitate nuclearnonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facilitymodeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facilitymodeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come from many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facilitymodelingcapabilities and illustrates how they could be integrated and utilized for nonproliferationanalysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facilitymodeling tools. After considering a representative sampling of key facilitymodelingcapabilities, the proposed integration framework is illustrated with several examples.

  1. An integrated system for genetic analysis

    Directory of Open Access Journals (Sweden)

    Duan Xiao

    2006-04-01

    Full Text Available Abstract Background Large-scale genetic mapping projects require data management systems that can handle complex phenotypes and detect and correct high-throughput genotyping errors, yet are easy to use. Description We have developed an Integrated Genotyping System (IGS to meet this need. IGS securely stores, edits and analyses genotype and phenotype data. It stores information about DNA samples, plates, primers, markers and genotypes generated by a genotyping laboratory. Data are structured so that statistical genetic analysis of both case-control and pedigree data is straightforward. Conclusion IGS can model complex phenotypes and contain genotypes from whole genome association studies. The database makes it possible to integrate genetic analysis with data curation. The IGS web site http://bioinformatics.well.ox.ac.uk/project-igs.shtml contains further information.

  2. Applying Groebner bases to solve reduction problems for Feynman integrals

    International Nuclear Information System (INIS)

    Smirnov, Alexander V.; Smirnov, Vladimir A.

    2006-01-01

    We describe how Groebner bases can be used to solve the reduction problem for Feynman integrals, i.e. to construct an algorithm that provides the possibility to express a Feynman integral of a given family as a linear combination of some master integrals. Our approach is based on a generalized Buchberger algorithm for constructing Groebner-type bases associated with polynomials of shift operators. We illustrate it through various examples of reduction problems for families of one- and two-loop Feynman integrals. We also solve the reduction problem for a family of integrals contributing to the three-loop static quark potential

  3. Applying Groebner bases to solve reduction problems for Feynman integrals

    Energy Technology Data Exchange (ETDEWEB)

    Smirnov, Alexander V. [Mechanical and Mathematical Department and Scientific Research Computer Center of Moscow State University, Moscow 119992 (Russian Federation); Smirnov, Vladimir A. [Nuclear Physics Institute of Moscow State University, Moscow 119992 (Russian Federation)

    2006-01-15

    We describe how Groebner bases can be used to solve the reduction problem for Feynman integrals, i.e. to construct an algorithm that provides the possibility to express a Feynman integral of a given family as a linear combination of some master integrals. Our approach is based on a generalized Buchberger algorithm for constructing Groebner-type bases associated with polynomials of shift operators. We illustrate it through various examples of reduction problems for families of one- and two-loop Feynman integrals. We also solve the reduction problem for a family of integrals contributing to the three-loop static quark potential.

  4. An analysis of 3D particle path integration algorithms

    International Nuclear Information System (INIS)

    Darmofal, D.L.; Haimes, R.

    1996-01-01

    Several techniques for the numerical integration of particle paths in steady and unsteady vector (velocity) fields are analyzed. Most of the analysis applies to unsteady vector fields, however, some results apply to steady vector field integration. Multistep, multistage, and some hybrid schemes are considered. It is shown that due to initialization errors, many unsteady particle path integration schemes are limited to third-order accuracy in time. Multistage schemes require at least three times more internal data storage than multistep schemes of equal order. However, for timesteps within the stability bounds, multistage schemes are generally more accurate. A linearized analysis shows that the stability of these integration algorithms are determined by the eigenvalues of the local velocity tensor. Thus, the accuracy and stability of the methods are interpreted with concepts typically used in critical point theory. This paper shows how integration schemes can lead to erroneous classification of critical points when the timestep is finite and fixed. For steady velocity fields, we demonstrate that timesteps outside of the relative stability region can lead to similar integration errors. From this analysis, guidelines for accurate timestep sizing are suggested for both steady and unsteady flows. In particular, using simulation data for the unsteady flow around a tapered cylinder, we show that accurate particle path integration requires timesteps which are at most on the order of the physical timescale of the flow

  5. Empirical Analysis of the Integration Activity of Business Structures in the Regions of Russia

    Directory of Open Access Journals (Sweden)

    Maria Gennadyevna Karelina

    2015-12-01

    Full Text Available The article investigates the integration activity of business structures in the regions of Russia. A wide variety of approaches to the study of the problems and prospects of economic integration and the current dispute on the role of integration processes in the regional economic development have determined the complexity of the concepts “integration” and “integration activities” in order to develop the objective conditions to analyse the integration activity of business structures in the Russian regions. The monitoring of the current legal system of the Russian Federation carried out in the area of statistics and compiling statistical databases on mergers and acquisitions has showed the absence of the formal executive authority dealing with the compiling and collections of information on the integration activity at the regional level. In this connection, the data of Russian information and analytical agencies are made from the information and analytical base. As the research tools, the methods of analysis of structural changes, methods of analysis of economic differentiation and concentration, methods of non-parametric statistics are used. The article shows the close relationship between the social and economic development of the subjects of Russia and the integrated business structures functioning on its territory. An investigation of the integration activity structure and dynamics in the subjects of the Russian Federation based on the statistical data for the period from 2003 to 2012 has revealed the increasing heterogeneity of the integration activity of business structures in the regions of Russia. The hypothesis of a substantial divergence of mergers and acquisitions of corporate structures in the Russian regions was confirmed by the high values of the Gini coefficient, the Herfindahl index, and the decile coefficient of differentiation. The research results are of practical importance since they can be used to improve the existing

  6. Development of a 3-D flow analysis computer program for integral reactor

    International Nuclear Information System (INIS)

    Youn, H. Y.; Lee, K. H.; Kim, H. K.; Whang, Y. D.; Kim, H. C.

    2003-01-01

    A 3-D computational fluid dynamics program TASS-3D is being developed for the flow analysis of primary coolant system consists of complex geometries such as SMART. A pre/post processor also is being developed to reduce the pre/post processing works such as a computational grid generation, set-up the analysis conditions and analysis of the calculated results. TASS-3D solver employs a non-orthogonal coordinate system and FVM based on the non-staggered grid system. The program includes the various models to simulate the physical phenomena expected to be occurred in the integral reactor and will be coupled with core dynamics code, core T/H code and the secondary system code modules. Currently, the application of TASS-3D is limited to the single phase of liquid, but the code will be further developed including 2-phase phenomena expected for the normal operation and the various transients of the integrator reactor in the next stage

  7. Integrated framework for dynamic safety analysis

    International Nuclear Information System (INIS)

    Kim, Tae Wan; Karanki, Durga R.

    2012-01-01

    In the conventional PSA (Probabilistic Safety Assessment), detailed plant simulations by independent thermal hydraulic (TH) codes are used in the development of accident sequence models. Typical accidents in a NPP involve complex interactions among process, safety systems, and operator actions. As independent TH codes do not have the models of operator actions and full safety systems, they cannot literally simulate the integrated and dynamic interactions of process, safety systems, and operator responses. Offline simulation with pre decided states and time delays may not model the accident sequences properly. Moreover, when stochastic variability in responses of accident models is considered, defining all the combinations for simulations will be cumbersome task. To overcome some of these limitations of conventional safety analysis approach, TH models are coupled with the stochastic models in the dynamic event tree (DET) framework, which provides flexibility to model the integrated response due to better communication as all the accident elements are in the same model. The advantages of this framework also include: Realistic modeling in dynamic scenarios, comprehensive results, integrated approach (both deterministic and probabilistic models), and support for HRA (Human Reliability Analysis)

  8. Integrative sparse principal component analysis of gene expression data.

    Science.gov (United States)

    Liu, Mengque; Fan, Xinyan; Fang, Kuangnan; Zhang, Qingzhao; Ma, Shuangge

    2017-12-01

    In the analysis of gene expression data, dimension reduction techniques have been extensively adopted. The most popular one is perhaps the PCA (principal component analysis). To generate more reliable and more interpretable results, the SPCA (sparse PCA) technique has been developed. With the "small sample size, high dimensionality" characteristic of gene expression data, the analysis results generated from a single dataset are often unsatisfactory. Under contexts other than dimension reduction, integrative analysis techniques, which jointly analyze the raw data of multiple independent datasets, have been developed and shown to outperform "classic" meta-analysis and other multidatasets techniques and single-dataset analysis. In this study, we conduct integrative analysis by developing the iSPCA (integrative SPCA) method. iSPCA achieves the selection and estimation of sparse loadings using a group penalty. To take advantage of the similarity across datasets and generate more accurate results, we further impose contrasted penalties. Different penalties are proposed to accommodate different data conditions. Extensive simulations show that iSPCA outperforms the alternatives under a wide spectrum of settings. The analysis of breast cancer and pancreatic cancer data further shows iSPCA's satisfactory performance. © 2017 WILEY PERIODICALS, INC.

  9. Design and analysis of heat exchanger networks for integrated Ca-looping systems

    International Nuclear Information System (INIS)

    Lara, Yolanda; Lisbona, Pilar; Martínez, Ana; Romeo, Luis M.

    2013-01-01

    Highlights: • Heat integration is essential to minimize energy penalties in calcium looping cycles. • A design and analysis of four heat exchanger networks is stated. • New design with higher power, lower costs and lower destroyed exergy than base case. - Abstract: One of the main challenges of carbon capture and storage technologies deals with the energy penalty associated with CO 2 separation and compression processes. Thus, heat integration plays an essential role in the improvement of these systems’ efficiencies. CO 2 capture systems based on Ca-looping process present a great potential for residual heat integration with a new supercritical power plant. The pinch methodology is applied in this study to define the minimum energy requirements of the process and to design four configurations for the required heat exchanger network. The Second Law of Thermodynamics represents a powerful tool for reducing the energy demand since identifying the exergy losses of the system serves to allocate inefficiencies. In parallel, an economic analysis is required to asses the cost reduction achieved by each configuration. This work presents a combination of pinch methodology with economic and exergetic analyses to select the more appropriate configuration of heat exchanger network. The lower costs and minor destroyed exergy obtained for the best proposed network result in a of 0.91% global energy efficiency increase

  10. SU-F-J-94: Development of a Plug-in Based Image Analysis Tool for Integration Into Treatment Planning

    Energy Technology Data Exchange (ETDEWEB)

    Owen, D; Anderson, C; Mayo, C; El Naqa, I; Ten Haken, R; Cao, Y; Balter, J; Matuszak, M [University of Michigan, Ann Arbor, MI (United States)

    2016-06-15

    Purpose: To extend the functionality of a commercial treatment planning system (TPS) to support (i) direct use of quantitative image-based metrics within treatment plan optimization and (ii) evaluation of dose-functional volume relationships to assist in functional image adaptive radiotherapy. Methods: A script was written that interfaces with a commercial TPS via an Application Programming Interface (API). The script executes a program that performs dose-functional volume analyses. Written in C#, the script reads the dose grid and correlates it with image data on a voxel-by-voxel basis through API extensions that can access registration transforms. A user interface was designed through WinForms to input parameters and display results. To test the performance of this program, image- and dose-based metrics computed from perfusion SPECT images aligned to the treatment planning CT were generated, validated, and compared. Results: The integration of image analysis information was successfully implemented as a plug-in to a commercial TPS. Perfusion SPECT images were used to validate the calculation and display of image-based metrics as well as dose-intensity metrics and histograms for defined structures on the treatment planning CT. Various biological dose correction models, custom image-based metrics, dose-intensity computations, and dose-intensity histograms were applied to analyze the image-dose profile. Conclusion: It is possible to add image analysis features to commercial TPSs through custom scripting applications. A tool was developed to enable the evaluation of image-intensity-based metrics in the context of functional targeting and avoidance. In addition to providing dose-intensity metrics and histograms that can be easily extracted from a plan database and correlated with outcomes, the system can also be extended to a plug-in optimization system, which can directly use the computed metrics for optimization of post-treatment tumor or normal tissue response

  11. A surface-integral-equation approach to the propagation of waves in EBG-based devices

    NARCIS (Netherlands)

    Lancellotti, V.; Tijhuis, A.G.

    2012-01-01

    We combine surface integral equations with domain decomposition to formulate and (numerically) solve the problem of electromagnetic (EM) wave propagation inside finite-sized structures. The approach is of interest for (but not limited to) the analysis of devices based on the phenomenon of

  12. FACILITATING INTEGRATED SPATIO-TEMPORAL VISUALIZATION AND ANALYSIS OF HETEROGENEOUS ARCHAEOLOGICAL AND PALAEOENVIRONMENTAL RESEARCH DATA

    Directory of Open Access Journals (Sweden)

    C. Willmes

    2012-07-01

    Full Text Available In the context of the Collaborative Research Centre 806 "Our way to Europe" (CRC806, a research database is developed for integrating data from the disciplines of archaeology, the geosciences and the cultural sciences to facilitate integrated access to heterogeneous data sources. A practice-oriented data integration concept and its implementation is presented in this contribution. The data integration approach is based on the application of Semantic Web Technology and is applied to the domains of archaeological and palaeoenvironmental data. The aim is to provide integrated spatio-temporal access to an existing wealth of data to facilitate research on the integrated data basis. For the web portal of the CRC806 research database (CRC806-Database, a number of interfaces and applications have been evaluated, developed and implemented for exposing the data to interactive analysis and visualizations.

  13. Recent Advances in the Method of Forces: Integrated Force Method of Structural Analysis

    Science.gov (United States)

    Patnaik, Surya N.; Coroneos, Rula M.; Hopkins, Dale A.

    1998-01-01

    Stress that can be induced in an elastic continuum can be determined directly through the simultaneous application of the equilibrium equations and the compatibility conditions. In the literature, this direct stress formulation is referred to as the integrated force method. This method, which uses forces as the primary unknowns, complements the popular equilibrium-based stiffness method, which considers displacements as the unknowns. The integrated force method produces accurate stress, displacement, and frequency results even for modest finite element models. This version of the force method should be developed as an alternative to the stiffness method because the latter method, which has been researched for the past several decades, may have entered its developmental plateau. Stress plays a primary role in the development of aerospace and other products, and its analysis is difficult. Therefore, it is advisable to use both methods to calculate stress and eliminate errors through comparison. This paper examines the role of the integrated force method in analysis, animation and design.

  14. Integrability of dynamical systems algebra and analysis

    CERN Document Server

    Zhang, Xiang

    2017-01-01

    This is the first book to systematically state the fundamental theory of integrability and its development of ordinary differential equations with emphasis on the Darboux theory of integrability and local integrability together with their applications. It summarizes the classical results of Darboux integrability and its modern development together with their related Darboux polynomials and their applications in the reduction of Liouville and elementary integrabilty and in the center—focus problem, the weakened Hilbert 16th problem on algebraic limit cycles and the global dynamical analysis of some realistic models in fields such as physics, mechanics and biology. Although it can be used as a textbook for graduate students in dynamical systems, it is intended as supplementary reading for graduate students from mathematics, physics, mechanics and engineering in courses related to the qualitative theory, bifurcation theory and the theory of integrability of dynamical systems.

  15. Promoting collaboration skills on reflection concept through multimedia-based integrated instruction

    Science.gov (United States)

    Hermawan, Hermawan; Siahaan, Parsaoran; Suhendi, Endi; Samsudin, Achmad

    2017-05-01

    Multimedia-Based Integrated Instructions (MBI2) has been developed to promote the collaboration skills on reflection concepts turn into more real and meaningful learning. The initial design of MBI2 in the form of a multimedia computer that allows users to explore the concept of the overall reflectance of the light through the conceptual and practical aspects that have been developed. MBI2has been developed to promoteone of the skills that the 21st-century skills to students'junior high school that is collaboration skill in order to compete in the future life. The ability to collaborate is divided into five aspects, namely contributions, time management, problem-solving, working with others and research techniques. Research methods utiliseed in this study is an exploration and instructional development 4D model (define, design, develop and disseminate). Based on data analysis, it can be concluded that the development of integrated multimedia-based instruction (MBI2) on the concept of reflection through the 4D developing model was effectively to enhance collaboration skills of students'junior high school.

  16. Derivative-Based Trapezoid Rule for the Riemann-Stieltjes Integral

    Directory of Open Access Journals (Sweden)

    Weijing Zhao

    2014-01-01

    Full Text Available The derivative-based trapezoid rule for the Riemann-Stieltjes integral is presented which uses 2 derivative values at the endpoints. This kind of quadrature rule obtains an increase of two orders of precision over the trapezoid rule for the Riemann-Stieltjes integral and the error term is investigated. At last, the rationality of the generalization of derivative-based trapezoid rule for Riemann-Stieltjes integral is demonstrated.

  17. Novel developments in mobile sensing based on the integration of microfluidic devices and smartphones.

    Science.gov (United States)

    Yang, Ke; Peretz-Soroka, Hagit; Liu, Yong; Lin, Francis

    2016-03-21

    Portable electronic devices and wireless communication systems enable a broad range of applications such as environmental and food safety monitoring, personalized medicine and healthcare management. Particularly, hybrid smartphone and microfluidic devices provide an integrated solution for the new generation of mobile sensing applications. Such mobile sensing based on microfluidic devices (broadly defined) and smartphones (MS(2)) offers a mobile laboratory for performing a wide range of bio-chemical detection and analysis functions such as water and food quality analysis, routine health tests and disease diagnosis. MS(2) offers significant advantages over traditional platforms in terms of test speed and control, low cost, mobility, ease-of-operation and data management. These improvements put MS(2) in a promising position in the fields of interdisciplinary basic and applied research. In particular, MS(2) enables applications to remote in-field testing, homecare, and healthcare in low-resource areas. The marriage of smartphones and microfluidic devices offers a powerful on-chip operating platform to enable various bio-chemical tests, remote sensing, data analysis and management in a mobile fashion. The implications of such integration are beyond telecommunication and microfluidic-related research and technology development. In this review, we will first provide the general background of microfluidic-based sensing, smartphone-based sensing, and their integration. Then, we will focus on several key application areas of MS(2) by systematically reviewing the important literature in each area. We will conclude by discussing our perspectives on the opportunities, issues and future directions of this emerging novel field.

  18. Novel Developments of Mobile Sensing Based on the Integration of Microfluidic Devices and Smartphone

    Science.gov (United States)

    Yang, Ke; Peretz-Soroka, Hagit; Liu, Yong; Lin, Francis

    2016-01-01

    Portable electronic devices and wireless communication systems enable a broad range of applications such as environmental and food safety monitoring, personalized medicine and healthcare management. Particularly, hybrid smartphone and microfluidic devices provide an integrated solution for the new generation of mobile sensing applications. Such mobile sensing based on microfluidic devices (broadly defined) and smartphones (MS2) offers a mobile laboratory for performing a wide range of bio-chemical detection and analysis functions such as water and food quality analysis, routine health tests and disease diagnosis. MS2 offers significant advantages over traditional platforms in terms of test speed and control, low cost, mobility, ease-of-operation and data management. These improvements put MS2 in a promising position in the fields of interdisciplinary basic and applied research. In particular, MS2 enables applications to remote infield testing, homecare, and healthcare in low-resource areas. The marriage of smartphones and microfluidic devices offers a powerful on-chip operating platform to enable various bio-chemical tests, remote sensing, data analysis and management in a mobile fashion. The implications of such integration are beyond telecommunication and microfluidic-related research and technology development. In this review, we will first provide the general background of microfluidic-based sensing, smartphone-based sensing, and their integration. Then, we will focus on several key application areas of MS2 by systematically reviewing the important literature in each area. We will conclude by discussing our perspectives on the opportunities, issues and future directions of this emerging novel field. PMID:26899264

  19. Exergy analysis of a combined heat and power plant with integrated lignocellulosic ethanol production

    DEFF Research Database (Denmark)

    Lythcke-Jørgensen, Christoffer Ernst; Haglind, Fredrik; Clausen, Lasse Røngaard

    2014-01-01

    production. An exergy analysis is carried out for a modelled polygeneration system in which lignocellulosic ethanol production based on hydrothermal pretreatment is integrated in an existing combined heat and power (CHP) plant. The ethanol facility is driven by steam extracted from the CHP unit when feasible...... district heating production in the ethanol facility. The results suggest that the efficiency of integrating lignocellulosic ethanol production in CHP plants is highly dependent on operation, and it is therefore suggested that the expected operation pattern of such polygeneration system is taken......Lignocellulosic ethanol production is often assumed integrated in polygeneration systems because of its energy intensive nature. The objective of this study is to investigate potential irreversibilities from such integration, and what impact it has on the efficiency of the integrated ethanol...

  20. Photometric method for determination of acidity constants through integral spectra analysis

    Science.gov (United States)

    Zevatskiy, Yuriy Eduardovich; Ruzanov, Daniil Olegovich; Samoylov, Denis Vladimirovich

    2015-04-01

    An express method for determination of acidity constants of organic acids, based on the analysis of the integral transmittance vs. pH dependence is developed. The integral value is registered as a photocurrent of photometric device simultaneously with potentiometric titration. The proposed method allows to obtain pKa using only simple and low-cost instrumentation. The optical part of the experimental setup has been optimized through the exclusion of the monochromator device. Thus it only takes 10-15 min to obtain one pKa value with the absolute error of less than 0.15 pH units. Application limitations and reliability of the method have been tested for a series of organic acids of various nature.

  1. Performance analysis of IMS based LTE and WIMAX integration architectures

    Directory of Open Access Journals (Sweden)

    A. Bagubali

    2016-12-01

    Full Text Available In the current networking field many research works are going on regarding the integration of different wireless technologies, with the aim of providing uninterrupted connectivity to the user anywhere, with high data rates due to increased demand. However, the number of objects like smart devices, industrial machines, smart homes, connected by wireless interface is dramatically increasing due to the evolution of cloud computing and internet of things technology. This Paper begins with the challenges involved in such integrations and then explains the role of different couplings and different architectures. This paper also gives further improvement in the LTE and Wimax integration architectures to provide seamless vertical handover and flexible quality of service for supporting voice, video, multimedia services over IP network and mobility management with the help of IMS networks. Evaluation of various parameters like handover delay, cost of signalling, packet loss,, is done and the performance of the interworking architecture is analysed from the simulation results. Finally, it concludes that the cross layer scenario is better than the non cross layer scenario.

  2. Bending of Euler-Bernoulli nanobeams based on the strain-driven and stress-driven nonlocal integral models: a numerical approach

    Science.gov (United States)

    Oskouie, M. Faraji; Ansari, R.; Rouhi, H.

    2018-04-01

    Eringen's nonlocal elasticity theory is extensively employed for the analysis of nanostructures because it is able to capture nanoscale effects. Previous studies have revealed that using the differential form of the strain-driven version of this theory leads to paradoxical results in some cases, such as bending analysis of cantilevers, and recourse must be made to the integral version. In this article, a novel numerical approach is developed for the bending analysis of Euler-Bernoulli nanobeams in the context of strain- and stress-driven integral nonlocal models. This numerical approach is proposed for the direct solution to bypass the difficulties related to converting the integral governing equation into a differential equation. First, the governing equation is derived based on both strain-driven and stress-driven nonlocal models by means of the minimum total potential energy. Also, in each case, the governing equation is obtained in both strong and weak forms. To solve numerically the derived equations, matrix differential and integral operators are constructed based upon the finite difference technique and trapezoidal integration rule. It is shown that the proposed numerical approach can be efficiently applied to the strain-driven nonlocal model with the aim of resolving the mentioned paradoxes. Also, it is able to solve the problem based on the strain-driven model without inconsistencies of the application of this model that are reported in the literature.

  3. Integrating Data Clustering and Visualization for the Analysis of 3D Gene Expression Data

    Energy Technology Data Exchange (ETDEWEB)

    Data Analysis and Visualization (IDAV) and the Department of Computer Science, University of California, Davis, One Shields Avenue, Davis CA 95616, USA,; nternational Research Training Group ``Visualization of Large and Unstructured Data Sets,' ' University of Kaiserslautern, Germany; Computational Research Division, Lawrence Berkeley National Laboratory, One Cyclotron Road, Berkeley, CA 94720, USA; Genomics Division, Lawrence Berkeley National Laboratory, One Cyclotron Road, Berkeley CA 94720, USA; Life Sciences Division, Lawrence Berkeley National Laboratory, One Cyclotron Road, Berkeley CA 94720, USA,; Computer Science Division,University of California, Berkeley, CA, USA,; Computer Science Department, University of California, Irvine, CA, USA,; All authors are with the Berkeley Drosophila Transcription Network Project, Lawrence Berkeley National Laboratory,; Rubel, Oliver; Weber, Gunther H.; Huang, Min-Yu; Bethel, E. Wes; Biggin, Mark D.; Fowlkes, Charless C.; Hendriks, Cris L. Luengo; Keranen, Soile V. E.; Eisen, Michael B.; Knowles, David W.; Malik, Jitendra; Hagen, Hans; Hamann, Bernd

    2008-05-12

    The recent development of methods for extracting precise measurements of spatial gene expression patterns from three-dimensional (3D) image data opens the way for new analyses of the complex gene regulatory networks controlling animal development. We present an integrated visualization and analysis framework that supports user-guided data clustering to aid exploration of these new complex datasets. The interplay of data visualization and clustering-based data classification leads to improved visualization and enables a more detailed analysis than previously possible. We discuss (i) integration of data clustering and visualization into one framework; (ii) application of data clustering to 3D gene expression data; (iii) evaluation of the number of clusters k in the context of 3D gene expression clustering; and (iv) improvement of overall analysis quality via dedicated post-processing of clustering results based on visualization. We discuss the use of this framework to objectively define spatial pattern boundaries and temporal profiles of genes and to analyze how mRNA patterns are controlled by their regulatory transcription factors.

  4. A distributed cloud-based cyberinfrastructure framework for integrated bridge monitoring

    Science.gov (United States)

    Jeong, Seongwoon; Hou, Rui; Lynch, Jerome P.; Sohn, Hoon; Law, Kincho H.

    2017-04-01

    This paper describes a cloud-based cyberinfrastructure framework for the management of the diverse data involved in bridge monitoring. Bridge monitoring involves various hardware systems, software tools and laborious activities that include, for examples, a structural health monitoring (SHM), sensor network, engineering analysis programs and visual inspection. Very often, these monitoring systems, tools and activities are not coordinated, and the collected information are not shared. A well-designed integrated data management framework can support the effective use of the data and, thereby, enhance bridge management and maintenance operations. The cloud-based cyberinfrastructure framework presented herein is designed to manage not only sensor measurement data acquired from the SHM system, but also other relevant information, such as bridge engineering model and traffic videos, in an integrated manner. For the scalability and flexibility, cloud computing services and distributed database systems are employed. The information stored can be accessed through standard web interfaces. For demonstration, the cyberinfrastructure system is implemented for the monitoring of the bridges located along the I-275 Corridor in the state of Michigan.

  5. Integrating Pavement Crack Detection and Analysis Using Autonomous Unmanned Aerial Vehicle Imagery

    Science.gov (United States)

    2015-03-27

    INTEGRATING PAVEMENT CRACK DETECTION AND ANALYSIS USING AUTONOMOUS UNMANNED AERIAL VEHICLE...protection in the United States. AFIT-ENV-MS-15-M-195 INTEGRATING PAVEMENT CRACK DETECTION AND ANALYSIS USING AUTONOMOUS UNMANNED AERIAL...APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED. AFIT-ENV-MS-15-M-195 INTEGRATING PAVEMENT CRACK DETECTION AND ANALYSIS USING AUTONOMOUS

  6. KaBOB: ontology-based semantic integration of biomedical databases.

    Science.gov (United States)

    Livingston, Kevin M; Bada, Michael; Baumgartner, William A; Hunter, Lawrence E

    2015-04-23

    The ability to query many independent biological databases using a common ontology-based semantic model would facilitate deeper integration and more effective utilization of these diverse and rapidly growing resources. Despite ongoing work moving toward shared data formats and linked identifiers, significant problems persist in semantic data integration in order to establish shared identity and shared meaning across heterogeneous biomedical data sources. We present five processes for semantic data integration that, when applied collectively, solve seven key problems. These processes include making explicit the differences between biomedical concepts and database records, aggregating sets of identifiers denoting the same biomedical concepts across data sources, and using declaratively represented forward-chaining rules to take information that is variably represented in source databases and integrating it into a consistent biomedical representation. We demonstrate these processes and solutions by presenting KaBOB (the Knowledge Base Of Biomedicine), a knowledge base of semantically integrated data from 18 prominent biomedical databases using common representations grounded in Open Biomedical Ontologies. An instance of KaBOB with data about humans and seven major model organisms can be built using on the order of 500 million RDF triples. All source code for building KaBOB is available under an open-source license. KaBOB is an integrated knowledge base of biomedical data representationally based in prominent, actively maintained Open Biomedical Ontologies, thus enabling queries of the underlying data in terms of biomedical concepts (e.g., genes and gene products, interactions and processes) rather than features of source-specific data schemas or file formats. KaBOB resolves many of the issues that routinely plague biomedical researchers intending to work with data from multiple data sources and provides a platform for ongoing data integration and development and for

  7. Integrated assessment of river health based on the conditions of water quality,aquatic life and physical habitat

    Institute of Scientific and Technical Information of China (English)

    MENG Wei; ZHANG Nan; ZHANG Yuan; ZHENG Binghui

    2009-01-01

    The health conditions of Liao River were assessed using 25 sampling sites in April 2005, with water quality index, biotic index and physical habitat quality index.Based on the method of cluster analysis (CA) for water quality indices, it reveals that heavily polluted sites of Liao River are located at estuary and mainstream.The aquatic species surveyed were attached algae and benthic invertebrates.The result shows that the diversity and biomass of attached algae and benthic index of biotic integrity (B-IBI) are degrading as the chemical and physical quality of water bodies deteriorating.Physiochemical parameters, BOD5, CODCr, TN, TP, NH3-N, DO, petroleum hydrocarbon and conductivity, were statistically analyzed with principal component analysis and correlation analysis.The statistical results were incorporated into the integrated assessing water quality index, combining fecal coliform count, attached algae diversity, B-IBI and physical habitat quality score, a comprehensive integrated assessing system of river ecological health was established.Based on the systimetic assesment, the assessed sites are categorized into 9 "healthy" and "sub-healthy" sites and 8 "sub-sick" and "sick" sites.

  8. HTGR-Integrated Coal To Liquids Production Analysis

    International Nuclear Information System (INIS)

    Gandrik, Anastasia M.; Wood, Rick A.

    2010-01-01

    As part of the DOE's Idaho National Laboratory (INL) nuclear energy development mission, the INL is leading a program to develop and design a high temperature gas-cooled reactor (HTGR), which has been selected as the base design for the Next Generation Nuclear Plant. Because an HTGR operates at a higher temperature, it can provide higher temperature process heat, more closely matched to chemical process temperatures, than a conventional light water reactor. Integrating HTGRs into conventional industrial processes would increase U.S. energy security and potentially reduce greenhouse gas emissions (GHG), particularly CO2. This paper focuses on the integration of HTGRs into a coal to liquids (CTL) process, for the production of synthetic diesel fuel, naphtha, and liquefied petroleum gas (LPG). The plant models for the CTL processes were developed using Aspen Plus. The models were constructed with plant production capacity set at 50,000 barrels per day of liquid products. Analysis of the conventional CTL case indicated a potential need for hydrogen supplementation from high temperature steam electrolysis (HTSE), with heat and power supplied by the HTGR. By supplementing the process with an external hydrogen source, the need to 'shift' the syngas using conventional water-gas shift reactors was eliminated. HTGR electrical power generation efficiency was set at 40%, a reactor size of 600 MWth was specified, and it was assumed that heat in the form of hot helium could be delivered at a maximum temperature of 700 C to the processes. Results from the Aspen Plus model were used to perform a preliminary economic analysis and a life cycle emissions assessment. The following conclusions were drawn when evaluating the nuclear assisted CTL process against the conventional process: (1) 11 HTGRs (600 MWth each) are required to support production of a 50,000 barrel per day CTL facility. When compared to conventional CTL production, nuclear integration decreases coal consumption by 66

  9. HTGR-INTEGRATED COAL TO LIQUIDS PRODUCTION ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Anastasia M Gandrik; Rick A Wood

    2010-10-01

    As part of the DOE’s Idaho National Laboratory (INL) nuclear energy development mission, the INL is leading a program to develop and design a high temperature gas-cooled reactor (HTGR), which has been selected as the base design for the Next Generation Nuclear Plant. Because an HTGR operates at a higher temperature, it can provide higher temperature process heat, more closely matched to chemical process temperatures, than a conventional light water reactor. Integrating HTGRs into conventional industrial processes would increase U.S. energy security and potentially reduce greenhouse gas emissions (GHG), particularly CO2. This paper focuses on the integration of HTGRs into a coal to liquids (CTL) process, for the production of synthetic diesel fuel, naphtha, and liquefied petroleum gas (LPG). The plant models for the CTL processes were developed using Aspen Plus. The models were constructed with plant production capacity set at 50,000 barrels per day of liquid products. Analysis of the conventional CTL case indicated a potential need for hydrogen supplementation from high temperature steam electrolysis (HTSE), with heat and power supplied by the HTGR. By supplementing the process with an external hydrogen source, the need to “shift” the syngas using conventional water-gas shift reactors was eliminated. HTGR electrical power generation efficiency was set at 40%, a reactor size of 600 MWth was specified, and it was assumed that heat in the form of hot helium could be delivered at a maximum temperature of 700°C to the processes. Results from the Aspen Plus model were used to perform a preliminary economic analysis and a life cycle emissions assessment. The following conclusions were drawn when evaluating the nuclear assisted CTL process against the conventional process: • 11 HTGRs (600 MWth each) are required to support production of a 50,000 barrel per day CTL facility. When compared to conventional CTL production, nuclear integration decreases coal

  10. Strategic Analysis of Technology Integration at Allstream

    OpenAIRE

    Brown, Jeff

    2011-01-01

    Innovation has been defined as the combination of invention and commercialization. Invention without commercialization is rarely, if ever, profitable. For the purposes of this paper the definition of innovation will be further expanded into the concept of technology integration. Successful technology integration not only includes new technology introduction, but also the operationalization of the new technology within each business unit of the enterprise. This paper conducts an analysis of Al...

  11. Developing a comprehensive framework of community integration for people with acquired brain injury: a conceptual analysis.

    Science.gov (United States)

    Shaikh, Nusratnaaz M; Kersten, Paula; Siegert, Richard J; Theadom, Alice

    2018-03-06

    Despite increasing emphasis on the importance of community integration as an outcome for acquired brain injury (ABI), there is still no consensus on the definition of community integration. The aim of this study was to complete a concept analysis of community integration in people with ABI. The method of concept clarification was used to guide concept analysis of community integration based on a literature review. Articles were included if they explored community integration in people with ABI. Data extraction was performed by the initial coding of (1) the definition of community integration used in the articles, (2) attributes of community integration recognized in the articles' findings, and (3) the process of community integration. This information was synthesized to develop a model of community integration. Thirty-three articles were identified that met the inclusion criteria. The construct of community integration was found to be a non-linear process reflecting recovery over time, sequential goals, and transitions. Community integration was found to encompass six components including: independence, sense of belonging, adjustment, having a place to live, involved in a meaningful occupational activity, and being socially connected into the community. Antecedents to community integration included individual, injury-related, environmental, and societal factors. The findings of this concept analysis suggest that the concept of community integration is more diverse than previously recognized. New measures and rehabilitation plans capturing all attributes of community integration are needed in clinical practice. Implications for rehabilitation Understanding of perceptions and lived experiences of people with acquired brain injury through this analysis provides basis to ensure rehabilitation meets patients' needs. This model highlights the need for clinicians to be aware and assess the role of antecedents as well as the attributes of community integration itself to

  12. Integrated analysis on static/dynamic aeroelasticity of curved panels based on a modified local piston theory

    Science.gov (United States)

    Yang, Zhichun; Zhou, Jian; Gu, Yingsong

    2014-10-01

    A flow field modified local piston theory, which is applied to the integrated analysis on static/dynamic aeroelastic behaviors of curved panels, is proposed in this paper. The local flow field parameters used in the modification are obtained by CFD technique which has the advantage to simulate the steady flow field accurately. This flow field modified local piston theory for aerodynamic loading is applied to the analysis of static aeroelastic deformation and flutter stabilities of curved panels in hypersonic flow. In addition, comparisons are made between results obtained by using the present method and curvature modified method. It shows that when the curvature of the curved panel is relatively small, the static aeroelastic deformations and flutter stability boundaries obtained by these two methods have little difference, while for curved panels with larger curvatures, the static aeroelastic deformation obtained by the present method is larger and the flutter stability boundary is smaller compared with those obtained by the curvature modified method, and the discrepancy increases with the increasing of curvature of panels. Therefore, the existing curvature modified method is non-conservative compared to the proposed flow field modified method based on the consideration of hypersonic flight vehicle safety, and the proposed flow field modified local piston theory for curved panels enlarges the application range of piston theory.

  13. Multi-response optimization of surface integrity characteristics of EDM process using grey-fuzzy logic-based hybrid approach

    Directory of Open Access Journals (Sweden)

    Shailesh Dewangan

    2015-09-01

    Full Text Available Surface integrity remains one of the major areas of concern in electric discharge machining (EDM. During the current study, grey-fuzzy logic-based hybrid optimization technique is utilized to determine the optimal settings of EDM process parameters with an aim to improve surface integrity aspects after EDM of AISI P20 tool steel. The experiment is designed using response surface methodology (RSM considering discharge current (Ip, pulse-on time (Ton, tool-work time (Tw and tool-lift time (Tup as process parameters. Various surface integrity characteristics such as white layer thickness (WLT, surface crack density (SCD and surface roughness (SR are considered during the current research work. Grey relational analysis (GRA combined with fuzzy-logic is used to determine grey fuzzy reasoning grade (GFRG. The optimal solution based on this analysis is found to be Ip = 1 A, Ton = 10 μs, Tw = 0.2 s, and Tup = 0.0 s. Analysis of variance (ANOVA results clearly indicate that Ton is the most contributing parameter followed by Ip, for multiple performance characteristics of surface integrity.

  14. Argentinean integrated small reactor design and scale economy analysis of integrated reactor

    International Nuclear Information System (INIS)

    Florido, P. C.; Bergallo, J. E.; Ishida, M. V.

    2000-01-01

    This paper describes the design of CAREM, which is Argentinean integrated small reactor project and the scale economy analysis results of integrated reactor. CAREM project consists on the development, design and construction of a small nuclear power plant. CAREM is an advanced reactor conceived with new generation design solutions and standing on the large experience accumulated in the safe operation of Light Water Reactors. The CAREM is an indirect cycle reactor with some distinctive and characteristic features that greatly simplify the reactor and also contribute to a highly level of safety: integrated primary cooling system, self pressurized, primary cooling by natural circulation and safety system relying on passive features. For a fully doupled economic evaluation of integrated reactors done by IREP (Integrated Reactor Evaluation Program) code transferred to IAEA, CAREM have been used as a reference point. The results shows that integrated reactors become competitive with power larger than 200MWe with Argentinean cheapest electricity option. Due to reactor pressure vessel construction limit, low pressure drop steam generator are used to reach power output of 200MWe for natural circulation. For forced circulation, 300MWe can be achieved. (author)

  15. Uniformity testing: assessment of a centralized web-based uniformity analysis system.

    Science.gov (United States)

    Klempa, Meaghan C

    2011-06-01

    Uniformity testing is performed daily to ensure adequate camera performance before clinical use. The aim of this study is to assess the reliability of Beth Israel Deaconess Medical Center's locally built, centralized, Web-based uniformity analysis system by examining the differences between manufacturer and Web-based National Electrical Manufacturers Association integral uniformity calculations measured in the useful field of view (FOV) and the central FOV. Manufacturer and Web-based integral uniformity calculations measured in the useful FOV and the central FOV were recorded over a 30-d period for 4 cameras from 3 different manufacturers. These data were then statistically analyzed. The differences between the uniformity calculations were computed, in addition to the means and the SDs of these differences for each head of each camera. There was a correlation between the manufacturer and Web-based integral uniformity calculations in the useful FOV and the central FOV over the 30-d period. The average differences between the manufacturer and Web-based useful FOV calculations ranged from -0.30 to 0.099, with SD ranging from 0.092 to 0.32. For the central FOV calculations, the average differences ranged from -0.163 to 0.055, with SD ranging from 0.074 to 0.24. Most of the uniformity calculations computed by this centralized Web-based uniformity analysis system are comparable to the manufacturers' calculations, suggesting that this system is reasonably reliable and effective. This finding is important because centralized Web-based uniformity analysis systems are advantageous in that they test camera performance in the same manner regardless of the manufacturer.

  16. Integrative pathway knowledge bases as a tool for systems molecular medicine.

    Science.gov (United States)

    Liang, Mingyu

    2007-08-20

    There exists a sense of urgency to begin to generate a cohesive assembly of biomedical knowledge as the pace of knowledge accumulation accelerates. The urgency is in part driven by the emergence of systems molecular medicine that emphasizes the combination of systems analysis and molecular dissection in the future of medical practice and research. A potentially powerful approach is to build integrative pathway knowledge bases that link organ systems function with molecules.

  17. Integrating Mainframe Data Bases on a Microcomputer

    OpenAIRE

    Marciniak, Thomas A.

    1985-01-01

    Microcomputers support user-friendly software for interrogating their resident data bases. Many medical data bases currently consist of files on less accessible mainframe computers with more limited inquiry capabilities. We discuss the transferring and integrating of mainframe data into microcomputer data base systems in one medical environment.

  18. Respiromics – An integrative analysis linking mitochondrial bioenergetics to molecular signatures

    Directory of Open Access Journals (Sweden)

    Ellen Walheim

    2018-03-01

    Full Text Available Objective: Energy metabolism is challenged upon nutrient stress, eventually leading to a variety of metabolic diseases that represent a major global health burden. Methods: Here, we combine quantitative mitochondrial respirometry (Seahorse technology and proteomics (LC-MS/MS-based total protein approach to understand how molecular changes translate to changes in mitochondrial energy transduction during diet-induced obesity (DIO in the liver. Results: The integrative analysis reveals that significantly increased palmitoyl-carnitine respiration is supported by an array of proteins enriching lipid metabolism pathways. Upstream of the respiratory chain, the increased capacity for ATP synthesis during DIO associates strongest to mitochondrial uptake of pyruvate, which is routed towards carboxylation. At the respiratory chain, robust increases of complex I are uncovered by cumulative analysis of single subunit concentrations. Specifically, nuclear-encoded accessory subunits, but not mitochondrial-encoded or core units, appear to be permissive for enhanced lipid oxidation. Conclusion: Our integrative analysis, that we dubbed “respiromics”, represents an effective tool to link molecular changes to functional mechanisms in liver energy metabolism, and, more generally, can be applied for mitochondrial analysis in a variety of metabolic and mitochondrial disease models. Keywords: Mitochondria, Respirometry, Proteomics, Mitochondrial pyruvate carrier, Liver disease, Bioenergetics, Obesity, Diabetes

  19. Integrative analysis of many weighted co-expression networks using tensor computation.

    Directory of Open Access Journals (Sweden)

    Wenyuan Li

    2011-06-01

    Full Text Available The rapid accumulation of biological networks poses new challenges and calls for powerful integrative analysis tools. Most existing methods capable of simultaneously analyzing a large number of networks were primarily designed for unweighted networks, and cannot easily be extended to weighted networks. However, it is known that transforming weighted into unweighted networks by dichotomizing the edges of weighted networks with a threshold generally leads to information loss. We have developed a novel, tensor-based computational framework for mining recurrent heavy subgraphs in a large set of massive weighted networks. Specifically, we formulate the recurrent heavy subgraph identification problem as a heavy 3D subtensor discovery problem with sparse constraints. We describe an effective approach to solving this problem by designing a multi-stage, convex relaxation protocol, and a non-uniform edge sampling technique. We applied our method to 130 co-expression networks, and identified 11,394 recurrent heavy subgraphs, grouped into 2,810 families. We demonstrated that the identified subgraphs represent meaningful biological modules by validating against a large set of compiled biological knowledge bases. We also showed that the likelihood for a heavy subgraph to be meaningful increases significantly with its recurrence in multiple networks, highlighting the importance of the integrative approach to biological network analysis. Moreover, our approach based on weighted graphs detects many patterns that would be overlooked using unweighted graphs. In addition, we identified a large number of modules that occur predominately under specific phenotypes. This analysis resulted in a genome-wide mapping of gene network modules onto the phenome. Finally, by comparing module activities across many datasets, we discovered high-order dynamic cooperativeness in protein complex networks and transcriptional regulatory networks.

  20. Integration of End-User Cloud Storage for CMS Analysis

    CERN Document Server

    Riahi, Hassen; Álvarez Ayllón, Alejandro; Balcas, Justas; Ciangottini, Diego; Hernández, José M; Keeble, Oliver; Magini, Nicolò; Manzi, Andrea; Mascetti, Luca; Mascheroni, Marco; Tanasijczuk, Andres Jorge; Vaandering, Eric Wayne

    2018-01-01

    End-user Cloud storage is increasing rapidly in popularity in research communities thanks to the collaboration capabilities it offers, namely synchronisation and sharing. CERN IT has implemented a model of such storage named, CERNBox, integrated with the CERN AuthN and AuthZ services. To exploit the use of the end-user Cloud storage for the distributed data analysis activity, the CMS experiment has started the integration of CERNBox as a Grid resource. This will allow CMS users to make use of their own storage in the Cloud for their analysis activities as well as to benefit from synchronisation and sharing capabilities to achieve results faster and more effectively. It will provide an integration model of Cloud storages in the Grid, which is implemented and commissioned over the world’s largest computing Grid infrastructure, Worldwide LHC Computing Grid (WLCG). In this paper, we present the integration strategy and infrastructure changes needed in order to transparently integrate end-user Cloud storage with...

  1. Advantages of Integrative Data Analysis for Developmental Research

    Science.gov (United States)

    Bainter, Sierra A.; Curran, Patrick J.

    2015-01-01

    Amid recent progress in cognitive development research, high-quality data resources are accumulating, and data sharing and secondary data analysis are becoming increasingly valuable tools. Integrative data analysis (IDA) is an exciting analytical framework that can enhance secondary data analysis in powerful ways. IDA pools item-level data across…

  2. Integrated intelligent instruments using supercritical fluid technology for soil analysis

    International Nuclear Information System (INIS)

    Liebman, S.A.; Phillips, C.; Fitzgerald, W.; Levy, E.J.

    1994-01-01

    Contaminated soils pose a significant challenge for characterization and remediation programs that require rapid, accurate and comprehensive data in the field or laboratory. Environmental analyzers based on supercritical fluid (SF) technology have been designed and developed for meeting these global needs. The analyzers are designated the CHAMP Systems (Chemical Hazards Automated Multimedia Processors). The prototype instrumentation features SF extraction (SFE) and on-line capillary gas chromatographic (GC) analysis with chromatographic and/or spectral identification detectors, such as ultra-violet, Fourier transform infrared and mass spectrometers. Illustrations are given for a highly automated SFE-capillary GC/flame ionization (FID) configuration to provide validated screening analysis for total extractable hydrocarbons within ca. 5--10 min, as well as a full qualitative/quantitative analysis in 25--30 min. Data analysis using optional expert system and neural networks software is demonstrated for test gasoline and diesel oil mixtures in this integrated intelligent instrument approach to trace organic analysis of soils and sediments

  3. Technology integrated teaching in Malaysian schools: GIS, a SWOT analysis

    Directory of Open Access Journals (Sweden)

    Habibah Lateh, vasugiammai muniandy

    2011-08-01

    Full Text Available Geographical Information System (GIS has been introduced and widely used in schools in various countries. The year 1990 onwards, the implementation of GIS in schools showed an increase. This is due to the drastic changes and reforms in the education system. Even though the name GIS suits well to the Geography subject, but it is widely integrated in various subjects such as History, Chemistry, Physics and Science. In Malaysia, GIS is common in fields such as risk management, architecture, town planning and municipal department. Anyhow, it is still unknown in the school education system. Even upper secondary students are not familiar with GIS. The Ministry of Education in Malaysia has been continuously reforming the education towards the aim of creating a society based on economic fundamentals and knowledge. The Master Plan for Educational Development with the aim of developing individual potential with well-integrated and balanced education is already on field. Recently, Malaysia invested 18 % of the annual national budget towards upgrading its education system. The computer in education program started in 1999. Three hundred and twenty two schools were chosen as ‘break a way’ from conventional teaching method towards technology integrated teaching. Projects such as New Primary School Curriculum (KBSR, Integrated Secondary School Curriculum (KBSM, Smart School Project, School Access Centre were introduced constantly. Teacher as the cogwheel of innovations in schools were given courses in aim to develop their ICT knowledge and skill. To this date, the technology integration in subjects is not equal and it disperses through subjects. Geography is one of the ‘dry’ subjects in schools with less technology which is not preferable among students. Geographical Information System (GIS is foremost the best Geographical Information Technology (GIT to be implied in geography subject. In Malaysian Education System, GIS is still exposed just in papers

  4. Integrate life-cycle assessment and risk analysis results, not methods.

    Science.gov (United States)

    Linkov, Igor; Trump, Benjamin D; Wender, Ben A; Seager, Thomas P; Kennedy, Alan J; Keisler, Jeffrey M

    2017-08-04

    Two analytic perspectives on environmental assessment dominate environmental policy and decision-making: risk analysis (RA) and life-cycle assessment (LCA). RA focuses on management of a toxicological hazard in a specific exposure scenario, while LCA seeks a holistic estimation of impacts of thousands of substances across multiple media, including non-toxicological and non-chemically deleterious effects. While recommendations to integrate the two approaches have remained a consistent feature of environmental scholarship for at least 15 years, the current perception is that progress is slow largely because of practical obstacles, such as a lack of data, rather than insurmountable theoretical difficulties. Nonetheless, the emergence of nanotechnology presents a serious challenge to both perspectives. Because the pace of nanomaterial innovation far outstrips acquisition of environmentally relevant data, it is now clear that a further integration of RA and LCA based on dataset completion will remain futile. In fact, the two approaches are suited for different purposes and answer different questions. A more pragmatic approach to providing better guidance to decision-makers is to apply the two methods in parallel, integrating only after obtaining separate results.

  5. SAW-Based Phononic Crystal Microfluidic Sensor-Microscale Realization of Velocimetry Approaches for Integrated Analytical Platform Applications.

    Science.gov (United States)

    Oseev, Aleksandr; Lucklum, Ralf; Zubtsov, Mikhail; Schmidt, Marc-Peter; Mukhin, Nikolay V; Hirsch, Soeren

    2017-09-23

    The current work demonstrates a novel surface acoustic wave (SAW) based phononic crystal sensor approach that allows the integration of a velocimetry-based sensor concept into single chip integrated solutions, such as Lab-on-a-Chip devices. The introduced sensor platform merges advantages of ultrasonic velocimetry analytic systems and a microacoustic sensor approach. It is based on the analysis of structural resonances in a periodic composite arrangement of microfluidic channels confined within a liquid analyte. Completed theoretical and experimental investigations show the ability to utilize periodic structure localized modes for the detection of volumetric properties of liquids and prove the efficacy of the proposed sensor concept.

  6. Process integration and pinch analysis in sugarcane industry

    Energy Technology Data Exchange (ETDEWEB)

    Prado, Adelk de Carvalho; Pinheiro, Ricardo Brant [UFMG, Departamento de Engenharia Nuclear, Programa de Pos-Graduacao em Ciencias e Tecnicas Nucleares, Belo Horizonte, MG (Brazil)], E-mail: rbp@nuclear.ufmg.br

    2010-07-01

    Process integration techniques were applied, particularly through the Pinch Analysis method, to sugarcane industry. Research was performed upon harvest data from an agroindustrial complex which processes sugarcane plant in excess of 3.5 million metric tons per year, producing motor fuel grade ethanol, standard quality sugar, and delivering excess electric power to the grid. Pinch Analysis was used in assessing internal heat recovery as well as external utility demand targets, while keeping the lowest but economically achievable targets for entropy increase. Efficiency on the use of energy was evaluated for the plant as it was found (the base case) as well as for five selected process and/or plant design modifications, always with guidance of the method. The first alternative design (case 2) was proposed to evaluate equipment mean idle time in the base case, to support subsequent comparisons. Cases 3 and 4 were used to estimate the upper limits of combined heat and power generation while raw material supply of the base case is kept; both the cases did not prove worth implementing. Cases 5 and 6 were devised to deal with the bottleneck of the plant, namely boiler capacity, in order to allow for some production increment. Inexpensive, minor modifications considered in case 5 were found unable to produce reasonable outcome gain. Nevertheless, proper changes in cane juice evaporation section (case 6) could allow sugar and ethanol combined production to rise up to 9.1% relative to the base case, without dropping cogenerated power. (author)

  7. Research on the Reliability Analysis of the Integrated Modular Avionics System Based on the AADL Error Model

    Directory of Open Access Journals (Sweden)

    Peng Wang

    2018-01-01

    Full Text Available In recent years, the integrated modular avionics (IMA concept has been introduced to replace the traditional federated avionics. Different avionics functions are hosted in a shared IMA platform, and IMA adopts partition technologies to provide a logical isolation among different functions. The IMA architecture can provide more sophisticated and powerful avionics functionality; meanwhile, the failure propagation patterns in IMA are more complex. The feature of resource sharing introduces some unintended interconnections among different functions, which makes the failure propagation modes more complex. Therefore, this paper proposes an architecture analysis and design language- (AADL- based method to establish the reliability model of IMA platform. The single software and hardware error behavior in IMA system is modeled. The corresponding AADL error model of failure propagation among components, between software and hardware, is given. Finally, the display function of IMA platform is taken as an example to illustrate the effectiveness of the proposed method.

  8. Integrating physically based simulators with Event Detection Systems: Multi-site detection approach.

    Science.gov (United States)

    Housh, Mashor; Ohar, Ziv

    2017-03-01

    The Fault Detection (FD) Problem in control theory concerns of monitoring a system to identify when a fault has occurred. Two approaches can be distinguished for the FD: Signal processing based FD and Model-based FD. The former concerns of developing algorithms to directly infer faults from sensors' readings, while the latter uses a simulation model of the real-system to analyze the discrepancy between sensors' readings and expected values from the simulation model. Most contamination Event Detection Systems (EDSs) for water distribution systems have followed the signal processing based FD, which relies on analyzing the signals from monitoring stations independently of each other, rather than evaluating all stations simultaneously within an integrated network. In this study, we show that a model-based EDS which utilizes a physically based water quality and hydraulics simulation models, can outperform the signal processing based EDS. We also show that the model-based EDS can facilitate the development of a Multi-Site EDS (MSEDS), which analyzes the data from all the monitoring stations simultaneously within an integrated network. The advantage of the joint analysis in the MSEDS is expressed by increased detection accuracy (higher true positive alarms and fewer false alarms) and shorter detection time. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Measurement-Based Investigation of Inter- and Intra-Area Effects of Wind Power Plant Integration

    Energy Technology Data Exchange (ETDEWEB)

    Allen, Alicia J.; Singh, Mohit; Muljadi, Eduard; Santoso, Surya

    2016-12-01

    This paper has a two pronged objective: the first objective is to analyze the general effects of wind power plant (WPP) integration and the resulting displacement of conventional power plant (CPP) inertia on power system stability and the second is to demonstrate the efficacy of PMU data in power system stability analyses, specifically when knowledge of the network is incomplete. Traditionally modal analysis applies small signal stability analysis based on Eigenvalues and the assumption of complete knowledge of the network and all of its components. The analysis presented here differs because it is a measurement-based investigation and employs simulated measurement data. Even if knowledge of the network were incomplete, this methodology would allow for monitoring and analysis of modes. This allows non-utility entities and study of power system stability. To generate inter- and intra-area modes, Kundur's well-known two-area four-generator system is modeled in PSCAD/EMTDC. A doubly-fed induction generator based WPP model, based on the Western Electricity Coordination Council (WECC) standard model, is included to analyze the effects of wind power on system modes. The two-area system and WPP are connected in various configurations with respect to WPP placement, CPP inertia and WPP penetration level. Analysis is performed on the data generated by the simulations. For each simulation run, a different configuration is chosen and a large disturbance is applied. The sampling frequency is set to resemble the sampling frequency at which data is available from phasor measurement units (PMUs). The estimate of power spectral density of these signals is made using the Yule-Walker algorithm. The resulting analysis shows that the presence of a WPP does not, of itself, lead to the introduction of new modes. The analysis also shows however that displacement of inertia may lead to introduction of new modes. The effects of location of inertia displacement (i.e. the effects on

  10. Methodological Bases for Describing Risks of the Enterprise Business Model in Integrated Reporting

    Directory of Open Access Journals (Sweden)

    Nesterenko Oksana O.

    2017-12-01

    Full Text Available The aim of the article is to substantiate the methodological bases for describing the business and accounting risks of an enterprise business model in integrated reporting for their timely detection and assessment, and develop methods for their leveling or minimizing and possible prevention. It is proposed to consider risks in the process of forming integrated reporting from two sides: first, risks that arise in the business model of an organization and should be disclosed in its integrated report; second, accounting risks of integrated reporting, which should be taken into account by members of the cross-sectoral working group and management personnel in the process of forming and promulgating integrated reporting. To develop an adequate accounting and analytical tool for disclosure of information about the risks of the business model and integrated reporting, their leveling or minimization, in the article a terminological analysis of the essence of entrepreneurial and accounting risks is carried out. The entrepreneurial risk is defined as an objective-subjective economic category that characterizes the probability of negative or positive consequences of economic-social-ecological activity within the framework of the business model of an enterprise under uncertainty. The accounting risk is suggested to be understood as the probability of unfavorable consequences as a result of organizational, methodological errors in the integrated accounting system, which present threat to the quality, accuracy and reliability of the reporting information on economic, social and environmental activities in integrated reporting as well as threat of inappropriate decision-making by stakeholders based on the integrated report. For the timely identification of business risks and maximum leveling of the influence of accounting risks on the process of formation and publication of integrated reporting, in the study the place of entrepreneurial and accounting risks in

  11. Integrative biological analysis for neuropsychopharmacology.

    Science.gov (United States)

    Emmett, Mark R; Kroes, Roger A; Moskal, Joseph R; Conrad, Charles A; Priebe, Waldemar; Laezza, Fernanda; Meyer-Baese, Anke; Nilsson, Carol L

    2014-01-01

    Although advances in psychotherapy have been made in recent years, drug discovery for brain diseases such as schizophrenia and mood disorders has stagnated. The need for new biomarkers and validated therapeutic targets in the field of neuropsychopharmacology is widely unmet. The brain is the most complex part of human anatomy from the standpoint of number and types of cells, their interconnections, and circuitry. To better meet patient needs, improved methods to approach brain studies by understanding functional networks that interact with the genome are being developed. The integrated biological approaches--proteomics, transcriptomics, metabolomics, and glycomics--have a strong record in several areas of biomedicine, including neurochemistry and neuro-oncology. Published applications of an integrated approach to projects of neurological, psychiatric, and pharmacological natures are still few but show promise to provide deep biological knowledge derived from cells, animal models, and clinical materials. Future studies that yield insights based on integrated analyses promise to deliver new therapeutic targets and biomarkers for personalized medicine.

  12. Photometric method for determination of acidity constants through integral spectra analysis.

    Science.gov (United States)

    Zevatskiy, Yuriy Eduardovich; Ruzanov, Daniil Olegovich; Samoylov, Denis Vladimirovich

    2015-04-15

    An express method for determination of acidity constants of organic acids, based on the analysis of the integral transmittance vs. pH dependence is developed. The integral value is registered as a photocurrent of photometric device simultaneously with potentiometric titration. The proposed method allows to obtain pKa using only simple and low-cost instrumentation. The optical part of the experimental setup has been optimized through the exclusion of the monochromator device. Thus it only takes 10-15 min to obtain one pKa value with the absolute error of less than 0.15 pH units. Application limitations and reliability of the method have been tested for a series of organic acids of various nature. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Metabolome Integrated Analysis of High-Temperature Response in Pinus radiata

    Directory of Open Access Journals (Sweden)

    Mónica Escandón

    2018-04-01

    Full Text Available The integrative omics approach is crucial to identify the molecular mechanisms underlying high-temperature response in non-model species. Based on future scenarios of heat increase, Pinus radiata plants were exposed to a temperature of 40°C for a period of 5 days, including recovered plants (30 days after last exposure to 40°C in the analysis. The analysis of the metabolome using complementary mass spectrometry techniques (GC-MS and LC-Orbitrap-MS allowed the reliable quantification of 2,287 metabolites. The analysis of identified metabolites and highlighter metabolic pathways across heat time exposure reveal the dynamism of the metabolome in relation to high-temperature response in P. radiata, identifying the existence of a turning point (on day 3 at which P. radiata plants changed from an initial stress response program (shorter-term response to an acclimation one (longer-term response. Furthermore, the integration of metabolome and physiological measurements, which cover from the photosynthetic state to hormonal profile, suggests a complex metabolic pathway interaction network related to heat-stress response. Cytokinins (CKs, fatty acid metabolism and flavonoid and terpenoid biosynthesis were revealed as the most important pathways involved in heat-stress response in P. radiata, with zeatin riboside (ZR and isopentenyl adenosine (iPA as the key hormones coordinating these multiple and complex interactions. On the other hand, the integrative approach allowed elucidation of crucial metabolic mechanisms involved in heat response in P. radiata, as well as the identification of thermotolerance metabolic biomarkers (L-phenylalanine, hexadecanoic acid, and dihydromyricetin, crucial metabolites which can reschedule the metabolic strategy to adapt to high temperature.

  14. IMPLEMENTATION OF GIS-BASED MULTICRITERIA DECISION ANALYSIS WITH VB IN ArcGIS

    OpenAIRE

    DERYA OZTURK; FATMAGUL BATUK

    2011-01-01

    This article focuses on the integration of multicriteria decision analysis (MCDA) and geographical information systems (GIS) and introduces a tool, GIS–MCDA, written in visual basic in ArcGIS for GIS-based MCDA. The GIS–MCDA deals with raster-based data sets and includes standardization, weighting and decision analysis methods, and sensitivity analysis. Simple additive weighting, weighted product method, technique for order preference by similarity to ideal solution, compromise programming, a...

  15. Integrative Analysis of Gene Expression Data Including an Assessment of Pathway Enrichment for Predicting Prostate Cancer

    Directory of Open Access Journals (Sweden)

    Pingzhao Hu

    2006-01-01

    Full Text Available Background: Microarray technology has been previously used to identify genes that are differentially expressed between tumour and normal samples in a single study, as well as in syntheses involving multiple studies. When integrating results from several Affymetrix microarray datasets, previous studies summarized probeset-level data, which may potentially lead to a loss of information available at the probe-level. In this paper, we present an approach for integrating results across studies while taking probe-level data into account. Additionally, we follow a new direction in the analysis of microarray expression data, namely to focus on the variation of expression phenotypes in predefined gene sets, such as pathways. This targeted approach can be helpful for revealing information that is not easily visible from the changes in the individual genes. Results: We used a recently developed method to integrate Affymetrix expression data across studies. The idea is based on a probe-level based test statistic developed for testing for differentially expressed genes in individual studies. We incorporated this test statistic into a classic random-effects model for integrating data across studies. Subsequently, we used a gene set enrichment test to evaluate the significance of enriched biological pathways in the differentially expressed genes identified from the integrative analysis. We compared statistical and biological significance of the prognostic gene expression signatures and pathways identified in the probe-level model (PLM with those in the probeset-level model (PSLM. Our integrative analysis of Affymetrix microarray data from 110 prostate cancer samples obtained from three studies reveals thousands of genes significantly correlated with tumour cell differentiation. The bioinformatics analysis, mapping these genes to the publicly available KEGG database, reveals evidence that tumour cell differentiation is significantly associated with many

  16. iTRAQ-Based Proteomics Analysis and Network Integration for Kernel Tissue Development in Maize

    Science.gov (United States)

    Dong, Yongbin; Wang, Qilei; Du, Chunguang; Xiong, Wenwei; Li, Xinyu; Zhu, Sailan; Li, Yuling

    2017-01-01

    Grain weight is one of the most important yield components and a developmentally complex structure comprised of two major compartments (endosperm and pericarp) in maize (Zea mays L.), however, very little is known concerning the coordinated accumulation of the numerous proteins involved. Herein, we used isobaric tags for relative and absolute quantitation (iTRAQ)-based comparative proteomic method to analyze the characteristics of dynamic proteomics for endosperm and pericarp during grain development. Totally, 9539 proteins were identified for both components at four development stages, among which 1401 proteins were non-redundant, 232 proteins were specific in pericarp and 153 proteins were specific in endosperm. A functional annotation of the identified proteins revealed the importance of metabolic and cellular processes, and binding and catalytic activities for the tissue development. Three and 76 proteins involved in 49 Kyoto Encyclopedia of Genes and Genomes (KEGG) pathways were integrated for the specific endosperm and pericarp proteins, respectively, reflecting their complex metabolic interactions. In addition, four proteins with important functions and different expression levels were chosen for gene cloning and expression analysis. Different concordance between mRNA level and the protein abundance was observed across different proteins, stages, and tissues as in previous research. These results could provide useful message for understanding the developmental mechanisms in grain development in maize. PMID:28837076

  17. ANALYSIS OF ENVIRONMENTAL FRAGILITY USING MULTI-CRITERIA ANALYSIS (MCE FOR INTEGRATED LANDSCAPE ASSESSMENT

    Directory of Open Access Journals (Sweden)

    Abimael Cereda Junior

    2014-01-01

    Full Text Available The Geographic Information Systems brought greater possibilitie s to the representation and interpretation of the landscap e as well as the integrated a nalysis. However, this approach does not dispense technical and methodological substan tiation for achieving the computational universe. This work is grounded in ecodynamic s and empirical analysis of natural and anthr opogenic environmental Fragility a nd aims to propose and present an integrated paradigm of Multi-criteria Analysis and F uzzy Logic Model of Environmental Fragility, taking as a case study of the Basin of Monjolinho Stream in São Carlos-SP. The use of this methodology allowed for a reduct ion in the subjectivism influences of decision criteria, which factors might have its cartographic expression, respecting the complex integrated landscape.

  18. Algal Functional Annotation Tool: a web-based analysis suite to functionally interpret large gene lists using integrated annotation and expression data

    Directory of Open Access Journals (Sweden)

    Merchant Sabeeha S

    2011-07-01

    Full Text Available Abstract Background Progress in genome sequencing is proceeding at an exponential pace, and several new algal genomes are becoming available every year. One of the challenges facing the community is the association of protein sequences encoded in the genomes with biological function. While most genome assembly projects generate annotations for predicted protein sequences, they are usually limited and integrate functional terms from a limited number of databases. Another challenge is the use of annotations to interpret large lists of 'interesting' genes generated by genome-scale datasets. Previously, these gene lists had to be analyzed across several independent biological databases, often on a gene-by-gene basis. In contrast, several annotation databases, such as DAVID, integrate data from multiple functional databases and reveal underlying biological themes of large gene lists. While several such databases have been constructed for animals, none is currently available for the study of algae. Due to renewed interest in algae as potential sources of biofuels and the emergence of multiple algal genome sequences, a significant need has arisen for such a database to process the growing compendiums of algal genomic data. Description The Algal Functional Annotation Tool is a web-based comprehensive analysis suite integrating annotation data from several pathway, ontology, and protein family databases. The current version provides annotation for the model alga Chlamydomonas reinhardtii, and in the future will include additional genomes. The site allows users to interpret large gene lists by identifying associated functional terms, and their enrichment. Additionally, expression data for several experimental conditions were compiled and analyzed to provide an expression-based enrichment search. A tool to search for functionally-related genes based on gene expression across these conditions is also provided. Other features include dynamic visualization of

  19. A middleware-based platform for the integration of bioinformatic services

    Directory of Open Access Journals (Sweden)

    Guzmán Llambías

    2015-08-01

    Full Text Available Performing Bioinformatic´s experiments involve an intensive access to distributed services and information resources through Internet. Although existing tools facilitate the implementation of workflow-oriented applications, they lack of capabilities to integrate services beyond low-scale applications, particularly integrating services with heterogeneous interaction patterns and in a larger scale. This is particularly required to enable a large-scale distributed processing of biological data generated by massive sequencing technologies. On the other hand, such integration mechanisms are provided by middleware products like Enterprise Service Buses (ESB, which enable to integrate distributed systems following a Service Oriented Architecture. This paper proposes an integration platform, based on enterprise middleware, to integrate Bioinformatics services. It presents a multi-level reference architecture and focuses on ESB-based mechanisms to provide asynchronous communications, event-based interactions and data transformation capabilities. The paper presents a formal specification of the platform using the Event-B model.

  20. Development of internet-based cooperative system for integrity evaluation of reactor pressure vessel

    International Nuclear Information System (INIS)

    Kim, Jong Choon; Choi, Jae Boong; Kim, Young Jin; Choi, Young Hwan

    2004-01-01

    Since early 1950s fracture mechanics has brought significant impact on structural integrity assessment in a wide range of industries such as power, transportation, civil and petrochemical industries, especially in nuclear power plant industries. For the last two decades, significant efforts have been devoted in developing defect assessment procedures, from which various fitness-for-purpose or fitness-for-service codes have been developed. From another aspect, recent advances in IT (Information Technologies) bring rapid changes in various engineering fields. IT enables people to share information through network and thus provides concurrent working environment without limitations of working places. For this reason, a network system based on internet or intranet bas been appeared in various fields of business. Evaluating the integrity of structures is one of the most critical issues in nuclear industry. In order to evaluate the integrity of structures, a complicated and collaborative procedure is required including regular in-service inspection, fracture mechanics analysis, etc. And thus, experts in different fields have to cooperate to resolve the integrity problem. In this paper, an internet-based cooperative system for integrity evaluation system which adapts IT into a structural integrity evaluation procedure for reactor pressure vessel is introduced. The proposed system uses Virtual Reality (VR) technique, Virtual Network Computing (VNC) and agent programs. This system is able to support 3-dimensional virtual reality environment and to provide experts to cooperate by accessing related data through internet

  1. Integrated severe accident containment analysis with the CONTAIN computer code

    International Nuclear Information System (INIS)

    Bergeron, K.D.; Williams, D.C.; Rexroth, P.E.; Tills, J.L.

    1985-12-01

    Analysis of physical and radiological conditions iunside the containment building during a severe (core-melt) nuclear reactor accident requires quantitative evaluation of numerous highly disparate yet coupled phenomenologies. These include two-phase thermodynamics and thermal-hydraulics, aerosol physics, fission product phenomena, core-concrete interactions, the formation and combustion of flammable gases, and performance of engineered safety features. In the past, this complexity has meant that a complete containment analysis would require application of suites of separate computer codes each of which would treat only a narrower subset of these phenomena, e.g., a thermal-hydraulics code, an aerosol code, a core-concrete interaction code, etc. In this paper, we describe the development and some recent applications of the CONTAIN code, which offers an integrated treatment of the dominant containment phenomena and the interactions among them. We describe the results of a series of containment phenomenology studies, based upon realistic accident sequence analyses in actual plants. These calculations highlight various phenomenological effects that have potentially important implications for source term and/or containment loading issues, and which are difficult or impossible to treat using a less integrated code suite

  2. Use of knowledge based systems for rational reliability analysis based inspection and maintenance planning for offshore structures

    International Nuclear Information System (INIS)

    Tang, M.X.; Dharmavasan, S.; Peers, S.M.C.

    1994-01-01

    The structural integrity of fixed offshore platforms is ensured by periodic inspections. In the past, decisions made as to when, where and how to inspect have been made by engineers using rules-of-thumb and general planning heuristics. It is now hoped that more rational inspection and maintenance scheduling may be carried out by applying recently developed techniques based on structural reliability methods. However, one of the problems associated with a theoretical approach is that it is not always possible to incorporate all the constraints that are present in a practical situation. These constraints modify the decisions made for analysis data input and the interpretation of the analysis results. Knowledge based systems provide a mean of encapsulating several different forms of information and knowledge within a computer system and hence can overcome this problem. In this paper, a prototype system being developed for integrating reliability based analysis with other constraints for inspection scheduling will be described. In addition, the scheduling model and the algorithms to carry out the scheduling will be explained. Furthermore, implementation details are also given

  3. An Integrated Solution for Performing Thermo-fluid Conjugate Analysis

    Science.gov (United States)

    Kornberg, Oren

    2009-01-01

    A method has been developed which integrates a fluid flow analyzer and a thermal analyzer to produce both steady state and transient results of 1-D, 2-D, and 3-D analysis models. The Generalized Fluid System Simulation Program (GFSSP) is a one dimensional, general purpose fluid analysis code which computes pressures and flow distributions in complex fluid networks. The MSC Systems Improved Numerical Differencing Analyzer (MSC.SINDA) is a one dimensional general purpose thermal analyzer that solves network representations of thermal systems. Both GFSSP and MSC.SINDA have graphical user interfaces which are used to build the respective model and prepare it for analysis. The SINDA/GFSSP Conjugate Integrator (SGCI) is a formbase graphical integration program used to set input parameters for the conjugate analyses and run the models. The contents of this paper describes SGCI and its thermo-fluids conjugate analysis techniques and capabilities by presenting results from some example models including the cryogenic chill down of a copper pipe, a bar between two walls in a fluid stream, and a solid plate creating a phase change in a flowing fluid.

  4. A fundamental numerical analysis for noninvasive thermometry integrated in a heating applicator based on the reentrant cavity

    International Nuclear Information System (INIS)

    Ohwada, Hiroshi; Ishihara, Yasutoshi

    2010-01-01

    To improve the efficacy of hyperthermia treatment, a novel method of noninvasive measurement of body temperature change is proposed. The proposed technology, thermometry, is based on changes in the electromagnetic field distribution inside the heating applicator with temperature changes and the temperature dependence of the dielectric constant. In addition, an image of the temperature change distribution inside a body is reconstructed by applying a computed tomography (CT) algorithm. The proposed thermometry method can serve as a possible noninvasive method to monitor the temperature change distribution inside the body without the use of enormous thermometers such as in the case of magnetic resonance imaging (MRI). Furthermore, this temperature monitoring method can be easily combined with a heating applicator based on a cavity resonator, and the novel integrated treatment system can possibly be used to treat cancer effectively while noninvasively monitoring the heating effect. In this paper, the phase change distributions of the electromagnetic field with temperature changes are simulated by numerical analysis using the finite difference time domain (FDTD) method. Moreover, to estimate the phase change distributions inside a target body, the phase change distributions with temperature changes are reconstructed by a filtered back-projection. In addition, the reconstruction accuracy of the converted temperature change distribution from the phase change is evaluated. (author)

  5. Bayesian Integrated Data Analysis of Fast-Ion Measurements by Velocity-Space Tomography

    DEFF Research Database (Denmark)

    Salewski, M.; Nocente, M.; Jacobsen, A.S.

    2018-01-01

    Bayesian integrated data analysis combines measurements from different diagnostics to jointly measure plasma parameters of interest such as temperatures, densities, and drift velocities. Integrated data analysis of fast-ion measurements has long been hampered by the complexity of the strongly non...... framework. The implementation for different types of diagnostics as well as the uncertainties are discussed, and we highlight the importance of integrated data analysis of all available detectors....

  6. Integrating Problem-Based Learning and Simulation: Effects on Student Motivation and Life Skills.

    Science.gov (United States)

    Roh, Young Sook; Kim, Sang Suk

    2015-07-01

    Previous research has suggested that a teaching strategy integrating problem-based learning and simulation may be superior to traditional lecture. The purpose of this study was to assess learner motivation and life skills before and after taking a course involving problem-based learning and simulation. The design used repeated measures with a convenience sample of 83 second-year nursing students who completed the integrated course. Data from a self-administered questionnaire measuring learner motivation and life skills were collected at pretest, post-problem-based learning, and post-simulation time points. Repeated-measures analysis of variance determined that the mean scores for total learner motivation (F=6.62, P=.003), communication (F=8.27, Plearning (F=4.45, P=.016) differed significantly between time points. Post hoc tests using the Bonferroni correction revealed that total learner motivation and total life skills significantly increased both from pretest to postsimulation and from post-problem-based learning test to postsimulation test. Subscales of learner motivation and life skills, intrinsic goal orientation, self-efficacy for learning and performance, problem-solving skills, and self-directed learning skills significantly increased both from pretest to postsimulation test and from post-problem-based learning test to post-simulation test. The results demonstrate that an integrating problem-based learning and simulation course elicits significant improvement in learner motivation and life skills. Simulation plus problem-based learning is more effective than problem-based learning alone at increasing intrinsic goal orientation, task value, self-efficacy for learning and performance, problem solving, and self-directed learning.

  7. Integration issues of information engineering based I-CASE tools

    OpenAIRE

    Kurbel, Karl; Schnieder, Thomas

    1994-01-01

    Problems and requirements regarding integration of methods and tools across phases of the software-development life cycle are discussed. Information engineering (IE) methodology and I-CASE (integrated CASE) tools supporting IE claim to have an integrated view across major stages of enterprise-wide information-system development: information strategy planning, business area analysis, system design, and construction. In the main part of this paper, two comprehensive I-CASE tools, ADW (Applicati...

  8. Beyond vertical integration--Community based medical education.

    Science.gov (United States)

    Kennedy, Emma Margaret

    2006-11-01

    The term 'vertical integration' is used broadly in medical education, sometimes when discussing community based medical education (CBME). This article examines the relevance of the term 'vertical integration' and provides an alternative perspective on the complexities of facilitating the CBME process. The principles of learner centredness, patient centredness and flexibility are fundamental to learning in the diverse contexts of 'community'. Vertical integration as a structural concept is helpful for academic organisations but has less application to education in the community setting; a different approach illuminates the strengths and challenges of CBME that need consideration by these organisations.

  9. Integrated phononic crystal resonators based on adiabatically-terminated phononic crystal waveguides

    Directory of Open Access Journals (Sweden)

    Razi Dehghannasiri

    2016-12-01

    Full Text Available In this letter, we demonstrate a new design for integrated phononic crystal (PnC resonators based on confining acoustic waves in a heterogeneous waveguide-based PnC structure. In this architecture, a PnC waveguide that supports a single mode at the desired resonance frequencies is terminated by two waveguide sections with no propagating mode at those frequencies (i.e., have mode gap. The proposed PnC resonators are designed through combining the spatial-domain and the spatial-frequency domain (i.e., the k-domain analysis to achieve a smooth mode envelope. This design approach can benefit both membrane-based and surface-acoustic-wave-based architectures by confining the mode spreading in k-domain that leads to improved electromechanical excitation/detection coupling and reduced loss through propagating bulk modes.

  10. FEATUREOUS: AN INTEGRATED ENVIRONMENT FOR FEATURE-CENTRIC ANALYSIS AND MODIFICATION OF OBJECT-ORIENTED SOFTWARE

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2011-01-01

    The decentralized nature of collaborations between objects in object-oriented software makes it difficult to understand the implementations of user-observable program features and their respective interdependencies. As feature-centric program understanding and modification are essential during...... software maintenance and evolution, this situation needs to change. In this paper, we present Featureous, an integrated development environment built on top of the NetBeans IDE that facilitates feature-centric analysis of object-oriented software. Our integrated development environment encompasses...... a lightweight feature location mechanism, a number of reusable analytical views, and necessary APIs for supporting future extensions. The base of the integrated development environment is a conceptual framework comprising of three complementary dimensions of comprehension: perspective, abstraction...

  11. NEW CORPORATE REPORTING TRENDS. ANALYSIS ON THE EVOLUTION OF INTEGRATED REPORTING

    Directory of Open Access Journals (Sweden)

    Dragu Ioana

    2013-07-01

    Full Text Available The objective of this paper is to present the new corporate reporting trends of the 21st century. Integrated reporting has been launched through a common initiative of the International Integrated Reporting Committee and global accounting organizations. However, the history of integrated reports starts before the initiative of the IIRC, and goes back in time when large corporations begun to disclose sustainability and corporate social responsibility information. Further on, we claim that the initial sustainability and CSR reports that were issued separate along with the financial annual report represent the predecessors of the current integrated reports. The paper consists of a literature review analysis on the evolution of integrated reporting, from the first stage of international non-financial initiatives, up to the current state of a single integrated annual report. In order to understand the background of integrated reporting we analyze the most relevant research papers on corporate reporting, focusing on the international organizations’ perspective on non-financial reporting, in general, and integrated reporting, in particular. Based on the literature overview, we subtracted the essential information for setting the framework of the integrated reporting evolution. The findings suggest that we can delimitate three main stages in the evolution of integrated reports, namely: the non-financial reporting initiatives, the sustainability era, and the revolution of integrated reporting. We illustrate these results by presenting each relevant point in the history of integrated reporting on a time scale axis, developed with the purpose of defining the road to integrated reporting at theoretical, empirical, and practical levels. We consider the current investigation as relevant for future studies concerning integrated reports, as this is a new area of research still in its infancy. The originality of the research derives from the novelty of

  12. Power Loss Analysis for Wind Power Grid Integration Based on Weibull Distribution

    Directory of Open Access Journals (Sweden)

    Ahmed Al Ameri

    2017-04-01

    Full Text Available The growth of electrical demand increases the need of renewable energy sources, such as wind energy, to meet that need. Electrical power losses are an important factor when wind farm location and size are selected. The capitalized cost of constant power losses during the life of a wind farm will continue to high levels. During the operation period, a method to determine if the losses meet the requirements of the design is significantly needed. This article presents a Simulink simulation of wind farm integration into the grid; the aim is to achieve a better understanding of wind variation impact on grid losses. The real power losses are set as a function of the annual variation, considering a Weibull distribution. An analytical method has been used to select the size and placement of a wind farm, taking into account active power loss reduction. It proposes a fast linear model estimation to find the optimal capacity of a wind farm based on DC power flow and graph theory. The results show that the analytical approach is capable of predicting the optimal size and location of wind turbines. Furthermore, it revealed that the annual variation of wind speed could have a strong effect on real power loss calculations. In addition to helping to improve utility efficiency, the proposed method can develop specific designs to speeding up integration of wind farms into grids.

  13. NuSEE: an integrated environment of software specification and V and V for PLC based safety-critical systems

    International Nuclear Information System (INIS)

    Koo, Seo Ryong; Seong, Poong Hyun; Yoo, Jun Beom; Cha, Sung Deok; Youn, Cheong; Han, Hyun Chul

    2006-01-01

    As the use of digital systems becomes more prevalent, adequate techniques for software specification and analysis have become increasingly important in Nuclear Power Plant (NPP) safety-critical systems. Additionally, the importance of software Verification and Validation (V and V) based on adequate specification has received greater emphasis in view of improving software quality. For thorough V and V of safety-critical systems, V and V should be performed throughout the software lifecycle. However, systematic V and V is difficult as it involves many manual-oriented tasks. Tool support is needed in order to more conveniently perform software V and V. In response, we developed four kinds of Computer Aided Software Engineering (CASE) tools to support system specification for a formal-based analysis according to the software lifecycle. In this work, we achieved optimized integration of each tool. The toolset, NuSEE, is an integrated environment for software specification and V and V for PLC based safety-critical systems. In accordance with the software lifecycle, NuSEE consists of NuSISRT for the concept phase, NuSRS for the requirements phase, NuSDS for the design phase and NuSCM for configuration management. It is believed that after further development our integrated environment will be a unique and promising software specification and analysis toolset that will support the entire software lifecycle for the development of PLC based NPP safety-critical systems

  14. Variation in the Interpretation of Scientific Integrity in Community-based Participatory Health Research

    Science.gov (United States)

    Kraemer Diaz, Anne E.; Spears Johnson, Chaya R.; Arcury, Thomas A.

    2013-01-01

    Community-based participatory research (CBPR) has become essential in health disparities and environmental justice research; however, the scientific integrity of CBPR projects has become a concern. Some concerns, such as appropriate research training, lack of access to resources and finances, have been discussed as possibly limiting the scientific integrity of a project. Prior to understanding what threatens scientific integrity in CBPR, it is vital to understand what scientific integrity means for the professional and community investigators who are involved in CBPR. This analysis explores the interpretation of scientific integrity in CBPR among 74 professional and community research team members from of 25 CBPR projects in nine states in the southeastern United States in 2012. It describes the basic definition for scientific integrity and then explores variations in the interpretation of scientific integrity in CBPR. Variations in the interpretations were associated with team member identity as professional or community investigators. Professional investigators understood scientific integrity in CBPR as either conceptually or logistically flexible, as challenging to balance with community needs, or no different than traditional scientific integrity. Community investigators interpret other factors as important in scientific integrity, such as trust, accountability, and overall benefit to the community. This research demonstrates that the variations in the interpretation of scientific integrity in CBPR call for a new definition of scientific integrity in CBPR that takes into account the understanding and needs of all investigators. PMID:24161098

  15. Integration of Evidence Base into a Probabilistic Risk Assessment

    Science.gov (United States)

    Saile, Lyn; Lopez, Vilma; Bickham, Grandin; Kerstman, Eric; FreiredeCarvalho, Mary; Byrne, Vicky; Butler, Douglas; Myers, Jerry; Walton, Marlei

    2011-01-01

    INTRODUCTION: A probabilistic decision support model such as the Integrated Medical Model (IMM) utilizes an immense amount of input data that necessitates a systematic, integrated approach for data collection, and management. As a result of this approach, IMM is able to forecasts medical events, resource utilization and crew health during space flight. METHODS: Inflight data is the most desirable input for the Integrated Medical Model. Non-attributable inflight data is collected from the Lifetime Surveillance for Astronaut Health study as well as the engineers, flight surgeons, and astronauts themselves. When inflight data is unavailable cohort studies, other models and Bayesian analyses are used, in addition to subject matters experts input on occasion. To determine the quality of evidence of a medical condition, the data source is categorized and assigned a level of evidence from 1-5; the highest level is one. The collected data reside and are managed in a relational SQL database with a web-based interface for data entry and review. The database is also capable of interfacing with outside applications which expands capabilities within the database itself. Via the public interface, customers can access a formatted Clinical Findings Form (CLiFF) that outlines the model input and evidence base for each medical condition. Changes to the database are tracked using a documented Configuration Management process. DISSCUSSION: This strategic approach provides a comprehensive data management plan for IMM. The IMM Database s structure and architecture has proven to support additional usages. As seen by the resources utilization across medical conditions analysis. In addition, the IMM Database s web-based interface provides a user-friendly format for customers to browse and download the clinical information for medical conditions. It is this type of functionality that will provide Exploratory Medicine Capabilities the evidence base for their medical condition list

  16. Integrating enzyme fermentation in lignocellulosic ethanol production: life-cycle assessment and techno-economic analysis.

    Science.gov (United States)

    Olofsson, Johanna; Barta, Zsolt; Börjesson, Pål; Wallberg, Ola

    2017-01-01

    Cellulase enzymes have been reported to contribute with a significant share of the total costs and greenhouse gas emissions of lignocellulosic ethanol production today. A potential future alternative to purchasing enzymes from an off-site manufacturer is to integrate enzyme and ethanol production, using microorganisms and part of the lignocellulosic material as feedstock for enzymes. This study modelled two such integrated process designs for ethanol from logging residues from spruce production, and compared it to an off-site case based on existing data regarding purchased enzymes. Greenhouse gas emissions and primary energy balances were studied in a life-cycle assessment, and cost performance in a techno-economic analysis. The base case scenario suggests that greenhouse gas emissions per MJ of ethanol could be significantly lower in the integrated cases than in the off-site case. However, the difference between the integrated and off-site cases is reduced with alternative assumptions regarding enzyme dosage and the environmental impact of the purchased enzymes. The comparison of primary energy balances did not show any significant difference between the cases. The minimum ethanol selling price, to reach break-even costs, was from 0.568 to 0.622 EUR L -1 for the integrated cases, as compared to 0.581 EUR L -1 for the off-site case. An integrated process design could reduce greenhouse gas emissions from lignocellulose-based ethanol production, and the cost of an integrated process could be comparable to purchasing enzymes produced off-site. This study focused on the environmental and economic assessment of an integrated process, and in order to strengthen the comparison to the off-site case, more detailed and updated data regarding industrial off-site enzyme production are especially important.

  17. Transient analysis of electromagnetic wave interactions on high-contrast scatterers using volume electric field integral equation

    KAUST Repository

    Sayed, Sadeed Bin; Ulku, Huseyin Arda; Bagci, Hakan

    2014-01-01

    A marching on-in-time (MOT)-based time domain volume electric field integral equation (TD-VEFIE) solver is proposed for accurate and stable analysis of electromagnetic wave interactions on high-contrast scatterers. The stability is achieved using

  18. Building-integrated renewable energy policy analysis in China

    Institute of Scientific and Technical Information of China (English)

    姚春妮; 郝斌

    2009-01-01

    With the dramatic development of renewable energy all over the world,and for purpose of adjusting energy structure,the Ministry of Construction of China plans to promote the large scale application of renewable energy in buildings. In order to ensure the validity of policy-making,this work firstly exerts a method to do cost-benefit analysis for three kinds of technologies such as building-integrated solar hot water (BISHW) system,building-integrated photovoltaic (BIPV) technology and ground water heat pump (GWHP). Through selecting a representative city of every climate region,the analysis comes into different results for different climate regions in China and respectively different suggestion for policy-making. On the analysis basis,the Ministry of Construction (MOC) and the Ministry of Finance of China (MOF) united to start-up Building-integrated Renewable Energy Demonstration Projects (BIREDP) in 2006. In the demonstration projects,renewable energy takes place of traditional energy to supply the domestic hot water,electricity,air-conditioning and heating. Through carrying out the demonstration projects,renewable energy related market has been expanded. More and more relative companies and local governments take the opportunity to promote the large scale application of renewable energy in buildings.

  19. The COPD Knowledge Base: enabling data analysis and computational simulation in translational COPD research.

    Science.gov (United States)

    Cano, Isaac; Tényi, Ákos; Schueller, Christine; Wolff, Martin; Huertas Migueláñez, M Mercedes; Gomez-Cabrero, David; Antczak, Philipp; Roca, Josep; Cascante, Marta; Falciani, Francesco; Maier, Dieter

    2014-11-28

    Previously we generated a chronic obstructive pulmonary disease (COPD) specific knowledge base (http://www.copdknowledgebase.eu) from clinical and experimental data, text-mining results and public databases. This knowledge base allowed the retrieval of specific molecular networks together with integrated clinical and experimental data. The COPDKB has now been extended to integrate over 40 public data sources on functional interaction (e.g. signal transduction, transcriptional regulation, protein-protein interaction, gene-disease association). In addition we integrated COPD-specific expression and co-morbidity networks connecting over 6 000 genes/proteins with physiological parameters and disease states. Three mathematical models describing different aspects of systemic effects of COPD were connected to clinical and experimental data. We have completely redesigned the technical architecture of the user interface and now provide html and web browser-based access and form-based searches. A network search enables the use of interconnecting information and the generation of disease-specific sub-networks from general knowledge. Integration with the Synergy-COPD Simulation Environment enables multi-scale integrated simulation of individual computational models while integration with a Clinical Decision Support System allows delivery into clinical practice. The COPD Knowledge Base is the only publicly available knowledge resource dedicated to COPD and combining genetic information with molecular, physiological and clinical data as well as mathematical modelling. Its integrated analysis functions provide overviews about clinical trends and connections while its semantically mapped content enables complex analysis approaches. We plan to further extend the COPDKB by offering it as a repository to publish and semantically integrate data from relevant clinical trials. The COPDKB is freely available after registration at http://www.copdknowledgebase.eu.

  20. Glass-based integrated optical splitters: engineering oriented research

    Science.gov (United States)

    Hao, Yinlei; Zheng, Weiwei; Yang, Jianyi; Jiang, Xiaoqing; Wang, Minghua

    2010-10-01

    Optical splitter is one of most typical device heavily demanded in implementation of Fiber To The Home (FTTH) system. Due to its compatibility with optical fibers, low propagation loss, flexibility, and most distinguishingly, potentially costeffectiveness, glass-based integrated optical splitters made by ion-exchange technology promise to be very attractive in application of optical communication networks. Aiming at integrated optical splitters applied in optical communication network, glass ion-exchange waveguide process is developed, which includes two steps: thermal salts ion-exchange and field-assisted ion-diffusion. By this process, high performance optical splitters are fabricated in specially melted glass substrate. Main performance parameters of these splitters, including maximum insertion loss (IL), polarization dependence loss (PDL), and IL uniformity are all in accordance with corresponding specifications in generic requirements for optic branching components (GR-1209-CORE). In this paper, glass based integrated optical splitters manufacturing is demonstrated, after which, engineering-oriented research work results on glass-based optical splitter are presented.

  1. The effectivenes of science domain-based science learning integrated with local potency

    Science.gov (United States)

    Kurniawati, Arifah Putri; Prasetyo, Zuhdan Kun; Wilujeng, Insih; Suryadarma, I. Gusti Putu

    2017-08-01

    This research aimed to determine the significant effect of science domain-based science learning integrated with local potency toward science process skills. The research method used was a quasi-experimental design with nonequivalent control group design. The population of this research was all students of class VII SMP Negeri 1 Muntilan. The sample of this research was selected through cluster random sampling, namely class VII B as an experiment class (24 students) and class VII C as a control class (24 students). This research used a test instrument that was adapted from Agus Dwianto's research. The aspect of science process skills in this research was observation, classification, interpretation and communication. The analysis of data used the one factor anova at 0,05 significance level and normalized gain score. The significance level result of science process skills with one factor anova is 0,000. It shows that the significance level < alpha (0,05). It means that there was significant effect of science domain-based science learning integrated with local potency toward science learning process skills. The results of analysis show that the normalized gain score are 0,29 (low category) in control class and 0,67 (medium category) in experiment class.

  2. Nonlinear Coupling Characteristics Analysis of Integrated System of Electromagnetic Brake and Frictional Brake of Car

    Directory of Open Access Journals (Sweden)

    Ren He

    2015-01-01

    Full Text Available Since theoretical guidance is lacking in the design and control of the integrated system of electromagnetic brake and frictional brake, this paper aims to solve this problem and explores the nonlinear coupling characteristics and dynamic characteristics of the integrated system of electromagnetic brake and frictional brake. This paper uses the power bond graph method to establish nonlinear coupling mathematical model of the integrated system of electromagnetic brake and frictional brake and conducts the contrastive analysis on the dynamic characteristics based on this mathematical model. Meanwhile, the accuracy of the nonlinear coupling mathematical model proposed above is verified on the hardware in the loop simulation platform, and nonlinear coupling characteristics of the integrated system are also analyzed through experiments.

  3. Integrating computer programs for engineering analysis and design

    Science.gov (United States)

    Wilhite, A. W.; Crisp, V. K.; Johnson, S. C.

    1983-01-01

    The design of a third-generation system for integrating computer programs for engineering and design has been developed for the Aerospace Vehicle Interactive Design (AVID) system. This system consists of an engineering data management system, program interface software, a user interface, and a geometry system. A relational information system (ARIS) was developed specifically for the computer-aided engineering system. It is used for a repository of design data that are communicated between analysis programs, for a dictionary that describes these design data, for a directory that describes the analysis programs, and for other system functions. A method is described for interfacing independent analysis programs into a loosely-coupled design system. This method emphasizes an interactive extension of analysis techniques and manipulation of design data. Also, integrity mechanisms exist to maintain database correctness for multidisciplinary design tasks by an individual or a team of specialists. Finally, a prototype user interface program has been developed to aid in system utilization.

  4. The Effectiveness of Problem Based Learning Integrated With Islamic Values Based on ICT on Higher Order Thinking Skill and Students’ Character

    Directory of Open Access Journals (Sweden)

    Chairul Anwar

    2017-02-01

    Full Text Available The focus of this research is to known the influence of Problem Based Learning (PBL model application, that intergrated with Islamic values based on ICT, toward the ability of higher-order thinkingskill and the strenghtening of students’ characters. This research is quasy experiment type with group design pretest-postest. The research was conducted in SMA.Sampling by means of random sampling, to determine the control class and experimentalclass.Data analysis technique used is the t-test, based on the value of significance, as well as test-effect size. The research data shows that the model of problem based learning integrates Islamic values based on ICThas positive influence towards the increasing of higher-order thinking skill and the strenghtening of students’ characters compared to the students that use conventional method.The result of effect size test on experimental class in on medium category. It means that the learning which use problem based learning (PBL model, integrated with Islamic values based on ICT, can be said effective on increasing higher order thinking skillof students.

  5. Design-based online teacher professional development to introduce integration of STEM in Pakistan

    Science.gov (United States)

    Anwar, Tasneem

    In today's global society where innovations spread rapidly, the escalating focus on science, technology, engineering and mathematics (STEM) has quickly intensified in the United States, East Asia and much of Western Europe. Our ever-changing, increasingly global society faces many multidisciplinary problems, and many of the solutions require the integration of multiple science, technology, engineering, and mathematics (STEM) concepts. Thus, there is a critical need to explore the integration of STEM subjects in international education contexts. This dissertation study examined the exploration of integration of STEM in the unique context of Pakistan. This study used three-phase design-based methodological framework derived from McKenney and Reeves (2012) to explore the development of a STEM focused online teacher professional development (oTPD-STEM) and to identify the design features that facilitate teacher learning. The oTPD-STEM program was designed to facilitate eight Pakistani elementary school teachers' exploration of the new idea of STEM integration through both practical and theoretical considerations. This design-based study employed inductive analysis (Strauss and Corbin, 1998) to analyze multiple data sources of interviews, STEM perception responses, reflective learning team conversations, pre-post surveys and artifacts produced in oTPD-STEM. Findings of this study are presented as: (1) design-based decisions for oTPD-STEM, and (2) evolution in understanding of STEM by sharing participant teachers' STEM model for Pakistani context. This study advocates for the potential of school-wide oTPD for interdisciplinary collaboration through support for learner-centered practices.

  6. Model based process-product design and analysis

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    This paper gives a perspective on modelling and the important role it has within product-process design and analysis. Different modelling issues related to development and application of systematic model-based solution approaches for product-process design is discussed and the need for a hybrid...... model-based framework is highlighted. This framework should be able to manage knowledge-data, models, and associated methods and tools integrated with design work-flows and data-flows for specific product-process design problems. In particular, the framework needs to manage models of different types......, forms and complexity, together with their associated parameters. An example of a model-based system for design of chemicals based formulated products is also given....

  7. Parallel processing of structural integrity analysis codes

    International Nuclear Information System (INIS)

    Swami Prasad, P.; Dutta, B.K.; Kushwaha, H.S.

    1996-01-01

    Structural integrity analysis forms an important role in assessing and demonstrating the safety of nuclear reactor components. This analysis is performed using analytical tools such as Finite Element Method (FEM) with the help of digital computers. The complexity of the problems involved in nuclear engineering demands high speed computation facilities to obtain solutions in reasonable amount of time. Parallel processing systems such as ANUPAM provide an efficient platform for realising the high speed computation. The development and implementation of software on parallel processing systems is an interesting and challenging task. The data and algorithm structure of the codes plays an important role in exploiting the parallel processing system capabilities. Structural analysis codes based on FEM can be divided into two categories with respect to their implementation on parallel processing systems. The first category codes such as those used for harmonic analysis, mechanistic fuel performance codes need not require the parallelisation of individual modules of the codes. The second category of codes such as conventional FEM codes require parallelisation of individual modules. In this category, parallelisation of equation solution module poses major difficulties. Different solution schemes such as domain decomposition method (DDM), parallel active column solver and substructuring method are currently used on parallel processing systems. Two codes, FAIR and TABS belonging to each of these categories have been implemented on ANUPAM. The implementation details of these codes and the performance of different equation solvers are highlighted. (author). 5 refs., 12 figs., 1 tab

  8. Analysis of axially symmetric wire antennas by the use of exact kernel of electric field integral equation

    Directory of Open Access Journals (Sweden)

    Krneta Aleksandra J.

    2016-01-01

    Full Text Available The paper presents a new method for the analysis of wire antennas with axial symmetry. Truncated cones have been applied to precisely model antenna geometry, while the exact kernel of the electric field integral equation has been used for computation. Accuracy and efficiency of the method has been further increased by the use of higher order basis functions for current expansion, and by selecting integration methods based on singularity cancelation techniques for the calculation of potential and impedance integrals. The method has been applied to the analysis of a typical dipole antenna, thick dipole antenna and a coaxial line. The obtained results verify the high accuracy of the method. [Projekat Ministarstva nauke Republike Srbije, br. TR-32005

  9. The practical implementation of integrated safety management for nuclear safety analysis and fire hazards analysis documentation

    International Nuclear Information System (INIS)

    COLLOPY, M.T.

    1999-01-01

    In 1995 Mr. Joseph DiNunno of the Defense Nuclear Facilities Safety Board issued an approach to describe the concept of an integrated safety management program which incorporates hazard and safety analysis to address a multitude of hazards affecting the public, worker, property, and the environment. Since then the U S . Department of Energy (DOE) has adopted a policy to systematically integrate safety into management and work practices at all levels so that missions can be completed while protecting the public, worker, and the environment. While the DOE and its contractors possessed a variety of processes for analyzing fire hazards at a facility, activity, and job; the outcome and assumptions of these processes have not always been consistent for similar types of hazards within the safety analysis and the fire hazard analysis. Although the safety analysis and the fire hazard analysis are driven by different DOE Orders and requirements, these analyses should not be entirely independent and their preparation should be integrated to ensure consistency of assumptions, consequences, design considerations, and other controls. Under the DOE policy to implement an integrated safety management system, identification of hazards must be evaluated and agreed upon to ensure that the public. the workers. and the environment are protected from adverse consequences. The DOE program and contractor management need a uniform, up-to-date reference with which to plan. budget, and manage nuclear programs. It is crucial that DOE understand the hazards and risks necessarily to authorize the work needed to be performed. If integrated safety management is not incorporated into the preparation of the safety analysis and the fire hazard analysis, inconsistencies between assumptions, consequences, design considerations, and controls may occur that affect safety. Furthermore, confusion created by inconsistencies may occur in the DOE process to grant authorization of the work. In accordance with

  10. Accurate fluid force measurement based on control surface integration

    Science.gov (United States)

    Lentink, David

    2018-01-01

    Nonintrusive 3D fluid force measurements are still challenging to conduct accurately for freely moving animals, vehicles, and deforming objects. Two techniques, 3D particle image velocimetry (PIV) and a new technique, the aerodynamic force platform (AFP), address this. Both rely on the control volume integral for momentum; whereas PIV requires numerical integration of flow fields, the AFP performs the integration mechanically based on rigid walls that form the control surface. The accuracy of both PIV and AFP measurements based on the control surface integration is thought to hinge on determining the unsteady body force associated with the acceleration of the volume of displaced fluid. Here, I introduce a set of non-dimensional error ratios to show which fluid and body parameters make the error negligible. The unsteady body force is insignificant in all conditions where the average density of the body is much greater than the density of the fluid, e.g., in gas. Whenever a strongly deforming body experiences significant buoyancy and acceleration, the error is significant. Remarkably, this error can be entirely corrected for with an exact factor provided that the body has a sufficiently homogenous density or acceleration distribution, which is common in liquids. The correction factor for omitting the unsteady body force, {{{ {ρ f}} {1 - {ρ f} ( {{ρ b}+{ρ f}} )}.{( {{{{ρ }}b}+{ρ f}} )}}} , depends only on the fluid, {ρ f}, and body, {{ρ }}b, density. Whereas these straightforward solutions work even at the liquid-gas interface in a significant number of cases, they do not work for generalized bodies undergoing buoyancy in combination with appreciable body density inhomogeneity, volume change (PIV), or volume rate-of-change (PIV and AFP). In these less common cases, the 3D body shape needs to be measured and resolved in time and space to estimate the unsteady body force. The analysis shows that accounting for the unsteady body force is straightforward to non

  11. Integrating School-Based and Therapeutic Conflict Management Models at School.

    Science.gov (United States)

    D'Oosterlinck, Franky; Broekaert, Eric

    2003-01-01

    Explores the possibility of integrating school-based and therapeutic conflict management models, comparing two management models: a school-based conflict management program, "Teaching Students To Be Peacemakers"; and a therapeutic conflict management program, "Life Space Crisis Intervention." The paper concludes that integration might be possible…

  12. An Integrated Approach of Model checking and Temporal Fault Tree for System Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Koh, Kwang Yong; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)

    2009-10-15

    Digitalization of instruments and control systems in nuclear power plants offers the potential to improve plant safety and reliability through features such as increased hardware reliability and stability, and improved failure detection capability. It however makes the systems and their safety analysis more complex. Originally, safety analysis was applied to hardware system components and formal methods mainly to software. For software-controlled or digitalized systems, it is necessary to integrate both. Fault tree analysis (FTA) which has been one of the most widely used safety analysis technique in nuclear industry suffers from several drawbacks as described in. In this work, to resolve the problems, FTA and model checking are integrated to provide formal, automated and qualitative assistance to informal and/or quantitative safety analysis. Our approach proposes to build a formal model of the system together with fault trees. We introduce several temporal gates based on timed computational tree logic (TCTL) to capture absolute time behaviors of the system and to give concrete semantics to fault tree gates to reduce errors during the analysis, and use model checking technique to automate the reasoning process of FTA.

  13. Nonlinear structural analysis using integrated force method

    Indian Academy of Sciences (India)

    A new formulation termed the Integrated Force Method (IFM) was proposed by Patnaik ... nated ``Structure (nY m)'' where (nY m) are the force and displacement degrees of ..... Patnaik S N, Yadagiri S 1976 Frequency analysis of structures.

  14. Web-GIS approach for integrated analysis of heterogeneous georeferenced data

    Science.gov (United States)

    Okladnikov, Igor; Gordov, Evgeny; Titov, Alexander; Shulgina, Tamara

    2014-05-01

    Georeferenced datasets are currently actively used for modeling, interpretation and forecasting of climatic and ecosystem changes on different spatial and temporal scales [1]. Due to inherent heterogeneity of environmental datasets as well as their huge size (up to tens terabytes for a single dataset) a special software supporting studies in the climate and environmental change areas is required [2]. Dedicated information-computational system for integrated analysis of heterogeneous georeferenced climatological and meteorological data is presented. It is based on combination of Web and GIS technologies according to Open Geospatial Consortium (OGC) standards, and involves many modern solutions such as object-oriented programming model, modular composition, and JavaScript libraries based on GeoExt library (http://www.geoext.org), ExtJS Framework (http://www.sencha.com/products/extjs) and OpenLayers software (http://openlayers.org). The main advantage of the system lies in it's capability to perform integrated analysis of time series of georeferenced data obtained from different sources (in-situ observations, model results, remote sensing data) and to combine the results in a single map [3, 4] as WMS and WFS layers in a web-GIS application. Also analysis results are available for downloading as binary files from the graphical user interface or can be directly accessed through web mapping (WMS) and web feature (WFS) services for a further processing by the user. Data processing is performed on geographically distributed computational cluster comprising data storage systems and corresponding computational nodes. Several geophysical datasets represented by NCEP/NCAR Reanalysis II, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 Reanalysis, ECMWF ERA Interim Reanalysis, MRI/JMA APHRODITE's Water Resources Project Reanalysis, DWD Global Precipitation Climatology Centre's data, GMAO Modern Era-Retrospective analysis for Research and Applications, reanalysis of Monitoring

  15. Tav4SB: integrating tools for analysis of kinetic models of biological systems.

    Science.gov (United States)

    Rybiński, Mikołaj; Lula, Michał; Banasik, Paweł; Lasota, Sławomir; Gambin, Anna

    2012-04-05

    Progress in the modeling of biological systems strongly relies on the availability of specialized computer-aided tools. To that end, the Taverna Workbench eases integration of software tools for life science research and provides a common workflow-based framework for computational experiments in Biology. The Taverna services for Systems Biology (Tav4SB) project provides a set of new Web service operations, which extend the functionality of the Taverna Workbench in a domain of systems biology. Tav4SB operations allow you to perform numerical simulations or model checking of, respectively, deterministic or stochastic semantics of biological models. On top of this functionality, Tav4SB enables the construction of high-level experiments. As an illustration of possibilities offered by our project we apply the multi-parameter sensitivity analysis. To visualize the results of model analysis a flexible plotting operation is provided as well. Tav4SB operations are executed in a simple grid environment, integrating heterogeneous software such as Mathematica, PRISM and SBML ODE Solver. The user guide, contact information, full documentation of available Web service operations, workflows and other additional resources can be found at the Tav4SB project's Web page: http://bioputer.mimuw.edu.pl/tav4sb/. The Tav4SB Web service provides a set of integrated tools in the domain for which Web-based applications are still not as widely available as for other areas of computational biology. Moreover, we extend the dedicated hardware base for computationally expensive task of simulating cellular models. Finally, we promote the standardization of models and experiments as well as accessibility and usability of remote services.

  16. SODIM: Service Oriented Data Integration based on MapReduce

    Directory of Open Access Journals (Sweden)

    Ghada ElSheikh

    2013-09-01

    Data integration systems can benefit from innovative dynamic infrastructure solutions such as Clouds, with its more agility, lower cost, device independency, location independency, and scalability. This study consolidates the data integration system, Service Orientation, and distributed processing to develop a new data integration system called Service Oriented Data Integration based on MapReduce (SODIM that improves the system performance, especially with large number of data sources, and that can efficiently be hosted on modern dynamic infrastructures as Clouds.

  17. Performance analysis of different tuning rules for an isothermal CSTR using integrated EPC and SPC

    Science.gov (United States)

    Roslan, A. H.; Karim, S. F. Abd; Hamzah, N.

    2018-03-01

    This paper demonstrates the integration of Engineering Process Control (EPC) and Statistical Process Control (SPC) for the control of product concentration of an isothermal CSTR. The objectives of this study are to evaluate the performance of Ziegler-Nichols (Z-N), Direct Synthesis, (DS) and Internal Model Control (IMC) tuning methods and determine the most effective method for this process. The simulation model was obtained from past literature and re-constructed using SIMULINK MATLAB to evaluate the process response. Additionally, the process stability, capability and normality were analyzed using Process Capability Sixpack reports in Minitab. Based on the results, DS displays the best response for having the smallest rise time, settling time, overshoot, undershoot, Integral Time Absolute Error (ITAE) and Integral Square Error (ISE). Also, based on statistical analysis, DS yields as the best tuning method as it exhibits the highest process stability and capability.

  18. The Impact of Information System-Enabled Supply Chain Process Integration on Business Performance: A Resource-Based Analysis

    OpenAIRE

    Morteza Ghobakhloo; Sai Hong Tang; Mohammad Sadegh Sabouri; Norzima Zulkifli

    2014-01-01

    This paper seeks to develop and test a model to examine the relationships between, technical aspects of IS resources (IS alignment, IS resources technical quality, IS advancement), supply chain process integration, and firm performance. A questionnaire-based survey was conducted to collect data from 227 supply chain, logistics, or procurement/purchasing managers of leading manufacturing and retail organizations. Drawing on resources-based view of the firm, and through extending the concept of...

  19. Man-system interface based on automatic speech recognition: integration to a virtual control desk

    Energy Technology Data Exchange (ETDEWEB)

    Jorge, Carlos Alexandre F.; Mol, Antonio Carlos A.; Pereira, Claudio M.N.A.; Aghina, Mauricio Alves C., E-mail: calexandre@ien.gov.b, E-mail: mol@ien.gov.b, E-mail: cmnap@ien.gov.b, E-mail: mag@ien.gov.b [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil); Nomiya, Diogo V., E-mail: diogonomiya@gmail.co [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil)

    2009-07-01

    This work reports the implementation of a man-system interface based on automatic speech recognition, and its integration to a virtual nuclear power plant control desk. The later is aimed to reproduce a real control desk using virtual reality technology, for operator training and ergonomic evaluation purpose. An automatic speech recognition system was developed to serve as a new interface with users, substituting computer keyboard and mouse. They can operate this virtual control desk in front of a computer monitor or a projection screen through spoken commands. The automatic speech recognition interface developed is based on a well-known signal processing technique named cepstral analysis, and on artificial neural networks. The speech recognition interface is described, along with its integration with the virtual control desk, and results are presented. (author)

  20. Man-system interface based on automatic speech recognition: integration to a virtual control desk

    International Nuclear Information System (INIS)

    Jorge, Carlos Alexandre F.; Mol, Antonio Carlos A.; Pereira, Claudio M.N.A.; Aghina, Mauricio Alves C.; Nomiya, Diogo V.

    2009-01-01

    This work reports the implementation of a man-system interface based on automatic speech recognition, and its integration to a virtual nuclear power plant control desk. The later is aimed to reproduce a real control desk using virtual reality technology, for operator training and ergonomic evaluation purpose. An automatic speech recognition system was developed to serve as a new interface with users, substituting computer keyboard and mouse. They can operate this virtual control desk in front of a computer monitor or a projection screen through spoken commands. The automatic speech recognition interface developed is based on a well-known signal processing technique named cepstral analysis, and on artificial neural networks. The speech recognition interface is described, along with its integration with the virtual control desk, and results are presented. (author)

  1. Simulation analysis for integrated evaluation of technical and commercial risk

    International Nuclear Information System (INIS)

    Gutleber, D.S.; Heiberger, E.M.; Morris, T.D.

    1995-01-01

    Decisions to invest in oil- and gasfield acquisitions or participating interests often are based on the perceived ability to enhance the economic value of the underlying asset. A multidisciplinary approach integrating reservoir engineering, operations and drilling, and deal structuring with Monte Carlo simulation modeling can overcome weaknesses of deterministic analysis and significantly enhance investment decisions. This paper discusses the use of spreadsheets and Monte Carlo simulation to generate probabilistic outcomes for key technical and economic parameters for ultimate identification of the economic volatility and value of potential deal concepts for a significant opportunity. The approach differs from a simple risk analysis for an individual well by incorporating detailed, full-field simulations that vary the reservoir parameters, capital and operating cost assumptions, and schedules on timing in the framework of various deal structures

  2. Nexusing Charcoal in South Mozambique: A Proposal To Integrate the Nexus Charcoal-Food-Water Analysis With a Participatory Analytical and Systemic Tool

    Directory of Open Access Journals (Sweden)

    Ricardo Martins

    2018-06-01

    Full Text Available Nexus analysis identifies and explores the synergies and trade-offs between energy, food and water systems, considered as interdependent systems interacting with contextual drivers (e.g., climate change, poverty. The nexus is, thus, a valuable analytical and policy design supporting tool to address the widely discussed links between bioenergy, food and water. In fact, the Nexus provides a more integrative and broad approach in relation to the single isolated system approach that characterizes many bioenergy analysis and policies of the last decades. In particular, for the South of Mozambique, charcoal production, food insecurity and water scarcity have been related in separated studies and, thus, it would be expected that Nexus analysis has the potential to provide the basis for integrated policies and strategies focused on charcoal as a development factor. However, to date there is no Nexus analysis focused on charcoal in Mozambique, neither is there an assessment of the comprehensiveness and relevance of Nexus analysis when applied to charcoal energy systems. To address these gaps, this work applies the Nexus to the charcoal-food-water system in Mozambique, integrating national, regional and international studies analysing the isolated, or pairs of, systems. This integration results in a novel Nexus analysis graphic for charcoal-food-water relationship. Then, to access the comprehensiveness and depth of analysis, this Nexus analysis is critically compared with the 2MBio-A, a systems analytical and design framework based on a design tool specifically developed for Bioenergy (the 2MBio. The results reveal that Nexus analysis is “blind” to specific fundamental social, ecological and socio-historical dynamics of charcoal energy systems. The critical comparison also suggests the need to integrate the high level systems analysis of Nexus with non-deterministic, non-prescriptive participatory analysis tools, like the 2MBio-A, as a means to

  3. Optimization of the integration time of pulse shape analysis for dual-layer GSO detector with different amount of Ce

    International Nuclear Information System (INIS)

    Yamamoto, Seiichi

    2008-01-01

    For a multi-layer depth-of-interaction (DOI) detector using different decay times, pulse shape analysis based on two different integration times is often used to distinguish scintillators in DOI direction. This method measures a partial integration and a full integration, and calculates the ratio of these two to obtain the pulse shape distribution. The full integration time is usually set to integrate full width of the scintillation pulse. However, the optimum partial integration time is not obvious for obtaining the best separation of the pulse shape distribution. To make it clear, a theoretical analysis and experiments were conducted for pulse shape analysis by changing the partial integration time using a scintillation detector of GSOs with different amount of Ce. A scintillation detector with 1-in. round photomultiplier tube (PMT) optically coupled GSO of 1.5 mol% (decay time: 35 ns) and that of 0.5 mol% (decay time: 60 ns) was used for the experiments. The signal from PMT was digitally integrated with partial (50-150 ns) and full (160 ns) integration times and ratio of these two was calculated to obtain the pulse shape distribution. In the theoretical analysis, partial integration time of 50 ns showed largest distance between two peaks of the pulse shape distribution. In the experiments, it showed maximum at 70-80 ns of partial integration time. The peak to valley ratio showed the maximum at 120-130 ns. Because the separation of two peaks is determined by the peak to valley ratio, we conclude the optimum partial integration time for these combinations of GSOs is around 120-130 ns, relatively longer than the expected value

  4. An e-consent-based shared EHR system architecture for integrated healthcare networks.

    Science.gov (United States)

    Bergmann, Joachim; Bott, Oliver J; Pretschner, Dietrich P; Haux, Reinhold

    2007-01-01

    Virtual integration of distributed patient data promises advantages over a consolidated health record, but raises questions mainly about practicability and authorization concepts. Our work aims on specification and development of a virtual shared health record architecture using a patient-centred integration and authorization model. A literature survey summarizes considerations of current architectural approaches. Complemented by a methodical analysis in two regional settings, a formal architecture model was specified and implemented. Results presented in this paper are a survey of architectural approaches for shared health records and an architecture model for a virtual shared EHR, which combines a patient-centred integration policy with provider-oriented document management. An electronic consent system assures, that access to the shared record remains under control of the patient. A corresponding system prototype has been developed and is currently being introduced and evaluated in a regional setting. The proposed architecture is capable of partly replacing message-based communications. Operating highly available provider repositories for the virtual shared EHR requires advanced technology and probably means additional costs for care providers. Acceptance of the proposed architecture depends on transparently embedding document validation and digital signature into the work processes. The paradigm shift from paper-based messaging to a "pull model" needs further evaluation.

  5. Spatial Data Integration Using Ontology-Based Approach

    Science.gov (United States)

    Hasani, S.; Sadeghi-Niaraki, A.; Jelokhani-Niaraki, M.

    2015-12-01

    In today's world, the necessity for spatial data for various organizations is becoming so crucial that many of these organizations have begun to produce spatial data for that purpose. In some circumstances, the need to obtain real time integrated data requires sustainable mechanism to process real-time integration. Case in point, the disater management situations that requires obtaining real time data from various sources of information. One of the problematic challenges in the mentioned situation is the high degree of heterogeneity between different organizations data. To solve this issue, we introduce an ontology-based method to provide sharing and integration capabilities for the existing databases. In addition to resolving semantic heterogeneity, better access to information is also provided by our proposed method. Our approach is consisted of three steps, the first step is identification of the object in a relational database, then the semantic relationships between them are modelled and subsequently, the ontology of each database is created. In a second step, the relative ontology will be inserted into the database and the relationship of each class of ontology will be inserted into the new created column in database tables. Last step is consisted of a platform based on service-oriented architecture, which allows integration of data. This is done by using the concept of ontology mapping. The proposed approach, in addition to being fast and low cost, makes the process of data integration easy and the data remains unchanged and thus takes advantage of the legacy application provided.

  6. SPATIAL DATA INTEGRATION USING ONTOLOGY-BASED APPROACH

    Directory of Open Access Journals (Sweden)

    S. Hasani

    2015-12-01

    Full Text Available In today's world, the necessity for spatial data for various organizations is becoming so crucial that many of these organizations have begun to produce spatial data for that purpose. In some circumstances, the need to obtain real time integrated data requires sustainable mechanism to process real-time integration. Case in point, the disater management situations that requires obtaining real time data from various sources of information. One of the problematic challenges in the mentioned situation is the high degree of heterogeneity between different organizations data. To solve this issue, we introduce an ontology-based method to provide sharing and integration capabilities for the existing databases. In addition to resolving semantic heterogeneity, better access to information is also provided by our proposed method. Our approach is consisted of three steps, the first step is identification of the object in a relational database, then the semantic relationships between them are modelled and subsequently, the ontology of each database is created. In a second step, the relative ontology will be inserted into the database and the relationship of each class of ontology will be inserted into the new created column in database tables. Last step is consisted of a platform based on service-oriented architecture, which allows integration of data. This is done by using the concept of ontology mapping. The proposed approach, in addition to being fast and low cost, makes the process of data integration easy and the data remains unchanged and thus takes advantage of the legacy application provided.

  7. Influencing Factors and Development Trend Analysis of China Electric Grid Investment Demand Based on a Panel Co-Integration Model

    Directory of Open Access Journals (Sweden)

    Jinchao Li

    2018-01-01

    Full Text Available Electric grid investment demand analysis is significant to reasonably arranging construction funds for the electric grid and reduce costs. This paper used the panel data of electric grid investment from 23 provinces of China between 2004 and 2016 as samples to analyze the influence between electric grid investment demand and GDP, population scale, social electricity consumption, installed electrical capacity, and peak load based on co-integration tests. We find that GDP and peak load have positive influences on electric grid investment demand, but the impact of population scale, social electricity consumption, and installed electrical capacity on electric grid investment is not remarkable. We divide different regions in China into the eastern region, central region, and western region to analyze influence factors of electric grid investment, finally obtaining key factors in the eastern, central, and western regions. In the end, according to the analysis of key factors, we make a prediction about China’s electric grid investment for 2020 in different scenarios. The results offer a certain understanding for the development trend of China’s electric grid investment and contribute to the future development of electric grid investment.

  8. Skill-based immigration, economic integration, and economic performance

    OpenAIRE

    Aydemir, Abdurrahman

    2014-01-01

    Studies for major immigrant-receiving countries provide evidence on the comparative economic performance of immigrant classes (skill-, kinship-, and humanitarian-based). Developed countries are increasingly competing for high-skilled immigrants, who perform better in the labor market. However, there are serious challenges to their economic integration, which highlights a need for complementary immigration and integration policies.

  9. Thermally-isolated silicon-based integrated circuits and related methods

    Science.gov (United States)

    Wojciechowski, Kenneth; Olsson, Roy H.; Clews, Peggy J.; Bauer, Todd

    2017-05-09

    Thermally isolated devices may be formed by performing a series of etches on a silicon-based substrate. As a result of the series of etches, silicon material may be removed from underneath a region of an integrated circuit (IC). The removal of the silicon material from underneath the IC forms a gap between remaining substrate and the integrated circuit, though the integrated circuit remains connected to the substrate via a support bar arrangement that suspends the integrated circuit over the substrate. The creation of this gap functions to release the device from the substrate and create a thermally-isolated integrated circuit.

  10. Method of making thermally-isolated silicon-based integrated circuits

    Science.gov (United States)

    Wojciechowski, Kenneth; Olsson, Roy; Clews, Peggy J.; Bauer, Todd

    2017-11-21

    Thermally isolated devices may be formed by performing a series of etches on a silicon-based substrate. As a result of the series of etches, silicon material may be removed from underneath a region of an integrated circuit (IC). The removal of the silicon material from underneath the IC forms a gap between remaining substrate and the integrated circuit, though the integrated circuit remains connected to the substrate via a support bar arrangement that suspends the integrated circuit over the substrate. The creation of this gap functions to release the device from the substrate and create a thermally-isolated integrated circuit.

  11. Integrated omics analysis of specialized metabolism in medicinal plants.

    Science.gov (United States)

    Rai, Amit; Saito, Kazuki; Yamazaki, Mami

    2017-05-01

    Medicinal plants are a rich source of highly diverse specialized metabolites with important pharmacological properties. Until recently, plant biologists were limited in their ability to explore the biosynthetic pathways of these metabolites, mainly due to the scarcity of plant genomics resources. However, recent advances in high-throughput large-scale analytical methods have enabled plant biologists to discover biosynthetic pathways for important plant-based medicinal metabolites. The reduced cost of generating omics datasets and the development of computational tools for their analysis and integration have led to the elucidation of biosynthetic pathways of several bioactive metabolites of plant origin. These discoveries have inspired synthetic biology approaches to develop microbial systems to produce bioactive metabolites originating from plants, an alternative sustainable source of medicinally important chemicals. Since the demand for medicinal compounds are increasing with the world's population, understanding the complete biosynthesis of specialized metabolites becomes important to identify or develop reliable sources in the future. Here, we review the contributions of major omics approaches and their integration to our understanding of the biosynthetic pathways of bioactive metabolites. We briefly discuss different approaches for integrating omics datasets to extract biologically relevant knowledge and the application of omics datasets in the construction and reconstruction of metabolic models. © 2017 The Authors The Plant Journal © 2017 John Wiley & Sons Ltd.

  12. Integrated tools for control-system analysis

    Science.gov (United States)

    Ostroff, Aaron J.; Proffitt, Melissa S.; Clark, David R.

    1989-01-01

    The basic functions embedded within a user friendly software package (MATRIXx) are used to provide a high level systems approach to the analysis of linear control systems. Various control system analysis configurations are assembled automatically to minimize the amount of work by the user. Interactive decision making is incorporated via menu options and at selected points, such as in the plotting section, by inputting data. There are five evaluations such as the singular value robustness test, singular value loop transfer frequency response, Bode frequency response, steady-state covariance analysis, and closed-loop eigenvalues. Another section describes time response simulations. A time response for random white noise disturbance is available. The configurations and key equations used for each type of analysis, the restrictions that apply, the type of data required, and an example problem are described. One approach for integrating the design and analysis tools is also presented.

  13. Locating new uranium occurrence by integrated weighted analysis in Kaladgi basin, Karnataka

    International Nuclear Information System (INIS)

    Sridhar, M.; Chaturvedi, A.K.; Rai, A.K.

    2014-01-01

    This study aims at identifying uranium potential zones by integrated analysis of thematic layer interpreted and derived from airborne radiometric and magnetic data, satellite data along with available ground geochemical data in western part of Kaladgi basin. Integrated weighted analysis of spatial datasets which included airborne radiometric data (eU, eTh and % K conc.), litho-structural map. hydrogeochemical U conc., and geomorphological data pertaining to study area, was attempted. The weightage analysis was done in GIS environment where different spatial dataset were brought on to a single platform and were analyzed by integration

  14. Data Portal for the Library of Integrated Network-based Cellular Signatures (LINCS) program: integrated access to diverse large-scale cellular perturbation response data

    Science.gov (United States)

    Koleti, Amar; Terryn, Raymond; Stathias, Vasileios; Chung, Caty; Cooper, Daniel J; Turner, John P; Vidović, Dušica; Forlin, Michele; Kelley, Tanya T; D’Urso, Alessandro; Allen, Bryce K; Torre, Denis; Jagodnik, Kathleen M; Wang, Lily; Jenkins, Sherry L; Mader, Christopher; Niu, Wen; Fazel, Mehdi; Mahi, Naim; Pilarczyk, Marcin; Clark, Nicholas; Shamsaei, Behrouz; Meller, Jarek; Vasiliauskas, Juozas; Reichard, John; Medvedovic, Mario; Ma’ayan, Avi; Pillai, Ajay

    2018-01-01

    Abstract The Library of Integrated Network-based Cellular Signatures (LINCS) program is a national consortium funded by the NIH to generate a diverse and extensive reference library of cell-based perturbation-response signatures, along with novel data analytics tools to improve our understanding of human diseases at the systems level. In contrast to other large-scale data generation efforts, LINCS Data and Signature Generation Centers (DSGCs) employ a wide range of assay technologies cataloging diverse cellular responses. Integration of, and unified access to LINCS data has therefore been particularly challenging. The Big Data to Knowledge (BD2K) LINCS Data Coordination and Integration Center (DCIC) has developed data standards specifications, data processing pipelines, and a suite of end-user software tools to integrate and annotate LINCS-generated data, to make LINCS signatures searchable and usable for different types of users. Here, we describe the LINCS Data Portal (LDP) (http://lincsportal.ccs.miami.edu/), a unified web interface to access datasets generated by the LINCS DSGCs, and its underlying database, LINCS Data Registry (LDR). LINCS data served on the LDP contains extensive metadata and curated annotations. We highlight the features of the LDP user interface that is designed to enable search, browsing, exploration, download and analysis of LINCS data and related curated content. PMID:29140462

  15. Rule-based Information Integration

    NARCIS (Netherlands)

    de Keijzer, Ander; van Keulen, Maurice

    2005-01-01

    In this report, we show the process of information integration. We specifically discuss the language used for integration. We show that integration consists of two phases, the schema mapping phase and the data integration phase. We formally define transformation rules, conversion, evolution and

  16. Researchers and teachers learning together and from each other using video-based multimodal analysis

    DEFF Research Database (Denmark)

    Davidsen, Jacob; Vanderlinde, Ruben

    2014-01-01

    integrated touch-screens into their teaching and learning. This paper examines the methodological usefulness of video-based multimodal analysis. Through reflection on the research project, we discuss how, by using video-based multimodal analysis, researchers and teachers can study children’s touch......This paper discusses a year-long technology integration project, during which teachers and researchers joined forces to explore children’s collaborative activities through the use of touch-screens. In the research project, discussed in this paper, 16 touch-screens were integrated into teaching...... and learning activities in two separate classrooms; the learning and collaborative processes were captured by using video, collecting over 150 hours of footage. By using digital research technologies and a longitudinal design, the authors of the research project studied how teachers and children gradually...

  17. Computation of Groebner bases for two-loop propagator type integrals

    International Nuclear Information System (INIS)

    Tarasov, O.V.

    2004-01-01

    The Groebner basis technique for calculating Feynman diagrams proposed in (Acta Phys. Pol. B 29(1998) 2655) is applied to the two-loop propagator type integrals with arbitrary masses and momentum. We describe the derivation of Groebner bases for all integrals with 1PI topologies and present explicit content of the Groebner bases

  18. Computation of Groebner bases for two-loop propagator type integrals

    Energy Technology Data Exchange (ETDEWEB)

    Tarasov, O.V. [DESY Zeuthen, Theory Group, Deutsches Elektronen Synchrotron, DESY, Platanenallee 6, D-15738 Zeuthen (Germany)]. E-mail: tarasov@ifh.de

    2004-11-21

    The Groebner basis technique for calculating Feynman diagrams proposed in (Acta Phys. Pol. B 29(1998) 2655) is applied to the two-loop propagator type integrals with arbitrary masses and momentum. We describe the derivation of Groebner bases for all integrals with 1PI topologies and present explicit content of the Groebner bases.

  19. Hybrid Pixel-Based Method for Cardiac Ultrasound Fusion Based on Integration of PCA and DWT

    Directory of Open Access Journals (Sweden)

    Samaneh Mazaheri

    2015-01-01

    Full Text Available Medical image fusion is the procedure of combining several images from one or multiple imaging modalities. In spite of numerous attempts in direction of automation ventricle segmentation and tracking in echocardiography, due to low quality images with missing anatomical details or speckle noises and restricted field of view, this problem is a challenging task. This paper presents a fusion method which particularly intends to increase the segment-ability of echocardiography features such as endocardial and improving the image contrast. In addition, it tries to expand the field of view, decreasing impact of noise and artifacts and enhancing the signal to noise ratio of the echo images. The proposed algorithm weights the image information regarding an integration feature between all the overlapping images, by using a combination of principal component analysis and discrete wavelet transform. For evaluation, a comparison has been done between results of some well-known techniques and the proposed method. Also, different metrics are implemented to evaluate the performance of proposed algorithm. It has been concluded that the presented pixel-based method based on the integration of PCA and DWT has the best result for the segment-ability of cardiac ultrasound images and better performance in all metrics.

  20. Modular Architecture for Integrated Model-Based Decision Support.

    Science.gov (United States)

    Gaebel, Jan; Schreiber, Erik; Oeser, Alexander; Oeltze-Jafra, Steffen

    2018-01-01

    Model-based decision support systems promise to be a valuable addition to oncological treatments and the implementation of personalized therapies. For the integration and sharing of decision models, the involved systems must be able to communicate with each other. In this paper, we propose a modularized architecture of dedicated systems for the integration of probabilistic decision models into existing hospital environments. These systems interconnect via web services and provide model sharing and processing capabilities for clinical information systems. Along the lines of IHE integration profiles from other disciplines and the meaningful reuse of routinely recorded patient data, our approach aims for the seamless integration of decision models into hospital infrastructure and the physicians' daily work.

  1. SEURAT: visual analytics for the integrated analysis of microarray data.

    Science.gov (United States)

    Gribov, Alexander; Sill, Martin; Lück, Sonja; Rücker, Frank; Döhner, Konstanze; Bullinger, Lars; Benner, Axel; Unwin, Antony

    2010-06-03

    In translational cancer research, gene expression data is collected together with clinical data and genomic data arising from other chip based high throughput technologies. Software tools for the joint analysis of such high dimensional data sets together with clinical data are required. We have developed an open source software tool which provides interactive visualization capability for the integrated analysis of high-dimensional gene expression data together with associated clinical data, array CGH data and SNP array data. The different data types are organized by a comprehensive data manager. Interactive tools are provided for all graphics: heatmaps, dendrograms, barcharts, histograms, eventcharts and a chromosome browser, which displays genetic variations along the genome. All graphics are dynamic and fully linked so that any object selected in a graphic will be highlighted in all other graphics. For exploratory data analysis the software provides unsupervised data analytics like clustering, seriation algorithms and biclustering algorithms. The SEURAT software meets the growing needs of researchers to perform joint analysis of gene expression, genomical and clinical data.

  2. SEURAT: Visual analytics for the integrated analysis of microarray data

    Directory of Open Access Journals (Sweden)

    Bullinger Lars

    2010-06-01

    Full Text Available Abstract Background In translational cancer research, gene expression data is collected together with clinical data and genomic data arising from other chip based high throughput technologies. Software tools for the joint analysis of such high dimensional data sets together with clinical data are required. Results We have developed an open source software tool which provides interactive visualization capability for the integrated analysis of high-dimensional gene expression data together with associated clinical data, array CGH data and SNP array data. The different data types are organized by a comprehensive data manager. Interactive tools are provided for all graphics: heatmaps, dendrograms, barcharts, histograms, eventcharts and a chromosome browser, which displays genetic variations along the genome. All graphics are dynamic and fully linked so that any object selected in a graphic will be highlighted in all other graphics. For exploratory data analysis the software provides unsupervised data analytics like clustering, seriation algorithms and biclustering algorithms. Conclusions The SEURAT software meets the growing needs of researchers to perform joint analysis of gene expression, genomical and clinical data.

  3. Understanding integrated care: a comprehensive conceptual framework based on the integrative functions of primary care.

    Science.gov (United States)

    Valentijn, Pim P; Schepman, Sanneke M; Opheij, Wilfrid; Bruijnzeels, Marc A

    2013-01-01

    Primary care has a central role in integrating care within a health system. However, conceptual ambiguity regarding integrated care hampers a systematic understanding. This paper proposes a conceptual framework that combines the concepts of primary care and integrated care, in order to understand the complexity of integrated care. The search method involved a combination of electronic database searches, hand searches of reference lists (snowball method) and contacting researchers in the field. The process of synthesizing the literature was iterative, to relate the concepts of primary care and integrated care. First, we identified the general principles of primary care and integrated care. Second, we connected the dimensions of integrated care and the principles of primary care. Finally, to improve content validity we held several meetings with researchers in the field to develop and refine our conceptual framework. The conceptual framework combines the functions of primary care with the dimensions of integrated care. Person-focused and population-based care serve as guiding principles for achieving integration across the care continuum. Integration plays complementary roles on the micro (clinical integration), meso (professional and organisational integration) and macro (system integration) level. Functional and normative integration ensure connectivity between the levels. The presented conceptual framework is a first step to achieve a better understanding of the inter-relationships among the dimensions of integrated care from a primary care perspective.

  4. Integrated program of using of Probabilistic Safety Analysis in Spain

    International Nuclear Information System (INIS)

    1998-01-01

    Since 25 June 1986, when the CSN (Nuclear Safety Conseil) approve the Integrated Program of Probabilistic Safety Analysis, this program has articulated the main activities of CSN. This document summarize the activities developed during these years and reviews the Integrated programme

  5. Containment integrity analysis with SAMPSON/DCRA module

    International Nuclear Information System (INIS)

    Hosoda, Seigo; Shirakawa, Noriyuki; Naitoh, Masanori

    2006-01-01

    The integrity of PWR containment under a severe accident is analyzed using the debris concrete reaction analysis code. If core fuels melt through the pressure vessel and the debris accumulates on the reactor cavity of a lower part of containment, its temperature continues to rise due to decay heat and the debris ablates the concrete floor. In case that cooling water is issued into the containment cavity and the amount of debris is limited to 30% of core fuels, our analyses showed that the debris could be cooled and frozen so that integrity of containment could hold. (author)

  6. Development of an integrated data acquisition and handling system based on digital time series analysis for the measurement of plasma fluctuations

    International Nuclear Information System (INIS)

    Ghayspoor, R.; Roth, J.R.

    1986-01-01

    The nonlinear characteristics of data obtained by many plasma diagnostic systems requires the power of modern computers for on-line data processing and reduction. The objective of this work is to develop an integrated data acquisition and handling system based on digital time series analysis techniques. These techniques make it possible to investigate the nature of plasma fluctuations and the physical processes which give rise to them. The approach is to digitize the data, and to generate various spectra by means of Fast Fourier Transforms (FFT). Of particular interest is the computer generated auto-power spectrum, cross-power spectrum, phase spectrum, and squared coherency spectrum. Software programs based on those developed by Jae. Y. Hong at the University of Texas are utilized for these spectra. The LeCroy 3500-SA signal analyzer and VAX 11/780 are used as the data handling and reduction system in this work. In this report, the software required to link these two systems are described

  7. S-bases as a tool to solve reduction problems for Feynman integrals

    International Nuclear Information System (INIS)

    Smirnov, A.V.; Smirnov, V.A.

    2006-01-01

    We suggest a mathematical definition of the notion of master integrals and present a brief review of algorithmic methods to solve reduction problems for Feynman integrals based on integration by parts relations. In particular, we discuss a recently suggested reduction algorithm which uses Groebner bases. New results obtained with its help for a family of three-loop Feynman integrals are outlined

  8. S-bases as a tool to solve reduction problems for Feynman integrals

    Energy Technology Data Exchange (ETDEWEB)

    Smirnov, A.V. [Scientific Research Computing Center of Moscow State University, Moscow 119992 (Russian Federation); Smirnov, V.A. [Nuclear Physics Institute of Moscow State University, Moscow 119992 (Russian Federation)

    2006-10-15

    We suggest a mathematical definition of the notion of master integrals and present a brief review of algorithmic methods to solve reduction problems for Feynman integrals based on integration by parts relations. In particular, we discuss a recently suggested reduction algorithm which uses Groebner bases. New results obtained with its help for a family of three-loop Feynman integrals are outlined.

  9. PLACE-BASED GREEN BUILDING: INTEGRATING LOCAL ENVIRONMENTAL AND PLANNING ANALYSIS INTO GREEN BUILDING GUIDELINES

    Science.gov (United States)

    This project will develop a model for place-based green building guidelines based on an analysis of local environmental, social, and land use conditions. The ultimate goal of this project is to develop a methodology and model for placing green buildings within their local cont...

  10. Integration of evidence-based and experience-based design: contributions from a study in a health care service

    Directory of Open Access Journals (Sweden)

    Mirela S. da Rosa

    2015-06-01

    Full Text Available The purpose this paper is to present an integrated study of Service Design and the Mechanism of the Production Function (MPF for redesigning the health care services to improve the perceived value of the patient and increase the productivity of hospital operations by eliminating wastes. The method used was action research and applied in an ICU of a private hospital in southern Brazil. The techniques of participant observation, interviews, archival research and meetings co-creation with a team of the hospital were used to collect data. Data were analyzed through content analysis of the interviews and the Design Service and Production Engineering tools. Evidence based approaches tends to contribute to the replication of the project outcomes in future cases. The MPF can support project development in the field of Design, as well the integrated approach developed in the healthcare sector, helped to devote more time to the phases of diagnosis and implementation. The findings are useful to demonstrate that can use simultaneously approaches the Service Design and MPF for the development of more robust solutions in health care environment. Further research could be done in other private or public hospitals as well as in other hospital units besides the ICUs. Limitations include the work done in a single hospital and service unit, data collected from a small group of people in the hospital. Integrating Evidence-Based Design, Experience-Based Design and the MPF can produce a more robust way to justify and define the focus of improvements in health care services.

  11. Integrating tracer-based metabolomics data and metabolic fluxes in a linear fashion via Elementary Carbon Modes.

    Science.gov (United States)

    Pey, Jon; Rubio, Angel; Theodoropoulos, Constantinos; Cascante, Marta; Planes, Francisco J

    2012-07-01

    Constraints-based modeling is an emergent area in Systems Biology that includes an increasing set of methods for the analysis of metabolic networks. In order to refine its predictions, the development of novel methods integrating high-throughput experimental data is currently a key challenge in the field. In this paper, we present a novel set of constraints that integrate tracer-based metabolomics data from Isotope Labeling Experiments and metabolic fluxes in a linear fashion. These constraints are based on Elementary Carbon Modes (ECMs), a recently developed concept that generalizes Elementary Flux Modes at the carbon level. To illustrate the effect of our ECMs-based constraints, a Flux Variability Analysis approach was applied to a previously published metabolic network involving the main pathways in the metabolism of glucose. The addition of our ECMs-based constraints substantially reduced the under-determination resulting from a standard application of Flux Variability Analysis, which shows a clear progress over the state of the art. In addition, our approach is adjusted to deal with combinatorial explosion of ECMs in genome-scale metabolic networks. This extension was applied to infer the maximum biosynthetic capacity of non-essential amino acids in human metabolism. Finally, as linearity is the hallmark of our approach, its importance is discussed at a methodological, computational and theoretical level and illustrated with a practical application in the field of Isotope Labeling Experiments. Copyright © 2012 Elsevier Inc. All rights reserved.

  12. A Hash Based Remote User Authentication and Authenticated Key Agreement Scheme for the Integrated EPR Information System.

    Science.gov (United States)

    Li, Chun-Ta; Weng, Chi-Yao; Lee, Cheng-Chi; Wang, Chun-Cheng

    2015-11-01

    To protect patient privacy and ensure authorized access to remote medical services, many remote user authentication schemes for the integrated electronic patient record (EPR) information system have been proposed in the literature. In a recent paper, Das proposed a hash based remote user authentication scheme using passwords and smart cards for the integrated EPR information system, and claimed that the proposed scheme could resist various passive and active attacks. However, in this paper, we found that Das's authentication scheme is still vulnerable to modification and user duplication attacks. Thereafter we propose a secure and efficient authentication scheme for the integrated EPR information system based on lightweight hash function and bitwise exclusive-or (XOR) operations. The security proof and performance analysis show our new scheme is well-suited to adoption in remote medical healthcare services.

  13. Analysis of the efficiency-integration nexus of Japanese stock market

    Science.gov (United States)

    Rizvi, Syed Aun R.; Arshad, Shaista

    2017-03-01

    This paper attempts a novel approach in analysing the Japanese economy through a dual-dimension analysis of its stock market, examining the efficiency and market integration. Taking a period of 24 years, this study employs MFDFA and MGARCH to understand how the efficiency and integration of the stock market faired during different business cycle phases of the Japanese economy. The results showed improving efficiency over the time period. For the case of market integration, our findings conform to recent literature on business cycles and stock market integration that every succeeding recession creates a break into integration levels resulting in a decrease.

  14. IMG: the integrated microbial genomes database and comparative analysis system

    Science.gov (United States)

    Markowitz, Victor M.; Chen, I-Min A.; Palaniappan, Krishna; Chu, Ken; Szeto, Ernest; Grechkin, Yuri; Ratner, Anna; Jacob, Biju; Huang, Jinghua; Williams, Peter; Huntemann, Marcel; Anderson, Iain; Mavromatis, Konstantinos; Ivanova, Natalia N.; Kyrpides, Nikos C.

    2012-01-01

    The Integrated Microbial Genomes (IMG) system serves as a community resource for comparative analysis of publicly available genomes in a comprehensive integrated context. IMG integrates publicly available draft and complete genomes from all three domains of life with a large number of plasmids and viruses. IMG provides tools and viewers for analyzing and reviewing the annotations of genes and genomes in a comparative context. IMG's data content and analytical capabilities have been continuously extended through regular updates since its first release in March 2005. IMG is available at http://img.jgi.doe.gov. Companion IMG systems provide support for expert review of genome annotations (IMG/ER: http://img.jgi.doe.gov/er), teaching courses and training in microbial genome analysis (IMG/EDU: http://img.jgi.doe.gov/edu) and analysis of genomes related to the Human Microbiome Project (IMG/HMP: http://www.hmpdacc-resources.org/img_hmp). PMID:22194640

  15. Printed organic thin-film transistor-based integrated circuits

    International Nuclear Information System (INIS)

    Mandal, Saumen; Noh, Yong-Young

    2015-01-01

    Organic electronics is moving ahead on its journey towards reality. However, this technology will only be possible when it is able to meet specific criteria including flexibility, transparency, disposability and low cost. Printing is one of the conventional techniques to deposit thin films from solution-based ink. It is used worldwide for visual modes of information, and it is now poised to enter into the manufacturing processes of various consumer electronics. The continuous progress made in the field of functional organic semiconductors has achieved high solubility in common solvents as well as high charge carrier mobility, which offers ample opportunity for organic-based printed integrated circuits. In this paper, we present a comprehensive review of all-printed organic thin-film transistor-based integrated circuits, mainly ring oscillators. First, the necessity of all-printed organic integrated circuits is discussed; we consider how the gap between printed electronics and real applications can be bridged. Next, various materials for printed organic integrated circuits are discussed. The features of these circuits and their suitability for electronics using different printing and coating techniques follow. Interconnection technology is equally important to make this product industrially viable; much attention in this review is placed here. For high-frequency operation, channel length should be sufficiently small; this could be achievable with a combination of surface treatment-assisted printing or laser writing. Registration is also an important issue related to printing; the printed gate should be perfectly aligned with the source and drain to minimize parasitic capacitances. All-printed organic inverters and ring oscillators are discussed here, along with their importance. Finally, future applications of all-printed organic integrated circuits are highlighted. (paper)

  16. Perceptions that influence the maintenance of scientific integrity in community-based participatory research.

    Science.gov (United States)

    Kraemer Diaz, Anne E; Spears Johnson, Chaya R; Arcury, Thomas A

    2015-06-01

    Scientific integrity is necessary for strong science; yet many variables can influence scientific integrity. In traditional research, some common threats are the pressure to publish, competition for funds, and career advancement. Community-based participatory research (CBPR) provides a different context for scientific integrity with additional and unique concerns. Understanding the perceptions that promote or discourage scientific integrity in CBPR as identified by professional and community investigators is essential to promoting the value of CBPR. This analysis explores the perceptions that facilitate scientific integrity in CBPR as well as the barriers among a sample of 74 professional and community CBPR investigators from 25 CBPR projects in nine states in the southeastern United States in 2012. There were variations in perceptions associated with team member identity as professional or community investigators. Perceptions identified to promote and discourage scientific integrity in CBPR by professional and community investigators were external pressures, community participation, funding, quality control and supervision, communication, training, and character and trust. Some perceptions such as communication and training promoted scientific integrity whereas other perceptions, such as a lack of funds and lack of trust could discourage scientific integrity. These results demonstrate that one of the most important perceptions in maintaining scientific integrity in CBPR is active community participation, which enables a co-responsibility by scientists and community members to provide oversight for scientific integrity. Credible CBPR science is crucial to empower the vulnerable communities to be heard by those in positions of power and policy making. © 2015 Society for Public Health Education.

  17. Integrated pathway-based transcription regulation network mining and visualization based on gene expression profiles.

    Science.gov (United States)

    Kibinge, Nelson; Ono, Naoaki; Horie, Masafumi; Sato, Tetsuo; Sugiura, Tadao; Altaf-Ul-Amin, Md; Saito, Akira; Kanaya, Shigehiko

    2016-06-01

    Conventionally, workflows examining transcription regulation networks from gene expression data involve distinct analytical steps. There is a need for pipelines that unify data mining and inference deduction into a singular framework to enhance interpretation and hypotheses generation. We propose a workflow that merges network construction with gene expression data mining focusing on regulation processes in the context of transcription factor driven gene regulation. The pipeline implements pathway-based modularization of expression profiles into functional units to improve biological interpretation. The integrated workflow was implemented as a web application software (TransReguloNet) with functions that enable pathway visualization and comparison of transcription factor activity between sample conditions defined in the experimental design. The pipeline merges differential expression, network construction, pathway-based abstraction, clustering and visualization. The framework was applied in analysis of actual expression datasets related to lung, breast and prostrate cancer. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Analysis of thermal-plastic response of shells of revolution by numerical integration

    International Nuclear Information System (INIS)

    Leonard, J.W.

    1975-01-01

    An economic technique for the numerical analysis of the elasto-plastic behaviour of shells of revolution would be of considerable value in the nuclear reactor industry. A numerical method based on the numerical integration of the governing shell equations has been shown, for elastic cases, to be more efficient than the finite element method when applied to shells of revolution. In the numerical integration method, the governing differential equations of motion are converted into a set of initial-value problems. Each initial-value problem is integrated numerically between meridional boundary points and recombined so as to satisfy boundary conditions. For large-deflection elasto-plastic behaviour, the equations are nonlinear and, hence, are recombined in an iterative manner using the Newton-Raphson procedure. Suppression techniques are incorporated in order to eliminate extraneous solutions within the numerical integration procedure. The Reissner-Meissner shell theory for shells of revolution is adopted to account for large deflection and higher-order rotation effects. The computer modelling of the equations is quite general in that specific shell segment geometries, e.g. cylindrical, spherical, toroidal, conical segments, and any combinations thereof can be handled easily. (Auth.)

  19. [A web-based integrated clinical database for laryngeal cancer].

    Science.gov (United States)

    E, Qimin; Liu, Jialin; Li, Yong; Liang, Chuanyu

    2014-08-01

    To establish an integrated database for laryngeal cancer, and to provide an information platform for laryngeal cancer in clinical and fundamental researches. This database also meet the needs of clinical and scientific use. Under the guidance of clinical expert, we have constructed a web-based integrated clinical database for laryngeal carcinoma on the basis of clinical data standards, Apache+PHP+MySQL technology, laryngeal cancer specialist characteristics and tumor genetic information. A Web-based integrated clinical database for laryngeal carcinoma had been developed. This database had a user-friendly interface and the data could be entered and queried conveniently. In addition, this system utilized the clinical data standards and exchanged information with existing electronic medical records system to avoid the Information Silo. Furthermore, the forms of database was integrated with laryngeal cancer specialist characteristics and tumor genetic information. The Web-based integrated clinical database for laryngeal carcinoma has comprehensive specialist information, strong expandability, high feasibility of technique and conforms to the clinical characteristics of laryngeal cancer specialties. Using the clinical data standards and structured handling clinical data, the database can be able to meet the needs of scientific research better and facilitate information exchange, and the information collected and input about the tumor sufferers are very informative. In addition, the user can utilize the Internet to realize the convenient, swift visit and manipulation on the database.

  20. TRAC-CFD code integration and its application to containment analysis

    International Nuclear Information System (INIS)

    Tahara, M.; Arai, K.; Oikawa, H.

    2004-01-01

    Several safety systems utilizing natural driving force have been recently adopted for operating reactors, or applied to next-generation reactor design. Examples of these safety systems are the Passive Containment Cooling System (PCCS) and the Drywell Cooler (DWC) for removing decay heat, and the Passive Auto-catalytic Recombiner (PAR) for removing flammable gas in reactor containment during an accident. DWC is used in almost all Boiling Water Reactors (BWR) in service. PAR has been introduced for some reactors in Europe and will be introduced for Japanese reactors. PCCS is a safety device of next-generation BWR. The functional mechanism of these safety systems is closely related to the transient of the thermal-hydraulic condition of the containment atmosphere. The performance depends on the containment atmospheric condition, which is eventually affected by the mass and energy changes caused by the safety system. Therefore, the thermal fluid dynamics in the containment vessel should be appropriately considered in detail to properly estimate the performance of these systems. A computational fluid dynamics (CFD) code is useful for evaluating detailed thermal hydraulic behavior related to this equipment. However, it also requires a considerable amount of computational resources when it is applied to whole containment system transient analysis. The paper describes the method and structure of the integrated analysis tool, and discusses the results of its application to the start-up behavior analysis of a containment cooling system, a drywell local cooler. The integrated analysis code was developed and applied to estimate the DWC performance during a severe accident. The integrated analysis tool is composed of three codes, TRAC-PCV, CFD-DW and TRAC-CC, and analyzes the interaction of the natural convection and steam condensation of the DWC as well as analyzing the thermal hydraulic transient behavior of the containment vessel during a severe accident in detail. The

  1. Paper-Plastic Hybrid Microfluidic Device for Smartphone-Based Colorimetric Analysis of Urine.

    Science.gov (United States)

    Jalal, Uddin M; Jin, Gyeong Jun; Shim, Joon S

    2017-12-19

    In this work, a disposable paper-plastic hybrid microfluidic lab-on-a-chip (LOC) has been developed and successfully applied for the colorimetric measurement of urine by the smartphone-based optical platform using a "UrineAnalysis" Android app. The developed device was cost-effectively implemented as a stand-alone hybrid LOC by incorporating the paper-based conventional reagent test strip inside the plastic-based LOC microchannel. The LOC device quantitatively investigated the small volume (40 μL) of urine analytes for the colorimetric reaction of glucose, protein, pH, and red blood cell (RBC) in integration with the finger-actuating micropump. On the basis of our experiments, the conventional urine strip showed large deviation as the reaction time goes by, because dipping the strip sensor in a bottle of urine could not control the reaction volume. By integrating the strip sensor in the LOC device for urine analysis, our device significantly improves the time-dependent inconstancy of the conventional dipstick-based urine strip, and the smartphone app used for image analysis enhances the visual assessment of the test strip, which is a major user concern for the colorimetric analysis in point-of-care (POC) applications. As a result, the user-friendly LOC, which is successfully implemented in a disposable format with the smartphone-based optical platform, may be applicable as an effective tool for rapid and qualitative POC urinalysis.

  2. Thermodynamic Analysis of a Woodchips Gasification Integrated with Solid Oxide Fuel Cell and Stirling Engine

    DEFF Research Database (Denmark)

    Rokni, Masoud

    2013-01-01

    Integrated gasification Solid Oxide Fuel Cell (SOFC) and Stirling engine for combined heat and power application is analysed. The target for electricity production is 120 kW. Woodchips are used as gasification feedstock to produce syngas which is utilized for feeding the SOFC stacks for electricity...... and suggested. Thermodynamic analysis shows that a thermal efficiency of 42.4% based on LHV (lower heating value) can be achieved. Different parameter studies are performed to analysis system behaviour under different conditions. The analysis show that increasing fuel mass flow from the design point results...

  3. Trends and applications of integrated automated ultra-trace sample handling and analysis (T9)

    International Nuclear Information System (INIS)

    Kingston, H.M.S.; Ye Han; Stewart, L.; Link, D.

    2002-01-01

    new approach has been developed for the semiconductor industry; however, as with most new technologies its applicability extends to many other areas as well including environmental, pharmaceutical, clinical and industrial chemical processing. This instrumental system represents a fundamentally new approach. Sample preparation has been integrated as a key system element to enable automation of the instrument system. It has long been believed that an automated fully integrated system was not feasible if a powerful MS system were included. This application demonstrates one of the first fully automated and integrated sample preparation and mass spectrometric instrumental analyses systems applied to practical use. The system is also a broad and ambitious mass based analyzer capable not only for elements but also for direct speciated analysis. The complete analytical suite covering inorganic, organic, organo-metallic and speciated analytes is being applied for critical contamination control of semiconductor processes. As with new paradigms technology it will now extend from its current use into those other applications needing real-time fully automated multi-component analysis. Refs. 4 (author)

  4. Plant-wide integrated equipment monitoring and analysis system

    International Nuclear Information System (INIS)

    Morimoto, C.N.; Hunter, T.A.; Chiang, S.C.

    2004-01-01

    A nuclear power plant equipment monitoring system monitors plant equipment and reports deteriorating equipment conditions. The more advanced equipment monitoring systems can also provide information for understanding the symptoms and diagnosing the root cause of a problem. Maximizing the equipment availability and minimizing or eliminating consequential damages are the ultimate goals of equipment monitoring systems. GE Integrated Equipment Monitoring System (GEIEMS) is designed as an integrated intelligent monitoring and analysis system for plant-wide application for BWR plants. This approach reduces system maintenance efforts and equipment monitoring costs and provides information for integrated planning. This paper describes GEIEMS and how the current system is being upgraded to meet General Electric's vision for plant-wide decision support. (author)

  5. The integration of expert-defined importance factors to enrich Bayesian Fault Tree Analysis

    International Nuclear Information System (INIS)

    Darwish, Molham; Almouahed, Shaban; Lamotte, Florent de

    2017-01-01

    This paper proposes an analysis of a hybrid Bayesian-Importance model for system designers to improve the quality of services related to Active Assisted Living Systems. The proposed model is based on two factors: failure probability measure of different service components and, an expert defined degree of importance that each component holds for the success of the corresponding service. The proposed approach advocates the integration of expert-defined importance factors to enrich the Bayesian Fault Tree Analysis (FTA) approach. The evaluation of the proposed approach is conducted using the Fault Tree Analysis formalism where the undesired state of a system is analyzed using Boolean logic mechanisms to combine a series of lower-level events.

  6. Conceptual design of a thermo-electrical energy storage system based on heat integration of thermodynamic cycles – Part A: Methodology and base case

    International Nuclear Information System (INIS)

    Morandin, Matteo; Maréchal, François; Mercangöz, Mehmet; Buchter, Florian

    2012-01-01

    The interest in large scale electricity storage (ES) with discharging time longer than 1 h and nominal power greater than 1 MW, is increasing worldwide as the increasing share of renewable energy, typically solar and wind energy, imposes severe load management issues. Thermo-electrical energy storage (TEES) based on thermodynamic cycles is currently under investigation at ABB corporate research as an alternative solution to pump hydro and compressed air energy storage. TEES is based on the conversion of electricity into thermal energy during charge by means of a heat pump and on the conversion of thermal energy into electricity during discharge by means of a thermal engine. The synthesis and the thermodynamic optimization of a TEES system based on hot water, ice storage and transcritical CO 2 cycles, is discussed in two papers. In this first paper a methodology for the conceptual design of a TEES system based on the analysis of the thermal integration between charging and discharging cycles through Pinch Analysis tools is introduced. According to such methodology, the heat exchanger network and temperatures and volumes of storage tanks are not defined a priori but are determined after the cycle parameters are optimized. For this purpose a heuristic procedure based on the interpretation of the composite curves obtained by optimizing the thermal integration between the cycles was developed. Such heuristic rules were implemented in a code that allows finding automatically the complete system design for given values of the intensive parameters of the charging and discharging cycles only. A base case system configuration is introduced and the results of its thermodynamic optimization are discussed here. A maximum roundtrip efficiency of 60% was obtained for the base case configuration assuming turbomachinery and heat exchanger performances in line with indications from manufacturers. -- Highlights: ► Energy storage based on water, ice, and transcritical CO 2 cycles is

  7. SIGMA: A System for Integrative Genomic Microarray Analysis of Cancer Genomes

    Directory of Open Access Journals (Sweden)

    Davies Jonathan J

    2006-12-01

    Full Text Available Abstract Background The prevalence of high resolution profiling of genomes has created a need for the integrative analysis of information generated from multiple methodologies and platforms. Although the majority of data in the public domain are gene expression profiles, and expression analysis software are available, the increase of array CGH studies has enabled integration of high throughput genomic and gene expression datasets. However, tools for direct mining and analysis of array CGH data are limited. Hence, there is a great need for analytical and display software tailored to cross platform integrative analysis of cancer genomes. Results We have created a user-friendly java application to facilitate sophisticated visualization and analysis such as cross-tumor and cross-platform comparisons. To demonstrate the utility of this software, we assembled array CGH data representing Affymetrix SNP chip, Stanford cDNA arrays and whole genome tiling path array platforms for cross comparison. This cancer genome database contains 267 profiles from commonly used cancer cell lines representing 14 different tissue types. Conclusion In this study we have developed an application for the visualization and analysis of data from high resolution array CGH platforms that can be adapted for analysis of multiple types of high throughput genomic datasets. Furthermore, we invite researchers using array CGH technology to deposit both their raw and processed data, as this will be a continually expanding database of cancer genomes. This publicly available resource, the System for Integrative Genomic Microarray Analysis (SIGMA of cancer genomes, can be accessed at http://sigma.bccrc.ca.

  8. Integration of hydrothermal carbonization and a CHP plant: Part 2 –operational and economic analysis

    International Nuclear Information System (INIS)

    Saari, Jussi; Sermyagina, Ekaterina; Kaikko, Juha; Vakkilainen, Esa; Sergeev, Vitaly

    2016-01-01

    Wood-fired combined heat and power (CHP) plants are a proven technology for producing domestic, carbon-neutral heat and power in Nordic countries. One drawback of CHP plants is the low capacity factors due to varying heat loads. In the current economic environment, uncertainty over energy prices creates also uncertainty over investment profitability. Hydrothermal carbonization (HTC) is a promising thermochemical conversion technology for producing an improved, more versatile wood-based fuel. Integrating HTC with a CHP plant allows simplifying the HTC process and extending the CHP plant operating time. An integrated polygeneration plant producing three energy products is also less sensitive to price changes in any one product. This study compares three integration cases chosen from the previous paper, and the case of separate stand-alone plants. The best economic performance is obtained using pressurized hot water from the CHP plant boiler drum as HTC process water. This has the poorest efficiency, but allows the greatest cost reduction in the HTC process and longest CHP plant operating time. The result demonstrates the suitability of CHP plants for integration with a HTC process, and the importance of economic and operational analysis considering annual load variations in sufficient detail. - Highlights: • Integration of wood hydrothermal carbonization with a small CHP plant studied. • Operation and economics of three concepts and stand-alone plants are compared. • Sensitivity analysis is performed. • Results show technical and thermodynamic analysis inadequate and misleading alone. • Minimizing HTC investment, extending CHP operating time important for profitability.

  9. Integrating health and environmental impact analysis

    DEFF Research Database (Denmark)

    Reis, S; Morris, G.; Fleming, L. E.

    2015-01-01

    which addresses human activity in all its social, economic and cultural complexity. The new approach must be integral to, and interactive, with the natural environment. We see the continuing failure to truly integrate human health and environmental impact analysis as deeply damaging, and we propose...... while equally emphasizing the health of the environment, and the growing calls for 'ecological public health' as a response to global environmental concerns now suffusing the discourse in public health. More revolution than evolution, ecological public health will demand new perspectives regarding...... the interconnections among society, the economy, the environment and our health and well-being. Success must be built on collaborations between the disparate scientific communities of the environmental sciences and public health as well as interactions with social scientists, economists and the legal profession...

  10. An Integrated Software Suite for Surface-based Analyses of Cerebral Cortex

    Science.gov (United States)

    Van Essen, David C.; Drury, Heather A.; Dickson, James; Harwell, John; Hanlon, Donna; Anderson, Charles H.

    2001-01-01

    The authors describe and illustrate an integrated trio of software programs for carrying out surface-based analyses of cerebral cortex. The first component of this trio, SureFit (Surface Reconstruction by Filtering and Intensity Transformations), is used primarily for cortical segmentation, volume visualization, surface generation, and the mapping of functional neuroimaging data onto surfaces. The second component, Caret (Computerized Anatomical Reconstruction and Editing Tool Kit), provides a wide range of surface visualization and analysis options as well as capabilities for surface flattening, surface-based deformation, and other surface manipulations. The third component, SuMS (Surface Management System), is a database and associated user interface for surface-related data. It provides for efficient insertion, searching, and extraction of surface and volume data from the database. PMID:11522765

  11. An integrated software suite for surface-based analyses of cerebral cortex

    Science.gov (United States)

    Van Essen, D. C.; Drury, H. A.; Dickson, J.; Harwell, J.; Hanlon, D.; Anderson, C. H.

    2001-01-01

    The authors describe and illustrate an integrated trio of software programs for carrying out surface-based analyses of cerebral cortex. The first component of this trio, SureFit (Surface Reconstruction by Filtering and Intensity Transformations), is used primarily for cortical segmentation, volume visualization, surface generation, and the mapping of functional neuroimaging data onto surfaces. The second component, Caret (Computerized Anatomical Reconstruction and Editing Tool Kit), provides a wide range of surface visualization and analysis options as well as capabilities for surface flattening, surface-based deformation, and other surface manipulations. The third component, SuMS (Surface Management System), is a database and associated user interface for surface-related data. It provides for efficient insertion, searching, and extraction of surface and volume data from the database.

  12. Integrated sequence analysis. Final report

    International Nuclear Information System (INIS)

    Andersson, K.; Pyy, P.

    1998-02-01

    The NKS/RAK subprojet 3 'integrated sequence analysis' (ISA) was formulated with the overall objective to develop and to test integrated methodologies in order to evaluate event sequences with significant human action contribution. The term 'methodology' denotes not only technical tools but also methods for integration of different scientific disciplines. In this report, we first discuss the background of ISA and the surveys made to map methods in different application fields, such as man machine system simulation software, human reliability analysis (HRA) and expert judgement. Specific event sequences were, after the surveys, selected for application and testing of a number of ISA methods. The event sequences discussed in the report were cold overpressure of BWR, shutdown LOCA of BWR, steam generator tube rupture of a PWR and BWR disturbed signal view in the control room after an external event. Different teams analysed these sequences by using different ISA and HRA methods. Two kinds of results were obtained from the ISA project: sequence specific and more general findings. The sequence specific results are discussed together with each sequence description. The general lessons are discussed under a separate chapter by using comparisons of different case studies. These lessons include areas ranging from plant safety management (design, procedures, instrumentation, operations, maintenance and safety practices) to methodological findings (ISA methodology, PSA,HRA, physical analyses, behavioural analyses and uncertainty assessment). Finally follows a discussion about the project and conclusions are presented. An interdisciplinary study of complex phenomena is a natural way to produce valuable and innovative results. This project came up with structured ways to perform ISA and managed to apply the in practice. The project also highlighted some areas where more work is needed. In the HRA work, development is required for the use of simulators and expert judgement as

  13. Integrated sequence analysis. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, K.; Pyy, P

    1998-02-01

    The NKS/RAK subprojet 3 `integrated sequence analysis` (ISA) was formulated with the overall objective to develop and to test integrated methodologies in order to evaluate event sequences with significant human action contribution. The term `methodology` denotes not only technical tools but also methods for integration of different scientific disciplines. In this report, we first discuss the background of ISA and the surveys made to map methods in different application fields, such as man machine system simulation software, human reliability analysis (HRA) and expert judgement. Specific event sequences were, after the surveys, selected for application and testing of a number of ISA methods. The event sequences discussed in the report were cold overpressure of BWR, shutdown LOCA of BWR, steam generator tube rupture of a PWR and BWR disturbed signal view in the control room after an external event. Different teams analysed these sequences by using different ISA and HRA methods. Two kinds of results were obtained from the ISA project: sequence specific and more general findings. The sequence specific results are discussed together with each sequence description. The general lessons are discussed under a separate chapter by using comparisons of different case studies. These lessons include areas ranging from plant safety management (design, procedures, instrumentation, operations, maintenance and safety practices) to methodological findings (ISA methodology, PSA,HRA, physical analyses, behavioural analyses and uncertainty assessment). Finally follows a discussion about the project and conclusions are presented. An interdisciplinary study of complex phenomena is a natural way to produce valuable and innovative results. This project came up with structured ways to perform ISA and managed to apply the in practice. The project also highlighted some areas where more work is needed. In the HRA work, development is required for the use of simulators and expert judgement as

  14. Fusion integral experiments and analysis and the determination of design safety factors - I: Methodology

    International Nuclear Information System (INIS)

    Youssef, M.Z.; Kumar, A.; Abdou, M.A.; Oyama, Y.; Maekawa, H.

    1995-01-01

    The role of the neutronics experimentation and analysis in fusion neutronics research and development programs is discussed. A new methodology was developed to arrive at estimates to design safety factors based on the experimental and analytical results from design-oriented integral experiments. In this methodology, and for a particular nuclear response, R, a normalized density function (NDF) is constructed from the prediction uncertainties, and their associated standard deviations, as found in the various integral experiments where that response, R, is measured. Important statistical parameters are derived from the NDF, such as the global mean prediction uncertainty, and the possible spread around it. The method of deriving safety factors from many possible NDFs based on various calculational and measuring methods (among other variants) is also described. Associated with each safety factor is a confidence level, designers may choose to have, that the calculated response, R, will not exceed (or will not fall below) the actual measured value. An illustrative example is given on how to construct the NDFs. The methodology is applied in two areas, namely the line-integrated tritium production rate and bulk shielding integral experiments. Conditions under which these factors could be derived and the validity of the method are discussed. 72 refs., 17 figs., 4 tabs

  15. Advancing Alternative Analysis: Integration of Decision Science

    DEFF Research Database (Denmark)

    Malloy, Timothy F; Zaunbrecher, Virginia M; Batteate, Christina

    2016-01-01

    Decision analysis-a systematic approach to solving complex problems-offers tools and frameworks to support decision making that are increasingly being applied to environmental challenges. Alternatives analysis is a method used in regulation and product design to identify, compare, and evaluate......, and civil society and included experts in toxicology, decision science, alternatives assessment, engineering, and law and policy. Participants were divided into two groups and prompted with targeted questions. Throughout the workshop, the groups periodically came together in plenary sessions to reflect......) engaging the systematic development and evaluation of decision approaches and tools; (2) using case studies to advance the integration of decision analysis into alternatives analysis; (3) supporting transdisciplinary research; and (4) supporting education and outreach efforts....

  16. Phosphoproteomics-based systems analysis of signal transduction networks

    Directory of Open Access Journals (Sweden)

    Hiroko eKozuka-Hata

    2012-01-01

    Full Text Available Signal transduction systems coordinate complex cellular information to regulate biological events such as cell proliferation and differentiation. Although the accumulating evidence on widespread association of signaling molecules has revealed essential contribution of phosphorylation-dependent interaction networks to cellular regulation, their dynamic behavior is mostly yet to be analyzed. Recent technological advances regarding mass spectrometry-based quantitative proteomics have enabled us to describe the comprehensive status of phosphorylated molecules in a time-resolved manner. Computational analyses based on the phosphoproteome dynamics accelerate generation of novel methodologies for mathematical analysis of cellular signaling. Phosphoproteomics-based numerical modeling can be used to evaluate regulatory network elements from a statistical point of view. Integration with transcriptome dynamics also uncovers regulatory hubs at the transcriptional level. These omics-based computational methodologies, which have firstly been applied to representative signaling systems such as the epidermal growth factor receptor pathway, have now opened up a gate for systems analysis of signaling networks involved in immune response and cancer.

  17. Integrating sentiment analysis and term associations with geo-temporal visualizations on customer feedback streams

    Science.gov (United States)

    Hao, Ming; Rohrdantz, Christian; Janetzko, Halldór; Keim, Daniel; Dayal, Umeshwar; Haug, Lars-Erik; Hsu, Mei-Chun

    2012-01-01

    Twitter currently receives over 190 million tweets (small text-based Web posts) and manufacturing companies receive over 10 thousand web product surveys a day, in which people share their thoughts regarding a wide range of products and their features. A large number of tweets and customer surveys include opinions about products and services. However, with Twitter being a relatively new phenomenon, these tweets are underutilized as a source for determining customer sentiments. To explore high-volume customer feedback streams, we integrate three time series-based visual analysis techniques: (1) feature-based sentiment analysis that extracts, measures, and maps customer feedback; (2) a novel idea of term associations that identify attributes, verbs, and adjectives frequently occurring together; and (3) new pixel cell-based sentiment calendars, geo-temporal map visualizations and self-organizing maps to identify co-occurring and influential opinions. We have combined these techniques into a well-fitted solution for an effective analysis of large customer feedback streams such as for movie reviews (e.g., Kung-Fu Panda) or web surveys (buyers).

  18. INS integrated motion analysis for autonomous vehicle navigation

    Science.gov (United States)

    Roberts, Barry; Bazakos, Mike

    1991-01-01

    The use of inertial navigation system (INS) measurements to enhance the quality and robustness of motion analysis techniques used for obstacle detection is discussed with particular reference to autonomous vehicle navigation. The approach to obstacle detection used here employs motion analysis of imagery generated by a passive sensor. Motion analysis of imagery obtained during vehicle travel is used to generate range measurements to points within the field of view of the sensor, which can then be used to provide obstacle detection. Results obtained with an INS integrated motion analysis approach are reviewed.

  19. Exergy analysis of a combined heat and power plant with integrated lignocellulosic ethanol production

    International Nuclear Information System (INIS)

    Lythcke-Jørgensen, Christoffer; Haglind, Fredrik; Clausen, Lasse R.

    2014-01-01

    Highlights: • We model a system where lignocellulosic ethanol production is integrated with a combined heat and power (CHP) plant. • We conduct an exergy analysis for the ethanol production in six different system operation points. • Integrated operation, district heating (DH) production and low CHP loads all increase the exergy efficiency. • Separate operation has the largest negative impact on the exergy efficiency. • Operation is found to have a significant impact on the exergy efficiency of the ethanol production. - Abstract: Lignocellulosic ethanol production is often assumed integrated in polygeneration systems because of its energy intensive nature. The objective of this study is to investigate potential irreversibilities from such integration, and what impact it has on the efficiency of the integrated ethanol production. An exergy analysis is carried out for a modelled polygeneration system in which lignocellulosic ethanol production based on hydrothermal pretreatment is integrated in an existing combined heat and power (CHP) plant. The ethanol facility is driven by steam extracted from the CHP unit when feasible, and a gas boiler is used as back-up when integration is not possible. The system was evaluated according to six operation points that alternate on the following three different operation parameters: Load in the CHP unit, integrated versus separate operation, and inclusion of district heating production in the ethanol facility. The calculated standard exergy efficiency of the ethanol facility varied from 0.564 to 0.855, of which the highest was obtained for integrated operation at reduced CHP load and full district heating production in the ethanol facility, and the lowest for separate operation with zero district heating production in the ethanol facility. The results suggest that the efficiency of integrating lignocellulosic ethanol production in CHP plants is highly dependent on operation, and it is therefore suggested that the

  20. Sensitivity Analysis for Design Optimization Integrated Software Tools, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this proposed project is to provide a new set of sensitivity analysis theory and codes, the Sensitivity Analysis for Design Optimization Integrated...

  1. A critical review of survey-based research in supply chain integration

    NARCIS (Netherlands)

    van der Vaart, Taco; van Donk, Dirk Pieter

    Supply chain (SC) integration is considered one of the major factors in improving performance. Based upon some concerns regarding the constructs, measurements and items used, this paper analyses survey-based research with respect to the relationship between SC integration and performance. The review

  2. Double path-integral migration velocity analysis: a real data example

    International Nuclear Information System (INIS)

    Costa, Jessé C; Schleicher, Jörg

    2011-01-01

    Path-integral imaging forms an image with no knowledge of the velocity model by summing over the migrated images obtained for a set of migration velocity models. Double path-integral imaging migration extracts the stationary velocities, i.e. those velocities at which common-image gathers align horizontally, as a byproduct. An application of the technique to a real data set demonstrates that quantitative information about the time migration velocity model can be determined by double path-integral migration velocity analysis. Migrated images using interpolations with different regularizations of the extracted velocities prove the high quality of the resulting time-migration velocity information. The so-obtained velocity model can then be used as a starting model for subsequent velocity analysis tools like migration tomography or other tomographic methods

  3. Life-cycle analysis of product integrated polymer solar cells

    DEFF Research Database (Denmark)

    Espinosa Martinez, Nieves; García-Valverde, Rafael; Krebs, Frederik C

    2011-01-01

    A life cycle analysis (LCA) on a product integrated polymer solar module is carried out in this study. These assessments are well-known to be useful in developmental stages of a product in order to identify the bottlenecks for the up-scaling in its production phase for several aspects spanning from...... economics through design to functionality. An LCA study was performed to quantify the energy use and greenhouse gas (GHG) emissions from electricity use in the manufacture of a light-weight lamp based on a plastic foil, a lithium-polymer battery, a polymer solar cell, printed circuitry, blocking diode......, switch and a white light emitting semiconductor diode. The polymer solar cell employed in this prototype presents a power conversion efficiency in the range of 2 to 3% yielding energy payback times (EPBT) in the range of 1.3–2 years. Based on this it is worthwhile to undertake a life-cycle study...

  4. Complex Behavior in an Integrate-and-Fire Neuron Model Based on Small World Networks

    International Nuclear Information System (INIS)

    Lin Min; Chen Tianlun

    2005-01-01

    Based on our previously pulse-coupled integrate-and-fire neuron model in small world networks, we investigate the complex behavior of electroencephalographic (EEG)-like activities produced by such a model. We find EEG-like activities have obvious chaotic characteristics. We also analyze the complex behaviors of EEG-like signals, such as spectral analysis, reconstruction of the phase space, the correlation dimension, and so on.

  5. An integrated approach for integrated intelligent instrumentation and control system (I3CS)

    International Nuclear Information System (INIS)

    Jung, C.H.; Kim, J.T.; Kwon, K.C.

    1997-01-01

    Nuclear power plants to guarantee the safety of public should be designed to reduce the operator intervention resulting in operating human errors, identify the process states in transients, and aid to make a decision of their tasks and guide operator actions. For the sake of this purpose, MMIS(MAN-Machine Interface System) in NPPs should be the integrated top-down approach tightly focused on the function-based task analysis including an advanced digital technology, an operator support function, and so on. The advanced I and C research team in KAERI has embarked on developing an Integrated Intelligent Instrumentation and Control System (I 3 CS) for Korea's next generation nuclear power plants. I 3 CS bases the integrated top-down approach on the function-based task analysis, modern digital technology, standardization and simplification, availability and reliability, and protection of investment. (author). 4 refs, 6 figs

  6. SIG-VISA: Signal-based Vertically Integrated Seismic Monitoring

    Science.gov (United States)

    Moore, D.; Mayeda, K. M.; Myers, S. C.; Russell, S.

    2013-12-01

    Traditional seismic monitoring systems rely on discrete detections produced by station processing software; however, while such detections may constitute a useful summary of station activity, they discard large amounts of information present in the original recorded signal. We present SIG-VISA (Signal-based Vertically Integrated Seismic Analysis), a system for seismic monitoring through Bayesian inference on seismic signals. By directly modeling the recorded signal, our approach incorporates additional information unavailable to detection-based methods, enabling higher sensitivity and more accurate localization using techniques such as waveform matching. SIG-VISA's Bayesian forward model of seismic signal envelopes includes physically-derived models of travel times and source characteristics as well as Gaussian process (kriging) statistical models of signal properties that combine interpolation of historical data with extrapolation of learned physical trends. Applying Bayesian inference, we evaluate the model on earthquakes as well as the 2009 DPRK test event, demonstrating a waveform matching effect as part of the probabilistic inference, along with results on event localization and sensitivity. In particular, we demonstrate increased sensitivity from signal-based modeling, in which the SIGVISA signal model finds statistical evidence for arrivals even at stations for which the IMS station processing failed to register any detection.

  7. Astrophysics Laboratory-Based Lecture Material Development of Solarscope with Integration and Interconnection

    Directory of Open Access Journals (Sweden)

    Asih Melati

    2015-12-01

    Full Text Available The development of laboratory-based lecture materials with integrated and interconnected value is a requirement for study and practical materials and in line with the vision and mission of UIN Sunan Kalijaga. As a result, the optimization of laboratory’s equipment is urgently needed. Although UIN Sunan Kalijaga Laboratory have had Solarscope telescope – which have a guidebook in German language – for six years, it was not optimally used even it can be used to satisfy the desires to observe astronomical objects economically, accurately and easy to operate. Based on above, this research propose to create a lab-work module for Solarscope with integration and interconnection value. This research used 4D methodology (Define, Design, Develop and Disseminate and have passed the assessment and validation phase from material, media and integrated-interconnected value experts. The data analysis of the module which was mapped by Sukarja into 5 scale mark resulted in good grade in the module assessment by material experts with 80% from the ideal mark with most of the complaint is in the formula typing which is not clear in its derivative. The module assessment by media experts scored very good grade with 88.89% from the ideal mark regarding the content and the figures of the module. Lastly, from the integrated-interconnected value experts marked in good grade with 73.50% from the ideal mark and suggested the addition of supported Al-Qur’an verses and relevant exclamation of the Al-Qur’an’s passages. With all of these assessment results, this module can be used as the material of astrophysics lab-work and for supporting students’ researches with integration-interconnection value and enhance the university’s book collection which will support the vision and mission of UIN Sunan Kalijaga

  8. Pathway Relevance Ranking for Tumor Samples through Network-Based Data Integration.

    Directory of Open Access Journals (Sweden)

    Lieven P C Verbeke

    Full Text Available The study of cancer, a highly heterogeneous disease with different causes and clinical outcomes, requires a multi-angle approach and the collection of large multi-omics datasets that, ideally, should be analyzed simultaneously. We present a new pathway relevance ranking method that is able to prioritize pathways according to the information contained in any combination of tumor related omics datasets. Key to the method is the conversion of all available data into a single comprehensive network representation containing not only genes but also individual patient samples. Additionally, all data are linked through a network of previously identified molecular interactions. We demonstrate the performance of the new method by applying it to breast and ovarian cancer datasets from The Cancer Genome Atlas. By integrating gene expression, copy number, mutation and methylation data, the method's potential to identify key pathways involved in breast cancer development shared by different molecular subtypes is illustrated. Interestingly, certain pathways were ranked equally important for different subtypes, even when the underlying (epi-genetic disturbances were diverse. Next to prioritizing universally high-scoring pathways, the pathway ranking method was able to identify subtype-specific pathways. Often the score of a pathway could not be motivated by a single mutation, copy number or methylation alteration, but rather by a combination of genetic and epi-genetic disturbances, stressing the need for a network-based data integration approach. The analysis of ovarian tumors, as a function of survival-based subtypes, demonstrated the method's ability to correctly identify key pathways, irrespective of tumor subtype. A differential analysis of survival-based subtypes revealed several pathways with higher importance for the bad-outcome patient group than for the good-outcome patient group. Many of the pathways exhibiting higher importance for the bad

  9. ASKI: A modular toolbox for scattering-integral-based seismic full waveform inversion and sensitivity analysis utilizing external forward codes

    Directory of Open Access Journals (Sweden)

    Florian Schumacher

    2016-01-01

    Full Text Available Due to increasing computational resources, the development of new numerically demanding methods and software for imaging Earth’s interior remains of high interest in Earth sciences. Here, we give a description from a user’s and programmer’s perspective of the highly modular, flexible and extendable software package ASKI–Analysis of Sensitivity and Kernel Inversion–recently developed for iterative scattering-integral-based seismic full waveform inversion. In ASKI, the three fundamental steps of solving the seismic forward problem, computing waveform sensitivity kernels and deriving a model update are solved by independent software programs that interact via file output/input only. Furthermore, the spatial discretizations of the model space used for solving the seismic forward problem and for deriving model updates, respectively, are kept completely independent. For this reason, ASKI does not contain a specific forward solver but instead provides a general interface to established community wave propagation codes. Moreover, the third fundamental step of deriving a model update can be repeated at relatively low costs applying different kinds of model regularization or re-selecting/weighting the inverted dataset without need to re-solve the forward problem or re-compute the kernels. Additionally, ASKI offers the user sensitivity and resolution analysis tools based on the full sensitivity matrix and allows to compose customized workflows in a consistent computational environment. ASKI is written in modern Fortran and Python, it is well documented and freely available under terms of the GNU General Public License (http://www.rub.de/aski.

  10. Coupled hydrodynamic-structural analysis of an integral flowing sodium test loop in the TREAT reactor

    International Nuclear Information System (INIS)

    Zeuch, W.R.; A-Moneim, M.T.

    1979-01-01

    A hydrodynamic-structural response analysis of the Mark-IICB loop was performed for the TREAT (Transient Reactor Test Facility) test AX-1. Test AX-1 is intended to provide information concerning the potential for a vapor explosion in an advanced-fueled LMFBR. The test will be conducted in TREAT with unirradiated uranium-carbide fuel pins in the Mark-IICB integral flowing sodium loop. Our analysis addressed the ability of the experimental hardware to maintain its containment integrity during the reference accident postulated for the test. Based on a thermal-hydraulics analysis and assumptions for fuel-coolant interaction in the test section, a pressure pulse of 144 MPa maximum pressure and pulse width of 1.32 ms has been calculated as the reference accident. The response of the test loop to the pressure transient was obtained with the ICEPEL and STRAW codes. Modelling of the test section was completed with STRAW and the remainder of the loop was modelled by ICEPEL

  11. Enhanced IMC based PID controller design for non-minimum phase (NMP) integrating processes with time delays.

    Science.gov (United States)

    Ghousiya Begum, K; Seshagiri Rao, A; Radhakrishnan, T K

    2017-05-01

    Internal model control (IMC) with optimal H 2 minimization framework is proposed in this paper for design of proportional-integral-derivative (PID) controllers. The controller design is addressed for integrating and double integrating time delay processes with right half plane (RHP) zeros. Blaschke product is used to derive the optimal controller. There is a single adjustable closed loop tuning parameter for controller design. Systematic guidelines are provided for selection of this tuning parameter based on maximum sensitivity. Simulation studies have been carried out on various integrating time delay processes to show the advantages of the proposed method. The proposed controller provides enhanced closed loop performances when compared to recently reported methods in the literature. Quantitative comparative analysis has been carried out using the performance indices, Integral Absolute Error (IAE) and Total Variation (TV). Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  12. STARS: An Integrated, Multidisciplinary, Finite-Element, Structural, Fluids, Aeroelastic, and Aeroservoelastic Analysis Computer Program

    Science.gov (United States)

    Gupta, K. K.

    1997-01-01

    A multidisciplinary, finite element-based, highly graphics-oriented, linear and nonlinear analysis capability that includes such disciplines as structures, heat transfer, linear aerodynamics, computational fluid dynamics, and controls engineering has been achieved by integrating several new modules in the original STARS (STructural Analysis RoutineS) computer program. Each individual analysis module is general-purpose in nature and is effectively integrated to yield aeroelastic and aeroservoelastic solutions of complex engineering problems. Examples of advanced NASA Dryden Flight Research Center projects analyzed by the code in recent years include the X-29A, F-18 High Alpha Research Vehicle/Thrust Vectoring Control System, B-52/Pegasus Generic Hypersonics, National AeroSpace Plane (NASP), SR-71/Hypersonic Launch Vehicle, and High Speed Civil Transport (HSCT) projects. Extensive graphics capabilities exist for convenient model development and postprocessing of analysis results. The program is written in modular form in standard FORTRAN language to run on a variety of computers, such as the IBM RISC/6000, SGI, DEC, Cray, and personal computer; associated graphics codes use OpenGL and IBM/graPHIGS language for color depiction. This program is available from COSMIC, the NASA agency for distribution of computer programs.

  13. Integrated dynamic landscape analysis and modeling system (IDLAMS) : installation manual.

    Energy Technology Data Exchange (ETDEWEB)

    Li, Z.; Majerus, K. A.; Sundell, R. C.; Sydelko, P. J.; Vogt, M. C.

    1999-02-24

    The Integrated Dynamic Landscape Analysis and Modeling System (IDLAMS) is a prototype, integrated land management technology developed through a joint effort between Argonne National Laboratory (ANL) and the US Army Corps of Engineers Construction Engineering Research Laboratories (USACERL). Dr. Ronald C. Sundell, Ms. Pamela J. Sydelko, and Ms. Kimberly A. Majerus were the principal investigators (PIs) for this project. Dr. Zhian Li was the primary software developer. Dr. Jeffrey M. Keisler, Mr. Christopher M. Klaus, and Mr. Michael C. Vogt developed the decision analysis component of this project. It was developed with funding support from the Strategic Environmental Research and Development Program (SERDP), a land/environmental stewardship research program with participation from the US Department of Defense (DoD), the US Department of Energy (DOE), and the US Environmental Protection Agency (EPA). IDLAMS predicts land conditions (e.g., vegetation, wildlife habitats, and erosion status) by simulating changes in military land ecosystems for given training intensities and land management practices. It can be used by military land managers to help predict the future ecological condition for a given land use based on land management scenarios of various levels of training intensity. It also can be used as a tool to help land managers compare different land management practices and further determine a set of land management activities and prescriptions that best suit the needs of a specific military installation.

  14. Ontology Based Resolution of Semantic Conflicts in Information Integration

    Institute of Scientific and Technical Information of China (English)

    LU Han; LI Qing-zhong

    2004-01-01

    Semantic conflict is the conflict caused by using different ways in heterogeneous systems to express the same entity in reality.This prevents information integration from accomplishing semantic coherence.Since ontology helps to solve semantic problems, this area has become a hot topic in information integration.In this paper, we introduce semantic conflict into information integration of heterogeneous applications.We discuss the origins and categories of the conflict, and present an ontology-based schema mapping approach to eliminate semantic conflicts.

  15. Argo: an integrative, interactive, text mining-based workbench supporting curation

    Science.gov (United States)

    Rak, Rafal; Rowley, Andrew; Black, William; Ananiadou, Sophia

    2012-01-01

    Curation of biomedical literature is often supported by the automatic analysis of textual content that generally involves a sequence of individual processing components. Text mining (TM) has been used to enhance the process of manual biocuration, but has been focused on specific databases and tasks rather than an environment integrating TM tools into the curation pipeline, catering for a variety of tasks, types of information and applications. Processing components usually come from different sources and often lack interoperability. The well established Unstructured Information Management Architecture is a framework that addresses interoperability by defining common data structures and interfaces. However, most of the efforts are targeted towards software developers and are not suitable for curators, or are otherwise inconvenient to use on a higher level of abstraction. To overcome these issues we introduce Argo, an interoperable, integrative, interactive and collaborative system for text analysis with a convenient graphic user interface to ease the development of processing workflows and boost productivity in labour-intensive manual curation. Robust, scalable text analytics follow a modular approach, adopting component modules for distinct levels of text analysis. The user interface is available entirely through a web browser that saves the user from going through often complicated and platform-dependent installation procedures. Argo comes with a predefined set of processing components commonly used in text analysis, while giving the users the ability to deposit their own components. The system accommodates various areas and levels of user expertise, from TM and computational linguistics to ontology-based curation. One of the key functionalities of Argo is its ability to seamlessly incorporate user-interactive components, such as manual annotation editors, into otherwise completely automatic pipelines. As a use case, we demonstrate the functionality of an in

  16. Developments of integrity evaluation technology for pressurized components in nuclear power plant and IT based integrity evaluation system

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Jin; Choi, Jae Boong; Shim, Do Jun [Sungkyunkwan Univ., Seoul (Korea, Republic of)] (and others)

    2003-03-15

    The objective of this research is to develop an efficient evaluation technology and to investigate applicability of newly-developed technology, such as internet-based cyber platform, to operating power plants. Development of efficient evaluation systems for Nuclear Power Plant components, based on structural integrity assessment techniques, are increasingly demanded for safe operation with the increasing operating period of Nuclear Power Plants. The following five topics are covered in this project: development of assessment method for wall-thinned nuclear piping based on pipe test; development of structural integrity program for steam generator tubes with cracks of various shape; development of fatigue life evaluation system for mam components of NPP; development of internet-based cyber platform and integrity program for primary components of NPP; effect of aging on strength of dissimilar welds.

  17. Lightweight ECC based RFID authentication integrated with an ID verifier transfer protocol.

    Science.gov (United States)

    He, Debiao; Kumar, Neeraj; Chilamkurti, Naveen; Lee, Jong-Hyouk

    2014-10-01

    The radio frequency identification (RFID) technology has been widely adopted and being deployed as a dominant identification technology in a health care domain such as medical information authentication, patient tracking, blood transfusion medicine, etc. With more and more stringent security and privacy requirements to RFID based authentication schemes, elliptic curve cryptography (ECC) based RFID authentication schemes have been proposed to meet the requirements. However, many recently published ECC based RFID authentication schemes have serious security weaknesses. In this paper, we propose a new ECC based RFID authentication integrated with an ID verifier transfer protocol that overcomes the weaknesses of the existing schemes. A comprehensive security analysis has been conducted to show strong security properties that are provided from the proposed authentication scheme. Moreover, the performance of the proposed authentication scheme is analyzed in terms of computational cost, communicational cost, and storage requirement.

  18. Implementation of a variable-step integration technique for nonlinear structural dynamic analysis

    International Nuclear Information System (INIS)

    Underwood, P.; Park, K.C.

    1977-01-01

    The paper presents the implementation of a recently developed unconditionally stable implicit time integration method into a production computer code for the transient response analysis of nonlinear structural dynamic systems. The time integrator is packaged with two significant features; a variable step size that is automatically determined and this is accomplished without additional matrix refactorizations. The equations of motion solved by the time integrator must be cast in the pseudo-force form, and this provides the mechanism for controlling the step size. Step size control is accomplished by extrapolating the pseudo-force to the next time (the predicted pseudo-force), then performing the integration step and then recomputing the pseudo-force based on the current solution (the correct pseudo-force); from this data an error norm is constructed, the value of which determines the step size for the next step. To avoid refactoring the required matrix with each step size change a matrix scaling technique is employed, which allows step sizes to change by a factor of 100 without refactoring. If during a computer run the integrator determines it can run with a step size larger than 100 times the original minimum step size, the matrix is refactored to take advantage of the larger step size. The strategy for effecting these features are discussed in detail. (Auth.)

  19. A solar reserve methodology for renewable energy integration studies based on sub-hourly variability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ibanez, Eduardo; Brinkman, Gregory; Hummon, Marissa [National Renewable Energy Lab. (NREL), Golden, CO (United States); Lew, Debra

    2012-07-01

    Increasing penetration of wind and solar energy are raising concerns among electric system operators because of the variability and uncertainty associated with the power sources. Previous work focused on the quantification of reserves for systems with wind power. This paper presents a new methodology that allows the determination of necessary reserves for high penetrations of photovoltaic power and compares it to the wind-based methodology. The solar reserve methodology was applied to Phase 2 of the Western Wind and Solar Integration Study. A summary of the results is included. (orig.)

  20. Likelihood ratio-based integrated personal risk assessment of type 2 diabetes.

    Science.gov (United States)

    Sato, Noriko; Htun, Nay Chi; Daimon, Makoto; Tamiya, Gen; Kato, Takeo; Kubota, Isao; Ueno, Yoshiyuki; Yamashita, Hidetoshi; Fukao, Akira; Kayama, Takamasa; Muramatsu, Masaaki

    2014-01-01

    To facilitate personalized health care for multifactorial diseases, risks of genetic and clinical/environmental factors should be assessed together for each individual in an integrated fashion. This approach is possible with the likelihood ratio (LR)-based risk assessment system, as this system can incorporate manifold tests. We examined the usefulness of this system for assessing type 2 diabetes (T2D). Our system employed 29 genetic susceptibility variants, body mass index (BMI), and hypertension as risk factors whose LRs can be estimated from openly available T2D association data for the Japanese population. The pretest probability was set at a sex- and age-appropriate population average of diabetes prevalence. The classification performance of our LR-based risk assessment was compared to that of a non-invasive screening test for diabetes called TOPICS (with score based on age, sex, family history, smoking, BMI, and hypertension) using receiver operating characteristic analysis with a community cohort (n = 1263). The area under the receiver operating characteristic curve (AUC) for the LR-based assessment and TOPICS was 0.707 (95% CI 0.665-0.750) and 0.719 (0.675-0.762), respectively. These AUCs were much higher than that of a genetic risk score constructed using the same genetic susceptibility variants, 0.624 (0.574-0.674). The use of ethnically matched LRs is necessary for proper personal risk assessment. In conclusion, although LR-based integrated risk assessment for T2D still requires additional tests that evaluate other factors, such as risks involved in missing heritability, our results indicate the potential usability of LR-based assessment system and stress the importance of stratified epidemiological investigations in personalized medicine.

  1. Integrative analysis of gene expression and DNA methylation using unsupervised feature extraction for detecting candidate cancer biomarkers.

    Science.gov (United States)

    Moon, Myungjin; Nakai, Kenta

    2018-04-01

    Currently, cancer biomarker discovery is one of the important research topics worldwide. In particular, detecting significant genes related to cancer is an important task for early diagnosis and treatment of cancer. Conventional studies mostly focus on genes that are differentially expressed in different states of cancer; however, noise in gene expression datasets and insufficient information in limited datasets impede precise analysis of novel candidate biomarkers. In this study, we propose an integrative analysis of gene expression and DNA methylation using normalization and unsupervised feature extractions to identify candidate biomarkers of cancer using renal cell carcinoma RNA-seq datasets. Gene expression and DNA methylation datasets are normalized by Box-Cox transformation and integrated into a one-dimensional dataset that retains the major characteristics of the original datasets by unsupervised feature extraction methods, and differentially expressed genes are selected from the integrated dataset. Use of the integrated dataset demonstrated improved performance as compared with conventional approaches that utilize gene expression or DNA methylation datasets alone. Validation based on the literature showed that a considerable number of top-ranked genes from the integrated dataset have known relationships with cancer, implying that novel candidate biomarkers can also be acquired from the proposed analysis method. Furthermore, we expect that the proposed method can be expanded for applications involving various types of multi-omics datasets.

  2. Mathematical analysis of the boundary-integral based electrostatics estimation approximation for molecular solvation: exact results for spherical inclusions.

    Science.gov (United States)

    Bardhan, Jaydeep P; Knepley, Matthew G

    2011-09-28

    We analyze the mathematically rigorous BIBEE (boundary-integral based electrostatics estimation) approximation of the mixed-dielectric continuum model of molecular electrostatics, using the analytically solvable case of a spherical solute containing an arbitrary charge distribution. Our analysis, which builds on Kirkwood's solution using spherical harmonics, clarifies important aspects of the approximation and its relationship to generalized Born models. First, our results suggest a new perspective for analyzing fast electrostatic models: the separation of variables between material properties (the dielectric constants) and geometry (the solute dielectric boundary and charge distribution). Second, we find that the eigenfunctions of the reaction-potential operator are exactly preserved in the BIBEE model for the sphere, which supports the use of this approximation for analyzing charge-charge interactions in molecular binding. Third, a comparison of BIBEE to the recent GBε theory suggests a modified BIBEE model capable of predicting electrostatic solvation free energies to within 4% of a full numerical Poisson calculation. This modified model leads to a projection-framework understanding of BIBEE and suggests opportunities for future improvements. © 2011 American Institute of Physics

  3. Traffic Multiresolution Modeling and Consistency Analysis of Urban Expressway Based on Asynchronous Integration Strategy

    Directory of Open Access Journals (Sweden)

    Liyan Zhang

    2017-01-01

    Full Text Available The paper studies multiresolution traffic flow simulation model of urban expressway. Firstly, compared with two-level hybrid model, three-level multiresolution hybrid model has been chosen. Then, multiresolution simulation framework and integration strategies are introduced. Thirdly, the paper proposes an urban expressway multiresolution traffic simulation model by asynchronous integration strategy based on Set Theory, which includes three submodels: macromodel, mesomodel, and micromodel. After that, the applicable conditions and derivation process of the three submodels are discussed in detail. In addition, in order to simulate and evaluate the multiresolution model, “simple simulation scenario” of North-South Elevated Expressway in Shanghai has been established. The simulation results showed the following. (1 Volume-density relationships of three submodels are unanimous with detector data. (2 When traffic density is high, macromodel has a high precision and smaller error and the dispersion of results is smaller. Compared with macromodel, simulation accuracies of micromodel and mesomodel are lower but errors are bigger. (3 Multiresolution model can simulate characteristics of traffic flow, capture traffic wave, and keep the consistency of traffic state transition. Finally, the results showed that the novel multiresolution model can have higher simulation accuracy and it is feasible and effective in the real traffic simulation scenario.

  4. An Integrated Gait and Balance Analysis System to Define Human Locomotor Control

    Science.gov (United States)

    2016-04-29

    test hypotheses they developed about how people walk. An Integrated Gait and Balance Analysis System to define Human Locomotor Control W911NF-14-R-0009...An Integrated Gait and Balance Analysis System to Define Human Locomotor Control Walking is a complicated task that requires the motor coordination...Gait and Balance Analysis System to Define Human Locomotor Control Report Title Walking is a complicated task that requires the motor coordination across

  5. Integration and segregation in auditory scene analysis

    Science.gov (United States)

    Sussman, Elyse S.

    2005-03-01

    Assessment of the neural correlates of auditory scene analysis, using an index of sound change detection that does not require the listener to attend to the sounds [a component of event-related brain potentials called the mismatch negativity (MMN)], has previously demonstrated that segregation processes can occur without attention focused on the sounds and that within-stream contextual factors influence how sound elements are integrated and represented in auditory memory. The current study investigated the relationship between the segregation and integration processes when they were called upon to function together. The pattern of MMN results showed that the integration of sound elements within a sound stream occurred after the segregation of sounds into independent streams and, further, that the individual streams were subject to contextual effects. These results are consistent with a view of auditory processing that suggests that the auditory scene is rapidly organized into distinct streams and the integration of sequential elements to perceptual units takes place on the already formed streams. This would allow for the flexibility required to identify changing within-stream sound patterns, needed to appreciate music or comprehend speech..

  6. Performance Criteria of Spatial Development Projects Based on Interregional Integration

    Directory of Open Access Journals (Sweden)

    Elena Viktorovna Kurushina

    2018-03-01

    Full Text Available The search of efficient ways for the development of regional socio-economic space is a relevant problem. The authors consider the models of spatial organization according to the Spatial Development Strategy of the Russian Federation until 2030. We conduct the comparative analysis of scenarios for the polarized and diversified spatial growth. Many investigations consider the concepts of polarized and endogenous growth. This study proposes a methodology to assess the development of macroregions and to increase the viability of interregional integration projects. To develop this methodology, we formulate scientific principles and indirect criteria of the project performance conforming to the theory of regional integration. In addition to the territorial community and complementarity of the development potentials, regional integration in the country should be based on the principles of security, networking, limited quantity and awareness of the potential project participants. Integration should ensure synergetic effects and take into account cultural and historical closeness, that manifests in the common mentality and existing economic relations among regions. The calculation results regarding the indirect criteria are obtained using the methods of classification and spatial correlation. This study confirms the hypothesis, that the formation of the Western Siberian and Ural macro-regions is appropriate. We have concluded this on the basis of the criteria of economic development, economic integration, the similarity of regional spaces as habitats, and a number of participants for the subjects of the Ural Federal District. The projection of the patterns of international economic integration to the interregional level allows predicting the highest probability for the successful cooperation among the Western Siberian regions with a high level of economic development. The authors’ method has revealed a high synchronization between the economies of

  7. [Model-based biofuels system analysis: a review].

    Science.gov (United States)

    Chang, Shiyan; Zhang, Xiliang; Zhao, Lili; Ou, Xunmin

    2011-03-01

    Model-based system analysis is an important tool for evaluating the potential and impacts of biofuels, and for drafting biofuels technology roadmaps and targets. The broad reach of the biofuels supply chain requires that biofuels system analyses span a range of disciplines, including agriculture/forestry, energy, economics, and the environment. Here we reviewed various models developed for or applied to modeling biofuels, and presented a critical analysis of Agriculture/Forestry System Models, Energy System Models, Integrated Assessment Models, Micro-level Cost, Energy and Emission Calculation Models, and Specific Macro-level Biofuel Models. We focused on the models' strengths, weaknesses, and applicability, facilitating the selection of a suitable type of model for specific issues. Such an analysis was a prerequisite for future biofuels system modeling, and represented a valuable resource for researchers and policy makers.

  8. Energy efficiency analysis of styrene production by adiabatic ethylbenzene dehydrogenation using exergy analysis and heat integration

    Directory of Open Access Journals (Sweden)

    Ali Emad

    2018-03-01

    Full Text Available Styrene is a valuable commodity for polymer industries. The main route for producing styrene by dehydrogenation of ethylbenzene consumes a substantial amount of energy because of the use of high-temperature steam. In this work, the process energy requirements and recovery are studied using Exergy analysis and Heat Integration (HI based on Pinch design method. The amount of steam plays a key role in the trade-off between Styrene yield and energy savings. Therefore, optimizing the operating conditions for energy reduction is infeasible. Heat integration indicated an insignificant reduction in the net energy demand and exergy losses, but 24% and 34% saving in external heating and cooling duties, respectively. When the required steam is generated by recovering the heat of the hot reactor effluent, a considerable saving in the net energy demand, as well as the heating and cooling utilities, can be achieved. Moreover, around 68% reduction in the exergy destruction is observed.

  9. Experimental assessment of computer codes used for safety analysis of integral reactors

    Energy Technology Data Exchange (ETDEWEB)

    Falkov, A.A.; Kuul, V.S.; Samoilov, O.B. [OKB Mechanical Engineering, Nizhny Novgorod (Russian Federation)

    1995-09-01

    Peculiarities of integral reactor thermohydraulics in accidents are associated with presence of noncondensable gas in built-in pressurizer, absence of pumped ECCS, use of guard vessel for LOCAs localisation and passive RHRS through in-reactor HX`s. These features defined the main trends in experimental investigations and verification efforts for computer codes applied. The paper reviews briefly the performed experimental investigation of thermohydraulics of AST-500, VPBER600-type integral reactors. The characteristic of UROVEN/MB-3 code for LOCAs analysis in integral reactors and results of its verification are given. The assessment of RELAP5/mod3 applicability for accident analysis in integral reactor is presented.

  10. Integration of XRootD into the cloud infrastructure for ALICE data analysis

    CERN Document Server

    Kompaniets, Mikhail; Svirin, Pavlo; Yurchenko, Volodymyr; Zarochentsev, Andrey

    2015-01-01

    Cloud technologies allow easy load balancing between different tasks and projects. From the viewpoint of the data analysis in the ALICE experiment, cloud allows to deploy software using Cern Virtual Machine (CernVM) and CernVM File System (CVMFS), to run different (including outdated) versions of software for long term data preservation and to dynamically allocate resources for different computing activities, e.g. grid site, ALICE Analysis Facility (AAF) and possible usage for local projects or other LHC experiments.We present a cloud solution for Tier-3 sites based on OpenStack and Ceph distributed storage with an integrated XRootD based storage element (SE). One of the key features of the solution is based on idea that Ceph has been used as a backend for Cinder Block Storage service for OpenStack, and in the same time as a storage backend for XRootD, with redundancy and availability of data preserved by Ceph settings. For faster and easier OpenStack deployment was applied the Packstack solution, which is ba...

  11. Device- and service profiles for integrated or systems based on open standards

    Directory of Open Access Journals (Sweden)

    Mildner Alexander

    2015-09-01

    Full Text Available Integrated OR systems nowadays are closed and proprietary, so that the interconnection of components from third-party vendors is only possible with high time and cost effort. An integrated operating theatre with open interfaces, giving clinical operators the opportunity to choose individual medical devices from different manufacturers, is currently being developed in the framework of the BMBF (Federal Ministry of Education and Research funded project OR.NET [1]. Actual standards and concepts regarding technical feasibility and accreditation process do not cope with the requirements for modular integration based on an open standard. Therefore, strategies as well as service and device profiles to enable a procedure for risk management and certifiability are in the focus of the project work. Amongst others, a concept for User Interface Profiles (UI-Profiles has been conceived in order to describe medical device functions and the entire user interface regarding Human-Machine-Interaction (HMI characteristics with the aim to identify human-induced risks of central user interfaces. The use of standardized device and service profiles shall allow the manufacturers to integrate their medical devices in the OR.NET network, without disclosing the medical devices’ risk analysis and related confidential knowledge or proprietary information.

  12. INTEGRAL EDUCATION, TIME AND SPACE: PROBLEMATIZING CONCEPTS

    Directory of Open Access Journals (Sweden)

    Ana Elisa Spaolonzi Queiroz Assis

    2018-03-01

    Full Text Available Integral Education, despite being the subject of public policy agenda for some decades, still carries disparities related to its concept. In this sense, this article aims to problematize not only the concepts of integral education but also the categories time and space contained in the magazines Em Aberto. They were organized and published by the National Institute of Educational Studies Anísio Teixeira (INEP, numbers 80 (2009 and 88 (2012, respectively entitled "Educação Integral e tempo integral" and " Políticas de educação integral em jornada ampliada". The methodology is based on Bardin’s content analysis, respecting the steps of pre-analysis (research corpus formed by the texts in the journals; material exploration (reading the texts encoding data choosing the registration units for categorization; and processing and interpretation of results, based on Saviani’s Historical-Critical Pedagogy. The work reveals convergent and divergent conceptual multiplicity, provoking a discussion about a critical conception of integral education. Keywords: Integral Education. Historical-Critical Pedagogy. Content Analysis.

  13. Clinical capabilities of graduates of an outcomes-based integrated medical program

    Directory of Open Access Journals (Sweden)

    Scicluna Helen A

    2012-06-01

    Full Text Available Abstract Background The University of New South Wales (UNSW Faculty of Medicine replaced its old content-based curriculum with an innovative new 6-year undergraduate entry outcomes-based integrated program in 2004. This paper is an initial evaluation of the perceived and assessed clinical capabilities of recent graduates of the new outcomes-based integrated medical program compared to benchmarks from traditional content-based or process-based programs. Method Self-perceived capability in a range of clinical tasks and assessment of medical education as preparation for hospital practice were evaluated in recent graduates after 3 months working as junior doctors. Responses of the 2009 graduates of the UNSW’s new outcomes-based integrated medical education program were compared to those of the 2007 graduates of UNSW’s previous content-based program, to published data from other Australian medical schools, and to hospital-based supervisor evaluations of their clinical competence. Results Three months into internship, graduates from UNSW’s new outcomes-based integrated program rated themselves to have good clinical and procedural skills, with ratings that indicated significantly greater capability than graduates of the previous UNSW content-based program. New program graduates rated themselves significantly more prepared for hospital practice in the confidence (reflective practice, prevention (social aspects of health, interpersonal skills (communication, and collaboration (teamwork subscales than old program students, and significantly better or equivalent to published benchmarks of graduates from other Australian medical schools. Clinical supervisors rated new program graduates highly capable for teamwork, reflective practice and communication. Conclusions Medical students from an outcomes-based integrated program graduate with excellent self-rated and supervisor-evaluated capabilities in a range of clinically-relevant outcomes. The program

  14. Evidence-based integrative medicine in clinical veterinary oncology.

    Science.gov (United States)

    Raditic, Donna M; Bartges, Joseph W

    2014-09-01

    Integrative medicine is the combined use of complementary and alternative medicine with conventional or traditional Western medicine systems. The demand for integrative veterinary medicine is growing, but evidence-based research on its efficacy is limited. In veterinary clinical oncology, such research could be translated to human medicine, because veterinary patients with spontaneous tumors are valuable translational models for human cancers. An overview of specific herbs, botanics, dietary supplements, and acupuncture evaluated in dogs, in vitro canine cells, and other relevant species both in vivo and in vitro is presented for their potential use as integrative therapies in veterinary clinical oncology. Published by Elsevier Inc.

  15. High-frequency acoustic spectrum analyzer based on polymer integrated optics

    Science.gov (United States)

    Yacoubian, Araz

    This dissertation presents an acoustic spectrum analyzer based on nonlinear polymer-integrated optics. The device is used in a scanning heterodyne geometry by zero biasing a Michelson interferometer. It is capable of detecting vibrations from DC to the GHz range. Initial low frequency experiments show that the device is an effective tool for analyzing an acoustic spectrum even in noisy environments. Three generations of integrated sensors are presented, starting with a very lossy (86 dB total insertion loss) initial device that detects vibrations as low as λ/10, and second and third generation improvements with a final device of 44 dB total insertion loss. The sensor was further tested for detecting a pulsed laser-excited vibration and resonances due to the structure of the sample. The data are compared to the acoustic spectrum measured using a low loss passive fiber interferometer detection scheme which utilizes a high speed detector. The peaks present in the passive detection scheme are clearly visible with our sensor data, which have a lower noise floor. Hybrid integration of GHz electronics is also investigated in this dissertation. A voltage controlled oscillator (VCO) is integrated on a polymer device using a new approach. The VCO is shown to operate as specified by the manufacturer, and the RF signal is efficiently launched onto the micro-strip line used for EO modulation. In the future this technology can be used in conjunction with the presented sensor to produce a fully integrated device containing high frequency drive electronics controlled by low DC voltage. Issues related to device fabrication, loss analysis, RF power delivery to drive circuitry, efficient poling of large area samples, and optimizing poling conditions are also discussed throughout the text.

  16. Integrated Modeling for the James Webb Space Telescope (JWST) Project: Structural Analysis Activities

    Science.gov (United States)

    Johnston, John; Mosier, Mark; Howard, Joe; Hyde, Tupper; Parrish, Keith; Ha, Kong; Liu, Frank; McGinnis, Mark

    2004-01-01

    This paper presents viewgraphs about structural analysis activities and integrated modeling for the James Webb Space Telescope (JWST). The topics include: 1) JWST Overview; 2) Observatory Structural Models; 3) Integrated Performance Analysis; and 4) Future Work and Challenges.

  17. Nanocantilever based mass sensor integrated with cmos circuitry

    DEFF Research Database (Denmark)

    Davis, Zachary James; Abadal, G.; Campabadal, F.

    2003-01-01

    We have demonstrated the successful integration of a cantilever based mass detector with standard CMOS circuitry. The purpose of the circuitry is to facilitate the readout of the cantilever's deflection in order to measure resonant frequency shifts of the cantilever. The principle and design...... of the mass detector are presented showing that miniaturization of such cantilever based resonant devices leads to highly sensitive mass sensors, which have the potential to detect single molecules. The design of the readout circuitry used for the first electrical characterization of an integrated cantilever...... with CMOS circuitry is demonstrated. The electrical characterization of the device shows that the resonant behavior of the cantilever depends on the applied voltages, which corresponds to theory....

  18. From organizational integration to clinical integration: analysis of the path between one level of integration to another using official documents

    Science.gov (United States)

    Mandza, Matey; Gagnon, Dominique; Carrier, Sébastien; Belzile, Louise; Demers, Louis

    2010-01-01

    Purpose Services’ integration comprises organizational, normative, economic, informational and clinical dimensions. Since 2004, the province of Quebec has devoted significant efforts to unify the governance of the main health and social care organizations of its various territories. Notwithstanding the uniformity of the national plan’s prescription, the territorial integration modalities greatly vary across the province. Theory This research is based upon a conceptual model of integration that comprises six components: inter-organizational partnership, case management, standardized assessment, a single entry point, a standardized service planning tool and a shared clinical file. Methods We conducted an embedded case study in six contrasted sites in terms of their level of integration. All documents prescribing the implementation of integration were retrieved and analyzed. Results and conclusions The analyzed documents demonstrate a growing local appropriation of the current integrative reform. Interestingly however, no link seems to exist between the quality of local prescriptions and the level of integration achieved in each site. This finding leads us to hypothesize that the variable quality of the operational accompaniment offered to implement these prescriptions is a variable in play.

  19. INTEGRATED ASSESSMENT AND GEOSPATIAL ANALYSIS OF ACCUMULATION OF PETROLEUM HYDROCARBONS IN THE SOIL COVER OF SAKHALIN ISLAND

    Directory of Open Access Journals (Sweden)

    V. V. Dmitriev

    2017-01-01

    Full Text Available The article considers the approach to the integral estimation of the assessment of petroleum hydrocarbons (PHc in the soil cover of Sakhalin Island. The soil map of Sakhalin was used as the cartographic base for this work. The soil map includes 103 soil polygons. An additional information on soils was also taken from The Soil Atlas of the Russian Federation. As an integral criterion for the accumulation of PHc, it is proposed to use an integral indicator calculated on the basis of 5 evaluation criteria. The choice of criteria for the assessment was based on the works of Russian scientists. The evaluation criteria on each of the polygons include information on the soil texture, the total thickness of the organic and humus horizons, the content of organic carbon in these horizons and the content of organic carbon in the mineral horizons, as well as the presence of a gley barrier.The calculation of the integral indicator is based on the principles of the ASPID methodology. On this basis, the authors compiled the map of the potential capacity of Sakhalin soils to accumulate petroleum hydrocarbons. On the basis of GIS-technology using the estimates of the integral indicator, the analysis has been performed revealing the features of spatial differentiation of PHc accumulation in the soil cover.The analysis and assessment of the accumulations of petroleum hydrocarbons has shown that peaty and peat boggy soil have the greatest ability to holding the PHc. The lowest ability to accumulate petroleum hydrocarbons is typical of illuvial-ferruginous podzols (illuvial low-humic podzols. The soils of this group occupy 1% of the island. In general, soils with low and very low hydrocarbon accumulation capacity occupy less than forty percent of the territory. 

  20. Ontology based heterogeneous materials database integration and semantic query

    Science.gov (United States)

    Zhao, Shuai; Qian, Quan

    2017-10-01

    Materials digital data, high throughput experiments and high throughput computations are regarded as three key pillars of materials genome initiatives. With the fast growth of materials data, the integration and sharing of data is very urgent, that has gradually become a hot topic of materials informatics. Due to the lack of semantic description, it is difficult to integrate data deeply in semantic level when adopting the conventional heterogeneous database integration approaches such as federal database or data warehouse. In this paper, a semantic integration method is proposed to create the semantic ontology by extracting the database schema semi-automatically. Other heterogeneous databases are integrated to the ontology by means of relational algebra and the rooted graph. Based on integrated ontology, semantic query can be done using SPARQL. During the experiments, two world famous First Principle Computational databases, OQMD and Materials Project are used as the integration targets, which show the availability and effectiveness of our method.

  1. Scaling analysis for a Savannah River reactor scaled model integral system

    International Nuclear Information System (INIS)

    Boucher, T.J.; Larson, T.K.; McCreery, G.E.; Anderson, J.L.

    1990-11-01

    801The Savannah River Laboratory has requested that the Idaho National Engineering Laboratory perform an analysis to help define, examine, and assess potential concepts for the design of a scaled integral hydraulics test facility representative of the current Savannah River Plant reactor design. In this report the thermal-hydraulic phenomena of importance (based on the knowledge and experience of the authors and the results of the joint INEL/TPG/SRL phenomena identification and ranking effort) to reactor safety during the design basis loss-of-coolant accident were examined and identified. Established scaling methodologies were used to develop potential concepts for integral hydraulic testing facilities. Analysis is conducted to examine the scaling of various phenomena in each of the selected concepts. Results generally support that a one-fourth (1/4) linear scale visual facility capable of operating at pressures up to 350 kPa (51 psia) and temperatures up to 330 K (134 degree F) will scale most hydraulic phenomena reasonably well. However, additional research will be necessary to determine the most appropriate method of simulating several of the reactor components, since the scaling methodology allows for several approaches which may only be assessed via appropriate research. 34 refs., 20 figs., 14 tabs

  2. Integral finite element analysis of turntable bearing with flexible rings

    Science.gov (United States)

    Deng, Biao; Liu, Yunfei; Guo, Yuan; Tang, Shengjin; Su, Wenbin; Lei, Zhufeng; Wang, Pengcheng

    2018-03-01

    This paper suggests a method to calculate the internal load distribution and contact stress of the thrust angular contact ball turntable bearing by FEA. The influence of the stiffness of the bearing structure and the plastic deformation of contact area on the internal load distribution and contact stress of the bearing is considered. In this method, the load-deformation relationship of the rolling elements is determined by the finite element contact analysis of a single rolling element and the raceway. Based on this, the nonlinear contact between the rolling elements and the inner and outer ring raceways is same as a nonlinear compression spring and bearing integral finite element analysis model including support structure was established. The effects of structural deformation and plastic deformation on the built-in stress distribution of slewing bearing are investigated on basis of comparing the consequences of load distribution, inner and outer ring stress, contact stress and other finite element analysis results with the traditional bearing theory, which has guiding function for improving the design of slewing bearing.

  3. Integrated Arts-Based Teaching (IAT) Model for Brain-Based Learning

    Science.gov (United States)

    Inocian, Reynaldo B.

    2015-01-01

    This study analyzes teaching strategies among the eight books in Principles and Methods of Teaching recommended for use in the College of Teacher Education in the Philippines. It seeks to answer the following objectives: (1) identify the most commonly used teaching strategies congruent with the integrated arts-based teaching (IAT) and (2) design…

  4. Incorporating Applied Behavior Analysis to Assess and Support Educators' Treatment Integrity

    Science.gov (United States)

    Collier-Meek, Melissa A.; Sanetti, Lisa M. H.; Fallon, Lindsay M.

    2017-01-01

    For evidence-based interventions to be effective for students they must be consistently implemented, however, many teachers struggle with treatment integrity and require support. Although many implementation support strategies are research based, there is little empirical guidance about the types of treatment integrity, implementers, and contexts…

  5. Primary healthcare-based integrated care with opioid agonist treatment: First experience from Ukraine.

    Science.gov (United States)

    Morozova, Olga; Dvoriak, Sergey; Pykalo, Iryna; Altice, Frederick L

    2017-04-01

    Ukraine's HIV epidemic is concentrated among people who inject drugs (PWID), however, coverage with opioid agonist therapies (OATs) available mostly at specialty addiction clinics is extremely low. OAT integrated into primary healthcare clinics (PHCs) provides an opportunity for integrating comprehensive healthcare services and scaling up OAT. A pilot study of PHC-based integrated care for drug users conducted in two Ukrainian cities between 2014 and 2016 included three sub-studies: 1) cross-sectional treatment site preference assessment among current OAT patients (N=755); 2) observational cohort of 107 PWID who continued the standard of care versus transition of stabilized and newly enrolled PWID into PHC-based integrated care; and 3) pre/post analysis of attitudes toward PWID and HIV patients by PHC staff (N=26). Among 755 OAT patients, 53.5% preferred receiving OAT at PHCs, which was independently correlated with convenience, trust in physician, and treatment with methadone (vs. buprenorphine). In 107 PWID observed over 6 months, retention in treatment was high: 89% in PWID continuing OAT in specialty addiction treatment settings (standard of care) vs 94% in PWID transitioning to PHCs; and 80% among PWID newly initiating OAT in PHCs. Overall, satisfaction with treatment, subjective self-perception of well-being, and trust in physician significantly increased in patients prescribed OAT in PHCs. Among PHC staff, attitudes towards PWID and HIV patients significantly improved over time. OAT can be successfully integrated into primary care in low and middle-income countries and improves outcomes in both patients and clinicians while potentially scaling-up OAT for PWID. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Consequence Based Design. An approach for integrating computational collaborative models (Integrated Dynamic Models) in the building design phase

    DEFF Research Database (Denmark)

    Negendahl, Kristoffer

    relies on various advancements in the area of integrated dynamic models. It also relies on the application and test of the approach in practice to evaluate the Consequence based design and the use of integrated dynamic models. As a result, the Consequence based design approach has been applied in five...... and define new ways to implement integrated dynamic models for the following project. In parallel, seven different developments of new methods, tools and algorithms have been performed to support the application of the approach. The developments concern: Decision diagrams – to clarify goals and the ability...... affect the design process and collaboration between building designers and simulationists. Within the limits of applying the approach of Consequence based design to five case studies, followed by documentation based on interviews, surveys and project related documentations derived from internal reports...

  7. An Integrated Approach to “Sustainable Community-Based Tourism”

    Directory of Open Access Journals (Sweden)

    Tek B. Dangi

    2016-05-01

    Full Text Available Two rich knowledge domains have been evolving along parallel pathways in tourism studies: sustainable tourism (ST and community-based tourism (CBT. Within both lie diverse definitions, principles, criteria, critical success factors and benefits sought or outcomes desired, advocated by different stakeholders ranging from quasi-governmental and non-profit organizations to public-private sector and academic interests. This poses significant challenges to those interested in theory building, research and practice in the sustainable development and management of tourism. The paper builds on a previous article published in Sustainability by presenting an integrated framework based on a comprehensive, in-depth review and analysis of the tourism-related literature. The study reveals not just common ground and differences that might be anticipated, but also important sustainability dimensions that are lagging or require much greater attention, such as equity, justice, ethical and governance issues. A preliminary framework of “sustainable community-based tourism” (SCBT is forwarded that attempts to bridge the disparate literature on ST and CBT. Critical directions forward are offered to progress research and sustainability-oriented practices towards more effective development and management of tourism in the 21st century.

  8. Reconstruction of biological networks based on life science data integration.

    Science.gov (United States)

    Kormeier, Benjamin; Hippe, Klaus; Arrigo, Patrizio; Töpel, Thoralf; Janowski, Sebastian; Hofestädt, Ralf

    2010-10-27

    For the implementation of the virtual cell, the fundamental question is how to model and simulate complex biological networks. Therefore, based on relevant molecular database and information systems, biological data integration is an essential step in constructing biological networks. In this paper, we will motivate the applications BioDWH--an integration toolkit for building life science data warehouses, CardioVINEdb--a information system for biological data in cardiovascular-disease and VANESA--a network editor for modeling and simulation of biological networks. Based on this integration process, the system supports the generation of biological network models. A case study of a cardiovascular-disease related gene-regulated biological network is also presented.

  9. Analysis of thermal-plastic response of shells of revolution by numerical integration

    International Nuclear Information System (INIS)

    Leonard, J.W.

    1975-01-01

    A numerical method based instead on the numerical integration of the governing shell equations has been shown, for elastic cases, to be more efficient than the finite element method when applied to shells of revolution. In the numerical integration method, the governing differential equations of motions are converted into a set of initial-value problems. Each initial-value problem is integrated numerically between meridional boundary points and recombined so as to satisfy boundary conditions. For large-deflection elasto-plastic behavior, the equations are nonlinear and, hence, are recombined in an iterative manner using the Newton-Raphson procedure. Suppression techniques are incorporated in order to eliminate extraneous solutions within the numerical integration procedure. The Reissner-Meissner shell theory for shells of revolution is adopted to account for large deflection and higher-order rotation effects. The computer modelling of the equations is quite general in that specific shell segment geometries, e.g. cylindrical, spherical, toroidal, conical segments, and any combinations thereof can be handled easily. The elasto-plastic constitutive relations adopted are in accordance with currently recommended constitutive equations for inelastic design analysis of FFTF Components. The Von Mises yield criteria and associated flow rule is used and the kinematic hardening law is followed. Examples are considered in which stainless steels common to LMFBR application are used

  10. Sextant: an expert system for transient analysis of nuclear reactors and integral test facilities

    International Nuclear Information System (INIS)

    Barbet, N.; Dumas, M.; Mihelich, G.

    1987-01-01

    Expert systems provide a new way of dealing with the computer-aided management of nuclear plants by combining several knowledge bases and reasoning modes together with a set of numerical models for real-time analysis of transients. New development tools are required together with metaknowledge bases handling temporal hypothetical reasoning and planning. They have to be efficient and robust because during a transient, neither measurements nor models, nor scenarios are hold as absolute references. SEXTANT is a general purpose physical analyzer intended to provide a pattern and avoid duplication of general tools and knowledge bases for similar applications. It combines several knowledge bases concerning measurements, models and qualitative behavior of PWR with a mechanism of conjecture-refutation and a set of simplified models matching the current physical state. A prototype is under assessment by dealing with integral test facility transients. For its development, SEXTANT requires a powerful shell. SPIRAL is such a toolkit, oriented towards online analysis of complex processes and already used in several applications

  11. International market integration for natural gas? A cointegration analysis of prices in Europe, North America and Japan

    International Nuclear Information System (INIS)

    Siliverstovs, Boriss; L'Hegaret, Guillaume; Neumann, Anne; Hirschlausen, Christian von

    2005-01-01

    This paper investigates the degree of integration of natural gas markets in Europe, North America and Japan in the time period between the early 1990s and 2004. The relationship between international gas market prices and their relation to the oil price are explored through principal components analysis and Johansen likelihood-based cointegration procedure. Both of them show a high level of natural gas market integration within Europe, between the European and Japanese markets as well as within the North American market. At the same time the obtained results suggest that the European (respectively, Japanese) and the North American markets were not integrated. (Author)

  12. Heater-Integrated Cantilevers for Nano-Samples Thermogravimetric Analysis

    OpenAIRE

    Toffoli, Valeria; Carrato, Sergio; Lee, Dongkyu; Jeon, Sangmin; Lazzarino, Marco

    2013-01-01

    The design and characteristics of a micro-system for thermogravimetric analysis (TGA) in which heater, temperature sensor and mass sensor are integrated into a single device are presented. The system consists of a suspended cantilever that incorporates a microfabricated resistor, used as both heater and thermometer. A three-dimensional finite element analysis was used to define the structure parameters. TGA sensors were fabricated by standard microlithographic techniques and tested using mill...

  13. Deterministic factor analysis: methods of integro-differentiation of non-integral order

    Directory of Open Access Journals (Sweden)

    Valentina V. Tarasova

    2016-12-01

    Full Text Available Objective to summarize the methods of deterministic factor economic analysis namely the differential calculus and the integral method. nbsp Methods mathematical methods for integrodifferentiation of nonintegral order the theory of derivatives and integrals of fractional nonintegral order. Results the basic concepts are formulated and the new methods are developed that take into account the memory and nonlocality effects in the quantitative description of the influence of individual factors on the change in the effective economic indicator. Two methods are proposed for integrodifferentiation of nonintegral order for the deterministic factor analysis of economic processes with memory and nonlocality. It is shown that the method of integrodifferentiation of nonintegral order can give more accurate results compared with standard methods method of differentiation using the first order derivatives and the integral method using the integration of the first order for a wide class of functions describing effective economic indicators. Scientific novelty the new methods of deterministic factor analysis are proposed the method of differential calculus of nonintegral order and the integral method of nonintegral order. Practical significance the basic concepts and formulas of the article can be used in scientific and analytical activity for factor analysis of economic processes. The proposed method for integrodifferentiation of nonintegral order extends the capabilities of the determined factorial economic analysis. The new quantitative method of deterministic factor analysis may become the beginning of quantitative studies of economic agents behavior with memory hereditarity and spatial nonlocality. The proposed methods of deterministic factor analysis can be used in the study of economic processes which follow the exponential law in which the indicators endogenous variables are power functions of the factors exogenous variables including the processes

  14. Analysis of thevenin equivalent network of a distribution system for solar integration studies

    DEFF Research Database (Denmark)

    Yang, Guangya; Mattesen, Majken; Kjaer, Søren Bækhøj

    2012-01-01

    generations and expected to play a significant role in the future sustainable energy system. Currently one of the main issues for solar integration is the voltage regulation problem in the LV grid, as to the small X/R ratios. Hence, the voltage control techniques developed for the MV and HV networks may need...... to be further evaluated before applied for the LV grid. For the inverter voltage control design, it is useful to develop a realistic Thevenin equivalent model for the grid to ease the analysis. In this paper, case studies are performed based on the analysis of a realistic distribution network for the design...

  15. Imagery Integration Team

    Science.gov (United States)

    Calhoun, Tracy; Melendrez, Dave

    2014-01-01

    The Human Exploration Science Office (KX) provides leadership for NASA's Imagery Integration (Integration 2) Team, an affiliation of experts in the use of engineering-class imagery intended to monitor the performance of launch vehicles and crewed spacecraft in flight. Typical engineering imagery assessments include studying and characterizing the liftoff and ascent debris environments; launch vehicle and propulsion element performance; in-flight activities; and entry, landing, and recovery operations. Integration 2 support has been provided not only for U.S. Government spaceflight (e.g., Space Shuttle, Ares I-X) but also for commercial launch providers, such as Space Exploration Technologies Corporation (SpaceX) and Orbital Sciences Corporation, servicing the International Space Station. The NASA Integration 2 Team is composed of imagery integration specialists from JSC, the Marshall Space Flight Center (MSFC), and the Kennedy Space Center (KSC), who have access to a vast pool of experience and capabilities related to program integration, deployment and management of imagery assets, imagery data management, and photogrammetric analysis. The Integration 2 team is currently providing integration services to commercial demonstration flights, Exploration Flight Test-1 (EFT-1), and the Space Launch System (SLS)-based Exploration Missions (EM)-1 and EM-2. EM-2 will be the first attempt to fly a piloted mission with the Orion spacecraft. The Integration 2 Team provides the customer (both commercial and Government) with access to a wide array of imagery options - ground-based, airborne, seaborne, or vehicle-based - that are available through the Government and commercial vendors. The team guides the customer in assembling the appropriate complement of imagery acquisition assets at the customer's facilities, minimizing costs associated with market research and the risk of purchasing inadequate assets. The NASA Integration 2 capability simplifies the process of securing one

  16. GO-FLOW methodology. Basic concept and integrated analysis framework for its applications

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi

    2010-01-01

    GO-FLOW methodology is a success oriented system analysis technique, and is capable of evaluating a large system with complex operational sequences. Recently an integrated analysis framework of the GO-FLOW has been developed for the safety evaluation of elevator systems by the Ministry of Land, Infrastructure, Transport and Tourism, Japanese Government. This paper describes (a) an Overview of the GO-FLOW methodology, (b) Procedure of treating a phased mission problem, (c) Common cause failure analysis, (d) Uncertainty analysis, and (e) Integrated analysis framework. The GO-FLOW methodology is a valuable and useful tool for system reliability analysis and has a wide range of applications. (author)

  17. An integrated approach for integrated intelligent instrumentation and control system (I{sup 3}CS)

    Energy Technology Data Exchange (ETDEWEB)

    Jung, C H; Kim, J T; Kwon, K C [Korea Atomic Energy Research Inst., Yusong, Taejon (Korea, Republic of)

    1997-07-01

    Nuclear power plants to guarantee the safety of public should be designed to reduce the operator intervention resulting in operating human errors, identify the process states in transients, and aid to make a decision of their tasks and guide operator actions. For the sake of this purpose, MMIS(MAN-Machine Interface System) in NPPs should be the integrated top-down approach tightly focused on the function-based task analysis including an advanced digital technology, an operator support function, and so on. The advanced I and C research team in KAERI has embarked on developing an Integrated Intelligent Instrumentation and Control System (I{sup 3}CS) for Korea`s next generation nuclear power plants. I{sup 3}CS bases the integrated top-down approach on the function-based task analysis, modern digital technology, standardization and simplification, availability and reliability, and protection of investment. (author). 4 refs, 6 figs.

  18. Development of integrity evaluation technology for pressurized components in nuclear power plant and IT based integrity evaluation system

    International Nuclear Information System (INIS)

    Kim, Young Jin; Choi, Jae Boong; Shim, Do Jun

    2004-02-01

    The objective of this research is to develop on efficient integrity evaluation technology and to investigate the applicability of the newly-developed technology such as internet-based cyber platform etc. to Nuclear Power Plant(NPP) components. The development of an efficient structural integrity evaluation system is necessary for safe operation of NPP as the increase of operating periods. Moreover, material test data as well as emerging structural integrity assessment technology are also needed for the evaluation of aged components. The following five topics are covered in this project: development of the wall-thinning evaluation program for nuclear piping; development of structural integrity evaluation criteria for steam generator tubes with cracks of various shape; development of fatigue life evaluation system for major components of NPP; ingegration of internet-based cyber platform and integrity evaluation program for primary components of NPP; effects of aging on strength of dissimilar welds

  19. Development of integrity evaluation technology for pressurized components in nuclear power plant and IT based integrity evaluation system

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Jin; Choi, Jae Boong; Shim, Do Jun [Sungkyunkwan Univ., Seoul (Korea, Republic of)] (and others)

    2004-02-15

    The objective of this research is to develop on efficient integrity evaluation technology and to investigate the applicability of the newly-developed technology such as internet-based cyber platform etc. to Nuclear Power Plant(NPP) components. The development of an efficient structural integrity evaluation system is necessary for safe operation of NPP as the increase of operating periods. Moreover, material test data as well as emerging structural integrity assessment technology are also needed for the evaluation of aged components. The following five topics are covered in this project: development of the wall-thinning evaluation program for nuclear piping; development of structural integrity evaluation criteria for steam generator tubes with cracks of various shape; development of fatigue life evaluation system for major components of NPP; ingegration of internet-based cyber platform and integrity evaluation program for primary components of NPP; effects of aging on strength of dissimilar welds.

  20. A 40 GHz fully integrated circuit with a vector network analyzer and a coplanar-line-based detection area for circulating tumor cell analysis using 65 nm CMOS technology

    Science.gov (United States)

    Nakanishi, Taiki; Matsunaga, Maya; Kobayashi, Atsuki; Nakazato, Kazuo; Niitsu, Kiichi

    2018-03-01

    A 40-GHz fully integrated CMOS-based circuit for circulating tumor cells (CTC) analysis, consisting of an on-chip vector network analyzer (VNA) and a highly sensitive coplanar-line-based detection area is presented in this paper. In this work, we introduce a fully integrated architecture that eliminates unwanted parasitic effects. The proposed analyzer was designed using 65 nm CMOS technology, and SPICE and MWS simulations were used to validate its operation. The simulation confirmed that the proposed circuit can measure S-parameter shifts resulting from the addition of various types of tumor cells to the detection area, the data of which are provided in a previous study: the |S 21| values for HepG2, A549, and HEC-1-A cells are -0.683, -0.580, and -0.623 dB, respectively. Additionally, the measurement demonstrated an S-parameters reduction of -25.7% when a silicone resin was put on the circuit. Hence, the proposed system is expected to contribute to cancer diagnosis.

  1. An integrated environment of software development and V and V for PLC based safety-critical systems

    International Nuclear Information System (INIS)

    Koo, Seo Ryong

    2005-02-01

    To develop and implement a safety-critical system, the requirements of the system must be analyzed thoroughly during the phases of a software development's life cycle because a single error in the requirements can generate serious software faults. We therefore propose an Integrated Environment (IE) approach for requirements which is an integrated approach that enables easy inspection by combining requirement traceability and effective use of a formal method. For the V and V tasks of requirements phase, our approach uses software inspection, requirement traceability, and formal specification with structural decomposition. Software inspection and the analysis of requirements traceability are the most effective methods of software V and V. Although formal methods are also considered an effective V and V activity, they are difficult to use properly in nuclear fields, as well as in other fields, because of their mathematical nature. We also propose another Integrated Environment (IE) for the design and implementation of safety-critical systems. In this study, a nuclear FED-style design specification and analysis (NuFDS) approach was proposed for PLC based safety-critical systems. The NuFDS approach is suggested in a straightforward manner for the effective and formal specification and analysis of software designs. Accordingly, the proposed NuFDS approach comprises one technique for specifying the software design and another for analyzing the software design. In addition, with the NuFDS approach, we can analyze the safety of software on the basis of fault tree synthesis. To analyze the design phase more effectively, we propose a technique of fault tree synthesis, along with a universal fault tree template for the architecture modules of nuclear software. Various tools have been needed to make software V and V more convenient. We therefore developed four kinds of computer-aided software engineering tools that could be used in accordance with the software's life cycle to

  2. Presentation planning using an integrated knowledge base

    Science.gov (United States)

    Arens, Yigal; Miller, Lawrence; Sondheimer, Norman

    1988-01-01

    A description is given of user interface research aimed at bringing together multiple input and output modes in a way that handles mixed mode input (commands, menus, forms, natural language), interacts with a diverse collection of underlying software utilities in a uniform way, and presents the results through a combination of output modes including natural language text, maps, charts and graphs. The system, Integrated Interfaces, derives much of its ability to interact uniformly with the user and the underlying services and to build its presentations, from the information present in a central knowledge base. This knowledge base integrates models of the application domain (Navy ships in the Pacific region, in the current demonstration version); the structure of visual displays and their graphical features; the underlying services (data bases and expert systems); and interface functions. The emphasis is on a presentation planner that uses the knowledge base to produce multi-modal output. There has been a flurry of recent work in user interface management systems. (Several recent examples are listed in the references). Existing work is characterized by an attempt to relieve the software designer of the burden of handcrafting an interface for each application. The work has generally focused on intelligently handling input. This paper deals with the other end of the pipeline - presentations.

  3. ANALYSIS DATA SETS USING HYBRID TECHNIQUES APPLIED ARTIFICIAL INTELLIGENCE BASED PRODUCTION SYSTEMS INTEGRATED DESIGN

    Directory of Open Access Journals (Sweden)

    Daniel-Petru GHENCEA

    2017-06-01

    Full Text Available The paper proposes a prediction model of behavior spindle from the point of view of the thermal deformations and the level of the vibrations by highlighting and processing the characteristic equations. This is a model analysis for the shaft with similar electro-mechanical characteristics can be achieved using a hybrid analysis based on artificial intelligence (genetic algorithms - artificial neural networks - fuzzy logic. The paper presents a prediction mode obtaining valid range of values for spindles with similar characteristics based on measured data sets from a few spindles test without additional measures being required. Extracting polynomial functions of graphs resulting from simultaneous measurements and predict the dynamics of the two features with multi-objective criterion is the main advantage of this method.

  4. Spherical and plane integral operators for PDEs construction, analysis, and applications

    CERN Document Server

    Sabelfeld, Karl K

    2013-01-01

    The book presents integral formulations for partial differential equations, with the focus on spherical and plane integral operators. The integral relations are obtained for different elliptic and parabolic equations, and both direct and inverse mean value relations are studied. The derived integral equations are used to construct new numerical methods for solving relevant boundary value problems, both deterministic and stochastic based on probabilistic interpretation of the spherical and plane integral operators.

  5. A peaking-regulation-balance-based method for wind & PV power integrated accommodation

    Science.gov (United States)

    Zhang, Jinfang; Li, Nan; Liu, Jun

    2018-02-01

    Rapid development of China’s new energy in current and future should be focused on cooperation of wind and PV power. Based on the analysis of system peaking balance, combined with the statistical features of wind and PV power output characteristics, a method of comprehensive integrated accommodation analysis of wind and PV power is put forward. By the electric power balance during night peaking load period in typical day, wind power installed capacity is determined firstly; then PV power installed capacity could be figured out by midday peak load hours, which effectively solves the problem of uncertainty when traditional method hard determines the combination of the wind and solar power simultaneously. The simulation results have validated the effectiveness of the proposed method.

  6. Integrating ICT in Agriculture for Knowledge-Based Economy

    African Journals Online (AJOL)

    agriculture –based livelihoods, demands the integration of ICT knowledge with agriculture. .... (CGIAR) shows the vital role of Agricultural development in Rwanda's ... Network, Rwanda National Backbone Project, Regional Communication.

  7. 3-D fracture analysis using a partial-reduced integration scheme

    International Nuclear Information System (INIS)

    Leitch, B.W.

    1987-01-01

    This paper presents details of 3-D elastic-plastic analyses of axially orientated external surface flaw in an internally pressurized thin-walled cylinder and discusses the variation of the J-integral values around the crack tip. A partial-reduced-integration-penalty method is introduced to minimize this variation of the J-integral near the crack tip. Utilizing 3-D symmetry, an eighth segment of a tube containing an elliptically shaped external surface flaw is modelled using 20-noded isoparametric elements. The crack-tip elements are collapsed to form a 1/r stress singularity about the curved crack front. The finite element model is subjected to internal pressure and axial pressure-generated loads. The virtual crack extension method is used to determine linear elastic stress intensity factors from the J-integral results at various points around the crack front. Despite the different material constants and the thinner wall thickness in this analysis, the elastic results compare favourably with those obtained by other researchers. The nonlinear stress-strain behaviour of the tube material is modelled using an incremental theory of plasticity. Variations of the J-integral values around the curved crack front of the 3-D flaw were seen. These variations could not be resolved by neglecting the immediate crack-tip elements J-integral results in favour of the more remote contour paths or else smoothed out when all the path results are averaged. Numerical incompatabilities in the 20-noded 3-D finite elements used to model the surface flaw were found. A partial-reduced integration scheme, using a combination of full and reduced integration elements, is proposed to determine J-integral results for 3-D fracture analyses. This procedure is applied to the analysis of an external semicircular surface flaw projecting halfway into the tube wall thickness. Examples of the J-integral values, before and after the partial-reduced integration method is employed, are given around the

  8. Integrating knowledge based functionality in commercial hospital information systems.

    Science.gov (United States)

    Müller, M L; Ganslandt, T; Eich, H P; Lang, K; Ohmann, C; Prokosch, H U

    2000-01-01

    Successful integration of knowledge-based functions in the electronic patient record depends on direct and context-sensitive accessibility and availability to clinicians and must suit their workflow. In this paper we describe an exemplary integration of an existing standalone scoring system for acute abdominal pain into two different commercial hospital information systems using Java/Corba technolgy.

  9. Graphene based integrated tandem supercapacitors fabricated directly on separators

    KAUST Repository

    Chen, Wei

    2015-04-09

    It is of great importance to fabricate integrated supercapacitors with extended operation voltages as high energy density storage devices. In this work, we develop a novel direct electrode deposition on separator (DEDS) process to fabricate graphene based integrated tandem supercapacitors for the first time. The DEDS process generates compact graphene-polyaniline electrodes directly on the separators to form integrated supercapacitors. The integrated graphene-polyaniline tandem supercapacitors demonstrate ultrahigh volumetric energy density of 52.5 Wh L^(−1) at power density of 6037 W L^(−1) and excellent gravimetric energy density of 26.1 Wh kg^(−1) at power density of 3002 W kg^(−1) with outstanding electrochemical stability for over 10000 cycles. This study show great promises for the future development of integrated energy storage devices.

  10. A taxonomy of integral reaction path analysis

    Energy Technology Data Exchange (ETDEWEB)

    Grcar, Joseph F.; Day, Marcus S.; Bell, John B.

    2004-12-23

    W. C. Gardiner observed that achieving understanding through combustion modeling is limited by the ability to recognize the implications of what has been computed and to draw conclusions about the elementary steps underlying the reaction mechanism. This difficulty can be overcome in part by making better use of reaction path analysis in the context of multidimensional flame simulations. Following a survey of current practice, an integral reaction flux is formulated in terms of conserved scalars that can be calculated in a fully automated way. Conditional analyses are then introduced, and a taxonomy for bidirectional path analysis is explored. Many examples illustrate the resulting path analysis and uncover some new results about nonpremixed methane-air laminar jets.

  11. Looking beyond borders: integrating best practices in benefit-risk analysis into the field of food and nutrition.

    Science.gov (United States)

    Tijhuis, M J; Pohjola, M V; Gunnlaugsdóttir, H; Kalogeras, N; Leino, O; Luteijn, J M; Magnússon, S H; Odekerken-Schröder, G; Poto, M; Tuomisto, J T; Ueland, O; White, B C; Holm, F; Verhagen, H

    2012-01-01

    An integrated benefit-risk analysis aims to give guidance in decision situations where benefits do not clearly prevail over risks, and explicit weighing of benefits and risks is thus indicated. The BEPRARIBEAN project aims to advance benefit-risk analysis in the area of food and nutrition by learning from other fields. This paper constitutes the final stage of the project, in which commonalities and differences in benefit-risk analysis are identified between the Food and Nutrition field and other fields, namely Medicines, Food Microbiology, Environmental Health, Economics and Marketing-Finance, and Consumer Perception. From this, ways forward are characterized for benefit-risk analysis in Food and Nutrition. Integrated benefit-risk analysis in Food and Nutrition may advance in the following ways: Increased engagement and communication between assessors, managers, and stakeholders; more pragmatic problem-oriented framing of assessment; accepting some risk; pre- and post-market analysis; explicit communication of the assessment purpose, input and output; more human (dose-response) data and more efficient use of human data; segmenting populations based on physiology; explicit consideration of value judgments in assessment; integration of multiple benefits and risks from multiple domains; explicit recognition of the impact of consumer beliefs, opinions, views, perceptions, and attitudes on behaviour; and segmenting populations based on behaviour; the opportunities proposed here do not provide ultimate solutions; rather, they define a collection of issues to be taken account of in developing methods, tools, practices and policies, as well as refining the regulatory context, for benefit-risk analysis in Food and Nutrition and other fields. Thus, these opportunities will now need to be explored further and incorporated into benefit-risk practice and policy. If accepted, incorporation of these opportunities will also involve a paradigm shift in Food and Nutrition benefit

  12. Towards a framework for agent-based image analysis of remote-sensing data.

    Science.gov (United States)

    Hofmann, Peter; Lettmayer, Paul; Blaschke, Thomas; Belgiu, Mariana; Wegenkittl, Stefan; Graf, Roland; Lampoltshammer, Thomas Josef; Andrejchenko, Vera

    2015-04-03

    Object-based image analysis (OBIA) as a paradigm for analysing remotely sensed image data has in many cases led to spatially and thematically improved classification results in comparison to pixel-based approaches. Nevertheless, robust and transferable object-based solutions for automated image analysis capable of analysing sets of images or even large image archives without any human interaction are still rare. A major reason for this lack of robustness and transferability is the high complexity of image contents: Especially in very high resolution (VHR) remote-sensing data with varying imaging conditions or sensor characteristics, the variability of the objects' properties in these varying images is hardly predictable. The work described in this article builds on so-called rule sets. While earlier work has demonstrated that OBIA rule sets bear a high potential of transferability, they need to be adapted manually, or classification results need to be adjusted manually in a post-processing step. In order to automate these adaptation and adjustment procedures, we investigate the coupling, extension and integration of OBIA with the agent-based paradigm, which is exhaustively investigated in software engineering. The aims of such integration are (a) autonomously adapting rule sets and (b) image objects that can adopt and adjust themselves according to different imaging conditions and sensor characteristics. This article focuses on self-adapting image objects and therefore introduces a framework for agent-based image analysis (ABIA).

  13. Momentum integral network method for thermal-hydraulic transient analysis

    International Nuclear Information System (INIS)

    Van Tuyle, G.J.

    1983-01-01

    A new momentum integral network method has been developed, and tested in the MINET computer code. The method was developed in order to facilitate the transient analysis of complex fluid flow and heat transfer networks, such as those found in the balance of plant of power generating facilities. The method employed in the MINET code is a major extension of a momentum integral method reported by Meyer. Meyer integrated the momentum equation over several linked nodes, called a segment, and used a segment average pressure, evaluated from the pressures at both ends. Nodal mass and energy conservation determined nodal flows and enthalpies, accounting for fluid compression and thermal expansion

  14. A comparison of sequential and information-based methods for determining the co-integration rank in heteroskedastic VAR MODELS

    DEFF Research Database (Denmark)

    Cavaliere, Giuseppe; Angelis, Luca De; Rahbek, Anders

    2015-01-01

    In this article, we investigate the behaviour of a number of methods for estimating the co-integration rank in VAR systems characterized by heteroskedastic innovation processes. In particular, we compare the efficacy of the most widely used information criteria, such as Akaike Information Criterion....... The relative finite-sample properties of the different methods are investigated by means of a Monte Carlo simulation study. For the simulation DGPs considered in the analysis, we find that the BIC-based procedure and the bootstrap sequential test procedure deliver the best overall performance in terms......-based method to over-estimate the co-integration rank in relatively small sample sizes....

  15. Data integration for plant genomics--exemplars from the integration of Arabidopsis thaliana databases.

    Science.gov (United States)

    Lysenko, Artem; Lysenko, Atem; Hindle, Matthew Morritt; Taubert, Jan; Saqi, Mansoor; Rawlings, Christopher John

    2009-11-01

    The development of a systems based approach to problems in plant sciences requires integration of existing information resources. However, the available information is currently often incomplete and dispersed across many sources and the syntactic and semantic heterogeneity of the data is a challenge for integration. In this article, we discuss strategies for data integration and we use a graph based integration method (Ondex) to illustrate some of these challenges with reference to two example problems concerning integration of (i) metabolic pathway and (ii) protein interaction data for Arabidopsis thaliana. We quantify the degree of overlap for three commonly used pathway and protein interaction information sources. For pathways, we find that the AraCyc database contains the widest coverage of enzyme reactions and for protein interactions we find that the IntAct database provides the largest unique contribution to the integrated dataset. For both examples, however, we observe a relatively small amount of data common to all three sources. Analysis and visual exploration of the integrated networks was used to identify a number of practical issues relating to the interpretation of these datasets. We demonstrate the utility of these approaches to the analysis of groups of coexpressed genes from an individual microarray experiment, in the context of pathway information and for the combination of coexpression data with an integrated protein interaction network.

  16. Integration, warehousing, and analysis strategies of Omics data.

    Science.gov (United States)

    Gedela, Srinubabu

    2011-01-01

    "-Omics" is a current suffix for numerous types of large-scale biological data generation procedures, which naturally demand the development of novel algorithms for data storage and analysis. With next generation genome sequencing burgeoning, it is pivotal to decipher a coding site on the genome, a gene's function, and information on transcripts next to the pure availability of sequence information. To explore a genome and downstream molecular processes, we need umpteen results at the various levels of cellular organization by utilizing different experimental designs, data analysis strategies and methodologies. Here comes the need for controlled vocabularies and data integration to annotate, store, and update the flow of experimental data. This chapter explores key methodologies to merge Omics data by semantic data carriers, discusses controlled vocabularies as eXtensible Markup Languages (XML), and provides practical guidance, databases, and software links supporting the integration of Omics data.

  17. Use of Model-Based Design Methods for Enhancing Resiliency Analysis of Unmanned Aerial Vehicles

    Science.gov (United States)

    Knox, Lenora A.

    The most common traditional non-functional requirement analysis is reliability. With systems becoming more complex, networked, and adaptive to environmental uncertainties, system resiliency has recently become the non-functional requirement analysis of choice. Analysis of system resiliency has challenges; which include, defining resilience for domain areas, identifying resilience metrics, determining resilience modeling strategies, and understanding how to best integrate the concepts of risk and reliability into resiliency. Formal methods that integrate all of these concepts do not currently exist in specific domain areas. Leveraging RAMSoS, a model-based reliability analysis methodology for Systems of Systems (SoS), we propose an extension that accounts for resiliency analysis through evaluation of mission performance, risk, and cost using multi-criteria decision-making (MCDM) modeling and design trade study variability modeling evaluation techniques. This proposed methodology, coined RAMSoS-RESIL, is applied to a case study in the multi-agent unmanned aerial vehicle (UAV) domain to investigate the potential benefits of a mission architecture where functionality to complete a mission is disseminated across multiple UAVs (distributed) opposed to being contained in a single UAV (monolithic). The case study based research demonstrates proof of concept for the proposed model-based technique and provides sufficient preliminary evidence to conclude which architectural design (distributed vs. monolithic) is most resilient based on insight into mission resilience performance, risk, and cost in addition to the traditional analysis of reliability.

  18. Permanent Magnet Eddy Current Loss Analysis of a Novel Motor Integrated Permanent Magnet Gear

    DEFF Research Database (Denmark)

    Zhang, Yuqiu; Lu, Kaiyuan; Ye, Yunyue

    2012-01-01

    In this paper, a new motor integrated permanent magnet gear (MIPMG) is discussed. The focus is on eddy current loss analysis associated to permanent magnets (PMs). A convenient model of MIPMG is provided based on 2-D field-motion coupled time-stepping finite element method for transient eddy...... current analysis. The model takes the eddy current effect of PMs into account in determination of the magnetic field in the air-gap and in the magnet regions. The eddy current losses generated in the magnets are properly interpreted. Design improvements for reducing the eddy current losses are suggested...

  19. Magnetic field integral equation analysis of surface plasmon scattering by rectangular dielectric channel discontinuities.

    Science.gov (United States)

    Chremmos, Ioannis

    2010-01-01

    The scattering of a surface plasmon polariton (SPP) by a rectangular dielectric channel discontinuity is analyzed through a rigorous magnetic field integral equation method. The scattering phenomenon is formulated by means of the magnetic-type scalar integral equation, which is subsequently treated through an entire-domain Galerkin method of moments (MoM), based on a Fourier-series plane wave expansion of the magnetic field inside the discontinuity. The use of Green's function Fourier transform allows all integrations over the area and along the boundary of the discontinuity to be performed analytically, resulting in a MoM matrix with entries that are expressed as spectral integrals of closed-form expressions. Complex analysis techniques, such as Cauchy's residue theorem and the saddle-point method, are applied to obtain the amplitudes of the transmitted and reflected SPP modes and the radiated field pattern. Through numerical results, we examine the wavelength selectivity of transmission and reflection against the channel dimensions as well as the sensitivity to changes in the refractive index of the discontinuity, which is useful for sensing applications.

  20. An Integrated Scenario Ensemble-Based Framework for Hurricane Evacuation Modeling: Part 1-Decision Support System.

    Science.gov (United States)

    Davidson, Rachel A; Nozick, Linda K; Wachtendorf, Tricia; Blanton, Brian; Colle, Brian; Kolar, Randall L; DeYoung, Sarah; Dresback, Kendra M; Yi, Wenqi; Yang, Kun; Leonardo, Nicholas

    2018-03-30

    This article introduces a new integrated scenario-based evacuation (ISE) framework to support hurricane evacuation decision making. It explicitly captures the dynamics, uncertainty, and human-natural system interactions that are fundamental to the challenge of hurricane evacuation, but have not been fully captured in previous formal evacuation models. The hazard is represented with an ensemble of probabilistic scenarios, population behavior with a dynamic decision model, and traffic with a dynamic user equilibrium model. The components are integrated in a multistage stochastic programming model that minimizes risk and travel times to provide a tree of evacuation order recommendations and an evaluation of the risk and travel time performance for that solution. The ISE framework recommendations offer an advance in the state of the art because they: (1) are based on an integrated hazard assessment (designed to ultimately include inland flooding), (2) explicitly balance the sometimes competing objectives of minimizing risk and minimizing travel time, (3) offer a well-hedged solution that is robust under the range of ways the hurricane might evolve, and (4) leverage the substantial value of increasing information (or decreasing degree of uncertainty) over the course of a hurricane event. A case study for Hurricane Isabel (2003) in eastern North Carolina is presented to demonstrate how the framework is applied, the type of results it can provide, and how it compares to available methods of a single scenario deterministic analysis and a two-stage stochastic program. © 2018 Society for Risk Analysis.

  1. Integrated Risk-Capability Analysis under Deep Uncertainty : An ESDMA Approach

    NARCIS (Netherlands)

    Pruyt, E.; Kwakkel, J.H.

    2012-01-01

    Integrated risk-capability analysis methodologies for dealing with increasing degrees of complexity and deep uncertainty are urgently needed in an ever more complex and uncertain world. Although scenario approaches, risk assessment methods, and capability analysis methods are used, few organizations

  2. Integrated dynamic modeling and management system mission analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, A.K.

    1994-12-28

    This document summarizes the mission analysis performed on the Integrated Dynamic Modeling and Management System (IDMMS). The IDMMS will be developed to provide the modeling and analysis capability required to understand the TWRS system behavior in terms of the identified TWRS performance measures. The IDMMS will be used to demonstrate in a verified and validated manner the satisfactory performance of the TWRS system configuration and assurance that the requirements have been satisfied.

  3. Integrated dynamic modeling and management system mission analysis

    International Nuclear Information System (INIS)

    Lee, A.K.

    1994-01-01

    This document summarizes the mission analysis performed on the Integrated Dynamic Modeling and Management System (IDMMS). The IDMMS will be developed to provide the modeling and analysis capability required to understand the TWRS system behavior in terms of the identified TWRS performance measures. The IDMMS will be used to demonstrate in a verified and validated manner the satisfactory performance of the TWRS system configuration and assurance that the requirements have been satisfied