WorldWideScience

Sample records for integrating aeroheating analysis

  1. Inviscid/Boundary-Layer Aeroheating Approach for Integrated Vehicle Design

    Science.gov (United States)

    Lee, Esther; Wurster, Kathryn E.

    2017-01-01

    A typical entry vehicle design depends on the synthesis of many essential subsystems, including thermal protection system (TPS), structures, payload, avionics, and propulsion, among others. The ability to incorporate aerothermodynamic considerations and TPS design into the early design phase is crucial, as both are closely coupled to the vehicle's aerodynamics, shape and mass. In the preliminary design stage, reasonably accurate results with rapid turn-representative entry envelope was explored. Initial results suggest that for Mach numbers ranging from 9-20, a few inviscid solutions could reasonably sup- port surface heating predictions at Mach numbers variation of +/-2, altitudes variation of +/-10 to 20 kft, and angle-of-attack variation of +/- 5. Agreement with Navier-Stokes solutions was generally found to be within 10-15% for Mach number and altitude, and 20% for angle of attack. A smaller angle-of-attack increment than the 5 deg around times for parametric studies and quickly evolving configurations are necessary to steer design decisions. This investigation considers the use of an unstructured 3D inviscid code in conjunction with an integral boundary-layer method; the former providing the flow field solution and the latter the surface heating. Sensitivity studies for Mach number, angle of attack, and altitude, examine the feasibility of using this approach to populate a representative entry flight envelope based on a limited set of inviscid solutions. Each inviscid solution is used to generate surface heating over the nearby trajectory space. A subset of a considered in this study is recommended. Results of the angle-of-attack sensitivity studies show that smaller increments may be needed for better heating predictions. The approach is well suited for application to conceptual multidisciplinary design and analysis studies where transient aeroheating environments are critical for vehicle TPS and thermal design. Concurrent prediction of aeroheating

  2. Solar Tower Experiments for Radiometric Calibration and Validation of Infrared Imaging Assets and Analysis Tools for Entry Aero-Heating Measurements

    Science.gov (United States)

    Splinter, Scott C.; Daryabeigi, Kamran; Horvath, Thomas J.; Mercer, David C.; Ghanbari, Cheryl M.; Ross, Martin N.; Tietjen, Alan; Schwartz, Richard J.

    2008-01-01

    The NASA Engineering and Safety Center sponsored Hypersonic Thermodynamic Infrared Measurements assessment team has a task to perform radiometric calibration and validation of land-based and airborne infrared imaging assets and tools for remote thermographic imaging. The IR assets and tools will be used for thermographic imaging of the Space Shuttle Orbiter during entry aero-heating to provide flight boundary layer transition thermography data that could be utilized for calibration and validation of empirical and theoretical aero-heating tools. A series of tests at the Sandia National Laboratories National Solar Thermal Test Facility were designed for this task where reflected solar radiation from a field of heliostats was used to heat a 4 foot by 4 foot test panel consisting of LI 900 ceramic tiles located on top of the 200 foot tall Solar Tower. The test panel provided an Orbiter-like entry temperature for the purposes of radiometric calibration and validation. The Solar Tower provided an ideal test bed for this series of radiometric calibration and validation tests because it had the potential to rapidly heat the large test panel to spatially uniform and non-uniform elevated temperatures. Also, the unsheltered-open-air environment of the Solar Tower was conducive to obtaining unobstructed radiometric data by land-based and airborne IR imaging assets. Various thermocouples installed on the test panel and an infrared imager located in close proximity to the test panel were used to obtain surface temperature measurements for evaluation and calibration of the radiometric data from the infrared imaging assets. The overall test environment, test article, test approach, and typical test results are discussed.

  3. CFD on hypersonic flow geometries with aeroheating

    Science.gov (United States)

    Sohail, Muhammad Amjad; Chao, Yan; Hui, Zhang Hui; Ullah, Rizwan

    2012-11-01

    The hypersonic flowfield around a blunted cone and cone-flare exhibits some of the major features of the flows around space vehicles, e.g. a detached bow shock in the stagnation region and the oblique shock wave/boundary layer interaction at the cone-flare junction. The shock wave/boundary layer interaction can produce a region of separated flow. This phenomenon may occur, for example, at the upstream-facing corner formed by a deflected control surface on a hypersonic entry vehicle, where the length of separation has implications for control effectiveness. Computational fluid-dynamics results are presented to show the flowfield around a blunted cone and cone-flare configurations in hypersonic flow with separation. This problem is of particular interest since it features most of the aspects of the hypersonic flow around planetary entry vehicles. The region between the cone and the flare is particularly critical with respect to the evaluation of the surface pressure and heat flux with aeroheating. Indeed, flow separation is induced by the shock wave boundary layer interaction, with subsequent flow reattachment, that can dramatically enhance the surface heat transfer. The exact determination of the extension of the recirculation zone is a particularly delicate task for numerical codes. Laminar flow and turbulent computations have been carried out using a full Navier-Stokes solver, with freestream conditions provided by the experimental data obtained at Mach 6, 8, and 16.34 wind tunnel. The numerical results are compared with the measured pressure and surface heat flux distributions in the wind tunnel and a good agreement is found, especially on the length of the recirculation region and location of shock waves. The critical physics of entropy layer, boundary layers, boundary layers and shock wave interaction and flow behind shock are properly captured and elaborated.. Hypersonic flows are characterized by high Mach number and high total enthalpy. An elevated

  4. Error estimation for CFD aeroheating prediction under rarefied flow condition

    Science.gov (United States)

    Jiang, Yazhong; Gao, Zhenxun; Jiang, Chongwen; Lee, Chunhian

    2014-12-01

    Both direct simulation Monte Carlo (DSMC) and Computational Fluid Dynamics (CFD) methods have become widely used for aerodynamic prediction when reentry vehicles experience different flow regimes during flight. The implementation of slip boundary conditions in the traditional CFD method under Navier-Stokes-Fourier (NSF) framework can extend the validity of this approach further into transitional regime, with the benefit that much less computational cost is demanded compared to DSMC simulation. Correspondingly, an increasing error arises in aeroheating calculation as the flow becomes more rarefied. To estimate the relative error of heat flux when applying this method for a rarefied flow in transitional regime, theoretical derivation is conducted and a dimensionless parameter ɛ is proposed by approximately analyzing the ratio of the second order term to first order term in the heat flux expression in Burnett equation. DSMC simulation for hypersonic flow over a cylinder in transitional regime is performed to test the performance of parameter ɛ, compared with two other parameters, Knρ and MaṡKnρ.

  5. Real-Time Simulation of Aeroheating of the Hyper-X Airplane

    Science.gov (United States)

    Gong, Les

    2005-01-01

    A capability for real-time computational simulation of aeroheating has been developed in support of the Hyper-X program, which is directed toward demonstrating the feasibility of operating an air-breathing ramjet/scramjet engine at mach 5, mach 7, and mach 10. The simulation software will serve as a valuable design tool for initial trajectory studies in which aerodynamic heating is expected to exert a major influence in the design of the Hyper-X airplane; this tool will aid in the selection of materials, sizing of structural skin thicknesses, and selection of components of a thermal-protection system (TPS) for structures that must be insulated against aeroheating.

  6. Integrated genetic analysis microsystems

    International Nuclear Information System (INIS)

    Lagally, Eric T; Mathies, Richard A

    2004-01-01

    With the completion of the Human Genome Project and the ongoing DNA sequencing of the genomes of other animals, bacteria, plants and others, a wealth of new information about the genetic composition of organisms has become available. However, as the demand for sequence information grows, so does the workload required both to generate this sequence and to use it for targeted genetic analysis. Microfabricated genetic analysis systems are well poised to assist in the collection and use of these data through increased analysis speed, lower analysis cost and higher parallelism leading to increased assay throughput. In addition, such integrated microsystems may point the way to targeted genetic experiments on single cells and in other areas that are otherwise very difficult. Concomitant with these advantages, such systems, when fully integrated, should be capable of forming portable systems for high-speed in situ analyses, enabling a new standard in disciplines such as clinical chemistry, forensics, biowarfare detection and epidemiology. This review will discuss the various technologies available for genetic analysis on the microscale, and efforts to integrate them to form fully functional robust analysis devices. (topical review)

  7. Aeroheating Test of CEV Entry Vehicle at Turbulent Conditions

    Science.gov (United States)

    Hollis, Brian R.; Berger, Karen T.; Horvath, Thomas J.; Coblish, Joseph J.; Norris, Joseph D.; Lillard, Randolph P.; Kirk, Ben

    2008-01-01

    An investigation of the aeroheating environment of the Project Orion Crew Entry Vehicle has been performed in the Arnold Engineering Development Center Tunnel 9. Data were measured on a approx. 3.5% scale model (0.1778m/7-inch diam.) of the vehicle using coaxial thermocouples in the Mach 8 and Mach 10 nozzles of Tunnel 9. Runs were performed at free stream Reynolds numbers of 1 106/ft to 20 10(exp 6)/ft in the Mach 10 nozzle and 8 10(exp 6)/ft to 48 10(exp 6)/ft in the Mach 8 nozzle. The test gas in Tunnel 9 is pure N2, which at these operating conditions remains un-dissociated and may be treated as a perfect gas. At these conditions, laminar, transitional, and turbulent flow was produced on the model at Mach 10, and transitional and turbulent conditions were produced on the model at Mach 8. The majority of runs were made on a clean, smooth-surface model configuration and a limited number of runs were made in which inserts with varying boundary-layer trips configurations were used to force the occurrence of transition. Laminar and turbulent predictions were generated for all wind tunnel test conditions and comparisons were performed with the data for the purpose of helping to define uncertainty margins for the computational method. Data from both the wind tunnel test and the computations are presented herein. Figure 1 shows a schematic of the thermocouple locations on the model and figures 2 and 3 show a photo and schematic of the AEDC Hypervelocity Tunnel 9. Figure 4 shows a typical grid used in the computations. From the comparisons shown in figures 5 through 8 it was concluded that for perfect-gas conditions, the computations could predict either fully-laminar or full-turbulent flow to within +/-10% of the experimental data. The experimental data showed that transition began on the leeside of the heatshield at a free stream Reynolds number of 9 10(exp 6)/ft in the Mach 10 nozzle and fully-developed turbulent flow was produced at 20 10(exp 6)/ft. In the Mach 8

  8. Features of the Upgraded Imaging for Hypersonic Experimental Aeroheating Testing (IHEAT) Software

    Science.gov (United States)

    Mason, Michelle L.; Rufer, Shann J.

    2016-01-01

    The Imaging for Hypersonic Experimental Aeroheating Testing (IHEAT) software is used at the NASA Langley Research Center to analyze global aeroheating data on wind tunnel models tested in the Langley Aerothermodynamics Laboratory. One-dimensional, semi-infinite heating data derived from IHEAT are used in the design of thermal protection systems for hypersonic vehicles that are exposed to severe aeroheating loads, such as reentry vehicles during descent and landing procedures. This software program originally was written in the PV-WAVE(Registered Trademark) programming language to analyze phosphor thermography data from the two-color, relative-intensity system developed at Langley. To increase the efficiency, functionality, and reliability of IHEAT, the program was migrated to MATLAB(Registered Trademark) syntax and compiled as a stand-alone executable file labeled version 4.0. New features of IHEAT 4.0 include the options to perform diagnostic checks of the accuracy of the acquired data during a wind tunnel test, to extract data along a specified multi-segment line following a feature such as a leading edge or a streamline, and to batch process all of the temporal frame data from a wind tunnel run. Results from IHEAT 4.0 were compared on a pixel level to the output images from the legacy software to validate the program. The absolute differences between the heat transfer data output from the two programs were on the order of 10(exp -5) to 10(exp -7). IHEAT 4.0 replaces the PV-WAVE(Registered Trademark) version as the production software for aeroheating experiments conducted in the hypersonic facilities at NASA Langley.

  9. An Upgrade of the Imaging for Hypersonic Experimental Aeroheating Testing (IHEAT) Software

    Science.gov (United States)

    Mason, Michelle L.; Rufer, Shann J.

    2015-01-01

    The Imaging for Hypersonic Experimental Aeroheating Testing (IHEAT) code is used at NASA Langley Research Center to analyze global aeroheating data on wind tunnel models tested in the Langley Aerothermodynamics Laboratory. One-dimensional, semi-infinite heating data derived from IHEAT are used to design thermal protection systems to mitigate the risks due to the aeroheating loads on hypersonic vehicles, such as re-entry vehicles during descent and landing procedures. This code was originally written in the PV-WAVE programming language to analyze phosphor thermography data from the two-color, relativeintensity system developed at Langley. To increase the efficiency, functionality, and reliability of IHEAT, the code was migrated to MATLAB syntax and compiled as a stand-alone executable file labeled version 4.0. New features of IHEAT 4.0 include the options to batch process all of the data from a wind tunnel run, to map the two-dimensional heating distribution to a three-dimensional computer-aided design model of the vehicle to be viewed in Tecplot, and to extract data from a segmented line that follows an interesting feature in the data. Results from IHEAT 4.0 were compared on a pixel level to the output images from the legacy code to validate the program. The differences between the two codes were on the order of 10-5 to 10-7. IHEAT 4.0 replaces the PV-WAVE version as the production code for aeroheating experiments conducted in the hypersonic facilities at NASA Langley.

  10. Experimental Investigation of Project Orion Crew Exploration Vehicle Aeroheating in AEDC Tunnel 9

    Science.gov (United States)

    Hollis, Brian R.; Horvath, Thomas J.; Berger, Karen T.; Lillard, Randolph P.; Kirk, Benjamin S.; Coblish, Joseph J.; Norris, Joseph D.

    2008-01-01

    An investigation of the aeroheating environment of the Project Orion Crew Entry Vehicle has been performed in the Arnold Engineering Development Center Tunnel 9. The goals of this test were to measure turbulent heating augmentation levels on the heat shield and to obtain high-fidelity heating data for assessment of computational fluid dynamics methods. Laminar and turbulent predictions were generated for all wind tunnel test conditions and comparisons were performed with the data for the purpose of helping to define uncertainty margins for the computational method. Data from both the wind tunnel test and the computational study are presented herein.

  11. Aero-Heating of Shallow Cavities in Hypersonic Freestream Flow

    Science.gov (United States)

    Everhart, Joel L.; Berger, Karen T.; Merski, N. R., Jr.; Woods, William A.; Hollingsworth, Kevin E.; Hyatt, Andrew; Prabhu, Ramadas K.

    2010-01-01

    The purpose of these experiments and analysis was to augment the heating database and tools used for assessment of impact-induced shallow-cavity damage to the thermal protection system of the Space Shuttle Orbiter. The effect of length and depth on the local heating disturbance of rectangular cavities tested at hypersonic freestream conditions has been globally assessed using the two-color phosphor thermography method. These rapid-response experiments were conducted in the Langley 31-Inch Mach 10 Tunnel and were initiated immediately prior to the launch of STS-114, the initial flight in the Space Shuttle Return-To-Flight Program, and continued during the first week of the mission. Previously-designed and numerically-characterized blunted-nose baseline flat plates were used as the test surfaces. Three-dimensional computational predictions of the entire model geometry were used as a check on the design process and the two-dimensional flow assumptions used for the data analysis. The experimental boundary layer state conditions were inferred using the measured heating distributions on a no-cavity test article. Two test plates were developed, each containing 4 equally-spaced spanwise-distributed cavities. The first test plate contained cavities with a constant length-to-depth ratio of 8 with design point depth-to-boundary-layer-thickness ratios of 0.1, 0.2, 0.35, and 0.5. The second test plate contained cavities with a constant design point depth-to-boundary-layer-thickness ratio of 0.35 with length-to-depth ratios of 8, 12, 16, and 20. Cavity design parameters and the test condition matrix were established using the computational predictions. Preliminary results indicate that the floor-averaged Bump Factor (local heating rate nondimensionalized by upstream reference) at the tested conditions is approximately 0.3 with a standard deviation of 0.04 for laminar-in/laminar-out conditions when the cavity length-to-boundary-layer thickness is between 2.5 and 10 and for

  12. Integrated piping structural analysis system

    International Nuclear Information System (INIS)

    Motoi, Toshio; Yamadera, Masao; Horino, Satoshi; Idehata, Takamasa

    1979-01-01

    Structural analysis of the piping system for nuclear power plants has become larger in scale and in quantity. In addition, higher quality analysis is regarded as of major importance nowadays from the point of view of nuclear plant safety. In order to fulfill to the above requirements, an integrated piping structural analysis system (ISAP-II) has been developed. Basic philosophy of this system is as follows: 1. To apply the date base system. All information is concentrated. 2. To minimize the manual process in analysis, evaluation and documentation. Especially to apply the graphic system as much as possible. On the basis of the above philosophy four subsystems were made. 1. Data control subsystem. 2. Analysis subsystem. 3. Plotting subsystem. 4. Report subsystem. Function of the data control subsystem is to control all information of the data base. Piping structural analysis can be performed by using the analysis subsystem. Isometric piping drawing and mode shape, etc. can be plotted by using the plotting subsystem. Total analysis report can be made without the manual process through the reporting subsystem. (author)

  13. Integrated sequence analysis. Final report

    International Nuclear Information System (INIS)

    Andersson, K.; Pyy, P.

    1998-02-01

    The NKS/RAK subprojet 3 'integrated sequence analysis' (ISA) was formulated with the overall objective to develop and to test integrated methodologies in order to evaluate event sequences with significant human action contribution. The term 'methodology' denotes not only technical tools but also methods for integration of different scientific disciplines. In this report, we first discuss the background of ISA and the surveys made to map methods in different application fields, such as man machine system simulation software, human reliability analysis (HRA) and expert judgement. Specific event sequences were, after the surveys, selected for application and testing of a number of ISA methods. The event sequences discussed in the report were cold overpressure of BWR, shutdown LOCA of BWR, steam generator tube rupture of a PWR and BWR disturbed signal view in the control room after an external event. Different teams analysed these sequences by using different ISA and HRA methods. Two kinds of results were obtained from the ISA project: sequence specific and more general findings. The sequence specific results are discussed together with each sequence description. The general lessons are discussed under a separate chapter by using comparisons of different case studies. These lessons include areas ranging from plant safety management (design, procedures, instrumentation, operations, maintenance and safety practices) to methodological findings (ISA methodology, PSA,HRA, physical analyses, behavioural analyses and uncertainty assessment). Finally follows a discussion about the project and conclusions are presented. An interdisciplinary study of complex phenomena is a natural way to produce valuable and innovative results. This project came up with structured ways to perform ISA and managed to apply the in practice. The project also highlighted some areas where more work is needed. In the HRA work, development is required for the use of simulators and expert judgement as

  14. Integrated sequence analysis. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, K.; Pyy, P

    1998-02-01

    The NKS/RAK subprojet 3 `integrated sequence analysis` (ISA) was formulated with the overall objective to develop and to test integrated methodologies in order to evaluate event sequences with significant human action contribution. The term `methodology` denotes not only technical tools but also methods for integration of different scientific disciplines. In this report, we first discuss the background of ISA and the surveys made to map methods in different application fields, such as man machine system simulation software, human reliability analysis (HRA) and expert judgement. Specific event sequences were, after the surveys, selected for application and testing of a number of ISA methods. The event sequences discussed in the report were cold overpressure of BWR, shutdown LOCA of BWR, steam generator tube rupture of a PWR and BWR disturbed signal view in the control room after an external event. Different teams analysed these sequences by using different ISA and HRA methods. Two kinds of results were obtained from the ISA project: sequence specific and more general findings. The sequence specific results are discussed together with each sequence description. The general lessons are discussed under a separate chapter by using comparisons of different case studies. These lessons include areas ranging from plant safety management (design, procedures, instrumentation, operations, maintenance and safety practices) to methodological findings (ISA methodology, PSA,HRA, physical analyses, behavioural analyses and uncertainty assessment). Finally follows a discussion about the project and conclusions are presented. An interdisciplinary study of complex phenomena is a natural way to produce valuable and innovative results. This project came up with structured ways to perform ISA and managed to apply the in practice. The project also highlighted some areas where more work is needed. In the HRA work, development is required for the use of simulators and expert judgement as

  15. Integrative Workflows for Metagenomic Analysis

    Directory of Open Access Journals (Sweden)

    Efthymios eLadoukakis

    2014-11-01

    Full Text Available The rapid evolution of all sequencing technologies, described by the term Next Generation Sequencing (NGS, have revolutionized metagenomic analysis. They constitute a combination of high-throughput analytical protocols, coupled to delicate measuring techniques, in order to potentially discover, properly assemble and map allelic sequences to the correct genomes, achieving particularly high yields for only a fraction of the cost of traditional processes (i.e. Sanger. From a bioinformatic perspective, this boils down to many gigabytes of data being generated from each single sequencing experiment, rendering the management or even the storage, critical bottlenecks with respect to the overall analytical endeavor. The enormous complexity is even more aggravated by the versatility of the processing steps available, represented by the numerous bioinformatic tools that are essential, for each analytical task, in order to fully unveil the genetic content of a metagenomic dataset. These disparate tasks range from simple, nonetheless non-trivial, quality control of raw data to exceptionally complex protein annotation procedures, requesting a high level of expertise for their proper application or the neat implementation of the whole workflow. Furthermore, a bioinformatic analysis of such scale, requires grand computational resources, imposing as the sole realistic solution, the utilization of cloud computing infrastructures. In this review article we discuss different, integrative, bioinformatic solutions available, which address the aforementioned issues, by performing a critical assessment of the available automated pipelines for data management, quality control and annotation of metagenomic data, embracing various, major sequencing technologies and applications.

  16. Integrated Radiation Analysis and Design Tools

    Data.gov (United States)

    National Aeronautics and Space Administration — The Integrated Radiation Analysis and Design Tools (IRADT) Project develops and maintains an integrated tool set that collects the current best practices, databases,...

  17. Integrated minicomputer alpha analysis system

    International Nuclear Information System (INIS)

    Vasilik, D.G.; Coy, D.E.; Seamons, M.; Henderson, R.W.; Romero, L.L.; Thomson, D.A.

    1978-01-01

    Approximately 1,000 stack and occupation air samples from plutonium and uranium facilities at LASL are analyzed daily. The concentrations of radio-nuclides in air are determined by measuring absolute alpha activities of particulates collected on air sample filter media. The Integrated Minicomputer Pulse system (IMPULSE) is an interface between many detectors of extremely simple design and a Digital Equipment Corporation (DEC) PDP-11/04 minicomputer. The detectors are photomultiplier tubes faced with zinc sulfide (ZnS). The average detector background is approximately 0.07 cpm. The IMPULSE system includes two mainframes, each of which can hold up to 64 detectors. The current hardware configuration includes 64 detectors in one mainframe and 40 detectors in the other. Each mainframe contains a minicomputer with 28K words of Random Access Memory. One minicomputer controls the detectors in both mainframes. A second computer was added for fail-safe redundancy and to support other laboratory computer requirements. The main minicomputer includes a dual floppy disk system and a dual DEC 'RK05' disk system for mass storage. The RK05 facilitates report generation and trend analysis. The IMPULSE hardware provides for passage of data from the detectors to the computer, and for passage of status and control information from the computer to the detector stations

  18. Aeroheating Testing and Predictions for Project Orion CEV at Turbulent Conditions

    Science.gov (United States)

    Hollis, Brian R.; Berger, Karen T.; Horvath, Thomas J.; Coblish, Joseph J.; Norris, Joseph D.; Lillard, Randolph P.; Kirk, Benjamin S.

    2009-01-01

    An investigation of the aeroheating environment of the Project Orion Crew Exploration Vehicle was performed in the Arnold Engineering Development Center Hypervelocity Wind Tunnel No. 9 Mach 8 and Mach 10 nozzles and in the NASA Langley Research Center 20 - Inch Mach 6 Air Tunnel. Heating data were obtained using a thermocouple-instrumented approx.0.035-scale model (0.1778-m/7-inch diameter) of the flight vehicle. Runs were performed in the Tunnel 9 Mach 10 nozzle at free stream unit Reynolds numbers of 1x10(exp 6)/ft to 20x10(exp 6)/ft, in the Tunnel 9 Mach 8 nozzle at free stream unit Reynolds numbers of 8 x 10(exp 6)/ft to 48x10(exp 6)/ft, and in the 20-Inch Mach 6 Air Tunnel at free stream unit Reynolds numbers of 1x10(exp 6)/ft to 7x10(exp 6)/ft. In both facilities, enthalpy levels were low and the test gas (N2 in Tunnel 9 and air in the 20-Inch Mach 6) behaved as a perfect-gas. These test conditions produced laminar, transitional and turbulent data in the Tunnel 9 Mach 10 nozzle, transitional and turbulent data in the Tunnel 9 Mach 8 nozzle, and laminar and transitional data in the 20- Inch Mach 6 Air Tunnel. Laminar and turbulent predictions were generated for all wind tunnel test conditions and comparisons were performed with the experimental data to help define the accuracy of computational method. In general, it was found that both laminar data and predictions, and turbulent data and predictions, agreed to within less than the estimated 12% experimental uncertainty estimate. Laminar heating distributions from all three data sets were shown to correlate well and demonstrated Reynolds numbers independence when expressed in terms of the Stanton number based on adiabatic wall-recovery enthalpy. Transition onset locations on the leeside centerline were determined from the data and correlated in terms of boundary-layer parameters. Finally turbulent heating augmentation ratios were determined for several body-point locations and correlated in terms of the

  19. Problems in mathematical analysis III integration

    CERN Document Server

    Kaczor, W J

    2003-01-01

    We learn by doing. We learn mathematics by doing problems. This is the third volume of Problems in Mathematical Analysis. The topic here is integration for real functions of one real variable. The first chapter is devoted to the Riemann and the Riemann-Stieltjes integrals. Chapter 2 deals with Lebesgue measure and integration. The authors include some famous, and some not so famous, integral inequalities related to Riemann integration. Many of the problems for Lebesgue integration concern convergence theorems and the interchange of limits and integrals. The book closes with a section on Fourier series, with a concentration on Fourier coefficients of functions from particular classes and on basic theorems for convergence of Fourier series. The book is primarily geared toward students in analysis, as a study aid, for problem-solving seminars, or for tutorials. It is also an excellent resource for instructors who wish to incorporate problems into their lectures. Solutions for the problems are provided in the boo...

  20. The Vehicle Integrated Performance Analysis Experience: Reconnecting With Technical Integration

    Science.gov (United States)

    McGhee, D. S.

    2006-01-01

    Very early in the Space Launch Initiative program, a small team of engineers at MSFC proposed a process for performing system-level assessments of a launch vehicle. Aimed primarily at providing insight and making NASA a smart buyer, the Vehicle Integrated Performance Analysis (VIPA) team was created. The difference between the VIPA effort and previous integration attempts is that VIPA a process using experienced people from various disciplines, which focuses them on a technically integrated assessment. The foundations of VIPA s process are described. The VIPA team also recognized the need to target early detailed analysis toward identifying significant systems issues. This process is driven by the T-model for technical integration. VIPA s approach to performing system-level technical integration is discussed in detail. The VIPA process significantly enhances the development and monitoring of realizable project requirements. VIPA s assessment validates the concept s stated performance, identifies significant issues either with the concept or the requirements, and then reintegrates these issues to determine impacts. This process is discussed along with a description of how it may be integrated into a program s insight and review process. The VIPA process has gained favor with both engineering and project organizations for being responsive and insightful

  1. Integrated logistic support analysis system

    International Nuclear Information System (INIS)

    Carnicero Iniguez, E.J.; Garcia de la Sen, R.

    1993-01-01

    Integrating logic support into a system results in a large volume of information having to be managed which can only be achieved with the help of computer applications. Both past experience and growing needs in such tasks have led Emperesarios Agrupados to undertake an ambitious development project which is described in this paper. (author)

  2. Analysis of integrated energy systems

    International Nuclear Information System (INIS)

    Matsuhashi, Takaharu; Kaya, Yoichi; Komiyama, Hiroshi; Hayashi, Taketo; Yasukawa, Shigeru.

    1988-01-01

    World attention is now attracted to the concept of Novel Horizontally Integrated Energy System (NHIES). In NHIES, all fossil fuels are fist converted into CO and H 2 . Potential environmental contaminants such as sulfur are removed during this process. CO turbines are mainly used to generate electric power. Combustion is performed in pure oxygen produced through air separation, making it possible to completely prevent the formation of thermal NOx. Thus, NHIES would release very little amount of such substances that would contribute to acid rain. In this system, the intermediate energy sources of CO, H 2 and O 2 are integrated horizontally. They are combined appropriately to produce a specific form of final energy source. The integration of intermediate energy sources can provide a wide variety of final energy sources, allowing any type of fossil fuel to serve as an alternative to other types of fossil fuel. Another feature of NHIES is the positive use of nuclear fuel to reduce the formation of CO 2 . Studies are under way in Japan to develop a new concept of integrated energy system. These studies are especially aimed at decreased overall efficiency and introduction of new liquid fuels that are high in conversion efficiency. Considerations are made on the final form of energy source, robust control, acid fallout, and CO 2 reduction. (Nogami, K.)

  3. Integrative Analysis of Omics Big Data.

    Science.gov (United States)

    Yu, Xiang-Tian; Zeng, Tao

    2018-01-01

    The diversity and huge omics data take biology and biomedicine research and application into a big data era, just like that popular in human society a decade ago. They are opening a new challenge from horizontal data ensemble (e.g., the similar types of data collected from different labs or companies) to vertical data ensemble (e.g., the different types of data collected for a group of person with match information), which requires the integrative analysis in biology and biomedicine and also asks for emergent development of data integration to address the great changes from previous population-guided to newly individual-guided investigations.Data integration is an effective concept to solve the complex problem or understand the complicate system. Several benchmark studies have revealed the heterogeneity and trade-off that existed in the analysis of omics data. Integrative analysis can combine and investigate many datasets in a cost-effective reproducible way. Current integration approaches on biological data have two modes: one is "bottom-up integration" mode with follow-up manual integration, and the other one is "top-down integration" mode with follow-up in silico integration.This paper will firstly summarize the combinatory analysis approaches to give candidate protocol on biological experiment design for effectively integrative study on genomics and then survey the data fusion approaches to give helpful instruction on computational model development for biological significance detection, which have also provided newly data resources and analysis tools to support the precision medicine dependent on the big biomedical data. Finally, the problems and future directions are highlighted for integrative analysis of omics big data.

  4. Containment integrity analysis under accidents

    International Nuclear Information System (INIS)

    Lin Chengge; Zhao Ruichang; Liu Zhitao

    2010-01-01

    Containment integrity analyses for current nuclear power plants (NPPs) mainly focus on the internal pressure caused by design basis accidents (DBAs). In addition to the analyses of containment pressure response caused by DBAs, the behavior of containment during severe accidents (SAs) are also evaluated for AP1000 NPP. Since the conservatism remains in the assumptions,boundary conditions and codes, margin of the results of containment integrity analyses may be overestimated. Along with the improvements of the knowledge to the phenomena and process of relevant accidents, the margin overrated can be appropriately reduced by using the best estimate codes combined with the uncertainty methods, which could be beneficial to the containment design and construction of large passive plants (LPP) in China. (authors)

  5. Integrative biological analysis for neuropsychopharmacology.

    Science.gov (United States)

    Emmett, Mark R; Kroes, Roger A; Moskal, Joseph R; Conrad, Charles A; Priebe, Waldemar; Laezza, Fernanda; Meyer-Baese, Anke; Nilsson, Carol L

    2014-01-01

    Although advances in psychotherapy have been made in recent years, drug discovery for brain diseases such as schizophrenia and mood disorders has stagnated. The need for new biomarkers and validated therapeutic targets in the field of neuropsychopharmacology is widely unmet. The brain is the most complex part of human anatomy from the standpoint of number and types of cells, their interconnections, and circuitry. To better meet patient needs, improved methods to approach brain studies by understanding functional networks that interact with the genome are being developed. The integrated biological approaches--proteomics, transcriptomics, metabolomics, and glycomics--have a strong record in several areas of biomedicine, including neurochemistry and neuro-oncology. Published applications of an integrated approach to projects of neurological, psychiatric, and pharmacological natures are still few but show promise to provide deep biological knowledge derived from cells, animal models, and clinical materials. Future studies that yield insights based on integrated analyses promise to deliver new therapeutic targets and biomarkers for personalized medicine.

  6. Integrating reliability analysis and design

    International Nuclear Information System (INIS)

    Rasmuson, D.M.

    1980-10-01

    This report describes the Interactive Reliability Analysis Project and demonstrates the advantages of using computer-aided design systems (CADS) in reliability analysis. Common cause failure problems require presentations of systems, analysis of fault trees, and evaluation of solutions to these. Results have to be communicated between the reliability analyst and the system designer. Using a computer-aided design system saves time and money in the analysis of design. Computer-aided design systems lend themselves to cable routing, valve and switch lists, pipe routing, and other component studies. At EG and G Idaho, Inc., the Applicon CADS is being applied to the study of water reactor safety systems

  7. Integrative cluster analysis in bioinformatics

    CERN Document Server

    Abu-Jamous, Basel; Nandi, Asoke K

    2015-01-01

    Clustering techniques are increasingly being put to use in the analysis of high-throughput biological datasets. Novel computational techniques to analyse high throughput data in the form of sequences, gene and protein expressions, pathways, and images are becoming vital for understanding diseases and future drug discovery. This book details the complete pathway of cluster analysis, from the basics of molecular biology to the generation of biological knowledge. The book also presents the latest clustering methods and clustering validation, thereby offering the reader a comprehensive review o

  8. Integral data analysis for resonance parameters determination

    International Nuclear Information System (INIS)

    Larson, N.M.; Leal, L.C.; Derrien, H.

    1997-09-01

    Neutron time-of-flight experiments have long been used to determine resonance parameters. Those resonance parameters have then been used in calculations of integral quantities such as Maxwellian averages or resonance integrals, and results of those calculations in turn have been used as a criterion for acceptability of the resonance analysis. However, the calculations were inadequate because covariances on the parameter values were not included in the calculations. In this report an effort to correct for that deficiency is documented: (1) the R-matrix analysis code SAMMY has been modified to include integral quantities of importance, (2) directly within the resonance parameter analysis, and (3) to determine the best fit to both differential (microscopic) and integral (macroscopic) data simultaneously. This modification was implemented because it is expected to have an impact on the intermediate-energy range that is important for criticality safety applications

  9. Preliminary Integrated Safety Analysis Status Report

    International Nuclear Information System (INIS)

    Gwyn, D.

    2001-01-01

    This report provides the status of the potential Monitored Geologic Repository (MGR) Integrated Safety Analysis (EA) by identifying the initial work scope scheduled for completion during the ISA development period, the schedules associated with the tasks identified, safety analysis issues encountered, and a summary of accomplishments during the reporting period. This status covers the period from October 1, 2000 through March 30, 2001

  10. International Space Station Configuration Analysis and Integration

    Science.gov (United States)

    Anchondo, Rebekah

    2016-01-01

    Ambitious engineering projects, such as NASA's International Space Station (ISS), require dependable modeling, analysis, visualization, and robotics to ensure that complex mission strategies are carried out cost effectively, sustainably, and safely. Learn how Booz Allen Hamilton's Modeling, Analysis, Visualization, and Robotics Integration Center (MAVRIC) team performs engineering analysis of the ISS Configuration based primarily on the use of 3D CAD models. To support mission planning and execution, the team tracks the configuration of ISS and maintains configuration requirements to ensure operational goals are met. The MAVRIC team performs multi-disciplinary integration and trade studies to ensure future configurations meet stakeholder needs.

  11. Integrability of dynamical systems algebra and analysis

    CERN Document Server

    Zhang, Xiang

    2017-01-01

    This is the first book to systematically state the fundamental theory of integrability and its development of ordinary differential equations with emphasis on the Darboux theory of integrability and local integrability together with their applications. It summarizes the classical results of Darboux integrability and its modern development together with their related Darboux polynomials and their applications in the reduction of Liouville and elementary integrabilty and in the center—focus problem, the weakened Hilbert 16th problem on algebraic limit cycles and the global dynamical analysis of some realistic models in fields such as physics, mechanics and biology. Although it can be used as a textbook for graduate students in dynamical systems, it is intended as supplementary reading for graduate students from mathematics, physics, mechanics and engineering in courses related to the qualitative theory, bifurcation theory and the theory of integrability of dynamical systems.

  12. Strategic Analysis of Technology Integration at Allstream

    OpenAIRE

    Brown, Jeff

    2011-01-01

    Innovation has been defined as the combination of invention and commercialization. Invention without commercialization is rarely, if ever, profitable. For the purposes of this paper the definition of innovation will be further expanded into the concept of technology integration. Successful technology integration not only includes new technology introduction, but also the operationalization of the new technology within each business unit of the enterprise. This paper conducts an analysis of Al...

  13. Analysis Method for Integrating Components of Product

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jun Ho [Inzest Co. Ltd, Seoul (Korea, Republic of); Lee, Kun Sang [Kookmin Univ., Seoul (Korea, Republic of)

    2017-04-15

    This paper presents some of the methods used to incorporate the parts constituting a product. A new relation function concept and its structure are introduced to analyze the relationships of component parts. This relation function has three types of information, which can be used to establish a relation function structure. The relation function structure of the analysis criteria was established to analyze and present the data. The priority components determined by the analysis criteria can be integrated. The analysis criteria were divided based on their number and orientation, as well as their direct or indirect characteristic feature. This paper presents a design algorithm for component integration. This algorithm was applied to actual products, and the components inside the product were integrated. Therefore, the proposed algorithm was used to conduct research to improve the brake discs for bicycles. As a result, an improved product similar to the related function structure was actually created.

  14. Analysis Method for Integrating Components of Product

    International Nuclear Information System (INIS)

    Choi, Jun Ho; Lee, Kun Sang

    2017-01-01

    This paper presents some of the methods used to incorporate the parts constituting a product. A new relation function concept and its structure are introduced to analyze the relationships of component parts. This relation function has three types of information, which can be used to establish a relation function structure. The relation function structure of the analysis criteria was established to analyze and present the data. The priority components determined by the analysis criteria can be integrated. The analysis criteria were divided based on their number and orientation, as well as their direct or indirect characteristic feature. This paper presents a design algorithm for component integration. This algorithm was applied to actual products, and the components inside the product were integrated. Therefore, the proposed algorithm was used to conduct research to improve the brake discs for bicycles. As a result, an improved product similar to the related function structure was actually created.

  15. An integrated acquisition, display, and analysis system

    International Nuclear Information System (INIS)

    Ahmad, T.; Huckins, R.J.

    1987-01-01

    The design goal of the ND9900/Genuie was to integrate a high performance data acquisition and display subsystem with a state-of-the-art 32-bit supermicrocomputer. This was achieved by integrating a Digital Equipment Corporation MicroVAX II CPU board with acquisition and display controllers via the Q-bus. The result is a tightly coupled processing and analysis system for Pulse Height Analysis and other applications. The system architecture supports distributed processing, so that acquisition and display functions are semi-autonomous, making the VAX concurrently available for applications programs

  16. Abel integral equations analysis and applications

    CERN Document Server

    Gorenflo, Rudolf

    1991-01-01

    In many fields of application of mathematics, progress is crucially dependent on the good flow of information between (i) theoretical mathematicians looking for applications, (ii) mathematicians working in applications in need of theory, and (iii) scientists and engineers applying mathematical models and methods. The intention of this book is to stimulate this flow of information. In the first three chapters (accessible to third year students of mathematics and physics and to mathematically interested engineers) applications of Abel integral equations are surveyed broadly including determination of potentials, stereology, seismic travel times, spectroscopy, optical fibres. In subsequent chapters (requiring some background in functional analysis) mapping properties of Abel integral operators and their relation to other integral transforms in various function spaces are investi- gated, questions of existence and uniqueness of solutions of linear and nonlinear Abel integral equations are treated, and for equatio...

  17. Nonlinear structural analysis using integrated force method

    Indian Academy of Sciences (India)

    A new formulation termed the Integrated Force Method (IFM) was proposed by Patnaik ... nated ``Structure (nY m)'' where (nY m) are the force and displacement degrees of ..... Patnaik S N, Yadagiri S 1976 Frequency analysis of structures.

  18. Integrating neural network technology and noise analysis

    International Nuclear Information System (INIS)

    Uhrig, R.E.; Oak Ridge National Lab., TN

    1995-01-01

    The integrated use of neural network and noise analysis technologies offers advantages not available by the use of either technology alone. The application of neural network technology to noise analysis offers an opportunity to expand the scope of problems where noise analysis is useful and unique ways in which the integration of these technologies can be used productively. The two-sensor technique, in which the responses of two sensors to an unknown driving source are related, is used to demonstration such integration. The relationship between power spectral densities (PSDs) of accelerometer signals is derived theoretically using noise analysis to demonstrate its uniqueness. This relationship is modeled from experimental data using a neural network when the system is working properly, and the actual PSD of one sensor is compared with the PSD of that sensor predicted by the neural network using the PSD of the other sensor as an input. A significant deviation between the actual and predicted PSDs indicate that system is changing (i.e., failing). Experiments carried out on check values and bearings illustrate the usefulness of the methodology developed. (Author)

  19. [Integrated health care organizations: guideline for analysis].

    Science.gov (United States)

    Vázquez Navarrete, M Luisa; Vargas Lorenzo, Ingrid; Farré Calpe, Joan; Terraza Núñez, Rebeca

    2005-01-01

    There has been a tendency recently to abandon competition and to introduce policies that promote collaboration between health providers as a means of improving the efficiency of the system and the continuity of care. A number of countries, most notably the United States, have experienced the integration of health care providers to cover the continuum of care of a defined population. Catalonia has witnessed the steady emergence of increasing numbers of integrated health organisations (IHO) but, unlike the United States, studies on health providers' integration are scarce. As part of a research project currently underway, a guide was developed to study Catalan IHOs, based on a classical literature review and the development of a theoretical framework. The guide proposes analysing the IHO's performance in relation to their final objectives of improving the efficiency and continuity of health care by an analysis of the integration type (based on key characteristics); external elements (existence of other suppliers, type of services' payment mechanisms); and internal elements (model of government, organization and management) that influence integration. Evaluation of the IHO's performance focuses on global strategies and results on coordination of care and efficiency. Two types of coordination are evaluated: information coordination and coordination of care management. Evaluation of the efficiency of the IHO refers to technical and allocative efficiency. This guide may have to be modified for use in the Catalan context.

  20. Integrated tools for control-system analysis

    Science.gov (United States)

    Ostroff, Aaron J.; Proffitt, Melissa S.; Clark, David R.

    1989-01-01

    The basic functions embedded within a user friendly software package (MATRIXx) are used to provide a high level systems approach to the analysis of linear control systems. Various control system analysis configurations are assembled automatically to minimize the amount of work by the user. Interactive decision making is incorporated via menu options and at selected points, such as in the plotting section, by inputting data. There are five evaluations such as the singular value robustness test, singular value loop transfer frequency response, Bode frequency response, steady-state covariance analysis, and closed-loop eigenvalues. Another section describes time response simulations. A time response for random white noise disturbance is available. The configurations and key equations used for each type of analysis, the restrictions that apply, the type of data required, and an example problem are described. One approach for integrating the design and analysis tools is also presented.

  1. An integrated system for genetic analysis

    Directory of Open Access Journals (Sweden)

    Duan Xiao

    2006-04-01

    Full Text Available Abstract Background Large-scale genetic mapping projects require data management systems that can handle complex phenotypes and detect and correct high-throughput genotyping errors, yet are easy to use. Description We have developed an Integrated Genotyping System (IGS to meet this need. IGS securely stores, edits and analyses genotype and phenotype data. It stores information about DNA samples, plates, primers, markers and genotypes generated by a genotyping laboratory. Data are structured so that statistical genetic analysis of both case-control and pedigree data is straightforward. Conclusion IGS can model complex phenotypes and contain genotypes from whole genome association studies. The database makes it possible to integrate genetic analysis with data curation. The IGS web site http://bioinformatics.well.ox.ac.uk/project-igs.shtml contains further information.

  2. Integrated Reliability and Risk Analysis System (IRRAS)

    International Nuclear Information System (INIS)

    Russell, K.D.; McKay, M.K.; Sattison, M.B.; Skinner, N.L.; Wood, S.T.; Rasmuson, D.M.

    1992-01-01

    The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the user the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification. Version 1.0 of the IRRAS program was released in February of 1987. Since that time, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 4.0 and is the subject of this Reference Manual. Version 4.0 of IRRAS provides the same capabilities as Version 1.0 and adds a relational data base facility for managing the data, improved functionality, and improved algorithm performance

  3. Integrity Analysis of Damaged Steam Generator Tubes

    International Nuclear Information System (INIS)

    Stanic, D.

    1998-01-01

    Variety of degradation mechanisms affecting steam generator tubes makes steam generators as one of the critical components in the nuclear power plants. Depending of their nature, degradation mechanisms cause different types of damages. It requires performance of extensive integrity analysis in order to access various conditions of crack behavior under operating and accidental conditions. Development and application of advanced eddy current techniques for steam generator examination provide good characterization of found damages. Damage characteristics (shape, orientation and dimensions) may be defined and used for further evaluation of damage influence on tube integrity. In comparison with experimental and analytical methods, numerical methods are also efficient tools for integrity assessment. Application of finite element methods provides relatively simple modeling of different type of damages and simulation of various operating conditions. The stress and strain analysis may be performed for elastic and elasto-plastic state with good ability for visual presentation of results. Furthermore, the fracture mechanics parameters may be calculated. Results obtained by numerical analysis supplemented with experimental results are the base for definition of alternative plugging criteria which may significantly reduce the number of plugged tubes. (author)

  4. The integrated microbial genome resource of analysis.

    Science.gov (United States)

    Checcucci, Alice; Mengoni, Alessio

    2015-01-01

    Integrated Microbial Genomes and Metagenomes (IMG) is a biocomputational system that allows to provide information and support for annotation and comparative analysis of microbial genomes and metagenomes. IMG has been developed by the US Department of Energy (DOE)-Joint Genome Institute (JGI). IMG platform contains both draft and complete genomes, sequenced by Joint Genome Institute and other public and available genomes. Genomes of strains belonging to Archaea, Bacteria, and Eukarya domains are present as well as those of viruses and plasmids. Here, we provide some essential features of IMG system and case study for pangenome analysis.

  5. Integrated analysis of genetic data with R

    Directory of Open Access Journals (Sweden)

    Zhao Jing

    2006-01-01

    Full Text Available Abstract Genetic data are now widely available. There is, however, an apparent lack of concerted effort to produce software systems for statistical analysis of genetic data compared with other fields of statistics. It is often a tremendous task for end-users to tailor them for particular data, especially when genetic data are analysed in conjunction with a large number of covariates. Here, R http://www.r-project.org, a free, flexible and platform-independent environment for statistical modelling and graphics is explored as an integrated system for genetic data analysis. An overview of some packages currently available for analysis of genetic data is given. This is followed by examples of package development and practical applications. With clear advantages in data management, graphics, statistical analysis, programming, internet capability and use of available codes, it is a feasible, although challenging, task to develop it into an integrated platform for genetic analysis; this will require the joint efforts of many researchers.

  6. Advancing Alternative Analysis: Integration of Decision Science

    DEFF Research Database (Denmark)

    Malloy, Timothy F; Zaunbrecher, Virginia M; Batteate, Christina

    2016-01-01

    Decision analysis-a systematic approach to solving complex problems-offers tools and frameworks to support decision making that are increasingly being applied to environmental challenges. Alternatives analysis is a method used in regulation and product design to identify, compare, and evaluate......, and civil society and included experts in toxicology, decision science, alternatives assessment, engineering, and law and policy. Participants were divided into two groups and prompted with targeted questions. Throughout the workshop, the groups periodically came together in plenary sessions to reflect......) engaging the systematic development and evaluation of decision approaches and tools; (2) using case studies to advance the integration of decision analysis into alternatives analysis; (3) supporting transdisciplinary research; and (4) supporting education and outreach efforts....

  7. Aeroheating Measurement of Apollo Shaped Capsule with Boundary Layer Trip in the Free-piston Shock Tunnel HIEST

    Science.gov (United States)

    Hideyuki, TANNO; Tomoyuki, KOMURO; Kazuo, SATO; Katsuhiro, ITOH; Lillard, Randolph P.; Olejniczak, Joseph

    2013-01-01

    An aeroheating measurement test campaign of an Apollo capsule model with laminar and turbulent boundary layer was performed in the free-piston shock tunnel HIEST at JAXA Kakuda Space Center. A 250mm-diameter 6.4%-scaled Apollo CM capsule model made of SUS-304 stainless steel was applied in this study. To measure heat flux distribution, the model was equipped with 88 miniature co-axial Chromel-Constantan thermocouples on the heat shield surface of the model. In order to promote boundary layer transition, a boundary layer trip insert with 13 "pizza-box" isolated roughness elements, which have 1.27mm square, were placed at 17mm below of the model geometric center. Three boundary layer trip inserts with roughness height of k=0.3mm, 0.6mm and 0.8mm were used to identify the appropriate height to induce transition. Heat flux records with or without roughness elements were obtained for model angles of attack 28º under stagnation enthalpy between H(sub 0)=3.5MJ/kg to 21MJ/kg and stagnation pressure between P(sub 0)=14MPa to 60MPa. Under the condition above, Reynolds number based on the model diameter was varied from 0.2 to 1.3 million. With roughness elements, boundary layer became fully turbulent less than H(sub 0)=9MJ/kg condition. However, boundary layer was still laminar over H(sub 0)=13MJ/kg condition even with the highest roughness elements. An additional experiment was also performed to correct unexpected heat flux augmentation observed over H(sub 0)=9MJ/kg condition.

  8. Integration and segregation in auditory scene analysis

    Science.gov (United States)

    Sussman, Elyse S.

    2005-03-01

    Assessment of the neural correlates of auditory scene analysis, using an index of sound change detection that does not require the listener to attend to the sounds [a component of event-related brain potentials called the mismatch negativity (MMN)], has previously demonstrated that segregation processes can occur without attention focused on the sounds and that within-stream contextual factors influence how sound elements are integrated and represented in auditory memory. The current study investigated the relationship between the segregation and integration processes when they were called upon to function together. The pattern of MMN results showed that the integration of sound elements within a sound stream occurred after the segregation of sounds into independent streams and, further, that the individual streams were subject to contextual effects. These results are consistent with a view of auditory processing that suggests that the auditory scene is rapidly organized into distinct streams and the integration of sequential elements to perceptual units takes place on the already formed streams. This would allow for the flexibility required to identify changing within-stream sound patterns, needed to appreciate music or comprehend speech..

  9. Integrated framework for dynamic safety analysis

    International Nuclear Information System (INIS)

    Kim, Tae Wan; Karanki, Durga R.

    2012-01-01

    In the conventional PSA (Probabilistic Safety Assessment), detailed plant simulations by independent thermal hydraulic (TH) codes are used in the development of accident sequence models. Typical accidents in a NPP involve complex interactions among process, safety systems, and operator actions. As independent TH codes do not have the models of operator actions and full safety systems, they cannot literally simulate the integrated and dynamic interactions of process, safety systems, and operator responses. Offline simulation with pre decided states and time delays may not model the accident sequences properly. Moreover, when stochastic variability in responses of accident models is considered, defining all the combinations for simulations will be cumbersome task. To overcome some of these limitations of conventional safety analysis approach, TH models are coupled with the stochastic models in the dynamic event tree (DET) framework, which provides flexibility to model the integrated response due to better communication as all the accident elements are in the same model. The advantages of this framework also include: Realistic modeling in dynamic scenarios, comprehensive results, integrated approach (both deterministic and probabilistic models), and support for HRA (Human Reliability Analysis)

  10. Integrating health and environmental impact analysis

    DEFF Research Database (Denmark)

    Reis, S; Morris, G.; Fleming, L. E.

    2015-01-01

    which addresses human activity in all its social, economic and cultural complexity. The new approach must be integral to, and interactive, with the natural environment. We see the continuing failure to truly integrate human health and environmental impact analysis as deeply damaging, and we propose...... while equally emphasizing the health of the environment, and the growing calls for 'ecological public health' as a response to global environmental concerns now suffusing the discourse in public health. More revolution than evolution, ecological public health will demand new perspectives regarding...... the interconnections among society, the economy, the environment and our health and well-being. Success must be built on collaborations between the disparate scientific communities of the environmental sciences and public health as well as interactions with social scientists, economists and the legal profession...

  11. Advancing Alternative Analysis: Integration of Decision Science.

    Science.gov (United States)

    Malloy, Timothy F; Zaunbrecher, Virginia M; Batteate, Christina M; Blake, Ann; Carroll, William F; Corbett, Charles J; Hansen, Steffen Foss; Lempert, Robert J; Linkov, Igor; McFadden, Roger; Moran, Kelly D; Olivetti, Elsa; Ostrom, Nancy K; Romero, Michelle; Schoenung, Julie M; Seager, Thomas P; Sinsheimer, Peter; Thayer, Kristina A

    2017-06-13

    Decision analysis-a systematic approach to solving complex problems-offers tools and frameworks to support decision making that are increasingly being applied to environmental challenges. Alternatives analysis is a method used in regulation and product design to identify, compare, and evaluate the safety and viability of potential substitutes for hazardous chemicals. We assessed whether decision science may assist the alternatives analysis decision maker in comparing alternatives across a range of metrics. A workshop was convened that included representatives from government, academia, business, and civil society and included experts in toxicology, decision science, alternatives assessment, engineering, and law and policy. Participants were divided into two groups and were prompted with targeted questions. Throughout the workshop, the groups periodically came together in plenary sessions to reflect on other groups' findings. We concluded that the further incorporation of decision science into alternatives analysis would advance the ability of companies and regulators to select alternatives to harmful ingredients and would also advance the science of decision analysis. We advance four recommendations: a ) engaging the systematic development and evaluation of decision approaches and tools; b ) using case studies to advance the integration of decision analysis into alternatives analysis; c ) supporting transdisciplinary research; and d ) supporting education and outreach efforts. https://doi.org/10.1289/EHP483.

  12. Structural integrity analysis of a steam turbine

    International Nuclear Information System (INIS)

    Villagarcia, Maria P.

    1997-01-01

    One of the most critical components of a power utility is the rotor of the steam turbine. Catastrophic failures of the last decades have promoted the development of life assessment procedures for rotors. The present study requires the knowledge of operating conditions, component geometry, the properties of materials, history of the component, size, location and nature of the existing flaws. The aim of the present work is the obtention of a structural integrity analysis procedure for a steam turbine rotor, taking into account the above-mentioned parameters. In this procedure, a stress thermal analysis by finite elements is performed initially, in order to obtain the temperature and stress distribution for a subsequent analysis by fracture mechanics. The risk of a fast fracture due to flaws in the central zone of the rotor is analyzed. The procedure is applied to an operating turbine: the main steam turbine of the Atucha I nuclear power utility. (author)

  13. A taxonomy of integral reaction path analysis

    Energy Technology Data Exchange (ETDEWEB)

    Grcar, Joseph F.; Day, Marcus S.; Bell, John B.

    2004-12-23

    W. C. Gardiner observed that achieving understanding through combustion modeling is limited by the ability to recognize the implications of what has been computed and to draw conclusions about the elementary steps underlying the reaction mechanism. This difficulty can be overcome in part by making better use of reaction path analysis in the context of multidimensional flame simulations. Following a survey of current practice, an integral reaction flux is formulated in terms of conserved scalars that can be calculated in a fully automated way. Conditional analyses are then introduced, and a taxonomy for bidirectional path analysis is explored. Many examples illustrate the resulting path analysis and uncover some new results about nonpremixed methane-air laminar jets.

  14. The ASDEX integrated data analysis system AIDA

    International Nuclear Information System (INIS)

    Grassie, K.; Gruber, O.; Kardaun, O.; Kaufmann, M.; Lackner, K.; Martin, P.; Mast, K.F.; McCarthy, P.J.; Mertens, V.; Pohl, D.; Rang, U.; Wunderlich, R.

    1989-11-01

    Since about two years, the ASDEX integrated data analysis system (AIDA), which combines the database (DABA) and the statistical analysis system (SAS), is successfully in operation. Besides a considerable, but meaningful, reduction of the 'raw' shot data, it offers the advantage of carefully selected and precisely defined datasets, which are easily accessible for informative tabular data overviews (DABA), and multi-shot analysis (SAS). Even rather complicated, statistical analyses can be performed efficiently within this system. In this report, we want to summarise AIDA's main features, give some details on its set-up and on the physical models which have been used for the derivation of the processed data. We also give short introduction how to use DABA and SAS. (orig.)

  15. Accelerator physics analysis with an integrated toolkit

    International Nuclear Information System (INIS)

    Holt, J.A.; Michelotti, L.; Satogata, T.

    1992-08-01

    Work is in progress on an integrated software toolkit for linear and nonlinear accelerator design, analysis, and simulation. As a first application, ''beamline'' and ''MXYZPTLK'' (differential algebra) class libraries, were used with an X Windows graphics library to build an user-friendly, interactive phase space tracker which, additionally, finds periodic orbits. This program was used to analyse a theoretical lattice which contains octupoles and decapoles to find the 20th order, stable and unstable periodic orbits and to explore the local phase space structure

  16. The Integral A Crux for Analysis

    CERN Document Server

    Krantz, Steven G

    2011-01-01

    This book treats all of the most commonly used theories of the integral. After motivating the idea of integral, we devote a full chapter to the Riemann integral and the next to the Lebesgue integral. Another chapter compares and contrasts the two theories. The concluding chapter offers brief introductions to the Henstock integral, the Daniell integral, the Stieltjes integral, and other commonly used integrals. The purpose of this book is to provide a quick but accurate (and detailed) introduction to all aspects of modern integration theory. It should be accessible to any student who has had ca

  17. Parallel processing of structural integrity analysis codes

    International Nuclear Information System (INIS)

    Swami Prasad, P.; Dutta, B.K.; Kushwaha, H.S.

    1996-01-01

    Structural integrity analysis forms an important role in assessing and demonstrating the safety of nuclear reactor components. This analysis is performed using analytical tools such as Finite Element Method (FEM) with the help of digital computers. The complexity of the problems involved in nuclear engineering demands high speed computation facilities to obtain solutions in reasonable amount of time. Parallel processing systems such as ANUPAM provide an efficient platform for realising the high speed computation. The development and implementation of software on parallel processing systems is an interesting and challenging task. The data and algorithm structure of the codes plays an important role in exploiting the parallel processing system capabilities. Structural analysis codes based on FEM can be divided into two categories with respect to their implementation on parallel processing systems. The first category codes such as those used for harmonic analysis, mechanistic fuel performance codes need not require the parallelisation of individual modules of the codes. The second category of codes such as conventional FEM codes require parallelisation of individual modules. In this category, parallelisation of equation solution module poses major difficulties. Different solution schemes such as domain decomposition method (DDM), parallel active column solver and substructuring method are currently used on parallel processing systems. Two codes, FAIR and TABS belonging to each of these categories have been implemented on ANUPAM. The implementation details of these codes and the performance of different equation solvers are highlighted. (author). 5 refs., 12 figs., 1 tab

  18. Office of Integrated Assessment and Policy Analysis

    International Nuclear Information System (INIS)

    Parzyck, D.C.

    1980-01-01

    The mission of the Office of Integrated Assessments and Policy Analysis (OIAPA) is to examine current and future policies related to the development and use of energy technologies. The principal ongoing research activity to date has focused on the impacts of several energy sources, including coal, oil shale, solar, and geothermal, from the standpoint of the Resource Conservation and Recovery Act. An additional project has recently been initiated on an evaluation of impacts associated with the implementation of the Toxic Substances Control Act. The impacts of the Resource Conservation and Recovery Act and the Toxic Substances Control Act on energy supply constitute the principal research focus of OIAPA for the near term. From these studies a research approach will be developed to identify certain common elements in the regulatory evaluation cycle as a means of evaluating subsequent environmental, health, and socioeconomic impact. It is planned that an integrated assessment team examine studies completed or underway on the following aspects of major regulations: health, risk assessment, testing protocols, environment control cost/benefits, institutional structures, and facility siting. This examination would assess the methodologies used, determine the general applicability of such studies, and present in a logical form information that appears to have broad general application. A suggested action plan for the State of Tennessee on radioactive and hazardous waste management is outlined

  19. Integrating Data Transformation in Principal Components Analysis

    KAUST Repository

    Maadooliat, Mehdi

    2015-01-02

    Principal component analysis (PCA) is a popular dimension reduction method to reduce the complexity and obtain the informative aspects of high-dimensional datasets. When the data distribution is skewed, data transformation is commonly used prior to applying PCA. Such transformation is usually obtained from previous studies, prior knowledge, or trial-and-error. In this work, we develop a model-based method that integrates data transformation in PCA and finds an appropriate data transformation using the maximum profile likelihood. Extensions of the method to handle functional data and missing values are also developed. Several numerical algorithms are provided for efficient computation. The proposed method is illustrated using simulated and real-world data examples.

  20. Qualitative Analysis of Integration Adapter Modeling

    OpenAIRE

    Ritter, Daniel; Holzleitner, Manuel

    2015-01-01

    Integration Adapters are a fundamental part of an integration system, since they provide (business) applications access to its messaging channel. However, their modeling and configuration remain under-represented. In previous work, the integration control and data flow syntax and semantics have been expressed in the Business Process Model and Notation (BPMN) as a semantic model for message-based integration, while adapter and the related quality of service modeling were left for further studi...

  1. PHIDIAS- Pathogen Host Interaction Data Integration and Analysis

    Indian Academy of Sciences (India)

    PHIDIAS- Pathogen Host Interaction Data Integration and Analysis- allows searching of integrated genome sequences, conserved domains and gene expressions data related to pathogen host interactions in high priority agents for public health and security ...

  2. Migration in Deltas: An Integrated Analysis

    Science.gov (United States)

    Nicholls, Robert J.; Hutton, Craig W.; Lazar, Attila; Adger, W. Neil; Allan, Andrew; Arto, Inaki; Vincent, Katharine; Rahman, Munsur; Salehin, Mashfiqus; Sugata, Hazra; Ghosh, Tuhin; Codjoe, Sam; Appeaning-Addo, Kwasi

    2017-04-01

    Deltas and low-lying coastal regions have long been perceived as vulnerable to global sea-level rise, with the potential for mass displacement of exposed populations. The assumption of mass displacement of populations in deltas requires a comprehensive reassessment in the light of present and future migration in deltas, including the potential role of adaptation to influence these decisions. At present, deltas are subject to multiple drivers of environmental change and often have high population densities as they are accessible and productive ecosystems. Climate change, catchment management, subsidence and land cover change drive environmental change across all deltas. Populations in deltas are also highly mobile, with significant urbanization trends and the growth of large cities and mega-cities within or adjacent to deltas across Asia and Africa. Such migration is driven primarily by economic opportunity, yet environmental change in general, and climate change in particular, are likely to play an increasing direct and indirect role in future migration trends. The policy challenges centre on the role of migration within regional adaptation strategies to climate change; the protection of vulnerable populations; and the future of urban settlements within deltas. This paper reviews current knowledge on migration and adaptation to environmental change to discern specific issues pertinent to delta regions. It develops a new integrated methodology to assess present and future migration in deltas using the Volta delta in Ghana, Mahanadi delta in India and Ganges-Brahmaputra-Meghna delta across India and Bangladesh. The integrated method focuses on: biophysical changes and spatial distribution of vulnerability; demographic changes and migration decision-making using multiple methods and data; macro-economic trends and scenarios in the deltas; and the policies and governance structures that constrain and enable adaptation. The analysis is facilitated by a range of

  3. Integration of risk analysis, land use planning, and cost analysis

    International Nuclear Information System (INIS)

    Rajen, G.; Sanchez, G.

    1994-01-01

    The Department of Energy (DOE) and the Pueblo of San Ildefonso (Pueblo), which is a sovereign Indian tribe, have often been involved in adversarial situations regarding the Los Alamos National Laboratory (LANL). The Pueblo shares a common boundary with the LANL. This paper describes an on-going project that could alter the DOE and the Pueblo's relationship to one of cooperation; and unite the DOE and the Pueblo in a Pollution Prevention/Waste Minimization, and Integrated Risk Analysis and Land Use Planning effort

  4. IMP: Integrated method for power analysis

    Energy Technology Data Exchange (ETDEWEB)

    1989-03-01

    An integrated, easy to use, economical package of microcomputer programs has been developed which can be used by small hydro developers to evaluate potential sites for small scale hydroelectric plants in British Columbia. The programs enable evaluation of sites located far from the nearest stream gauging station, for which streamflow data are not available. For each of the province's 6 hydrologic regions, a streamflow record for one small watershed is provided in the data base. The program can then be used to generate synthetic streamflow records and to compare results obtained by the modelling procedure with the actual data. The program can also be used to explore the significance of modelling parameters and to develop a detailed appreciation for the accuracy which can be obtained under various circumstances. The components of the program are an atmospheric model of precipitation; a watershed model that will generate a continuous series of streamflow data, based on information from the atmospheric model; a flood frequency analysis system that uses site-specific topographic data plus information from the atmospheric model to generate a flood frequency curve; a hydroelectric power simulation program which determines daily energy output for a run-of-river or reservoir storage site based on selected generation facilities and the time series generated in the watershed model; and a graphic analysis package that provides direct visualization of data and modelling results. This report contains a description of the programs, a user guide, the theory behind the model, the modelling methodology, and results from a workshop that reviewed the program package. 32 refs., 16 figs., 18 tabs.

  5. K West integrated water treatment system subproject safety analysis document

    International Nuclear Information System (INIS)

    SEMMENS, L.S.

    1999-01-01

    This Accident Analysis evaluates unmitigated accident scenarios, and identifies Safety Significant and Safety Class structures, systems, and components for the K West Integrated Water Treatment System

  6. K West integrated water treatment system subproject safety analysis document

    Energy Technology Data Exchange (ETDEWEB)

    SEMMENS, L.S.

    1999-02-24

    This Accident Analysis evaluates unmitigated accident scenarios, and identifies Safety Significant and Safety Class structures, systems, and components for the K West Integrated Water Treatment System.

  7. Integrating health and environmental impact analysis.

    Science.gov (United States)

    Reis, S; Morris, G; Fleming, L E; Beck, S; Taylor, T; White, M; Depledge, M H; Steinle, S; Sabel, C E; Cowie, H; Hurley, F; Dick, J McP; Smith, R I; Austen, M

    2015-10-01

    Scientific investigations have progressively refined our understanding of the influence of the environment on human health, and the many adverse impacts that human activities exert on the environment, from the local to the planetary level. Nonetheless, throughout the modern public health era, health has been pursued as though our lives and lifestyles are disconnected from ecosystems and their component organisms. The inadequacy of the societal and public health response to obesity, health inequities, and especially global environmental and climate change now calls for an ecological approach which addresses human activity in all its social, economic and cultural complexity. The new approach must be integral to, and interactive, with the natural environment. We see the continuing failure to truly integrate human health and environmental impact analysis as deeply damaging, and we propose a new conceptual model, the ecosystems-enriched Drivers, Pressures, State, Exposure, Effects, Actions or 'eDPSEEA' model, to address this shortcoming. The model recognizes convergence between the concept of ecosystems services which provides a human health and well-being slant to the value of ecosystems while equally emphasizing the health of the environment, and the growing calls for 'ecological public health' as a response to global environmental concerns now suffusing the discourse in public health. More revolution than evolution, ecological public health will demand new perspectives regarding the interconnections among society, the economy, the environment and our health and well-being. Success must be built on collaborations between the disparate scientific communities of the environmental sciences and public health as well as interactions with social scientists, economists and the legal profession. It will require outreach to political and other stakeholders including a currently largely disengaged general public. The need for an effective and robust science-policy interface has

  8. Noise analysis of switched integrator preamplifiers

    International Nuclear Information System (INIS)

    Sun Hongbo; Li Yulan; Zhu Weibin

    2004-01-01

    The main noise sources of switched integrator preamplifiers are discussed, and their noise performance are given combined PSpice simulation and experiments on them. Then, some practical methods on how to reduce noise of preamplifiers in two different integrator modes are provided. (authors)

  9. Social Ecological Model Analysis for ICT Integration

    Science.gov (United States)

    Zagami, Jason

    2013-01-01

    ICT integration of teacher preparation programmes was undertaken by the Australian Teaching Teachers for the Future (TTF) project in all 39 Australian teacher education institutions and highlighted the need for guidelines to inform systemic ICT integration approaches. A Social Ecological Model (SEM) was used to positively inform integration…

  10. IPAD: the Integrated Pathway Analysis Database for Systematic Enrichment Analysis.

    Science.gov (United States)

    Zhang, Fan; Drabier, Renee

    2012-01-01

    Next-Generation Sequencing (NGS) technologies and Genome-Wide Association Studies (GWAS) generate millions of reads and hundreds of datasets, and there is an urgent need for a better way to accurately interpret and distill such large amounts of data. Extensive pathway and network analysis allow for the discovery of highly significant pathways from a set of disease vs. healthy samples in the NGS and GWAS. Knowledge of activation of these processes will lead to elucidation of the complex biological pathways affected by drug treatment, to patient stratification studies of new and existing drug treatments, and to understanding the underlying anti-cancer drug effects. There are approximately 141 biological human pathway resources as of Jan 2012 according to the Pathguide database. However, most currently available resources do not contain disease, drug or organ specificity information such as disease-pathway, drug-pathway, and organ-pathway associations. Systematically integrating pathway, disease, drug and organ specificity together becomes increasingly crucial for understanding the interrelationships between signaling, metabolic and regulatory pathway, drug action, disease susceptibility, and organ specificity from high-throughput omics data (genomics, transcriptomics, proteomics and metabolomics). We designed the Integrated Pathway Analysis Database for Systematic Enrichment Analysis (IPAD, http://bioinfo.hsc.unt.edu/ipad), defining inter-association between pathway, disease, drug and organ specificity, based on six criteria: 1) comprehensive pathway coverage; 2) gene/protein to pathway/disease/drug/organ association; 3) inter-association between pathway, disease, drug, and organ; 4) multiple and quantitative measurement of enrichment and inter-association; 5) assessment of enrichment and inter-association analysis with the context of the existing biological knowledge and a "gold standard" constructed from reputable and reliable sources; and 6) cross-linking of

  11. CFD Analysis for Advanced Integrated Head Assembly

    Energy Technology Data Exchange (ETDEWEB)

    Jo, Won Ho; Kang, Tae Kyo; Cho, Yeon Ho; Kim, Hyun Min [KEPCO Engineering and Construction Co., Daejeon (Korea, Republic of)

    2016-10-15

    The Integrated Head Assembly (IHA) is permanently installed on the reactor vessel closure head during the normal plant operation and refueling operation. It consists of a number of systems and components such as the head lifting system, seismic support system, Control Element Drive Mechanism (CEDM) cooling system, cable support system, cooling shroud assemblies. With the operating experiences of the IHA, the needs for the design change to the current APR1400 IHA arouse to improve the seismic resistance and to accommodate the convenient maintenance. In this paper, the effects of the design changes were rigorously studied for the various sizes of the inlet openings to assure the proper cooling of the CEDMs. And the system pressure differentials and required flow rate for the CEDM cooling fan were analyzed regarding the various operating conditions for determining the capacity of the fan. As a part of the design process of the AIHA, the number of air inlets and baffle regions are reduced by simplifying the design of the APR1400 IHA. The design change of the baffle regions has been made such that the maximum possible space are occupied inside the IHA cooling shroud shell while avoiding the interference with CEDMs. So, only the air inlet opening was studied for the design change to supply a sufficient cooling air flow for each CEDM. The size and location of the air inlets in middle cooling shroud assembly were determined by the CFD analyses of the AIHA. And the case CFD analyses were performed depending on the ambient air temperature and fan operating conditions. The size of the air inlet openings is increased by comparison with the initial AIHA design, and it is confirmed that the cooling air flow rate for each CEDM meet the design requirement of 800 SCFM ± 10% with the increased air inlets. At the initial analysis, the fan outlet flow rate was assumed as 48.3 lbm/s, but the result revealed that the less outflow rate at the fan is enough to meet the design requirement

  12. Sensitivity Analysis for Design Optimization Integrated Software Tools, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this proposed project is to provide a new set of sensitivity analysis theory and codes, the Sensitivity Analysis for Design Optimization Integrated...

  13. Game analysis of product-service integration

    Directory of Open Access Journals (Sweden)

    Heping Zhong

    2014-10-01

    Full Text Available Purpose: This paper aims at defining the value creation mechanism and income distribution strategies of product-service integration in order to promote product-service integration of a firm.Design/methodology/approach: This paper conducts researches quantitatively on the coordination mechanism of product-service integration by using game theory, and uses the methods of Shapley value and Equal growth rate to further discuss income distribution strategies of product-service integration.Findings: Product-service integration increases the total income of a firm and the added value of the income decreases as the unit price demand variation coefficient of products and services increases, while decreases as the marginal cost of products increases, decreases as the marginal cost of services increases. Moreover, the findings suggest that both income distribution strategies of product-service integration based on Shapley value method and Equal growth rate method can make the product department and service department of a firm win-win and realize the pareto improvement. The choice of what kind of distribution strategy to coordinate the actions between departments depends on the department playing dominant role in the firm. Generally speaking, for a firm at the center of market, when the product department is the main contributor to firm income, the service department will choose the income distribution strategy of product-service integration based on Shapley value method; when the service department is the main contributor to firm income, the service department will choose the income distribution strategy of product-service integration based on Equal growth rate method.Research limitations/implications: This paper makes some strict assumptions such as complete information, risk neutral, linear cost function and so on and the discussion is limited to the simple relationship between product department and service department.Practical implications: Product

  14. Advanced Concept Architecture Design and Integrated Analysis (ACADIA)

    Science.gov (United States)

    2017-11-03

    1 Advanced Concept Architecture Design and Integrated Analysis (ACADIA) Submitted to the National Institute of Aerospace (NIA) on...Research Report 20161001 - 20161030 Advanced Concept Architecture Design and Integrated Analysis (ACADIA) W911NF-16-2-0229 8504Cedric Justin, Youngjun

  15. Integrating fire management analysis into land management planning

    Science.gov (United States)

    Thomas J. Mills

    1983-01-01

    The analysis of alternative fire management programs should be integrated into the land and resource management planning process, but a single fire management analysis model cannot meet all planning needs. Therefore, a set of simulation models that are analytically separate from integrated land management planning models are required. The design of four levels of fire...

  16. Real analysis measure theory, integration, and Hilbert spaces

    CERN Document Server

    Stein, Elias M

    2005-01-01

    Real Analysis is the third volume in the Princeton Lectures in Analysis, a series of four textbooks that aim to present, in an integrated manner, the core areas of analysis. Here the focus is on the development of measure and integration theory, differentiation and integration, Hilbert spaces, and Hausdorff measure and fractals. This book reflects the objective of the series as a whole: to make plain the organic unity that exists between the various parts of the subject, and to illustrate the wide applicability of ideas of analysis to other fields of mathematics and science. After

  17. Semantic web for integrated network analysis in biomedicine.

    Science.gov (United States)

    Chen, Huajun; Ding, Li; Wu, Zhaohui; Yu, Tong; Dhanapalan, Lavanya; Chen, Jake Y

    2009-03-01

    The Semantic Web technology enables integration of heterogeneous data on the World Wide Web by making the semantics of data explicit through formal ontologies. In this article, we survey the feasibility and state of the art of utilizing the Semantic Web technology to represent, integrate and analyze the knowledge in various biomedical networks. We introduce a new conceptual framework, semantic graph mining, to enable researchers to integrate graph mining with ontology reasoning in network data analysis. Through four case studies, we demonstrate how semantic graph mining can be applied to the analysis of disease-causal genes, Gene Ontology category cross-talks, drug efficacy analysis and herb-drug interactions analysis.

  18. Integrated risk analysis of global climate change

    International Nuclear Information System (INIS)

    Shlyakhter, Alexander; Wilson, Richard; Valverde A, L.J. Jr.

    1995-01-01

    This paper discusses several factors that should be considered in integrated risk analyses of global climate change. We begin by describing how the problem of global climate change can be subdivided into largely independent parts that can be linked together in an analytically tractable fashion. Uncertainty plays a central role in integrated risk analyses of global climate change. Accordingly, we consider various aspects of uncertainty as they relate to the climate change problem. We also consider the impacts of these uncertainties on various risk management issues, such as sequential decision strategies, value of information, and problems of interregional and intergenerational equity. (author)

  19. Integrated watershed analysis: adapting to changing times

    Science.gov (United States)

    Gordon H. Reeves

    2013-01-01

    Resource managers are increasingly required to conduct integrated analyses of aquatic and terrestrial ecosystems before undertaking any activities. Th ere are a number of research studies on the impacts of management actions on these ecosystems, as well as a growing body of knowledge about ecological processes that aff ect them, particularly aquatic ecosystems, which...

  20. Relay Feedback Analysis for Double Integral Plants

    Directory of Open Access Journals (Sweden)

    Zhen Ye

    2011-01-01

    Full Text Available Double integral plants under relay feedback are studied. Complete results on the uniqueness of solutions, existence, and stability of the limit cycles are established using the point transformation method. Analytical expressions are also given for determining the amplitude and period of a limit cycle from the plant parameters.

  1. An integrated algorithm for hypersonic fluid-thermal-structural numerical simulation

    Science.gov (United States)

    Li, Jia-Wei; Wang, Jiang-Feng

    2018-05-01

    In this paper, a fluid-structural-thermal integrated method is presented based on finite volume method. A unified integral equations system is developed as the control equations for physical process of aero-heating and structural heat transfer. The whole physical field is discretized by using an up-wind finite volume method. To demonstrate its capability, the numerical simulation of Mach 6.47 flow over stainless steel cylinder shows a good agreement with measured values, and this method dynamically simulates the objective physical processes. Thus, the integrated algorithm proves to be efficient and reliable.

  2. Integration of video and radiation analysis data

    International Nuclear Information System (INIS)

    Menlove, H.O.; Howell, J.A.; Rodriguez, C.A.; Eccleston, G.W.; Beddingfield, D.; Smith, J.E.; Baumgart, C.W.

    1995-01-01

    For the past several years, the integration of containment and surveillance (C/S) with nondestructive assay (NDA) sensors for monitoring the movement of nuclear material has focused on the hardware and communications protocols in the transmission network. Little progress has been made in methods to utilize the combined C/S and NDA data for safeguards and to reduce the inspector time spent in nuclear facilities. One of the fundamental problems in the integration of the combined data is that the two methods operate in different dimensions. The C/S video data is spatial in nature; whereas, the NDA sensors provide radiation levels versus time data. The authors have introduced a new method to integrate spatial (digital video) with time (radiation monitoring) information. This technology is based on pattern recognition by neural networks, provides significant capability to analyze complex data, and has the ability to learn and adapt to changing situations. This technique has the potential of significantly reducing the frequency of inspection visits to key facilities without a loss of safeguards effectiveness

  3. Integration of Design and Control through Model Analysis

    DEFF Research Database (Denmark)

    Russel, Boris Mariboe; Henriksen, Jens Peter; Jørgensen, Sten Bay

    2002-01-01

    A systematic computer aided analysis of the process model is proposed as a pre-solution step for integration of design and control problems. The process model equations are classified in terms of balance equations, constitutive equations and conditional equations. Analysis of the phenomena models...... (structure selection) issues for the integrated problems are considered. (C) 2002 Elsevier Science Ltd. All rights reserved....... representing the constitutive equations identify the relationships between the important process and design variables, which help to understand, define and address some of the issues related to integration of design and control. Furthermore, the analysis is able to identify a set of process (control) variables...

  4. Analysis of Optimal Operation of an Energy Integrated Distillation Plant

    DEFF Research Database (Denmark)

    Li, Hong Wen; Hansen, C.A.; Gani, Rafiqul

    2003-01-01

    The efficiency of manufacturing systems can be significantly increased through diligent application of control based on mathematical models thereby enabling more tight integration of decision making with systems operation. In the present paper analysis of optimal operation of an energy integrated...

  5. Integrated program of using of Probabilistic Safety Analysis in Spain

    International Nuclear Information System (INIS)

    1998-01-01

    Since 25 June 1986, when the CSN (Nuclear Safety Conseil) approve the Integrated Program of Probabilistic Safety Analysis, this program has articulated the main activities of CSN. This document summarize the activities developed during these years and reviews the Integrated programme

  6. Analysis of integrated video and radiation data

    International Nuclear Information System (INIS)

    Howell, J.A.; Menlove, H.O.; Rodriguez, C.A.; Beddingfield, D.; Vasil, A.

    1995-01-01

    We have developed prototype software for a facility-monitoring application that will detect anomalous activity in a nuclear facility. The software, which forms the basis of a simple model, automatically reviews and analyzes integrated safeguards data from continuous unattended monitoring systems. This technology, based on pattern recognition by neural networks, provides significant capability to analyze complex data and has the ability to learn and adapt to changing situations. It is well suited for large automated facilities, reactors, spent-fuel storage facilities, reprocessing plants, and nuclear material storage vaults

  7. Signal integrity analysis on discontinuous microstrip line

    International Nuclear Information System (INIS)

    Qiao, Qingyang; Dai, Yawen; Chen, Zipeng

    2013-01-01

    In high speed PCB design, microstirp lines were used to control the impedance, however, the discontinuous microstrip line can cause signal integrity problems. In this paper, we use the transmission line theory to study the characteristics of microstrip lines. Research results indicate that the discontinuity such as truncation, gap and size change result in the problems such as radiation, reflection, delay and ground bounce. We change the discontinuities to distributed parameter circuits, analysed the steady-state response and transient response and the phase delay. The transient response cause radiation and voltage jump.

  8. Advantages of Integrative Data Analysis for Developmental Research

    Science.gov (United States)

    Bainter, Sierra A.; Curran, Patrick J.

    2015-01-01

    Amid recent progress in cognitive development research, high-quality data resources are accumulating, and data sharing and secondary data analysis are becoming increasingly valuable tools. Integrative data analysis (IDA) is an exciting analytical framework that can enhance secondary data analysis in powerful ways. IDA pools item-level data across…

  9. Integrated care: a comprehensive bibliometric analysis and literature review

    Directory of Open Access Journals (Sweden)

    Xiaowei Sun

    2014-06-01

    Full Text Available Introduction: Integrated care could not only fix up fragmented health care but also improve the continuity of care and the quality of life. Despite the volume and variety of publications, little is known about how ‘integrated care’ has developed. There is a need for a systematic bibliometric analysis on studying the important features of the integrated care literature.Aim: To investigate the growth pattern, core journals and jurisdictions and identify the key research domains of integrated care.Methods: We searched Medline/PubMed using the search strategy ‘(delivery of health care, integrated [MeSH Terms] OR integrated care [Title/Abstract]’ without time and language limits. Second, we extracted the publishing year, journals, jurisdictions and keywords of the retrieved articles. Finally, descriptive statistical analysis by the Bibliographic Item Co-occurrence Matrix Builder and hierarchical clustering by SPSS were used.Results: As many as 9090 articles were retrieved. Results included: (1 the cumulative numbers of the publications on integrated care rose perpendicularly after 1993; (2 all documents were recorded by 1646 kinds of journals. There were 28 core journals; (3 the USA is the predominant publishing country; and (4 there are six key domains including: the definition/models of integrated care, interdisciplinary patient care team, disease management for chronically ill patients, types of health care organizations and policy, information system integration and legislation/jurisprudence.Discussion and conclusion: Integrated care literature has been most evident in developed countries. International Journal of Integrated Care is highly recommended in this research area. The bibliometric analysis and identification of publication hotspots provides researchers and practitioners with core target journals, as well as an overview of the field for further research in integrated care.

  10. An operator expansion technique for path integral analysis

    International Nuclear Information System (INIS)

    Tsvetkov, I.V.

    1995-01-01

    A new method of path integral analysis in the framework of a power series technique is presented. The method is based on the operator expansion of an exponential. A regular procedure to calculate the correction terms is found. (orig.)

  11. Analysis of Price Variation and Market Integration of Prosopis ...

    African Journals Online (AJOL)

    Analysis of Price Variation and Market Integration of Prosopis Africana (guill. ... select five markets based on the presence of traders selling the commodity in the markets ... T- test result showed that Prosopis africana seed trade is profitable and ...

  12. Integrating Data Transformation in Principal Components Analysis

    KAUST Repository

    Maadooliat, Mehdi; Huang, Jianhua Z.; Hu, Jianhua

    2015-01-01

    Principal component analysis (PCA) is a popular dimension reduction method to reduce the complexity and obtain the informative aspects of high-dimensional datasets. When the data distribution is skewed, data transformation is commonly used prior

  13. Cost benefit analysis of power plant database integration

    International Nuclear Information System (INIS)

    Wilber, B.E.; Cimento, A.; Stuart, R.

    1988-01-01

    A cost benefit analysis of plant wide data integration allows utility management to evaluate integration and automation benefits from an economic perspective. With this evaluation, the utility can determine both the quantitative and qualitative savings that can be expected from data integration. The cost benefit analysis is then a planning tool which helps the utility to develop a focused long term implementation strategy that will yield significant near term benefits. This paper presents a flexible cost benefit analysis methodology which is both simple to use and yields accurate, verifiable results. Included in this paper is a list of parameters to consider, a procedure for performing the cost savings analysis, and samples of this procedure when applied to a utility. A case study is presented involving a specific utility where this procedure was applied. Their uses of the cost-benefit analysis are also described

  14. Development of data analysis tool for combat system integration

    Directory of Open Access Journals (Sweden)

    Seung-Chun Shin

    2013-03-01

    Full Text Available System integration is an important element for the construction of naval combat ships. In particular, because impeccable combat system integration together with the sensors and weapons can ensure the combat capability and survivability of the ship, the integrated performance of the combat system should be verified and validated whether or not it fulfills the requirements of the end user. In order to conduct systematic verification and validation, a data analysis tool is requisite. This paper suggests the Data Extraction, Recording and Analysis Tool (DERAT for the data analysis of the integrated performance of the combat system, including the functional definition, architecture and effectiveness of the DERAT by presenting the test results.

  15. Integrated systems analysis of the PIUS reactor

    Energy Technology Data Exchange (ETDEWEB)

    Fullwood, F.; Kroeger, P.; Higgins, J. [Brookhaven National Lab., Upton, NY (United States)] [and others

    1993-11-01

    Results are presented of a systems failure analysis of the PIUS plant systems that are used during normal reactor operation and postulated accidents. This study was performed to provide the NRC with an understanding of the behavior of the plant. The study applied two diverse failure identification methods, Failure Modes Effects & Criticality Analysis (FMECA) and Hazards & Operability (HAZOP) to the plant systems, supported by several deterministic analyses. Conventional PRA methods were also used along with a scheme for classifying events by initiator frequency and combinations of failures. Principal results of this study are: (a) an extensive listing of potential event sequences, grouped in categories that can be used by the NRC, (b) identification of support systems that are important to safety, and (c) identification of key operator actions.

  16. Integrated systems analysis of the PIUS reactor

    International Nuclear Information System (INIS)

    Fullwood, F.; Kroeger, P.; Higgins, J.

    1993-11-01

    Results are presented of a systems failure analysis of the PIUS plant systems that are used during normal reactor operation and postulated accidents. This study was performed to provide the NRC with an understanding of the behavior of the plant. The study applied two diverse failure identification methods, Failure Modes Effects ampersand Criticality Analysis (FMECA) and Hazards ampersand Operability (HAZOP) to the plant systems, supported by several deterministic analyses. Conventional PRA methods were also used along with a scheme for classifying events by initiator frequency and combinations of failures. Principal results of this study are: (a) an extensive listing of potential event sequences, grouped in categories that can be used by the NRC, (b) identification of support systems that are important to safety, and (c) identification of key operator actions

  17. Integrative data analysis of male reproductive disorders

    DEFF Research Database (Denmark)

    Edsgard, Stefan Daniel

    of such data in conjunction with data from publicly available repositories. This thesis presents an introduction to disease genetics and molecular systems biology, followed by four studies that each provide detailed clues to the etiology of male reproductive disorders. Finally, a fifth study illustrates......-wide association data with respect to copy number variation and show that the aggregated effect of rare variants can influence the risk for testicular cancer. Paper V provides an example of the application of RNA-Seq for expression analysis of a species with an unsequenced genome. We analysed the plant...... of this thesis is the identification of the molecular basis of male reproductive disorders, with a special focus on testicular cancer. To this end, clinical samples were characterized by microarraybased transcription and genomic variation assays and molecular entities were identified by computational analysis...

  18. An integrated platform for biomolecule interaction analysis

    Science.gov (United States)

    Jan, Chia-Ming; Tsai, Pei-I.; Chou, Shin-Ting; Lee, Shu-Sheng; Lee, Chih-Kung

    2013-02-01

    We developed a new metrology platform which can detect real-time changes in both a phase-interrogation mode and intensity mode of a SPR (surface plasmon resonance). We integrated a SPR and ellipsometer to a biosensor chip platform to create a new biomolecular interaction measurement mechanism. We adopted a conductive ITO (indium-tinoxide) film to the bio-sensor platform chip to expand the dynamic range and improve measurement accuracy. The thickness of the conductive film and the suitable voltage constants were found to enhance performance. A circularly polarized ellipsometry configuration was incorporated into the newly developed platform to measure the label-free interactions of recombinant human C-reactive protein (CRP) with immobilized biomolecule target monoclonal human CRP antibody at various concentrations. CRP was chosen as it is a cardiovascular risk biomarker and is an acute phase reactant as well as a specific prognostic indicator for inflammation. We found that the sensitivity of a phaseinterrogation SPR is predominantly dependent on the optimization of the sample incidence angle. The effect of the ITO layer effective index under DC and AC effects as well as an optimal modulation were experimentally performed and discussed. Our experimental results showed that the modulated dynamic range for phase detection was 10E-2 RIU based on a current effect and 10E-4 RIU based on a potential effect of which a 0.55 (°/RIU) measurement was found by angular-interrogation. The performance of our newly developed metrology platform was characterized to have a higher sensitivity and less dynamic range when compared to a traditional full-field measurement system.

  19. An analysis of 3D particle path integration algorithms

    International Nuclear Information System (INIS)

    Darmofal, D.L.; Haimes, R.

    1996-01-01

    Several techniques for the numerical integration of particle paths in steady and unsteady vector (velocity) fields are analyzed. Most of the analysis applies to unsteady vector fields, however, some results apply to steady vector field integration. Multistep, multistage, and some hybrid schemes are considered. It is shown that due to initialization errors, many unsteady particle path integration schemes are limited to third-order accuracy in time. Multistage schemes require at least three times more internal data storage than multistep schemes of equal order. However, for timesteps within the stability bounds, multistage schemes are generally more accurate. A linearized analysis shows that the stability of these integration algorithms are determined by the eigenvalues of the local velocity tensor. Thus, the accuracy and stability of the methods are interpreted with concepts typically used in critical point theory. This paper shows how integration schemes can lead to erroneous classification of critical points when the timestep is finite and fixed. For steady velocity fields, we demonstrate that timesteps outside of the relative stability region can lead to similar integration errors. From this analysis, guidelines for accurate timestep sizing are suggested for both steady and unsteady flows. In particular, using simulation data for the unsteady flow around a tapered cylinder, we show that accurate particle path integration requires timesteps which are at most on the order of the physical timescale of the flow

  20. Integrating human factors into process hazard analysis

    International Nuclear Information System (INIS)

    Kariuki, S.G.; Loewe, K.

    2007-01-01

    A comprehensive process hazard analysis (PHA) needs to address human factors. This paper describes an approach that systematically identifies human error in process design and the human factors that influence its production and propagation. It is deductive in nature and therefore considers human error as a top event. The combinations of different factors that may lead to this top event are analysed. It is qualitative in nature and is used in combination with other PHA methods. The method has an advantage because it does not look at the operator error as the sole contributor to the human failure within a system but a combination of all underlying factors

  1. Integrative Genomic Analysis of Complex traits

    DEFF Research Database (Denmark)

    Ehsani, Ali Reza

    In the last decade rapid development in biotechnologies has made it possible to extract extensive information about practically all levels of biological organization. An ever-increasing number of studies are reporting miltilayered datasets on the entire DNA sequence, transceroption, protein...... expression, and metabolite abundance of more and more populations in a multitude of invironments. However, a solid model for including all of this complex information in one analysis, to disentangle genetic variation and the underlying genetic architecture of complex traits and diseases, has not yet been...

  2. Project analysis and integration economic analyses summary

    Science.gov (United States)

    Macomber, H. L.

    1986-01-01

    An economic-analysis summary was presented for the manufacture of crystalline-silicon modules involving silicon ingot/sheet, growth, slicing, cell manufacture, and module assembly. Economic analyses provided: useful quantitative aspects for complex decision-making to the Flat-plate Solar Array (FSA) Project; yardsticks for design and performance to industry; and demonstration of how to evaluate and understand the worth of research and development both to JPL and other government agencies and programs. It was concluded that future research and development funds for photovoltaics must be provided by the Federal Government because the solar industry today does not reap enough profits from its present-day sales of photovoltaic equipment.

  3. Integrative analysis of metabolomics and transcriptomics data

    DEFF Research Database (Denmark)

    Brink-Jensen, Kasper; Bak, Søren; Jørgensen, Kirsten

    2013-01-01

    ) measurements from the same samples, to identify genes controlling the production of metabolites. Due to the high dimensionality of both LC-MS and DNA microarray data, dimension reduction and variable selection are key elements of the analysis. Our proposed approach starts by identifying the basis functions......The abundance of high-dimensional measurements in the form of gene expression and mass spectroscopy calls for models to elucidate the underlying biological system. For widely studied organisms like yeast, it is possible to incorporate prior knowledge from a variety of databases, an approach used...... ("building blocks") that constitute the output from a mass spectrometry experiment. Subsequently, the weights of these basis functions are related to the observations from the corresponding gene expression data in order to identify which genes are associated with specific patterns seen in the metabolite data...

  4. Vertically Integrated Seismological Analysis II : Inference

    Science.gov (United States)

    Arora, N. S.; Russell, S.; Sudderth, E.

    2009-12-01

    Methods for automatically associating detected waveform features with hypothesized seismic events, and localizing those events, are a critical component of efforts to verify the Comprehensive Test Ban Treaty (CTBT). As outlined in our companion abstract, we have developed a hierarchical model which views detection, association, and localization as an integrated probabilistic inference problem. In this abstract, we provide more details on the Markov chain Monte Carlo (MCMC) methods used to solve this inference task. MCMC generates samples from a posterior distribution π(x) over possible worlds x by defining a Markov chain whose states are the worlds x, and whose stationary distribution is π(x). In the Metropolis-Hastings (M-H) method, transitions in the Markov chain are constructed in two steps. First, given the current state x, a candidate next state x‧ is generated from a proposal distribution q(x‧ | x), which may be (more or less) arbitrary. Second, the transition to x‧ is not automatic, but occurs with an acceptance probability—α(x‧ | x) = min(1, π(x‧)q(x | x‧)/π(x)q(x‧ | x)). The seismic event model outlined in our companion abstract is quite similar to those used in multitarget tracking, for which MCMC has proved very effective. In this model, each world x is defined by a collection of events, a list of properties characterizing those events (times, locations, magnitudes, and types), and the association of each event to a set of observed detections. The target distribution π(x) = P(x | y), the posterior distribution over worlds x given the observed waveform data y at all stations. Proposal distributions then implement several types of moves between worlds. For example, birth moves create new events; death moves delete existing events; split moves partition the detections for an event into two new events; merge moves combine event pairs; swap moves modify the properties and assocations for pairs of events. Importantly, the rules for

  5. Development of safety analysis technology for integral reactor

    Energy Technology Data Exchange (ETDEWEB)

    Sim, Suk K.; Song, J. H.; Chung, Y. J. and others

    1999-03-01

    Inherent safety features and safety system characteristics of the SMART integral reactor are investigated in this study. Performance and safety of the SMART conceptual design have been evaluated and confirmed through the performance and safety analyses using safety analysis system codes as well as a preliminary performance and safety analysis methodology. SMART design base events and their acceptance criteria are identified to develop a preliminary PIRT for the SMART integral reactor. Using the preliminary PIRT, a set of experimental program for the thermal hydraulic separate effect tests and the integral effect tests was developed for the thermal hydraulic model development and the system code validation. Safety characteristics as well as the safety issues of the integral reactor has been identified during the study, which will be used to resolve the safety issues and guide the regulatory criteria for the integral reactor. The results of the performance and safety analyses performed during the study were used to feedback for the SMART conceptual design. The performance and safety analysis code systems as well as the preliminary safety analysis methodology developed in this study will be validated as the SMART design evolves. The performance and safety analysis technology developed during the study will be utilized for the SMART basic design development. (author)

  6. Overcoming barriers to integrating economic analysis into risk assessment.

    Science.gov (United States)

    Hoffmann, Sandra

    2011-09-01

    Regulatory risk analysis is designed to provide decisionmakers with a clearer understanding of how policies are likely to affect risk. The systems that produce risk are biological, physical, and social and economic. As a result, risk analysis is an inherently interdisciplinary task. Yet in practice, risk analysis has been interdisciplinary in only limited ways. Risk analysis could provide more accurate assessments of risk if there were better integration of economics and other social sciences into risk assessment itself. This essay examines how discussions about risk analysis policy have influenced the roles of various disciplines in risk analysis. It explores ways in which integrated bio/physical-economic modeling could contribute to more accurate assessments of risk. It reviews examples of the kind of integrated economics-bio/physical modeling that could be used to enhance risk assessment. The essay ends with a discussion of institutional barriers to greater integration of economic modeling into risk assessment and provides suggestions on how these might be overcome. © 2011 Society for Risk Analysis.

  7. A network analysis of leadership theory : the infancy of integration.

    OpenAIRE

    Meuser, J. D.; Gardner, W. L.; Dinh, J. E.; Hu, J.; Liden, R. C.; Lord, R. G.

    2016-01-01

    We investigated the status of leadership theory integration by reviewing 14 years of published research (2000 through 2013) in 10 top journals (864 articles). The authors of these articles examined 49 leadership approaches/theories, and in 293 articles, 3 or more of these leadership approaches were included in their investigations. Focusing on these articles that reflected relatively extensive integration, we applied an inductive approach and used graphic network analysis as a guide for drawi...

  8. Integrated analysis of oxide nuclear fuel sintering

    International Nuclear Information System (INIS)

    Baranov, V.; Kuzmin, R.; Tenishev, A.; Timoshin, I.; Khlunov, A.; Ivanov, A.; Petrov, I.

    2011-01-01

    Dilatometric and thermal-gravimetric investigations have been carried out for the sintering process of oxide nuclear fuel in gaseous Ar - 8% H 2 atmosphere at temperatures up to 1600 0 C. The pressed compacts were fabricated under real production conditions of the OAO MSZ with application of two different technologies, so called 'dry' and 'wet' technologies. Effects of the grain size growth after the heating to different temperatures were observed. In order to investigate the effects produced by rate of heating on properties of sintered fuel pellets, the heating rates were varied from 1 to 8 0 C per minute. Time of isothermal overexposure at maximal temperature (1600 0 C) was about 8 hours. Real production conditions were imitated. The results showed that the sintering process of the fuel pellets produced by two technologies differs. The samples sintered under different heating rates were studied with application of scanning electronic microscopy analysis for determination of mean grain size. A simulation of heating profile for industrial furnaces was performed to reduce the beam cycles and estimate the effects of variation of the isothermal overexposure temperatures. Based on this data, an optimization of the sintering conditions was performed in operations terms of OAO MSZ. (authors)

  9. Integrated Network Analysis and Effective Tools in Plant Systems Biology

    Directory of Open Access Journals (Sweden)

    Atsushi eFukushima

    2014-11-01

    Full Text Available One of the ultimate goals in plant systems biology is to elucidate the genotype-phenotype relationship in plant cellular systems. Integrated network analysis that combines omics data with mathematical models has received particular attention. Here we focus on the latest cutting-edge computational advances that facilitate their combination. We highlight (1 network visualization tools, (2 pathway analyses, (3 genome-scale metabolic reconstruction, and (4 the integration of high-throughput experimental data and mathematical models. Multi-omics data that contain the genome, transcriptome, proteome, and metabolome and mathematical models are expected to integrate and expand our knowledge of complex plant metabolisms.

  10. Momentum integral network method for thermal-hydraulic transient analysis

    International Nuclear Information System (INIS)

    Van Tuyle, G.J.

    1983-01-01

    A new momentum integral network method has been developed, and tested in the MINET computer code. The method was developed in order to facilitate the transient analysis of complex fluid flow and heat transfer networks, such as those found in the balance of plant of power generating facilities. The method employed in the MINET code is a major extension of a momentum integral method reported by Meyer. Meyer integrated the momentum equation over several linked nodes, called a segment, and used a segment average pressure, evaluated from the pressures at both ends. Nodal mass and energy conservation determined nodal flows and enthalpies, accounting for fluid compression and thermal expansion

  11. Study on integrated design and analysis platform of NPP

    International Nuclear Information System (INIS)

    Lu Dongsen; Gao Zuying; Zhou Zhiwei

    2001-01-01

    Many calculation software have been developed to nuclear system's design and safety analysis, such as structure design software, fuel design and manage software, thermal hydraulic analysis software, severe accident simulation software, etc. This study integrates those software to a platform, develops visual modeling tool for Retran, NGFM90. And in this platform, a distribution calculation method is also provided for couple calculation between different software. The study will improve the design and analysis of NPP

  12. Integrated failure probability estimation based on structural integrity analysis and failure data: Natural gas pipeline case

    International Nuclear Information System (INIS)

    Dundulis, Gintautas; Žutautaitė, Inga; Janulionis, Remigijus; Ušpuras, Eugenijus; Rimkevičius, Sigitas; Eid, Mohamed

    2016-01-01

    In this paper, the authors present an approach as an overall framework for the estimation of the failure probability of pipelines based on: the results of the deterministic-probabilistic structural integrity analysis (taking into account loads, material properties, geometry, boundary conditions, crack size, and defected zone thickness), the corrosion rate, the number of defects and failure data (involved into the model via application of Bayesian method). The proposed approach is applied to estimate the failure probability of a selected part of the Lithuanian natural gas transmission network. The presented approach for the estimation of integrated failure probability is a combination of several different analyses allowing us to obtain: the critical crack's length and depth, the failure probability of the defected zone thickness, dependency of the failure probability on the age of the natural gas transmission pipeline. A model's uncertainty analysis and uncertainty propagation analysis are performed, as well. - Highlights: • Degradation mechanisms of natural gas transmission pipelines. • Fracture mechanic analysis of the pipe with crack. • Stress evaluation of the pipe with critical crack. • Deterministic-probabilistic structural integrity analysis of gas pipeline. • Integrated estimation of pipeline failure probability by Bayesian method.

  13. Direct integration multiple collision integral transport analysis method for high energy fusion neutronics

    International Nuclear Information System (INIS)

    Koch, K.R.

    1985-01-01

    A new analysis method specially suited for the inherent difficulties of fusion neutronics was developed to provide detailed studies of the fusion neutron transport physics. These studies should provide a better understanding of the limitations and accuracies of typical fusion neutronics calculations. The new analysis method is based on the direct integration of the integral form of the neutron transport equation and employs a continuous energy formulation with the exact treatment of the energy angle kinematics of the scattering process. In addition, the overall solution is analyzed in terms of uncollided, once-collided, and multi-collided solution components based on a multiple collision treatment. Furthermore, the numerical evaluations of integrals use quadrature schemes that are based on the actual dependencies exhibited in the integrands. The new DITRAN computer code was developed on the Cyber 205 vector supercomputer to implement this direct integration multiple-collision fusion neutronics analysis. Three representative fusion reactor models were devised and the solutions to these problems were studied to provide suitable choices for the numerical quadrature orders as well as the discretized solution grid and to understand the limitations of the new analysis method. As further verification and as a first step in assessing the accuracy of existing fusion-neutronics calculations, solutions obtained using the new analysis method were compared to typical multigroup discrete ordinates calculations

  14. IMG: the integrated microbial genomes database and comparative analysis system

    Science.gov (United States)

    Markowitz, Victor M.; Chen, I-Min A.; Palaniappan, Krishna; Chu, Ken; Szeto, Ernest; Grechkin, Yuri; Ratner, Anna; Jacob, Biju; Huang, Jinghua; Williams, Peter; Huntemann, Marcel; Anderson, Iain; Mavromatis, Konstantinos; Ivanova, Natalia N.; Kyrpides, Nikos C.

    2012-01-01

    The Integrated Microbial Genomes (IMG) system serves as a community resource for comparative analysis of publicly available genomes in a comprehensive integrated context. IMG integrates publicly available draft and complete genomes from all three domains of life with a large number of plasmids and viruses. IMG provides tools and viewers for analyzing and reviewing the annotations of genes and genomes in a comparative context. IMG's data content and analytical capabilities have been continuously extended through regular updates since its first release in March 2005. IMG is available at http://img.jgi.doe.gov. Companion IMG systems provide support for expert review of genome annotations (IMG/ER: http://img.jgi.doe.gov/er), teaching courses and training in microbial genome analysis (IMG/EDU: http://img.jgi.doe.gov/edu) and analysis of genomes related to the Human Microbiome Project (IMG/HMP: http://www.hmpdacc-resources.org/img_hmp). PMID:22194640

  15. Integrative sparse principal component analysis of gene expression data.

    Science.gov (United States)

    Liu, Mengque; Fan, Xinyan; Fang, Kuangnan; Zhang, Qingzhao; Ma, Shuangge

    2017-12-01

    In the analysis of gene expression data, dimension reduction techniques have been extensively adopted. The most popular one is perhaps the PCA (principal component analysis). To generate more reliable and more interpretable results, the SPCA (sparse PCA) technique has been developed. With the "small sample size, high dimensionality" characteristic of gene expression data, the analysis results generated from a single dataset are often unsatisfactory. Under contexts other than dimension reduction, integrative analysis techniques, which jointly analyze the raw data of multiple independent datasets, have been developed and shown to outperform "classic" meta-analysis and other multidatasets techniques and single-dataset analysis. In this study, we conduct integrative analysis by developing the iSPCA (integrative SPCA) method. iSPCA achieves the selection and estimation of sparse loadings using a group penalty. To take advantage of the similarity across datasets and generate more accurate results, we further impose contrasted penalties. Different penalties are proposed to accommodate different data conditions. Extensive simulations show that iSPCA outperforms the alternatives under a wide spectrum of settings. The analysis of breast cancer and pancreatic cancer data further shows iSPCA's satisfactory performance. © 2017 WILEY PERIODICALS, INC.

  16. Train integrity detection risk analysis based on PRISM

    Science.gov (United States)

    Wen, Yuan

    2018-04-01

    GNSS based Train Integrity Monitoring System (TIMS) is an effective and low-cost detection scheme for train integrity detection. However, as an external auxiliary system of CTCS, GNSS may be influenced by external environments, such as uncertainty of wireless communication channels, which may lead to the failure of communication and positioning. In order to guarantee the reliability and safety of train operation, a risk analysis method of train integrity detection based on PRISM is proposed in this article. First, we analyze the risk factors (in GNSS communication process and the on-board communication process) and model them. Then, we evaluate the performance of the model in PRISM based on the field data. Finally, we discuss how these risk factors influence the train integrity detection process.

  17. A Collaborative Analysis Tool for Integrated Hypersonic Aerodynamics, Thermal Protection Systems, and RBCC Engine Performance for Single Stage to Orbit Vehicles

    Science.gov (United States)

    Stanley, Thomas Troy; Alexander, Reginald; Landrum, Brian

    2000-01-01

    engine model. HYFIM performs the aerodynamic analysis of forebodies and inlet characteristics of RBCC powered SSTO launch vehicles. HYFIM is applicable to the analysis of the ramjet/scramjet engine operations modes (Mach 3-12), and provides estimates of parameters such as air capture area, shock-on-lip Mach number, design Mach number, compression ratio, etc., based on a basic geometry routine for modeling axisymmetric cones, 2-D wedge geometries. HYFIM also estimates the variation of shock layer properties normal to the forebody surface. The thermal protection system (TPS) is directly linked to determination of the vehicle moldline and the shaping of the trajectory. Thermal protection systems to maintain the structural integrity of the vehicle must be able to mitigate the heat transfer to the structure and be lightweight. Herein lies the interdependency, in that as the vehicle's speed increases, the TPS requirements are increased. And as TPS masses increase the effect on the propulsion system and all other systems is compounded. The need to analyze vehicle forebody and engine inlet is critical to be able to design the RBCC vehicle. To adequately determine insulation masses for an RBCC vehicle, the hypersonic aerodynamic environment and aeroheating loads must be calculated and the TPS thicknesses must be calculated for the entire vehicle. To accomplish this an ascent or reentry trajectory is obtained using the computer code Program to Optimize Simulated Trajectories (POST). The trajectory is then used to calculate the convective heat rates on several locations on the vehicles using the Miniature Version of the JA70 Aerodynamic Heating Computer Program (MINIVER). Once the heat rates are defined for each body point on the vehicle, then insulation thicknesses that are required to maintain the vehicle within structural limits are calculated using Systems Improved Numerical Differencing Analyzer (SINDA) models. If the TPS masses are too heavy for the performance of the vehicle

  18. Argentinean integrated small reactor design and scale economy analysis of integrated reactor

    International Nuclear Information System (INIS)

    Florido, P. C.; Bergallo, J. E.; Ishida, M. V.

    2000-01-01

    This paper describes the design of CAREM, which is Argentinean integrated small reactor project and the scale economy analysis results of integrated reactor. CAREM project consists on the development, design and construction of a small nuclear power plant. CAREM is an advanced reactor conceived with new generation design solutions and standing on the large experience accumulated in the safe operation of Light Water Reactors. The CAREM is an indirect cycle reactor with some distinctive and characteristic features that greatly simplify the reactor and also contribute to a highly level of safety: integrated primary cooling system, self pressurized, primary cooling by natural circulation and safety system relying on passive features. For a fully doupled economic evaluation of integrated reactors done by IREP (Integrated Reactor Evaluation Program) code transferred to IAEA, CAREM have been used as a reference point. The results shows that integrated reactors become competitive with power larger than 200MWe with Argentinean cheapest electricity option. Due to reactor pressure vessel construction limit, low pressure drop steam generator are used to reach power output of 200MWe for natural circulation. For forced circulation, 300MWe can be achieved. (author)

  19. Development of Probabilistic Structural Analysis Integrated with Manufacturing Processes

    Science.gov (United States)

    Pai, Shantaram S.; Nagpal, Vinod K.

    2007-01-01

    An effort has been initiated to integrate manufacturing process simulations with probabilistic structural analyses in order to capture the important impacts of manufacturing uncertainties on component stress levels and life. Two physics-based manufacturing process models (one for powdered metal forging and the other for annular deformation resistance welding) have been linked to the NESSUS structural analysis code. This paper describes the methodology developed to perform this integration including several examples. Although this effort is still underway, particularly for full integration of a probabilistic analysis, the progress to date has been encouraging and a software interface that implements the methodology has been developed. The purpose of this paper is to report this preliminary development.

  20. Integration of rocket turbine design and analysis through computer graphics

    Science.gov (United States)

    Hsu, Wayne; Boynton, Jim

    1988-01-01

    An interactive approach with engineering computer graphics is used to integrate the design and analysis processes of a rocket engine turbine into a progressive and iterative design procedure. The processes are interconnected through pre- and postprocessors. The graphics are used to generate the blade profiles, their stacking, finite element generation, and analysis presentation through color graphics. Steps of the design process discussed include pitch-line design, axisymmetric hub-to-tip meridional design, and quasi-three-dimensional analysis. The viscous two- and three-dimensional analysis codes are executed after acceptable designs are achieved and estimates of initial losses are confirmed.

  1. INS integrated motion analysis for autonomous vehicle navigation

    Science.gov (United States)

    Roberts, Barry; Bazakos, Mike

    1991-01-01

    The use of inertial navigation system (INS) measurements to enhance the quality and robustness of motion analysis techniques used for obstacle detection is discussed with particular reference to autonomous vehicle navigation. The approach to obstacle detection used here employs motion analysis of imagery generated by a passive sensor. Motion analysis of imagery obtained during vehicle travel is used to generate range measurements to points within the field of view of the sensor, which can then be used to provide obstacle detection. Results obtained with an INS integrated motion analysis approach are reviewed.

  2. Integrating computer programs for engineering analysis and design

    Science.gov (United States)

    Wilhite, A. W.; Crisp, V. K.; Johnson, S. C.

    1983-01-01

    The design of a third-generation system for integrating computer programs for engineering and design has been developed for the Aerospace Vehicle Interactive Design (AVID) system. This system consists of an engineering data management system, program interface software, a user interface, and a geometry system. A relational information system (ARIS) was developed specifically for the computer-aided engineering system. It is used for a repository of design data that are communicated between analysis programs, for a dictionary that describes these design data, for a directory that describes the analysis programs, and for other system functions. A method is described for interfacing independent analysis programs into a loosely-coupled design system. This method emphasizes an interactive extension of analysis techniques and manipulation of design data. Also, integrity mechanisms exist to maintain database correctness for multidisciplinary design tasks by an individual or a team of specialists. Finally, a prototype user interface program has been developed to aid in system utilization.

  3. Analysis and Modeling of Integrated Magnetics for LLC resonant Converters

    DEFF Research Database (Denmark)

    Li, Mingxiao; Ouyang, Ziwei; Zhao, Bin

    2017-01-01

    Shunt-inserted transformers are widely used toobtain high leakage inductance. This paper investigates thismethod in depth to make it applicable to integrate resonantinductor for the LLC resonant converters. The analysis andmodel of magnetizing inductance and leakage inductance forshunt...... transformers can provide a significantdifference. The way to obtain the desirable magnetizing andleakage inductance value for LLC resonant converters issimplified by the creation of air gaps together with a magneticshunt. The calculation and relation are validated by finiteelement analysis (FEA) simulations...

  4. Integrated dynamic modeling and management system mission analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, A.K.

    1994-12-28

    This document summarizes the mission analysis performed on the Integrated Dynamic Modeling and Management System (IDMMS). The IDMMS will be developed to provide the modeling and analysis capability required to understand the TWRS system behavior in terms of the identified TWRS performance measures. The IDMMS will be used to demonstrate in a verified and validated manner the satisfactory performance of the TWRS system configuration and assurance that the requirements have been satisfied.

  5. Integrated dynamic modeling and management system mission analysis

    International Nuclear Information System (INIS)

    Lee, A.K.

    1994-01-01

    This document summarizes the mission analysis performed on the Integrated Dynamic Modeling and Management System (IDMMS). The IDMMS will be developed to provide the modeling and analysis capability required to understand the TWRS system behavior in terms of the identified TWRS performance measures. The IDMMS will be used to demonstrate in a verified and validated manner the satisfactory performance of the TWRS system configuration and assurance that the requirements have been satisfied

  6. Heater-Integrated Cantilevers for Nano-Samples Thermogravimetric Analysis

    OpenAIRE

    Toffoli, Valeria; Carrato, Sergio; Lee, Dongkyu; Jeon, Sangmin; Lazzarino, Marco

    2013-01-01

    The design and characteristics of a micro-system for thermogravimetric analysis (TGA) in which heater, temperature sensor and mass sensor are integrated into a single device are presented. The system consists of a suspended cantilever that incorporates a microfabricated resistor, used as both heater and thermometer. A three-dimensional finite element analysis was used to define the structure parameters. TGA sensors were fabricated by standard microlithographic techniques and tested using mill...

  7. On the Integration of Digital Design and Analysis Tools

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    2006-01-01

    The aim of this research is to look into integrated digital design and analysis tools in order to find out if it is suited for use by architects and designers or only by specialists and technicians - and if not, then to look at what can be done to make them more available to architects and design...

  8. Multi-criteria decision analysis integrated with GIS for radio ...

    African Journals Online (AJOL)

    Multi-criteria decision analysis integrated with GIS for radio astronomical observatory site selection in peninsular of Malaysia. R Umar, Z.Z. Abidin, Z.A. Ibrahim, M.K.A. Kamarudin, S.N. Hazmin, A Endut, H Juahir ...

  9. Integrated analysis for genotypic adaptation in rice | Das | African ...

    African Journals Online (AJOL)

    Integrated analysis for genotypic adaptation in rice. S Das, RC Misra, MC Pattnaik, SK Sinha. Abstract. Development of varieties with high yield potential coupled with wide adaptability is an important plant breeding objective. The presence of genotype by environment (GxE) interaction plays a crucial role in determining the ...

  10. Integration of Design and Control Through Model Analysis

    DEFF Research Database (Denmark)

    Russel, Boris Mariboe; Henriksen, Jens Peter; Jørgensen, Sten Bay

    2000-01-01

    of the phenomena models representing the process model identify the relationships between the important process and design variables, which help to understand, define and address some of the issues related to integration of design and control issues. The model analysis is highlighted through examples involving...... processes with mass and/or energy recycle. (C) 2000 Elsevier Science Ltd. All rights reserved....

  11. Enhancing yeast transcription analysis through integration of heterogeneous data

    DEFF Research Database (Denmark)

    Grotkjær, Thomas; Nielsen, Jens

    2004-01-01

    of Saccharomyces cerevisiae whole genome transcription data. A special focus is on the quantitative aspects of normalisation and mathematical modelling approaches, since they are expected to play an increasing role in future DNA microarray analysis studies. Data analysis is exemplified with cluster analysis......DNA microarray technology enables the simultaneous measurement of the transcript level of thousands of genes. Primary analysis can be done with basic statistical tools and cluster analysis, but effective and in depth analysis of the vast amount of transcription data requires integration with data...... from several heterogeneous data Sources, such as upstream promoter sequences, genome-scale metabolic models, annotation databases and other experimental data. In this review, we discuss how experimental design, normalisation, heterogeneous data and mathematical modelling can enhance analysis...

  12. Integration of End-User Cloud Storage for CMS Analysis

    CERN Document Server

    Riahi, Hassen; Álvarez Ayllón, Alejandro; Balcas, Justas; Ciangottini, Diego; Hernández, José M; Keeble, Oliver; Magini, Nicolò; Manzi, Andrea; Mascetti, Luca; Mascheroni, Marco; Tanasijczuk, Andres Jorge; Vaandering, Eric Wayne

    2018-01-01

    End-user Cloud storage is increasing rapidly in popularity in research communities thanks to the collaboration capabilities it offers, namely synchronisation and sharing. CERN IT has implemented a model of such storage named, CERNBox, integrated with the CERN AuthN and AuthZ services. To exploit the use of the end-user Cloud storage for the distributed data analysis activity, the CMS experiment has started the integration of CERNBox as a Grid resource. This will allow CMS users to make use of their own storage in the Cloud for their analysis activities as well as to benefit from synchronisation and sharing capabilities to achieve results faster and more effectively. It will provide an integration model of Cloud storages in the Grid, which is implemented and commissioned over the world’s largest computing Grid infrastructure, Worldwide LHC Computing Grid (WLCG). In this paper, we present the integration strategy and infrastructure changes needed in order to transparently integrate end-user Cloud storage with...

  13. Application of Stochastic Sensitivity Analysis to Integrated Force Method

    Directory of Open Access Journals (Sweden)

    X. F. Wei

    2012-01-01

    Full Text Available As a new formulation in structural analysis, Integrated Force Method has been successfully applied to many structures for civil, mechanical, and aerospace engineering due to the accurate estimate of forces in computation. Right now, it is being further extended to the probabilistic domain. For the assessment of uncertainty effect in system optimization and identification, the probabilistic sensitivity analysis of IFM was further investigated in this study. A set of stochastic sensitivity analysis formulation of Integrated Force Method was developed using the perturbation method. Numerical examples are presented to illustrate its application. Its efficiency and accuracy were also substantiated with direct Monte Carlo simulations and the reliability-based sensitivity method. The numerical algorithm was shown to be readily adaptable to the existing program since the models of stochastic finite element and stochastic design sensitivity are almost identical.

  14. A Key Event Path Analysis Approach for Integrated Systems

    Directory of Open Access Journals (Sweden)

    Jingjing Liao

    2012-01-01

    Full Text Available By studying the key event paths of probabilistic event structure graphs (PESGs, a key event path analysis approach for integrated system models is proposed. According to translation rules concluded from integrated system architecture descriptions, the corresponding PESGs are constructed from the colored Petri Net (CPN models. Then the definitions of cycle event paths, sequence event paths, and key event paths are given. Whereafter based on the statistic results after the simulation of CPN models, key event paths are found out by the sensitive analysis approach. This approach focuses on the logic structures of CPN models, which is reliable and could be the basis of structured analysis for discrete event systems. An example of radar model is given to characterize the application of this approach, and the results are worthy of trust.

  15. Integrative Analysis of Metabolic Models – from Structure to Dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Hartmann, Anja, E-mail: hartmann@ipk-gatersleben.de [Leibniz Institute of Plant Genetics and Crop Plant Research (IPK), Gatersleben (Germany); Schreiber, Falk [Monash University, Melbourne, VIC (Australia); Martin-Luther-University Halle-Wittenberg, Halle (Germany)

    2015-01-26

    The characterization of biological systems with respect to their behavior and functionality based on versatile biochemical interactions is a major challenge. To understand these complex mechanisms at systems level modeling approaches are investigated. Different modeling formalisms allow metabolic models to be analyzed depending on the question to be solved, the biochemical knowledge and the availability of experimental data. Here, we describe a method for an integrative analysis of the structure and dynamics represented by qualitative and quantitative metabolic models. Using various formalisms, the metabolic model is analyzed from different perspectives. Determined structural and dynamic properties are visualized in the context of the metabolic model. Interaction techniques allow the exploration and visual analysis thereby leading to a broader understanding of the behavior and functionality of the underlying biological system. The System Biology Metabolic Model Framework (SBM{sup 2} – Framework) implements the developed method and, as an example, is applied for the integrative analysis of the crop plant potato.

  16. Construction of an integrated database to support genomic sequence analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gilbert, W.; Overbeek, R.

    1994-11-01

    The central goal of this project is to develop an integrated database to support comparative analysis of genomes including DNA sequence data, protein sequence data, gene expression data and metabolism data. In developing the logic-based system GenoBase, a broader integration of available data was achieved due to assistance from collaborators. Current goals are to easily include new forms of data as they become available and to easily navigate through the ensemble of objects described within the database. This report comments on progress made in these areas.

  17. Containment integrity analysis with SAMPSON/DCRA module

    International Nuclear Information System (INIS)

    Hosoda, Seigo; Shirakawa, Noriyuki; Naitoh, Masanori

    2006-01-01

    The integrity of PWR containment under a severe accident is analyzed using the debris concrete reaction analysis code. If core fuels melt through the pressure vessel and the debris accumulates on the reactor cavity of a lower part of containment, its temperature continues to rise due to decay heat and the debris ablates the concrete floor. In case that cooling water is issued into the containment cavity and the amount of debris is limited to 30% of core fuels, our analyses showed that the debris could be cooled and frozen so that integrity of containment could hold. (author)

  18. Plant-wide integrated equipment monitoring and analysis system

    International Nuclear Information System (INIS)

    Morimoto, C.N.; Hunter, T.A.; Chiang, S.C.

    2004-01-01

    A nuclear power plant equipment monitoring system monitors plant equipment and reports deteriorating equipment conditions. The more advanced equipment monitoring systems can also provide information for understanding the symptoms and diagnosing the root cause of a problem. Maximizing the equipment availability and minimizing or eliminating consequential damages are the ultimate goals of equipment monitoring systems. GE Integrated Equipment Monitoring System (GEIEMS) is designed as an integrated intelligent monitoring and analysis system for plant-wide application for BWR plants. This approach reduces system maintenance efforts and equipment monitoring costs and provides information for integrated planning. This paper describes GEIEMS and how the current system is being upgraded to meet General Electric's vision for plant-wide decision support. (author)

  19. STINGRAY: system for integrated genomic resources and analysis.

    Science.gov (United States)

    Wagner, Glauber; Jardim, Rodrigo; Tschoeke, Diogo A; Loureiro, Daniel R; Ocaña, Kary A C S; Ribeiro, Antonio C B; Emmel, Vanessa E; Probst, Christian M; Pitaluga, André N; Grisard, Edmundo C; Cavalcanti, Maria C; Campos, Maria L M; Mattoso, Marta; Dávila, Alberto M R

    2014-03-07

    The STINGRAY system has been conceived to ease the tasks of integrating, analyzing, annotating and presenting genomic and expression data from Sanger and Next Generation Sequencing (NGS) platforms. STINGRAY includes: (a) a complete and integrated workflow (more than 20 bioinformatics tools) ranging from functional annotation to phylogeny; (b) a MySQL database schema, suitable for data integration and user access control; and (c) a user-friendly graphical web-based interface that makes the system intuitive, facilitating the tasks of data analysis and annotation. STINGRAY showed to be an easy to use and complete system for analyzing sequencing data. While both Sanger and NGS platforms are supported, the system could be faster using Sanger data, since the large NGS datasets could potentially slow down the MySQL database usage. STINGRAY is available at http://stingray.biowebdb.org and the open source code at http://sourceforge.net/projects/stingray-biowebdb/.

  20. An Integrated Solution for Performing Thermo-fluid Conjugate Analysis

    Science.gov (United States)

    Kornberg, Oren

    2009-01-01

    A method has been developed which integrates a fluid flow analyzer and a thermal analyzer to produce both steady state and transient results of 1-D, 2-D, and 3-D analysis models. The Generalized Fluid System Simulation Program (GFSSP) is a one dimensional, general purpose fluid analysis code which computes pressures and flow distributions in complex fluid networks. The MSC Systems Improved Numerical Differencing Analyzer (MSC.SINDA) is a one dimensional general purpose thermal analyzer that solves network representations of thermal systems. Both GFSSP and MSC.SINDA have graphical user interfaces which are used to build the respective model and prepare it for analysis. The SINDA/GFSSP Conjugate Integrator (SGCI) is a formbase graphical integration program used to set input parameters for the conjugate analyses and run the models. The contents of this paper describes SGCI and its thermo-fluids conjugate analysis techniques and capabilities by presenting results from some example models including the cryogenic chill down of a copper pipe, a bar between two walls in a fluid stream, and a solid plate creating a phase change in a flowing fluid.

  1. Analysis of Waste Isolation Pilot Plant Samples: Integrated Summary Report

    Energy Technology Data Exchange (ETDEWEB)

    Britt, Phillip F [ORNL

    2015-03-01

    Analysis of Waste Isolation Pilot Plant Samples: Integrated Summary Report. Summaries of conclusions, analytical processes, and analytical results. Analysis of samples taken from the Waste Isolation Pilot Plant (WIPP) near Carlsbad, New Mexico in support of the WIPP Technical Assessment Team (TAT) activities to determine to the extent feasible the mechanisms and chemical reactions that may have resulted in the breach of at least one waste drum and release of waste material in WIPP Panel 7 Room 7 on February 14, 2014. This report integrates and summarizes the results contained in three separate reports, described below, and draws conclusions based on those results. Chemical and Radiochemical Analyses of WIPP Samples R-15 C5 SWB and R16 C-4 Lip; PNNL-24003, Pacific Northwest National Laboratory, December 2014 Analysis of Waste Isolation Pilot Plant (WIPP) Underground and MgO Samples by the Savannah River National Laboratory (SRNL); SRNL-STI-2014-00617; Savannah River National Laboratory, December 2014 Report for WIPP UG Sample #3, R15C5 (9/3/14); LLNL-TR-667015; Lawrence Livermore National Laboratory, January 2015 This report is also contained in the Waste Isolation Pilot Plant Technical Assessment Team Report; SRNL-RP-2015-01198; Savannah River National Laboratory, March 17, 2015, as Appendix C: Analysis Integrated Summary Report.

  2. Vehicle Integrated Performance Analysis, the VIPA Experience: Reconnecting with Technical Integration

    Science.gov (United States)

    McGhee, David S.

    2005-01-01

    Today's NASA is facing significant challenges and changes. The Exploration initiative indicates a large increase in projects with limited increase in budget. The Columbia report has criticized NASA for its lack of insight and technical integration impacting its ability to provide safety. The Aldridge report is advocating NASA find new ways of doing business. Very early in the Space Launch Initiative (SLI) program a small team of engineers at MSFC were asked to propose a process for performing a system level assessment of a launch vehicle. The request was aimed primarily at providing insight and making NASA a "smart buyer." Out of this effort the VIPA team was created. The difference between the VIPA effort and many integration attempts is that VIPA focuses on using experienced people from various disciplines and a process which focuses them on a technically integrated assessment. Most previous attempts have focused on developing an all encompassing software tool. In addition, VIPA anchored its process formulation in the experience of its members and in early developmental Space Shuttle experience. The primary reference for this is NASA-TP-2001-210092, "Launch Vehicle Design Process: Characterization, Technical Integration, and Lessons Learned," and discussions with its authors. The foundations of VIPA's process are described. The VIPA team also recognized the need to drive detailed analysis earlier in the design process. Analyses and techniques typically done in later design phases, are brought forward using improved computing technology. The intent is to allow the identification of significant sensitivities, trades, and design issues much earlier in the program. This process is driven by the T-model for Technical Integration described in the aforementioned reference. VIPA's approach to performing system level technical integration is discussed in detail. Proposed definitions are offered to clarify this discussion and the general systems integration dialog. VIPA

  3. INTEGRATION OF FACILITY MODELING CAPABILITIES FOR NUCLEAR NONPROLIFERATION ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Gorensek, M.; Hamm, L.; Garcia, H.; Burr, T.; Coles, G.; Edmunds, T.; Garrett, A.; Krebs, J.; Kress, R.; Lamberti, V.; Schoenwald, D.; Tzanos, C.; Ward, R.

    2011-07-18

    Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come from many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.

  4. Integration of facility modeling capabilities for nuclear nonproliferation analysis

    International Nuclear Information System (INIS)

    Garcia, Humberto; Burr, Tom; Coles, Garill A.; Edmunds, Thomas A.; Garrett, Alfred; Gorensek, Maximilian; Hamm, Luther; Krebs, John; Kress, Reid L.; Lamberti, Vincent; Schoenwald, David; Tzanos, Constantine P.; Ward, Richard C.

    2012-01-01

    Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come from many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.

  5. Integration Of Facility Modeling Capabilities For Nuclear Nonproliferation Analysis

    International Nuclear Information System (INIS)

    Gorensek, M.; Hamm, L.; Garcia, H.; Burr, T.; Coles, G.; Edmunds, T.; Garrett, A.; Krebs, J.; Kress, R.; Lamberti, V.; Schoenwald, D.; Tzanos, C.; Ward, R.

    2011-01-01

    Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come from many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.

  6. Combination and Integration of Qualitative and Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Philipp Mayring

    2001-02-01

    Full Text Available In this paper, I am going to outline ways of combining qualitative and quantitative steps of analysis on five levels. On the technical level, programs for the computer-aided analysis of qualitative data offer various combinations. Where the data are concerned, the employment of categories (for instance by using qualitative content analysis allows for combining qualitative and quantitative forms of data analysis. On the individual level, the creation of types and the inductive generalisation of cases allow for proceeding from individual case material to quantitative generalisations. As for research design, different models can be distinguished (preliminary study, generalisation, elaboration, triangulation which combine qualitative and quantitative steps of analysis. Where the logic of research is concerned, it can be shown that an extended process model which combined qualitative and quantitative research can be appropriate and thus lead to an integration of the two approaches. URN: urn:nbn:de:0114-fqs010162

  7. Inertial navigation sensor integrated motion analysis for autonomous vehicle navigation

    Science.gov (United States)

    Roberts, Barry; Bhanu, Bir

    1992-01-01

    Recent work on INS integrated motion analysis is described. Results were obtained with a maximally passive system of obstacle detection (OD) for ground-based vehicles and rotorcraft. The OD approach involves motion analysis of imagery acquired by a passive sensor in the course of vehicle travel to generate range measurements to world points within the sensor FOV. INS data and scene analysis results are used to enhance interest point selection, the matching of the interest points, and the subsequent motion-based computations, tracking, and OD. The most important lesson learned from the research described here is that the incorporation of inertial data into the motion analysis program greatly improves the analysis and makes the process more robust.

  8. Sparse Group Penalized Integrative Analysis of Multiple Cancer Prognosis Datasets

    Science.gov (United States)

    Liu, Jin; Huang, Jian; Xie, Yang; Ma, Shuangge

    2014-01-01

    SUMMARY In cancer research, high-throughput profiling studies have been extensively conducted, searching for markers associated with prognosis. Because of the “large d, small n” characteristic, results generated from the analysis of a single dataset can be unsatisfactory. Recent studies have shown that integrative analysis, which simultaneously analyzes multiple datasets, can be more effective than single-dataset analysis and classic meta-analysis. In most of existing integrative analysis, the homogeneity model has been assumed, which postulates that different datasets share the same set of markers. Several approaches have been designed to reinforce this assumption. In practice, different datasets may differ in terms of patient selection criteria, profiling techniques, and many other aspects. Such differences may make the homogeneity model too restricted. In this study, we assume the heterogeneity model, under which different datasets are allowed to have different sets of markers. With multiple cancer prognosis datasets, we adopt the AFT (accelerated failure time) model to describe survival. This model may have the lowest computational cost among popular semiparametric survival models. For marker selection, we adopt a sparse group MCP (minimax concave penalty) approach. This approach has an intuitive formulation and can be computed using an effective group coordinate descent algorithm. Simulation study shows that it outperforms the existing approaches under both the homogeneity and heterogeneity models. Data analysis further demonstrates the merit of heterogeneity model and proposed approach. PMID:23938111

  9. Simulation analysis of globally integrated logistics and recycling strategies

    Energy Technology Data Exchange (ETDEWEB)

    Song, S.J.; Hiroshi, K. [Hiroshima Inst. of Tech., Graduate School of Mechanical Systems Engineering, Dept. of In formation and Intelligent Systems Engineering, Hiroshima (Japan)

    2004-07-01

    This paper focuses on the optimal analysis of world-wide recycling activities associated with managing the logistics and production activities in global manufacturing whose activities stretch across national boundaries. Globally integrated logistics and recycling strategies consist of the home country and two free trading economic blocs, NAFTA and ASEAN, where significant differences are found in production and disassembly cost, tax rates, local content rules and regulations. Moreover an optimal analysis of globally integrated value-chain was developed by applying simulation optimization technique as a decision-making tool. The simulation model was developed and analyzed by using ProModel packages, and the results help to identify some of the appropriate conditions required to make well-performed logistics and recycling plans in world-wide collaborated manufacturing environment. (orig.)

  10. Integration, warehousing, and analysis strategies of Omics data.

    Science.gov (United States)

    Gedela, Srinubabu

    2011-01-01

    "-Omics" is a current suffix for numerous types of large-scale biological data generation procedures, which naturally demand the development of novel algorithms for data storage and analysis. With next generation genome sequencing burgeoning, it is pivotal to decipher a coding site on the genome, a gene's function, and information on transcripts next to the pure availability of sequence information. To explore a genome and downstream molecular processes, we need umpteen results at the various levels of cellular organization by utilizing different experimental designs, data analysis strategies and methodologies. Here comes the need for controlled vocabularies and data integration to annotate, store, and update the flow of experimental data. This chapter explores key methodologies to merge Omics data by semantic data carriers, discusses controlled vocabularies as eXtensible Markup Languages (XML), and provides practical guidance, databases, and software links supporting the integration of Omics data.

  11. LLIMAS: Revolutionizing integrating modeling and analysis at MIT Lincoln Laboratory

    Science.gov (United States)

    Doyle, Keith B.; Stoeckel, Gerhard P.; Rey, Justin J.; Bury, Mark E.

    2017-08-01

    MIT Lincoln Laboratory's Integrated Modeling and Analysis Software (LLIMAS) enables the development of novel engineering solutions for advanced prototype systems through unique insights into engineering performance and interdisciplinary behavior to meet challenging size, weight, power, environmental, and performance requirements. LLIMAS is a multidisciplinary design optimization tool that wraps numerical optimization algorithms around an integrated framework of structural, thermal, optical, stray light, and computational fluid dynamics analysis capabilities. LLIMAS software is highly extensible and has developed organically across a variety of technologies including laser communications, directed energy, photometric detectors, chemical sensing, laser radar, and imaging systems. The custom software architecture leverages the capabilities of existing industry standard commercial software and supports the incorporation of internally developed tools. Recent advances in LLIMAS's Structural-Thermal-Optical Performance (STOP), aeromechanical, and aero-optical capabilities as applied to Lincoln prototypes are presented.

  12. Computational Approaches for Integrative Analysis of the Metabolome and Microbiome

    Directory of Open Access Journals (Sweden)

    Jasmine Chong

    2017-11-01

    Full Text Available The study of the microbiome, the totality of all microbes inhabiting the host or an environmental niche, has experienced exponential growth over the past few years. The microbiome contributes functional genes and metabolites, and is an important factor for maintaining health. In this context, metabolomics is increasingly applied to complement sequencing-based approaches (marker genes or shotgun metagenomics to enable resolution of microbiome-conferred functionalities associated with health. However, analyzing the resulting multi-omics data remains a significant challenge in current microbiome studies. In this review, we provide an overview of different computational approaches that have been used in recent years for integrative analysis of metabolome and microbiome data, ranging from statistical correlation analysis to metabolic network-based modeling approaches. Throughout the process, we strive to present a unified conceptual framework for multi-omics integration and interpretation, as well as point out potential future directions.

  13. Sensitivity Analysis Based on Markovian Integration by Parts Formula

    Directory of Open Access Journals (Sweden)

    Yongsheng Hang

    2017-10-01

    Full Text Available Sensitivity analysis is widely applied in financial risk management and engineering; it describes the variations brought by the changes of parameters. Since the integration by parts technique for Markov chains is well developed in recent years, in this paper we apply it for computation of sensitivity and show the closed-form expressions for two commonly-used time-continuous Markovian models. By comparison, we conclude that our approach outperforms the existing technique of computing sensitivity on Markovian models.

  14. Corporate Disclosure, Materiality, and Integrated Report: An Event Study Analysis

    OpenAIRE

    Maria Cleofe Giorgino; Enrico Supino; Federico Barnabè

    2017-01-01

    Within the extensive literature investigating the impacts of corporate disclosure in supporting the sustainable growth of an organization, few studies have included in the analysis the materiality issue referred to the information being disclosed. This article aims to address this gap, exploring the effect produced on capital markets by the publication of a recent corporate reporting tool, Integrated Report (IR). The features of this tool are that it aims to represent the multidimensional imp...

  15. Process Integration Analysis of an Industrial Hydrogen Production Process

    OpenAIRE

    Stolten, Detlef; Grube, Thomas; Tock, Laurence; Maréchal, François; Metzger, Christian; Arpentinier, Philippe

    2010-01-01

    The energy efficiency of an industrial hydrogen production process using steam methane reforming (SMR) combined with the water gas shift reaction (WGS) is analyzed using process integration techniques based on heat cascade calculation and pinch analysis with the aim of identifying potential measures to enhance the process performance. The challenge is to satisfy the high temperature heat demand of the SMR reaction by minimizing the consumption of natural gas to feed the combustion and to expl...

  16. Building-integrated renewable energy policy analysis in China

    Institute of Scientific and Technical Information of China (English)

    姚春妮; 郝斌

    2009-01-01

    With the dramatic development of renewable energy all over the world,and for purpose of adjusting energy structure,the Ministry of Construction of China plans to promote the large scale application of renewable energy in buildings. In order to ensure the validity of policy-making,this work firstly exerts a method to do cost-benefit analysis for three kinds of technologies such as building-integrated solar hot water (BISHW) system,building-integrated photovoltaic (BIPV) technology and ground water heat pump (GWHP). Through selecting a representative city of every climate region,the analysis comes into different results for different climate regions in China and respectively different suggestion for policy-making. On the analysis basis,the Ministry of Construction (MOC) and the Ministry of Finance of China (MOF) united to start-up Building-integrated Renewable Energy Demonstration Projects (BIREDP) in 2006. In the demonstration projects,renewable energy takes place of traditional energy to supply the domestic hot water,electricity,air-conditioning and heating. Through carrying out the demonstration projects,renewable energy related market has been expanded. More and more relative companies and local governments take the opportunity to promote the large scale application of renewable energy in buildings.

  17. Lectures on functional analysis and the Lebesgue integral

    CERN Document Server

    Komornik, Vilmos

    2016-01-01

    This textbook, based on three series of lectures held by the author at the University of Strasbourg, presents functional analysis in a non-traditional way by generalizing elementary theorems of plane geometry to spaces of arbitrary dimension. This approach leads naturally to the basic notions and theorems. Most results are illustrated by the small ℓp spaces. The Lebesgue integral, meanwhile, is treated via the direct approach of Frigyes Riesz, whose constructive definition of measurable functions leads to optimal, clear-cut versions of the classical theorems of Fubini-Tonelli and Radon-Nikodým. Lectures on Functional Analysis and the Lebesgue Integral presents the most important topics for students, with short, elegant proofs. The exposition style follows the Hungarian mathematical tradition of Paul Erdős and others. The order of the first two parts, functional analysis and the Lebesgue integral, may be reversed. In the third and final part they are combined to study various spaces of continuous and integ...

  18. Heater-Integrated Cantilevers for Nano-Samples Thermogravimetric Analysis

    Directory of Open Access Journals (Sweden)

    Valeria Toffoli

    2013-12-01

    Full Text Available The design and characteristics of a micro-system for thermogravimetric analysis (TGA in which heater, temperature sensor and mass sensor are integrated into a single device are presented. The system consists of a suspended cantilever that incorporates a microfabricated resistor, used as both heater and thermometer. A three-dimensional finite element analysis was used to define the structure parameters. TGA sensors were fabricated by standard microlithographic techniques and tested using milli-Q water and polyurethane microcapsule. The results demonstrated that our approach provides a faster and more sensitive TGA with respect to commercial systems.

  19. Heater-Integrated Cantilevers for Nano-Samples Thermogravimetric Analysis

    Science.gov (United States)

    Toffoli, Valeria; Carrato, Sergio; Lee, Dongkyu; Jeon, Sangmin; Lazzarino, Marco

    2013-01-01

    The design and characteristics of a micro-system for thermogravimetric analysis (TGA) in which heater, temperature sensor and mass sensor are integrated into a single device are presented. The system consists of a suspended cantilever that incorporates a microfabricated resistor, used as both heater and thermometer. A three-dimensional finite element analysis was used to define the structure parameters. TGA sensors were fabricated by standard microlithographic techniques and tested using milli-Q water and polyurethane microcapsule. The results demonstrated that our approach provides a faster and more sensitive TGA with respect to commercial systems.

  20. Structural integrity analysis of an INPP building under external loading

    International Nuclear Information System (INIS)

    Dundulis, G.; Karalevicius, R.; Uspuras, E.; Kulak, R.F.; Marchertas, A.

    2005-01-01

    After the terrorist attacks in New York and Washington D. C. using civil airplanes, the evaluation of civil airplane crashes into civil and NPP structures has become very important. The interceptions of many terrorists' communications reveal that the use of commandeered commercial aircraft is still a major part of their plans for destruction. Aircraft crash or other flying objects in the territory of the Ignalina Nuclear Power Plant (INPP) represents a concern to the plant. Aircraft traveling at high velocity have a destructive potential. The aircraft crash may damage the roof and walls of buildings, pipelines, electric motors, cases of power supplies, power cables of electricity transmission and other elements and systems, which are important for safety. Therefore, the evaluation of the structural response to an of aircraft crash is important and was selected for analysis. The structural integrity analysis due to the effects of an aircraft crash on an NPP building structure is the subject of this paper. The finite element method was used for the structural analysis of a typical Ignalina NPP building. The structural integrity analysis was performed for a portion of the ALS using the dynamic loading of an aircraft crash impact model. The computer code NEPTUNE was used for this analysis. The local effects caused by impact of the aircraft's engine on the building wall were evaluated independently by using an empirical formula. (authors)

  1. Penalized differential pathway analysis of integrative oncogenomics studies.

    Science.gov (United States)

    van Wieringen, Wessel N; van de Wiel, Mark A

    2014-04-01

    Through integration of genomic data from multiple sources, we may obtain a more accurate and complete picture of the molecular mechanisms underlying tumorigenesis. We discuss the integration of DNA copy number and mRNA gene expression data from an observational integrative genomics study involving cancer patients. The two molecular levels involved are linked through the central dogma of molecular biology. DNA copy number aberrations abound in the cancer cell. Here we investigate how these aberrations affect gene expression levels within a pathway using observational integrative genomics data of cancer patients. In particular, we aim to identify differential edges between regulatory networks of two groups involving these molecular levels. Motivated by the rate equations, the regulatory mechanism between DNA copy number aberrations and gene expression levels within a pathway is modeled by a simultaneous-equations model, for the one- and two-group case. The latter facilitates the identification of differential interactions between the two groups. Model parameters are estimated by penalized least squares using the lasso (L1) penalty to obtain a sparse pathway topology. Simulations show that the inclusion of DNA copy number data benefits the discovery of gene-gene interactions. In addition, the simulations reveal that cis-effects tend to be over-estimated in a univariate (single gene) analysis. In the application to real data from integrative oncogenomic studies we show that inclusion of prior information on the regulatory network architecture benefits the reproducibility of all edges. Furthermore, analyses of the TP53 and TGFb signaling pathways between ER+ and ER- samples from an integrative genomics breast cancer study identify reproducible differential regulatory patterns that corroborate with existing literature.

  2. Integration of facility modeling capabilities for nuclear nonproliferation analysis

    International Nuclear Information System (INIS)

    Burr, Tom; Gorensek, M.B.; Krebs, John; Kress, Reid L.; Lamberti, Vincent; Schoenwald, David; Ward, Richard C.

    2012-01-01

    Developing automated methods for data collection and analysis that can facilitate nuclearnonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facilitymodeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facilitymodeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come from many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facilitymodelingcapabilities and illustrates how they could be integrated and utilized for nonproliferationanalysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facilitymodeling tools. After considering a representative sampling of key facilitymodelingcapabilities, the proposed integration framework is illustrated with several examples.

  3. Integrated situational awareness for cyber attack detection, analysis, and mitigation

    Science.gov (United States)

    Cheng, Yi; Sagduyu, Yalin; Deng, Julia; Li, Jason; Liu, Peng

    2012-06-01

    Real-time cyberspace situational awareness is critical for securing and protecting today's enterprise networks from various cyber threats. When a security incident occurs, network administrators and security analysts need to know what exactly has happened in the network, why it happened, and what actions or countermeasures should be taken to quickly mitigate the potential impacts. In this paper, we propose an integrated cyberspace situational awareness system for efficient cyber attack detection, analysis and mitigation in large-scale enterprise networks. Essentially, a cyberspace common operational picture will be developed, which is a multi-layer graphical model and can efficiently capture and represent the statuses, relationships, and interdependencies of various entities and elements within and among different levels of a network. Once shared among authorized users, this cyberspace common operational picture can provide an integrated view of the logical, physical, and cyber domains, and a unique visualization of disparate data sets to support decision makers. In addition, advanced analyses, such as Bayesian Network analysis, will be explored to address the information uncertainty, dynamic and complex cyber attack detection, and optimal impact mitigation issues. All the developed technologies will be further integrated into an automatic software toolkit to achieve near real-time cyberspace situational awareness and impact mitigation in large-scale computer networks.

  4. Measure and integral an introduction to real analysis

    CERN Document Server

    Wheeden, Richard L

    2015-01-01

    Now considered a classic text on the topic, Measure and Integral: An Introduction to Real Analysis provides an introduction to real analysis by first developing the theory of measure and integration in the simple setting of Euclidean space, and then presenting a more general treatment based on abstract notions characterized by axioms and with less geometric content.Published nearly forty years after the first edition, this long-awaited Second Edition also:Studies the Fourier transform of functions in the spaces L1, L2, and Lp, 1 p Shows the Hilbert transform to be a bounded operator on L2, as an application of the L2 theory of the Fourier transform in the one-dimensional caseCovers fractional integration and some topics related to mean oscillation properties of functions, such as the classes of Hölder continuous functions and the space of functions of bounded mean oscillationDerives a subrepresentation formula, which in higher dimensions plays a role roughly similar to the one played by the fundamental theor...

  5. Integrative Analysis of Cancer Diagnosis Studies with Composite Penalization

    Science.gov (United States)

    Liu, Jin; Huang, Jian; Ma, Shuangge

    2013-01-01

    Summary In cancer diagnosis studies, high-throughput gene profiling has been extensively conducted, searching for genes whose expressions may serve as markers. Data generated from such studies have the “large d, small n” feature, with the number of genes profiled much larger than the sample size. Penalization has been extensively adopted for simultaneous estimation and marker selection. Because of small sample sizes, markers identified from the analysis of single datasets can be unsatisfactory. A cost-effective remedy is to conduct integrative analysis of multiple heterogeneous datasets. In this article, we investigate composite penalization methods for estimation and marker selection in integrative analysis. The proposed methods use the minimax concave penalty (MCP) as the outer penalty. Under the homogeneity model, the ridge penalty is adopted as the inner penalty. Under the heterogeneity model, the Lasso penalty and MCP are adopted as the inner penalty. Effective computational algorithms based on coordinate descent are developed. Numerical studies, including simulation and analysis of practical cancer datasets, show satisfactory performance of the proposed methods. PMID:24578589

  6. Methodology for dimensional variation analysis of ITER integrated systems

    International Nuclear Information System (INIS)

    Fuentes, F. Javier; Trouvé, Vincent; Cordier, Jean-Jacques; Reich, Jens

    2016-01-01

    Highlights: • Tokamak dimensional management methodology, based on 3D variation analysis, is presented. • Dimensional Variation Model implementation workflow is described. • Methodology phases are described in detail. The application of this methodology to the tolerance analysis of ITER Vacuum Vessel is presented. • Dimensional studies are a valuable tool for the assessment of Tokamak PCR (Project Change Requests), DR (Deviation Requests) and NCR (Non-Conformance Reports). - Abstract: The ITER machine consists of a large number of complex systems highly integrated, with critical functional requirements and reduced design clearances to minimize the impact in cost and performances. Tolerances and assembly accuracies in critical areas could have a serious impact in the final performances, compromising the machine assembly and plasma operation. The management of tolerances allocated to part manufacture and assembly processes, as well as the control of potential deviations and early mitigation of non-compliances with the technical requirements, is a critical activity on the project life cycle. A 3D tolerance simulation analysis of ITER Tokamak machine has been developed based on 3DCS dedicated software. This integrated dimensional variation model is representative of Tokamak manufacturing functional tolerances and assembly processes, predicting accurate values for the amount of variation on critical areas. This paper describes the detailed methodology to implement and update the Tokamak Dimensional Variation Model. The model is managed at system level. The methodology phases are illustrated by its application to the Vacuum Vessel (VV), considering the status of maturity of VV dimensional variation model. The following topics are described in this paper: • Model description and constraints. • Model implementation workflow. • Management of input and output data. • Statistical analysis and risk assessment. The management of the integration studies based on

  7. Methodology for dimensional variation analysis of ITER integrated systems

    Energy Technology Data Exchange (ETDEWEB)

    Fuentes, F. Javier, E-mail: FranciscoJavier.Fuentes@iter.org [ITER Organization, Route de Vinon-sur-Verdon—CS 90046, 13067 St Paul-lez-Durance (France); Trouvé, Vincent [Assystem Engineering & Operation Services, rue J-M Jacquard CS 60117, 84120 Pertuis (France); Cordier, Jean-Jacques; Reich, Jens [ITER Organization, Route de Vinon-sur-Verdon—CS 90046, 13067 St Paul-lez-Durance (France)

    2016-11-01

    Highlights: • Tokamak dimensional management methodology, based on 3D variation analysis, is presented. • Dimensional Variation Model implementation workflow is described. • Methodology phases are described in detail. The application of this methodology to the tolerance analysis of ITER Vacuum Vessel is presented. • Dimensional studies are a valuable tool for the assessment of Tokamak PCR (Project Change Requests), DR (Deviation Requests) and NCR (Non-Conformance Reports). - Abstract: The ITER machine consists of a large number of complex systems highly integrated, with critical functional requirements and reduced design clearances to minimize the impact in cost and performances. Tolerances and assembly accuracies in critical areas could have a serious impact in the final performances, compromising the machine assembly and plasma operation. The management of tolerances allocated to part manufacture and assembly processes, as well as the control of potential deviations and early mitigation of non-compliances with the technical requirements, is a critical activity on the project life cycle. A 3D tolerance simulation analysis of ITER Tokamak machine has been developed based on 3DCS dedicated software. This integrated dimensional variation model is representative of Tokamak manufacturing functional tolerances and assembly processes, predicting accurate values for the amount of variation on critical areas. This paper describes the detailed methodology to implement and update the Tokamak Dimensional Variation Model. The model is managed at system level. The methodology phases are illustrated by its application to the Vacuum Vessel (VV), considering the status of maturity of VV dimensional variation model. The following topics are described in this paper: • Model description and constraints. • Model implementation workflow. • Management of input and output data. • Statistical analysis and risk assessment. The management of the integration studies based on

  8. Technology integrated teaching in Malaysian schools: GIS, a SWOT analysis

    Directory of Open Access Journals (Sweden)

    Habibah Lateh, vasugiammai muniandy

    2011-08-01

    , articles and proceeding papers. Researches had been continuously done in integrating GIS into Geography syllabus. Thus, this article describes and discusses the barriers and opportunities of implementing GIS in schools with a deep focus of how GIS could enhance the process of teaching and learning geography. The purpose of the study is to determine the effectiveness of GIS in enhancing students’ interest towards the subject. Barriers that might limit the implementation of GIS in schools also briefly discussedin this article. The capabilities of GIS in schools and teaching with GIS is also a part of this article. SWOT analysis is used to find the strength, threaten, opportunities and weakness of GIS to be integrated in Malaysian schools. A content analysis was performed using articles from local and abroad publications regarding technology integration and GIS. Conference proceedings were also analyzed. This content analysis included 35 articles selected from ICT and GIS publication in Malaysia and abroad. The content analysis was done in order to identify the barriers of trying GIS in schools in Malaysia. Thus, this article discusses strengths, weaknesses, opportunities and threatens. The future of GIS in Malaysian Schools has been added into the conclusion.

  9. Analysis on working pressure selection of ACME integral test facility

    International Nuclear Information System (INIS)

    Chen Lian; Chang Huajian; Li Yuquan; Ye Zishen; Qin Benke

    2011-01-01

    An integral effects test facility, advanced core cooling mechanism experiment facility (ACME) was designed to verify the performance of the passive safety system and validate its safety analysis codes of a pressurized water reactor power plant. Three test facilities for AP1000 design were introduced and review was given. The problems resulted from the different working pressures of its test facilities were analyzed. Then a detailed description was presented on the working pressure selection of ACME facility as well as its characteristics. And the approach of establishing desired testing initial condition was discussed. The selected 9.3 MPa working pressure covered almost all important passive safety system enables the ACME to simulate the LOCAs with the same pressure and property similitude as the prototype. It's expected that the ACME design would be an advanced core cooling integral test facility design. (authors)

  10. Application of symplectic integrator to numerical fluid analysis

    International Nuclear Information System (INIS)

    Tanaka, Nobuatsu

    2000-01-01

    This paper focuses on application of the symplectic integrator to numerical fluid analysis. For the purpose, we introduce Hamiltonian particle dynamics to simulate fluid behavior. The method is based on both the Hamiltonian formulation of a system and the particle methods, and is therefore called Hamiltonian Particle Dynamics (HPD). In this paper, an example of HPD applications, namely the behavior of incompressible inviscid fluid, is solved. In order to improve accuracy of HPD with respect to space, CIVA, which is a highly accurate interpolation method, is combined, but the combined method is subject to problems in that the invariants of the system are not conserved in a long-time computation. For solving the problems, symplectic time integrators are introduced and the effectiveness is confirmed by numerical analyses. (author)

  11. Integrated information system for analysis of nuclear power plants

    International Nuclear Information System (INIS)

    Galperin, A.

    1994-01-01

    Performing complicated engineering analyses of a nuclear power plant requires storage and manipulation of a large amount of information, both data and knowledge. This information is characterized by its multidisciplinary nature, complexity, and diversity. The problems caused by inefficient and lengthy manual operations involving the data flow management within the frame-work of the safety-related analysis of a power plant can be solved by applying the computer aided engineering principles. These principles are the basis of the design of an integrated information storage system (IRIS). The basic idea is to create a computerized environment, which includes both database and functional capabilities. Consideration and analysis of the data types and required data manipulation capabilities as well as operational requirements, resulted in the choice of an object-oriented data-base management system (OODBMS) as a development platform for solving the software engineering problems. Several advantages of OODBMSs over conventional relations database systems were found of crucial importance, especially providing the necessary flexibility for different data types and extensibility potential. A detailed design of a data model is produced for the plant technical data and for the storage of analysis results. The overall system architecture was designed to assure the feasibility of integrating database capabilities with procedures and functions written in conventional algorithmic programming languages

  12. Vertically integrated analysis of human DNA. Final technical report

    Energy Technology Data Exchange (ETDEWEB)

    Olson, M.

    1997-10-01

    This project has been oriented toward improving the vertical integration of the sequential steps associated with the large-scale analysis of human DNA. The central focus has been on an approach to the preparation of {open_quotes}sequence-ready{close_quotes} maps, which is referred to as multiple-complete-digest (MCD) mapping, primarily directed at cosmid clones. MCD mapping relies on simple experimental steps, supported by advanced image-analysis and map-assembly software, to produce extremely accurate restriction-site and clone-overlap maps. We believe that MCD mapping is one of the few high-resolution mapping systems that has the potential for high-level automation. Successful automation of this process would be a landmark event in genome analysis. Once other higher organisms, paving the way for cost-effective sequencing of these genomes. Critically, MCD mapping has the potential to provide built-in quality control for sequencing accuracy and to make possible a highly integrated end product even if there are large numbers of discontinuities in the actual sequence.

  13. DESIGN ANALYSIS OF ELECTRICAL MACHINES THROUGH INTEGRATED NUMERICAL APPROACH

    Directory of Open Access Journals (Sweden)

    ARAVIND C.V.

    2016-02-01

    Full Text Available An integrated design platform for the newer type of machines is presented in this work. The machine parameters are evaluated out using developed modelling tool. With the machine parameters, the machine is modelled using computer aided tool. The designed machine is brought to simulation tool to perform electromagnetic and electromechanical analysis. In the simulation, conditions setting are performed to setup the materials, meshes, rotational speed and the excitation circuit. Electromagnetic analysis is carried out to predict the behavior of the machine based on the movement of flux in the machines. Besides, electromechanical analysis is carried out to analyse the speed-torque characteristic, the current-torque characteristic and the phase angle-torque characteristic. After all the results are analysed, the designed machine is used to generate S block function that is compatible with MATLAB/SIMULINK tool for the dynamic operational characteristics. This allows the integration of existing drive system into the new machines designed in the modelling tool. An example of the machine design is presented to validate the usage of such a tool.

  14. Solid waste integrated cost analysis model: 1991 project year report

    Energy Technology Data Exchange (ETDEWEB)

    1991-01-01

    The purpose of the City of Houston's 1991 Solid Waste Integrated Cost Analysis Model (SWICAM) project was to continue the development of a computerized cost analysis model. This model is to provide solid waste managers with tool to evaluate the dollar cost of real or hypothetical solid waste management choices. Those choices have become complicated by the implementation of Subtitle D of the Resources Conservation and Recovery Act (RCRA) and the EPA's Integrated Approach to managing municipal solid waste;. that is, minimize generation, maximize recycling, reduce volume (incinerate), and then bury (landfill) only the remainder. Implementation of an integrated solid waste management system involving all or some of the options of recycling, waste to energy, composting, and landfilling is extremely complicated. Factors such as hauling distances, markets, and prices for recyclable, costs and benefits of transfer stations, and material recovery facilities must all be considered. A jurisdiction must determine the cost impacts of implementing a number of various possibilities for managing, handling, processing, and disposing of waste. SWICAM employs a single Lotus 123 spreadsheet to enable a jurisdiction to predict or assess the costs of its waste management system. It allows the user to select his own process flow for waste material and to manipulate the model to include as few or as many options as he or she chooses. The model will calculate the estimated cost for those choices selected. The user can then change the model to include or exclude waste stream components, until the mix of choices suits the user. Graphs can be produced as a visual communication aid in presenting the results of the cost analysis. SWICAM also allows future cost projections to be made.

  15. An Integrated Analysis of Changes in Water Stress in Europe

    DEFF Research Database (Denmark)

    Henrichs, T.; Lehner, B.; Alcamo, J.

    2002-01-01

    Future changes in water availability with climate change and changes in water use due to socio-economic development are to occur in parallel. In an integrated analysis we bring together these aspects of global change in a consistent manner, and analyse the water stress situation in Europe. We find...... that today high water stress exists in one-fifth of European river basin area. Under a scenario projection, increases in water use throughout Eastern Europe are accompanied by decreases in water availability in most of Southern Europe--combining these trends leads to a marked increase in water stress...

  16. Case for integral core-disruptive accident analysis

    International Nuclear Information System (INIS)

    Luck, L.B.; Bell, C.R.

    1985-01-01

    Integral analysis is an approach used at the Los Alamos National Laboratory to cope with the broad multiplicity of accident paths and complex phenomena that characterize the transition phase of core-disruptive accident progression in a liquid-metal-cooled fast breeder reactor. The approach is based on the combination of a reference calculation, which is intended to represent a band of similar accident paths, and associated system- and separate-effect studies, which are designed to determine the effect of uncertainties. Results are interpreted in the context of a probabilistic framework. The approach was applied successfully in two studies; illustrations from the Clinch River Breeder Reactor licensing assessment are included

  17. Integrated polymer waveguides for absorbance detection in chemical analysis systems

    DEFF Research Database (Denmark)

    Mogensen, Klaus Bo; El-Ali, Jamil; Wolff, Anders

    2003-01-01

    A chemical analysis system for absorbance detection with integrated polymer waveguides is reported for the first time. The fabrication procedure relies on structuring of a single layer of the photoresist SU-8, so both the microfluidic channel network and the optical components, which include planar....... The emphasis of this paper is on the signal-to-noise ratio of the detection and its relation to the sensitivity. Two absorbance cells with an optical path length of 100 μm and 1000 μm were characterized and compared in terms of sensitivity, limit of detection and effective path length for measurements...

  18. Integrated Modeling for the James Webb Space Telescope (JWST) Project: Structural Analysis Activities

    Science.gov (United States)

    Johnston, John; Mosier, Mark; Howard, Joe; Hyde, Tupper; Parrish, Keith; Ha, Kong; Liu, Frank; McGinnis, Mark

    2004-01-01

    This paper presents viewgraphs about structural analysis activities and integrated modeling for the James Webb Space Telescope (JWST). The topics include: 1) JWST Overview; 2) Observatory Structural Models; 3) Integrated Performance Analysis; and 4) Future Work and Challenges.

  19. Integrated omics analysis of specialized metabolism in medicinal plants.

    Science.gov (United States)

    Rai, Amit; Saito, Kazuki; Yamazaki, Mami

    2017-05-01

    Medicinal plants are a rich source of highly diverse specialized metabolites with important pharmacological properties. Until recently, plant biologists were limited in their ability to explore the biosynthetic pathways of these metabolites, mainly due to the scarcity of plant genomics resources. However, recent advances in high-throughput large-scale analytical methods have enabled plant biologists to discover biosynthetic pathways for important plant-based medicinal metabolites. The reduced cost of generating omics datasets and the development of computational tools for their analysis and integration have led to the elucidation of biosynthetic pathways of several bioactive metabolites of plant origin. These discoveries have inspired synthetic biology approaches to develop microbial systems to produce bioactive metabolites originating from plants, an alternative sustainable source of medicinally important chemicals. Since the demand for medicinal compounds are increasing with the world's population, understanding the complete biosynthesis of specialized metabolites becomes important to identify or develop reliable sources in the future. Here, we review the contributions of major omics approaches and their integration to our understanding of the biosynthetic pathways of bioactive metabolites. We briefly discuss different approaches for integrating omics datasets to extract biologically relevant knowledge and the application of omics datasets in the construction and reconstruction of metabolic models. © 2017 The Authors The Plant Journal © 2017 John Wiley & Sons Ltd.

  20. Integrative analysis of the mitochondrial proteome in yeast.

    Directory of Open Access Journals (Sweden)

    Holger Prokisch

    2004-06-01

    Full Text Available In this study yeast mitochondria were used as a model system to apply, evaluate, and integrate different genomic approaches to define the proteins of an organelle. Liquid chromatography mass spectrometry applied to purified mitochondria identified 546 proteins. By expression analysis and comparison to other proteome studies, we demonstrate that the proteomic approach identifies primarily highly abundant proteins. By expanding our evaluation to other types of genomic approaches, including systematic deletion phenotype screening, expression profiling, subcellular localization studies, protein interaction analyses, and computational predictions, we show that an integration of approaches moves beyond the limitations of any single approach. We report the success of each approach by benchmarking it against a reference set of known mitochondrial proteins, and predict approximately 700 proteins associated with the mitochondrial organelle from the integration of 22 datasets. We show that a combination of complementary approaches like deletion phenotype screening and mass spectrometry can identify over 75% of the known mitochondrial proteome. These findings have implications for choosing optimal genome-wide approaches for the study of other cellular systems, including organelles and pathways in various species. Furthermore, our systematic identification of genes involved in mitochondrial function and biogenesis in yeast expands the candidate genes available for mapping Mendelian and complex mitochondrial disorders in humans.

  1. Integrated intelligent instruments using supercritical fluid technology for soil analysis

    International Nuclear Information System (INIS)

    Liebman, S.A.; Phillips, C.; Fitzgerald, W.; Levy, E.J.

    1994-01-01

    Contaminated soils pose a significant challenge for characterization and remediation programs that require rapid, accurate and comprehensive data in the field or laboratory. Environmental analyzers based on supercritical fluid (SF) technology have been designed and developed for meeting these global needs. The analyzers are designated the CHAMP Systems (Chemical Hazards Automated Multimedia Processors). The prototype instrumentation features SF extraction (SFE) and on-line capillary gas chromatographic (GC) analysis with chromatographic and/or spectral identification detectors, such as ultra-violet, Fourier transform infrared and mass spectrometers. Illustrations are given for a highly automated SFE-capillary GC/flame ionization (FID) configuration to provide validated screening analysis for total extractable hydrocarbons within ca. 5--10 min, as well as a full qualitative/quantitative analysis in 25--30 min. Data analysis using optional expert system and neural networks software is demonstrated for test gasoline and diesel oil mixtures in this integrated intelligent instrument approach to trace organic analysis of soils and sediments

  2. Integrative Analysis of Prognosis Data on Multiple Cancer Subtypes

    Science.gov (United States)

    Liu, Jin; Huang, Jian; Zhang, Yawei; Lan, Qing; Rothman, Nathaniel; Zheng, Tongzhang; Ma, Shuangge

    2014-01-01

    Summary In cancer research, profiling studies have been extensively conducted, searching for genes/SNPs associated with prognosis. Cancer is diverse. Examining the similarity and difference in the genetic basis of multiple subtypes of the same cancer can lead to a better understanding of their connections and distinctions. Classic meta-analysis methods analyze each subtype separately and then compare analysis results across subtypes. Integrative analysis methods, in contrast, analyze the raw data on multiple subtypes simultaneously and can outperform meta-analysis methods. In this study, prognosis data on multiple subtypes of the same cancer are analyzed. An AFT (accelerated failure time) model is adopted to describe survival. The genetic basis of multiple subtypes is described using the heterogeneity model, which allows a gene/SNP to be associated with prognosis of some subtypes but not others. A compound penalization method is developed to identify genes that contain important SNPs associated with prognosis. The proposed method has an intuitive formulation and is realized using an iterative algorithm. Asymptotic properties are rigorously established. Simulation shows that the proposed method has satisfactory performance and outperforms a penalization-based meta-analysis method and a regularized thresholding method. An NHL (non-Hodgkin lymphoma) prognosis study with SNP measurements is analyzed. Genes associated with the three major subtypes, namely DLBCL, FL, and CLL/SLL, are identified. The proposed method identifies genes that are different from alternatives and have important implications and satisfactory prediction performance. PMID:24766212

  3. Integrated Data Analysis (IDCA) Program - PETN Class 4 Standard

    Energy Technology Data Exchange (ETDEWEB)

    Sandstrom, Mary M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Geoffrey W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Daniel N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pollard, Colin J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Warner, Kirstin F. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Sorensen, Daniel N. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Remmers, Daniel L. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Shelley, Timothy J. [Air Force Research Lab. (AFRL), Tyndall AFB, FL (United States); Reyes, Jose A. [Applied Research Associates, Tyndall AFB, FL (United States); Phillips, Jason J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hsu, Peter C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Reynolds, John G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2012-08-01

    The Integrated Data Collection Analysis (IDCA) program is conducting a proficiency study for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are the results for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of PETN Class 4. The PETN was found to have: 1) an impact sensitivity (DH50) range of 6 to 12 cm, 2) a BAM friction sensitivity (F50) range 7 to 11 kg, TIL (0/10) of 3.7 to 7.2 kg, 3) a ABL friction sensitivity threshold of 5 or less psig at 8 fps, 4) an ABL ESD sensitivity threshold of 0.031 to 0.326 j/g, and 5) a thermal sensitivity of an endothermic feature with Tmin = ~ 141 °C, and a exothermic feature with a Tmax = ~205°C.

  4. Integrated Data Collection Analysis (IDCA) Program — Ammonium Nitrate

    Energy Technology Data Exchange (ETDEWEB)

    Sandstrom, Mary M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Geoffrey W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Daniel N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pollard, Colin J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Warner, Kirstin F. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Sorensen, Daniel N. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Remmers, Daniel L. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Phillips, Jason J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shelley, Timothy J. [Bureau of Alcohol, Tobacco and Firearms, Redstone Arsenal, AL (United States); Reyes, Jose A. [Applied Research Associates, Tyndall AFB, FL (United States); Hsu, Peter C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Reynolds, John G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-05-17

    The Integrated Data Collection Analysis (IDCA) program is conducting a proficiency study for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are the results for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of ammonium nitrate (AN). AN was tested, in most cases, as both received from manufacturer and dried/sieved. The participants found the AN to be: 1) insensitive in Type 12A impact testing (although with a wide range of values), 2) completely insensitive in BAM friction testing, 3) less sensitive than the RDX standard in ABL friction testing, 4) less sensitive than RDX in ABL ESD testing, and 5) less sensitive than RDX and PETN in DSC thermal analyses.

  5. An integrated economic and distributional analysis of energy policies

    International Nuclear Information System (INIS)

    Labandeira, Xavier; Labeaga, Jose M.; Rodriguez, Miguel

    2009-01-01

    Most public policies, particularly those in the energy sphere, have not only efficiency but also distributional effects. However, there is a trade-off between modelling approaches suitable for calculating those impacts on the economy. For the former most of the studies have been conducted with general equilibrium models, whereas partial equilibrium models represent the main approach for distributional analysis. This paper proposes a methodology to simultaneously carry out an analysis of the distributional and efficiency consequences of changes in energy taxation. In order to do so, we have integrated a microeconomic household demand model and a computable general equilibrium model for the Spanish economy. We illustrate the advantages of this approach by simulating a revenue-neutral reform in Spanish indirect taxation, with a large increase of energy taxes that serve an environmental purpose. The results show that the reforms bring about significant efficiency and distributional effects, in some cases counterintuitive, and demonstrate the academic and social utility of this approximation. (author)

  6. An integrated economic and distributional analysis of energy policies

    Energy Technology Data Exchange (ETDEWEB)

    Labandeira, Xavier [Facultade de CC. Economicas, University of Vigo, 36310 Vigo (Spain); Labeaga, Jose M. [Instituto de Estudios Fiscales, Avda. Cardenal Herrera Oria 378, 28035 Madrid (Spain); Rodriguez, Miguel [Facultade de CC. Empresariais e Turismo, University of Vigo, 32004 Ourense (Spain)

    2009-12-15

    Most public policies, particularly those in the energy sphere, have not only efficiency but also distributional effects. However, there is a trade-off between modelling approaches suitable for calculating those impacts on the economy. For the former most of the studies have been conducted with general equilibrium models, whereas partial equilibrium models represent the main approach for distributional analysis. This paper proposes a methodology to simultaneously carry out an analysis of the distributional and efficiency consequences of changes in energy taxation. In order to do so, we have integrated a microeconomic household demand model and a computable general equilibrium model for the Spanish economy. We illustrate the advantages of this approach by simulating a revenue-neutral reform in Spanish indirect taxation, with a large increase of energy taxes that serve an environmental purpose. The results show that the reforms bring about significant efficiency and distributional effects, in some cases counterintuitive, and demonstrate the academic and social utility of this approximation. (author)

  7. Simulation analysis for integrated evaluation of technical and commercial risk

    International Nuclear Information System (INIS)

    Gutleber, D.S.; Heiberger, E.M.; Morris, T.D.

    1995-01-01

    Decisions to invest in oil- and gasfield acquisitions or participating interests often are based on the perceived ability to enhance the economic value of the underlying asset. A multidisciplinary approach integrating reservoir engineering, operations and drilling, and deal structuring with Monte Carlo simulation modeling can overcome weaknesses of deterministic analysis and significantly enhance investment decisions. This paper discusses the use of spreadsheets and Monte Carlo simulation to generate probabilistic outcomes for key technical and economic parameters for ultimate identification of the economic volatility and value of potential deal concepts for a significant opportunity. The approach differs from a simple risk analysis for an individual well by incorporating detailed, full-field simulations that vary the reservoir parameters, capital and operating cost assumptions, and schedules on timing in the framework of various deal structures

  8. Integrating Pavement Crack Detection and Analysis Using Autonomous Unmanned Aerial Vehicle Imagery

    Science.gov (United States)

    2015-03-27

    INTEGRATING PAVEMENT CRACK DETECTION AND ANALYSIS USING AUTONOMOUS UNMANNED AERIAL VEHICLE...protection in the United States. AFIT-ENV-MS-15-M-195 INTEGRATING PAVEMENT CRACK DETECTION AND ANALYSIS USING AUTONOMOUS UNMANNED AERIAL...APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED. AFIT-ENV-MS-15-M-195 INTEGRATING PAVEMENT CRACK DETECTION AND ANALYSIS USING AUTONOMOUS

  9. Sensitivity Analysis of the Integrated Medical Model for ISS Programs

    Science.gov (United States)

    Goodenow, D. A.; Myers, J. G.; Arellano, J.; Boley, L.; Garcia, Y.; Saile, L.; Walton, M.; Kerstman, E.; Reyes, D.; Young, M.

    2016-01-01

    Sensitivity analysis estimates the relative contribution of the uncertainty in input values to the uncertainty of model outputs. Partial Rank Correlation Coefficient (PRCC) and Standardized Rank Regression Coefficient (SRRC) are methods of conducting sensitivity analysis on nonlinear simulation models like the Integrated Medical Model (IMM). The PRCC method estimates the sensitivity using partial correlation of the ranks of the generated input values to each generated output value. The partial part is so named because adjustments are made for the linear effects of all the other input values in the calculation of correlation between a particular input and each output. In SRRC, standardized regression-based coefficients measure the sensitivity of each input, adjusted for all the other inputs, on each output. Because the relative ranking of each of the inputs and outputs is used, as opposed to the values themselves, both methods accommodate the nonlinear relationship of the underlying model. As part of the IMM v4.0 validation study, simulations are available that predict 33 person-missions on ISS and 111 person-missions on STS. These simulated data predictions feed the sensitivity analysis procedures. The inputs to the sensitivity procedures include the number occurrences of each of the one hundred IMM medical conditions generated over the simulations and the associated IMM outputs: total quality time lost (QTL), number of evacuations (EVAC), and number of loss of crew lives (LOCL). The IMM team will report the results of using PRCC and SRRC on IMM v4.0 predictions of the ISS and STS missions created as part of the external validation study. Tornado plots will assist in the visualization of the condition-related input sensitivities to each of the main outcomes. The outcomes of this sensitivity analysis will drive review focus by identifying conditions where changes in uncertainty could drive changes in overall model output uncertainty. These efforts are an integral

  10. Multi - band Persistent Scatterer Interferometry data integration for landslide analysis

    Science.gov (United States)

    Bianchini, Silvia; Mateos, Rosa; Mora, Oscar; García, Inma; Sánchez, Ciscu; Sanabria, Margarita; López, Maite; Mulas, Joaquin; Hernández, Mario; Herrera, Gerardo

    2013-04-01

    We present a methodology to perform a geomorphological assessment of ground movements over wide areas, by improving Persistent Scatterer Interferometry (PSI) analysis for landslide studies. The procedure relies on the integrated use of multi-band EO data acquired by different satellite sensors in different time intervals, to provide a detailed investigation of ground displacements. The methodology, throughout the cross-comparison and integration of PS data in different microwave bands (ALOS in L-band, ERS1/2 and ENVISAT in C-band, COSMOSKY-MED in X-band), is applied on the Tramontana Range in the northwestern part of Mallorca island (Spain), extensively affected by mass movements across time, especially during the last years. We increase the confidence degree of the available interferometric data and we homogenize all PS targets by implementing and classifying them through common criteria. Therefore, PSI results are combined with geo-thematic data and pre-existing landslide inventories of the study area, in order to improve the landslide database, providing additional information on the detected ground displacements. The results of this methodology are used to elaborate landslide activity maps, permitting to jointly exploit heterogeneous PS data for analyzing landslides at regional scale. Moreover, from a geomorphological perspective, the proposed approach exploits the implemented PS data to achieve a reliable spatial analysis of movement rates, whatever referred to certain landslide phenomena or to other natural processes, in order to perform ground motion activity maps within a wide area.

  11. Integrated severe accident containment analysis with the CONTAIN computer code

    International Nuclear Information System (INIS)

    Bergeron, K.D.; Williams, D.C.; Rexroth, P.E.; Tills, J.L.

    1985-12-01

    Analysis of physical and radiological conditions iunside the containment building during a severe (core-melt) nuclear reactor accident requires quantitative evaluation of numerous highly disparate yet coupled phenomenologies. These include two-phase thermodynamics and thermal-hydraulics, aerosol physics, fission product phenomena, core-concrete interactions, the formation and combustion of flammable gases, and performance of engineered safety features. In the past, this complexity has meant that a complete containment analysis would require application of suites of separate computer codes each of which would treat only a narrower subset of these phenomena, e.g., a thermal-hydraulics code, an aerosol code, a core-concrete interaction code, etc. In this paper, we describe the development and some recent applications of the CONTAIN code, which offers an integrated treatment of the dominant containment phenomena and the interactions among them. We describe the results of a series of containment phenomenology studies, based upon realistic accident sequence analyses in actual plants. These calculations highlight various phenomenological effects that have potentially important implications for source term and/or containment loading issues, and which are difficult or impossible to treat using a less integrated code suite

  12. Integrative analysis to select cancer candidate biomarkers to targeted validation

    Science.gov (United States)

    Heberle, Henry; Domingues, Romênia R.; Granato, Daniela C.; Yokoo, Sami; Canevarolo, Rafael R.; Winck, Flavia V.; Ribeiro, Ana Carolina P.; Brandão, Thaís Bianca; Filgueiras, Paulo R.; Cruz, Karen S. P.; Barbuto, José Alexandre; Poppi, Ronei J.; Minghim, Rosane; Telles, Guilherme P.; Fonseca, Felipe Paiva; Fox, Jay W.; Santos-Silva, Alan R.; Coletta, Ricardo D.; Sherman, Nicholas E.; Paes Leme, Adriana F.

    2015-01-01

    Targeted proteomics has flourished as the method of choice for prospecting for and validating potential candidate biomarkers in many diseases. However, challenges still remain due to the lack of standardized routines that can prioritize a limited number of proteins to be further validated in human samples. To help researchers identify candidate biomarkers that best characterize their samples under study, a well-designed integrative analysis pipeline, comprising MS-based discovery, feature selection methods, clustering techniques, bioinformatic analyses and targeted approaches was performed using discovery-based proteomic data from the secretomes of three classes of human cell lines (carcinoma, melanoma and non-cancerous). Three feature selection algorithms, namely, Beta-binomial, Nearest Shrunken Centroids (NSC), and Support Vector Machine-Recursive Features Elimination (SVM-RFE), indicated a panel of 137 candidate biomarkers for carcinoma and 271 for melanoma, which were differentially abundant between the tumor classes. We further tested the strength of the pipeline in selecting candidate biomarkers by immunoblotting, human tissue microarrays, label-free targeted MS and functional experiments. In conclusion, the proposed integrative analysis was able to pre-qualify and prioritize candidate biomarkers from discovery-based proteomics to targeted MS. PMID:26540631

  13. Integrated modeling and analysis methodology for precision pointing applications

    Science.gov (United States)

    Gutierrez, Homero L.

    2002-07-01

    Space-based optical systems that perform tasks such as laser communications, Earth imaging, and astronomical observations require precise line-of-sight (LOS) pointing. A general approach is described for integrated modeling and analysis of these types of systems within the MATLAB/Simulink environment. The approach can be applied during all stages of program development, from early conceptual design studies to hardware implementation phases. The main objective is to predict the dynamic pointing performance subject to anticipated disturbances and noise sources. Secondary objectives include assessing the control stability, levying subsystem requirements, supporting pointing error budgets, and performing trade studies. The integrated model resides in Simulink, and several MATLAB graphical user interfaces (GUI"s) allow the user to configure the model, select analysis options, run analyses, and process the results. A convenient parameter naming and storage scheme, as well as model conditioning and reduction tools and run-time enhancements, are incorporated into the framework. This enables the proposed architecture to accommodate models of realistic complexity.

  14. Integrated dynamic landscape analysis and modeling system (IDLAMS) : installation manual.

    Energy Technology Data Exchange (ETDEWEB)

    Li, Z.; Majerus, K. A.; Sundell, R. C.; Sydelko, P. J.; Vogt, M. C.

    1999-02-24

    The Integrated Dynamic Landscape Analysis and Modeling System (IDLAMS) is a prototype, integrated land management technology developed through a joint effort between Argonne National Laboratory (ANL) and the US Army Corps of Engineers Construction Engineering Research Laboratories (USACERL). Dr. Ronald C. Sundell, Ms. Pamela J. Sydelko, and Ms. Kimberly A. Majerus were the principal investigators (PIs) for this project. Dr. Zhian Li was the primary software developer. Dr. Jeffrey M. Keisler, Mr. Christopher M. Klaus, and Mr. Michael C. Vogt developed the decision analysis component of this project. It was developed with funding support from the Strategic Environmental Research and Development Program (SERDP), a land/environmental stewardship research program with participation from the US Department of Defense (DoD), the US Department of Energy (DOE), and the US Environmental Protection Agency (EPA). IDLAMS predicts land conditions (e.g., vegetation, wildlife habitats, and erosion status) by simulating changes in military land ecosystems for given training intensities and land management practices. It can be used by military land managers to help predict the future ecological condition for a given land use based on land management scenarios of various levels of training intensity. It also can be used as a tool to help land managers compare different land management practices and further determine a set of land management activities and prescriptions that best suit the needs of a specific military installation.

  15. The practical implementation of integrated safety management for nuclear safety analysis and fire hazards analysis documentation

    International Nuclear Information System (INIS)

    COLLOPY, M.T.

    1999-01-01

    In 1995 Mr. Joseph DiNunno of the Defense Nuclear Facilities Safety Board issued an approach to describe the concept of an integrated safety management program which incorporates hazard and safety analysis to address a multitude of hazards affecting the public, worker, property, and the environment. Since then the U S . Department of Energy (DOE) has adopted a policy to systematically integrate safety into management and work practices at all levels so that missions can be completed while protecting the public, worker, and the environment. While the DOE and its contractors possessed a variety of processes for analyzing fire hazards at a facility, activity, and job; the outcome and assumptions of these processes have not always been consistent for similar types of hazards within the safety analysis and the fire hazard analysis. Although the safety analysis and the fire hazard analysis are driven by different DOE Orders and requirements, these analyses should not be entirely independent and their preparation should be integrated to ensure consistency of assumptions, consequences, design considerations, and other controls. Under the DOE policy to implement an integrated safety management system, identification of hazards must be evaluated and agreed upon to ensure that the public. the workers. and the environment are protected from adverse consequences. The DOE program and contractor management need a uniform, up-to-date reference with which to plan. budget, and manage nuclear programs. It is crucial that DOE understand the hazards and risks necessarily to authorize the work needed to be performed. If integrated safety management is not incorporated into the preparation of the safety analysis and the fire hazard analysis, inconsistencies between assumptions, consequences, design considerations, and controls may occur that affect safety. Furthermore, confusion created by inconsistencies may occur in the DOE process to grant authorization of the work. In accordance with

  16. Integrated Software Environment for Pressurized Thermal Shock Analysis

    Directory of Open Access Journals (Sweden)

    Dino Araneo

    2011-01-01

    Full Text Available The present paper describes the main features and an application to a real Nuclear Power Plant (NPP of an Integrated Software Environment (in the following referred to as “platform” developed at University of Pisa (UNIPI to perform Pressurized Thermal Shock (PTS analysis. The platform is written in Java for the portability and it implements all the steps foreseen in the methodology developed at UNIPI for the deterministic analysis of PTS scenarios. The methodology starts with the thermal hydraulic analysis of the NPP with a system code (such as Relap5-3D and Cathare2, during a selected transient scenario. The results so obtained are then processed to provide boundary conditions for the next step, that is, a CFD calculation. Once the system pressure and the RPV wall temperature are known, the stresses inside the RPV wall can be calculated by mean a Finite Element (FE code. The last step of the methodology is the Fracture Mechanics (FM analysis, using weight functions, aimed at evaluating the stress intensity factor (KI at crack tip to be compared with the critical stress intensity factor KIc. The platform automates all these steps foreseen in the methodology once the user specifies a number of boundary conditions at the beginning of the simulation.

  17. SEURAT: visual analytics for the integrated analysis of microarray data.

    Science.gov (United States)

    Gribov, Alexander; Sill, Martin; Lück, Sonja; Rücker, Frank; Döhner, Konstanze; Bullinger, Lars; Benner, Axel; Unwin, Antony

    2010-06-03

    In translational cancer research, gene expression data is collected together with clinical data and genomic data arising from other chip based high throughput technologies. Software tools for the joint analysis of such high dimensional data sets together with clinical data are required. We have developed an open source software tool which provides interactive visualization capability for the integrated analysis of high-dimensional gene expression data together with associated clinical data, array CGH data and SNP array data. The different data types are organized by a comprehensive data manager. Interactive tools are provided for all graphics: heatmaps, dendrograms, barcharts, histograms, eventcharts and a chromosome browser, which displays genetic variations along the genome. All graphics are dynamic and fully linked so that any object selected in a graphic will be highlighted in all other graphics. For exploratory data analysis the software provides unsupervised data analytics like clustering, seriation algorithms and biclustering algorithms. The SEURAT software meets the growing needs of researchers to perform joint analysis of gene expression, genomical and clinical data.

  18. SEURAT: Visual analytics for the integrated analysis of microarray data

    Directory of Open Access Journals (Sweden)

    Bullinger Lars

    2010-06-01

    Full Text Available Abstract Background In translational cancer research, gene expression data is collected together with clinical data and genomic data arising from other chip based high throughput technologies. Software tools for the joint analysis of such high dimensional data sets together with clinical data are required. Results We have developed an open source software tool which provides interactive visualization capability for the integrated analysis of high-dimensional gene expression data together with associated clinical data, array CGH data and SNP array data. The different data types are organized by a comprehensive data manager. Interactive tools are provided for all graphics: heatmaps, dendrograms, barcharts, histograms, eventcharts and a chromosome browser, which displays genetic variations along the genome. All graphics are dynamic and fully linked so that any object selected in a graphic will be highlighted in all other graphics. For exploratory data analysis the software provides unsupervised data analytics like clustering, seriation algorithms and biclustering algorithms. Conclusions The SEURAT software meets the growing needs of researchers to perform joint analysis of gene expression, genomical and clinical data.

  19. Transient flow analysis of integrated valve opening process

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Xinming; Qin, Benke; Bo, Hanliang, E-mail: bohl@tsinghua.edu.cn; Xu, Xingxing

    2017-03-15

    Highlights: • The control rod hydraulic driving system (CRHDS) is a new type of built-in control rod drive technology and the integrated valve (IV) is the key control component. • The transient flow experiment induced by IV is conducted and the test results are analyzed to get its working mechanism. • The theoretical model of IV opening process is established and applied to get the changing rule of the transient flow characteristic parameters. - Abstract: The control rod hydraulic driving system (CRHDS) is a new type of built-in control rod drive technology and the IV is the key control component. The working principle of integrated valve (IV) is analyzed and the IV hydraulic experiment is conducted. There is transient flow phenomenon in the valve opening process. The theoretical model of IV opening process is established by the loop system control equations and boundary conditions. The valve opening boundary condition equation is established based on the IV three dimensional flow field analysis results and the dynamic analysis of the valve core movement. The model calculation results are in good agreement with the experimental results. On this basis, the model is used to analyze the transient flow under high temperature condition. The peak pressure head is consistent with the one under room temperature and the pressure fluctuation period is longer than the one under room temperature. Furthermore, the changing rule of pressure transients with the fluid and loop structure parameters is analyzed. The peak pressure increases with the flow rate and the peak pressure decreases with the increase of the valve opening time. The pressure fluctuation period increases with the loop pipe length and the fluctuation amplitude remains largely unchanged under different equilibrium pressure conditions. The research results lay the base for the vibration reduction analysis of the CRHDS.

  20. Performance Analysis of a Photovoltaic-Thermal Integrated System

    International Nuclear Information System (INIS)

    Radziemska, E.

    2009-01-01

    The present commercial photovoltaic solar cells (PV) converts solar energy into electricity with a relatively low efficiency, less than 20%. More than 80% of the absorbed solar energy is dumped to the surroundings again after photovoltaic conversion. Hybrid PV/T systems consist of PV modules coupled with the heat extraction devices. The PV/T collectors generate electric power and heat simultaneously. Stabilizing temperature of photovoltaic modules at low level is highly desirable to obtain efficiency increase. The total efficiency of 60-80% can be achieved with the whole PV/T system provided that the T system is operated near ambient temperature. The value of the low-T heat energy is typically much smaller than the value of the PV electricity. The PV/T systems can exist in many designs, but the most common models are with the use of water or air as a working fuid. Efficiency is the most valuable parameter for the economic analysis. It has substantial meaning in the case of installations with great nominal power, as air-cooled Building Integrated Photovoltaic Systems (BIPV). In this paper the performance analysis of a hybrid PV/T system is presented: an energetic analysis as well as an exergetic analysis. Exergy is always destroyed when a process involves a temperature change. This destruction is proportional to the entropy increase of the system together with its surroundings the destroyed exergy has been called energy. Exergy analysis identifies the location, the magnitude, and the sources of thermodynamic inefficiencies in a system. This information, which cannot be provided by other means (e.g., an energy analysis), is very useful for the improvement and cost-effectiveness of the system. Calculations were carried out for the tested water-cooled ASE-100-DGL-SM Solar watt module.

  1. Performance Analysis of a Photovoltaic-Thermal Integrated System

    Directory of Open Access Journals (Sweden)

    Ewa Radziemska

    2009-01-01

    Full Text Available The present commercial photovoltaic solar cells (PV converts solar energy into electricity with a relatively low efficiency, less than 20%. More than 80% of the absorbed solar energy is dumped to the surroundings again after photovoltaic conversion. Hybrid PV/T systems consist of PV modules coupled with the heat extraction devices. The PV/T collectors generate electric power and heat simultaneously. Stabilizing temperature of photovoltaic modules at low level is higly desirable to obtain efficiency increase. The total efficiency of 60–80% can be achieved with the whole PV/T system provided that the T system is operated near ambient temperature. The value of the low-T heat energy is typically much smaller than the value of the PV electricity. The PV/T systems can exist in many designs, but the most common models are with the use of water or air as a working fuid. Efficiency is the most valuable parameter for the economic analysis. It has substantial meaning in the case of installations with great nominal power, as air-cooled Building Integrated Photovoltaic Systems (BIPV. In this paper the performance analysis of a hybrid PV/T system is presented: an energetic analysis as well as an exergetic analysis. Exergy is always destroyed when a process involves a temperature change. This destruction is proportional to the entropy increase of the system together with its surroundings—the destroyed exergy has been called anergy. Exergy analysis identifies the location, the magnitude, and the sources of thermodynamic inefficiences in a system. This information, which cannot be provided by other means (e.g., an energy analysis, is very useful for the improvement and cost-effictiveness of the system. Calculations were carried out for the tested water-cooled ASE-100-DGL-SM Solarwatt module.

  2. PEA: an integrated R toolkit for plant epitranscriptome analysis.

    Science.gov (United States)

    Zhai, Jingjing; Song, Jie; Cheng, Qian; Tang, Yunjia; Ma, Chuang

    2018-05-29

    The epitranscriptome, also known as chemical modifications of RNA (CMRs), is a newly discovered layer of gene regulation, the biological importance of which emerged through analysis of only a small fraction of CMRs detected by high-throughput sequencing technologies. Understanding of the epitranscriptome is hampered by the absence of computational tools for the systematic analysis of epitranscriptome sequencing data. In addition, no tools have yet been designed for accurate prediction of CMRs in plants, or to extend epitranscriptome analysis from a fraction of the transcriptome to its entirety. Here, we introduce PEA, an integrated R toolkit to facilitate the analysis of plant epitranscriptome data. The PEA toolkit contains a comprehensive collection of functions required for read mapping, CMR calling, motif scanning and discovery, and gene functional enrichment analysis. PEA also takes advantage of machine learning technologies for transcriptome-scale CMR prediction, with high prediction accuracy, using the Positive Samples Only Learning algorithm, which addresses the two-class classification problem by using only positive samples (CMRs), in the absence of negative samples (non-CMRs). Hence PEA is a versatile epitranscriptome analysis pipeline covering CMR calling, prediction, and annotation, and we describe its application to predict N6-methyladenosine (m6A) modifications in Arabidopsis thaliana. Experimental results demonstrate that the toolkit achieved 71.6% sensitivity and 73.7% specificity, which is superior to existing m6A predictors. PEA is potentially broadly applicable to the in-depth study of epitranscriptomics. PEA Docker image is available at https://hub.docker.com/r/malab/pea, source codes and user manual are available at https://github.com/cma2015/PEA. chuangma2006@gmail.com. Supplementary data are available at Bioinformatics online.

  3. HTGR-INTEGRATED COAL TO LIQUIDS PRODUCTION ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Anastasia M Gandrik; Rick A Wood

    2010-10-01

    As part of the DOE’s Idaho National Laboratory (INL) nuclear energy development mission, the INL is leading a program to develop and design a high temperature gas-cooled reactor (HTGR), which has been selected as the base design for the Next Generation Nuclear Plant. Because an HTGR operates at a higher temperature, it can provide higher temperature process heat, more closely matched to chemical process temperatures, than a conventional light water reactor. Integrating HTGRs into conventional industrial processes would increase U.S. energy security and potentially reduce greenhouse gas emissions (GHG), particularly CO2. This paper focuses on the integration of HTGRs into a coal to liquids (CTL) process, for the production of synthetic diesel fuel, naphtha, and liquefied petroleum gas (LPG). The plant models for the CTL processes were developed using Aspen Plus. The models were constructed with plant production capacity set at 50,000 barrels per day of liquid products. Analysis of the conventional CTL case indicated a potential need for hydrogen supplementation from high temperature steam electrolysis (HTSE), with heat and power supplied by the HTGR. By supplementing the process with an external hydrogen source, the need to “shift” the syngas using conventional water-gas shift reactors was eliminated. HTGR electrical power generation efficiency was set at 40%, a reactor size of 600 MWth was specified, and it was assumed that heat in the form of hot helium could be delivered at a maximum temperature of 700°C to the processes. Results from the Aspen Plus model were used to perform a preliminary economic analysis and a life cycle emissions assessment. The following conclusions were drawn when evaluating the nuclear assisted CTL process against the conventional process: • 11 HTGRs (600 MWth each) are required to support production of a 50,000 barrel per day CTL facility. When compared to conventional CTL production, nuclear integration decreases coal

  4. HTGR-Integrated Coal To Liquids Production Analysis

    International Nuclear Information System (INIS)

    Gandrik, Anastasia M.; Wood, Rick A.

    2010-01-01

    As part of the DOE's Idaho National Laboratory (INL) nuclear energy development mission, the INL is leading a program to develop and design a high temperature gas-cooled reactor (HTGR), which has been selected as the base design for the Next Generation Nuclear Plant. Because an HTGR operates at a higher temperature, it can provide higher temperature process heat, more closely matched to chemical process temperatures, than a conventional light water reactor. Integrating HTGRs into conventional industrial processes would increase U.S. energy security and potentially reduce greenhouse gas emissions (GHG), particularly CO2. This paper focuses on the integration of HTGRs into a coal to liquids (CTL) process, for the production of synthetic diesel fuel, naphtha, and liquefied petroleum gas (LPG). The plant models for the CTL processes were developed using Aspen Plus. The models were constructed with plant production capacity set at 50,000 barrels per day of liquid products. Analysis of the conventional CTL case indicated a potential need for hydrogen supplementation from high temperature steam electrolysis (HTSE), with heat and power supplied by the HTGR. By supplementing the process with an external hydrogen source, the need to 'shift' the syngas using conventional water-gas shift reactors was eliminated. HTGR electrical power generation efficiency was set at 40%, a reactor size of 600 MWth was specified, and it was assumed that heat in the form of hot helium could be delivered at a maximum temperature of 700 C to the processes. Results from the Aspen Plus model were used to perform a preliminary economic analysis and a life cycle emissions assessment. The following conclusions were drawn when evaluating the nuclear assisted CTL process against the conventional process: (1) 11 HTGRs (600 MWth each) are required to support production of a 50,000 barrel per day CTL facility. When compared to conventional CTL production, nuclear integration decreases coal consumption by 66

  5. RADIA: RNA and DNA integrated analysis for somatic mutation detection.

    Directory of Open Access Journals (Sweden)

    Amie J Radenbaugh

    Full Text Available The detection of somatic single nucleotide variants is a crucial component to the characterization of the cancer genome. Mutation calling algorithms thus far have focused on comparing the normal and tumor genomes from the same individual. In recent years, it has become routine for projects like The Cancer Genome Atlas (TCGA to also sequence the tumor RNA. Here we present RADIA (RNA and DNA Integrated Analysis, a novel computational method combining the patient-matched normal and tumor DNA with the tumor RNA to detect somatic mutations. The inclusion of the RNA increases the power to detect somatic mutations, especially at low DNA allelic frequencies. By integrating an individual's DNA and RNA, we are able to detect mutations that would otherwise be missed by traditional algorithms that examine only the DNA. We demonstrate high sensitivity (84% and very high precision (98% and 99% for RADIA in patient data from endometrial carcinoma and lung adenocarcinoma from TCGA. Mutations with both high DNA and RNA read support have the highest validation rate of over 99%. We also introduce a simulation package that spikes in artificial mutations to patient data, rather than simulating sequencing data from a reference genome. We evaluate sensitivity on the simulation data and demonstrate our ability to rescue back mutations at low DNA allelic frequencies by including the RNA. Finally, we highlight mutations in important cancer genes that were rescued due to the incorporation of the RNA.

  6. Integrity analysis of an upper guide structure flange

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Ki Hyoung; Kang, Sung Sik; Jhung, Myung Jo [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-10-15

    The integrity assessment of reactor vessel internals should be conducted in the design process to secure the safety of nuclear power plants. Various loads such as self-weight, seismic load, flow-induced load, and preload are applied to the internals. Therefore, the American Society of Mechanical Engineers (ASME) Code, Section III, defines the stress limit for reactor vessel internals. The present study focused on structural response analyses of the upper guide structure upper flange. The distributions of the stress intensity in the flange body were analyzed under various design load cases during normal operation. The allowable stress intensities along the expected sections of stress concentration were derived from the results of the finite element analysis for evaluating the structural integrity of the flange design. Furthermore, seismic analyses of the upper flange were performed to identify dynamic behavior with respect to the seismic and impact input. The mode superposition and full transient methods were used to perform time–history analyses, and the displacement at the lower end of the flange was obtained. The effect of the damping ratio on the response of the flange was also evaluated, and the acceleration was obtained. The results of elastic and seismic analyses in this study will be used as basic information to judge whether a flange design meets the acceptance criteria.

  7. Development of safety analysis technology for integral reactor

    International Nuclear Information System (INIS)

    Kim, Hee Cheol; Kim, K. K.; Kim, S. H.

    2002-04-01

    The state-of-the-arts for the integral reactor was performed to investigate the safety features. The safety and performance of SMART were assessed using the technologies developed during the study. For this purpose, the computer code system and the analysis methodology were developed and the safety and performance analyses on SMART basic design were carried out for the design basis event and accident. The experimental facilities were designed for the core flow distribution test and the self-pressurizing pressurizer performance test. The tests on the 2-phase critical flow with non-condensable gas were completed and the results were used to assess the critical flow model. Probabilistic Safety Assessment(PSA) was carried out to evaluate the safety level and to optimize the design by identifying and remedying any weakness in the design. A joint study with KINS was carried out to promote licensing environment. The generic safety issues of integral reactors were identified and the solutions were formulated. The economic evaluation of the SMART desalination plant and the activities related to the process control were carried out in the scope of the study

  8. Visual Data Analysis as an Integral Part of Environmental Management

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Joerg; Bethel, E. Wes; Horsman, Jennifer L.; Hubbard, Susan S.; Krishnan, Harinarayan; Romosan,, Alexandru; Keating, Elizabeth H.; Monroe, Laura; Strelitz, Richard; Moore, Phil; Taylor, Glenn; Torkian, Ben; Johnson, Timothy C.; Gorton, Ian

    2012-10-01

    The U.S. Department of Energy's (DOE) Office of Environmental Management (DOE/EM) currently supports an effort to understand and predict the fate of nuclear contaminants and their transport in natural and engineered systems. Geologists, hydrologists, physicists and computer scientists are working together to create models of existing nuclear waste sites, to simulate their behavior and to extrapolate it into the future. We use visualization as an integral part in each step of this process. In the first step, visualization is used to verify model setup and to estimate critical parameters. High-performance computing simulations of contaminant transport produces massive amounts of data, which is then analyzed using visualization software specifically designed for parallel processing of large amounts of structured and unstructured data. Finally, simulation results are validated by comparing simulation results to measured current and historical field data. We describe in this article how visual analysis is used as an integral part of the decision-making process in the planning of ongoing and future treatment options for the contaminated nuclear waste sites. Lessons learned from visually analyzing our large-scale simulation runs will also have an impact on deciding on treatment measures for other contaminated sites.

  9. Integral finite element analysis of turntable bearing with flexible rings

    Science.gov (United States)

    Deng, Biao; Liu, Yunfei; Guo, Yuan; Tang, Shengjin; Su, Wenbin; Lei, Zhufeng; Wang, Pengcheng

    2018-03-01

    This paper suggests a method to calculate the internal load distribution and contact stress of the thrust angular contact ball turntable bearing by FEA. The influence of the stiffness of the bearing structure and the plastic deformation of contact area on the internal load distribution and contact stress of the bearing is considered. In this method, the load-deformation relationship of the rolling elements is determined by the finite element contact analysis of a single rolling element and the raceway. Based on this, the nonlinear contact between the rolling elements and the inner and outer ring raceways is same as a nonlinear compression spring and bearing integral finite element analysis model including support structure was established. The effects of structural deformation and plastic deformation on the built-in stress distribution of slewing bearing are investigated on basis of comparing the consequences of load distribution, inner and outer ring stress, contact stress and other finite element analysis results with the traditional bearing theory, which has guiding function for improving the design of slewing bearing.

  10. Development and assessment of best estimate integrated safety analysis code

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Bub Dong; Lee, Young Jin; Hwang, Moon Kyu (and others)

    2007-03-15

    Improvement of the integrated safety analysis code MARS3.0 has been carried out and a multi-D safety analysis application system has been established. Iterative matrix solver and parallel processing algorithm have been introduced, and a LINUX version has been generated to enable MARS to run in cluster PCs. MARS variables and sub-routines have been reformed and modularised to simplify code maintenance. Model uncertainty analyses have been performed for THTF, FLECHT, NEPTUN, and LOFT experiments as well as APR1400 plant. Participations in international cooperation research projects such as OECD BEMUSE, SETH, PKL, BFBT, and TMI-2 have been actively pursued as part of code assessment efforts. The assessment, evaluation and experimental data obtained through international cooperation projects have been registered and maintained in the T/H Databank. Multi-D analyses of APR1400 LBLOCA, DVI Break, SLB, and SGTR have been carried out as a part of application efforts in multi-D safety analysis. GUI based 3D input generator has been developed for user convenience. Operation of the MARS Users Group (MUG) was continued and through MUG, the technology has been transferred to 24 organisations. A set of 4 volumes of user manuals has been compiled and the correction reports for the code errors reported during MARS development have been published.

  11. Development and assessment of best estimate integrated safety analysis code

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Lee, Young Jin; Hwang, Moon Kyu

    2007-03-01

    Improvement of the integrated safety analysis code MARS3.0 has been carried out and a multi-D safety analysis application system has been established. Iterative matrix solver and parallel processing algorithm have been introduced, and a LINUX version has been generated to enable MARS to run in cluster PCs. MARS variables and sub-routines have been reformed and modularised to simplify code maintenance. Model uncertainty analyses have been performed for THTF, FLECHT, NEPTUN, and LOFT experiments as well as APR1400 plant. Participations in international cooperation research projects such as OECD BEMUSE, SETH, PKL, BFBT, and TMI-2 have been actively pursued as part of code assessment efforts. The assessment, evaluation and experimental data obtained through international cooperation projects have been registered and maintained in the T/H Databank. Multi-D analyses of APR1400 LBLOCA, DVI Break, SLB, and SGTR have been carried out as a part of application efforts in multi-D safety analysis. GUI based 3D input generator has been developed for user convenience. Operation of the MARS Users Group (MUG) was continued and through MUG, the technology has been transferred to 24 organisations. A set of 4 volumes of user manuals has been compiled and the correction reports for the code errors reported during MARS development have been published

  12. Integrating Process Mining and Cognitive Analysis to Study EHR Workflow.

    Science.gov (United States)

    Furniss, Stephanie K; Burton, Matthew M; Grando, Adela; Larson, David W; Kaufman, David R

    2016-01-01

    There are numerous methods to study workflow. However, few produce the kinds of in-depth analyses needed to understand EHR-mediated workflow. Here we investigated variations in clinicians' EHR workflow by integrating quantitative analysis of patterns of users' EHR-interactions with in-depth qualitative analysis of user performance. We characterized 6 clinicians' patterns of information-gathering using a sequential process-mining approach. The analysis revealed 519 different screen transition patterns performed across 1569 patient cases. No one pattern was followed for more than 10% of patient cases, the 15 most frequent patterns accounted for over half ofpatient cases (53%), and 27% of cases exhibited unique patterns. By triangulating quantitative and qualitative analyses, we found that participants' EHR-interactive behavior was associated with their routine processes, patient case complexity, and EHR default settings. The proposed approach has significant potential to inform resource allocation for observation and training. In-depth observations helped us to explain variation across users.

  13. Bayesian tomography and integrated data analysis in fusion diagnostics

    Science.gov (United States)

    Li, Dong; Dong, Y. B.; Deng, Wei; Shi, Z. B.; Fu, B. Z.; Gao, J. M.; Wang, T. B.; Zhou, Yan; Liu, Yi; Yang, Q. W.; Duan, X. R.

    2016-11-01

    In this article, a Bayesian tomography method using non-stationary Gaussian process for a prior has been introduced. The Bayesian formalism allows quantities which bear uncertainty to be expressed in the probabilistic form so that the uncertainty of a final solution can be fully resolved from the confidence interval of a posterior probability. Moreover, a consistency check of that solution can be performed by checking whether the misfits between predicted and measured data are reasonably within an assumed data error. In particular, the accuracy of reconstructions is significantly improved by using the non-stationary Gaussian process that can adapt to the varying smoothness of emission distribution. The implementation of this method to a soft X-ray diagnostics on HL-2A has been used to explore relevant physics in equilibrium and MHD instability modes. This project is carried out within a large size inference framework, aiming at an integrated analysis of heterogeneous diagnostics.

  14. System integration and control strategy analysis of PEMFC car

    International Nuclear Information System (INIS)

    Sun, L.; Chen, Y.; Liu, Y.; Shi, P.

    2004-01-01

    A new fuel car was designed according to the prototype LN2000 hydrogen-oxygen fuel cell car. The new prototype consists of a compact fuel cell engine with separated fuel cell stack, nickel metal hydride battery, a motor with power of 30Kw/100Kw and an inverter with high efficiency. With in the powertrain, a two-shift Planet gear transmission was employed. The power performance was greatly improved. New battery with EMS, new self-developed fuel cell engine, the motor propulsion system and electronic controlled transmission make it feasible to control the whole fuel car automatically and efficiently with optimization. The presents the system integration and the control strategy analysis of the fuel cell car prototype. The paper can be used for reference for engineers in the field of fuel cell vehicle. (author)

  15. Urban Integrated Industrial Cogeneration Systems Analysis. Phase II final report

    Energy Technology Data Exchange (ETDEWEB)

    1984-01-01

    Through the Urban Integrated Industrial Cogeneration Systems Analysis (UIICSA), the City of Chicago embarked upon an ambitious effort to identify the measure the overall industrial cogeneration market in the city and to evaluate in detail the most promising market opportunities. This report discusses the background of the work completed during Phase II of the UIICSA and presents the results of economic feasibility studies conducted for three potential cogeneration sites in Chicago. Phase II focused on the feasibility of cogeneration at the three most promising sites: the Stockyards and Calumet industrial areas, and the Ford City commercial/industrial complex. Each feasibility case study considered the energy load requirements of the existing facilities at the site and the potential for attracting and serving new growth in the area. Alternative fuels and technologies, and ownership and financing options were also incorporated into the case studies. Finally, site specific considerations such as development incentives, zoning and building code restrictions and environmental requirements were investigated.

  16. Neutronics analysis for integration of ITER diagnostics port EP10

    Energy Technology Data Exchange (ETDEWEB)

    Colling, Bethany, E-mail: bethany.colling@ccfe.ac.uk [Culham Centre for Fusion Energy, Culham Science Centre, Abingdon, Oxon OX14 3DB (United Kingdom); Department of Engineering, Lancaster University, Lancashire LA1 4YR (United Kingdom); Eade, Tim [Culham Centre for Fusion Energy, Culham Science Centre, Abingdon, Oxon OX14 3DB (United Kingdom); Joyce, Malcolm J. [Department of Engineering, Lancaster University, Lancashire LA1 4YR (United Kingdom); Pampin, Raul; Seyvet, Fabien [Fusion for Energy, Josep Pla 2, Torres Diagonal Litoral B3, 08019 Barcelona (Spain); Turner, Andrew [Culham Centre for Fusion Energy, Culham Science Centre, Abingdon, Oxon OX14 3DB (United Kingdom); Udintsev, Victor [ITER Organization, Route de Vinon-sur-Verdon, CS 90 046, 13067 St. Paul Lez Durance Cedex (France)

    2016-11-01

    Shutdown dose rate calculations have been performed on an integrated ITER C-lite neutronics model with equatorial port 10. A ‘fully shielded’ configuration, optimised for a given set of diagnostic designs (i.e. shielding in all available space within the port plug drawers), results in a shutdown dose rate in the port interspace, from the activation of materials comprising equatorial port 10, in excess of 2000 μSv/h. Achieving dose rates of 100 μSv/h or less, as required in areas where hands-on maintenance can be performed, in the port interspace region will be challenging. A combination of methods will need to be implemented, such as reducing mass and/or the use of reduced activation steel in the port interspace, optimisation of the diagnostic designs and shielding of the port interspace floor. Further analysis is required to test these options and the ongoing design optimisation of the EP10 diagnostic systems.

  17. Advanced data analysis in neuroscience integrating statistical and computational models

    CERN Document Server

    Durstewitz, Daniel

    2017-01-01

    This book is intended for use in advanced graduate courses in statistics / machine learning, as well as for all experimental neuroscientists seeking to understand statistical methods at a deeper level, and theoretical neuroscientists with a limited background in statistics. It reviews almost all areas of applied statistics, from basic statistical estimation and test theory, linear and nonlinear approaches for regression and classification, to model selection and methods for dimensionality reduction, density estimation and unsupervised clustering.  Its focus, however, is linear and nonlinear time series analysis from a dynamical systems perspective, based on which it aims to convey an understanding also of the dynamical mechanisms that could have generated observed time series. Further, it integrates computational modeling of behavioral and neural dynamics with statistical estimation and hypothesis testing. This way computational models in neuroscience are not only explanat ory frameworks, but become powerfu...

  18. Life-cycle analysis of product integrated polymer solar cells

    DEFF Research Database (Denmark)

    Espinosa Martinez, Nieves; García-Valverde, Rafael; Krebs, Frederik C

    2011-01-01

    A life cycle analysis (LCA) on a product integrated polymer solar module is carried out in this study. These assessments are well-known to be useful in developmental stages of a product in order to identify the bottlenecks for the up-scaling in its production phase for several aspects spanning from...... economics through design to functionality. An LCA study was performed to quantify the energy use and greenhouse gas (GHG) emissions from electricity use in the manufacture of a light-weight lamp based on a plastic foil, a lithium-polymer battery, a polymer solar cell, printed circuitry, blocking diode......, switch and a white light emitting semiconductor diode. The polymer solar cell employed in this prototype presents a power conversion efficiency in the range of 2 to 3% yielding energy payback times (EPBT) in the range of 1.3–2 years. Based on this it is worthwhile to undertake a life-cycle study...

  19. Integrated aerosol and thermalhydraulics modelling for CANDU safety analysis

    International Nuclear Information System (INIS)

    McDonald, B.H.; Hanna, B.N.

    1990-08-01

    Analysis of postulated accidents in CANDU reactors that could result in severe fuel damage requires the ability to model the formation of aerosols containing fission product materials and the transport of these aerosols from the fuel, through containment, to any leak to the atmosphere. Best-estimate calculations require intimate coupling and simultaneous solution of all the equations describing the entire range of physical and chemical phenomena involved. The prototype CATHENA/PACE-3D has been developed for integrated calculation of thermalhydraulic and aerosol events in a CANDU reactor during postulated accidents. Examples demonstrate the ability of CATHENA/PACE-3D to produce realistic flow and circulation patterns and reasonable accuracy in solution of two simple fluid-flow test cases for which analytical solutions exist

  20. Strategic Technology Investment Analysis: An Integrated System Approach

    Science.gov (United States)

    Adumitroaie, V.; Weisbin, C. R.

    2010-01-01

    Complex technology investment decisions within NASA are increasingly difficult to make such that the end results are satisfying the technical objectives and all the organizational constraints. Due to a restricted science budget environment and numerous required technology developments, the investment decisions need to take into account not only the functional impact on the program goals, but also development uncertainties and cost variations along with maintaining a healthy workforce. This paper describes an approach for optimizing and qualifying technology investment portfolios from the perspective of an integrated system model. The methodology encompasses multi-attribute decision theory elements and sensitivity analysis. The evaluation of the degree of robustness of the recommended portfolio provides the decision-maker with an array of viable selection alternatives, which take into account input uncertainties and possibly satisfy nontechnical constraints. The methodology is presented in the context of assessing capability development portfolios for NASA technology programs.

  1. The IMBA suite: integrated modules for bioassay analysis

    Energy Technology Data Exchange (ETDEWEB)

    Birchall, A.; Jarvis, N.S.; Peace, M.S.; Riddell, A.E.; Battersby, W.P

    1998-07-01

    The increasing complexity of models representing the biokinetic behaviour of radionuclides in the body following intake poses problems for people who are required to implement these models. The problem is exacerbated by the current paucity of suitable software. In order to remedy this situation, a collaboration between British Nuclear Fuels, Westlakes Research Institute and the National Radiological Protection Board has started with the aim of producing a suite of modules for estimating intakes and doses from bioassay measurements using the new ICRP models. Each module will have a single purpose (e.g. to calculate respiratory tract deposition) and will interface with other software using data files. The elements to be implemented initially are plutonium, uranium, caesium, iodine and tritium. It is intended to make the software available to other parties under terms yet to be decided. This paper describes the proposed suite of integrated modules for bioassay analysis, IMBA. (author)

  2. ANALYSIS OF ENVIRONMENTAL FRAGILITY USING MULTI-CRITERIA ANALYSIS (MCE FOR INTEGRATED LANDSCAPE ASSESSMENT

    Directory of Open Access Journals (Sweden)

    Abimael Cereda Junior

    2014-01-01

    Full Text Available The Geographic Information Systems brought greater possibilitie s to the representation and interpretation of the landscap e as well as the integrated a nalysis. However, this approach does not dispense technical and methodological substan tiation for achieving the computational universe. This work is grounded in ecodynamic s and empirical analysis of natural and anthr opogenic environmental Fragility a nd aims to propose and present an integrated paradigm of Multi-criteria Analysis and F uzzy Logic Model of Environmental Fragility, taking as a case study of the Basin of Monjolinho Stream in São Carlos-SP. The use of this methodology allowed for a reduct ion in the subjectivism influences of decision criteria, which factors might have its cartographic expression, respecting the complex integrated landscape.

  3. Functional Module Analysis for Gene Coexpression Networks with Network Integration.

    Science.gov (United States)

    Zhang, Shuqin; Zhao, Hongyu; Ng, Michael K

    2015-01-01

    Network has been a general tool for studying the complex interactions between different genes, proteins, and other small molecules. Module as a fundamental property of many biological networks has been widely studied and many computational methods have been proposed to identify the modules in an individual network. However, in many cases, a single network is insufficient for module analysis due to the noise in the data or the tuning of parameters when building the biological network. The availability of a large amount of biological networks makes network integration study possible. By integrating such networks, more informative modules for some specific disease can be derived from the networks constructed from different tissues, and consistent factors for different diseases can be inferred. In this paper, we have developed an effective method for module identification from multiple networks under different conditions. The problem is formulated as an optimization model, which combines the module identification in each individual network and alignment of the modules from different networks together. An approximation algorithm based on eigenvector computation is proposed. Our method outperforms the existing methods, especially when the underlying modules in multiple networks are different in simulation studies. We also applied our method to two groups of gene coexpression networks for humans, which include one for three different cancers, and one for three tissues from the morbidly obese patients. We identified 13 modules with three complete subgraphs, and 11 modules with two complete subgraphs, respectively. The modules were validated through Gene Ontology enrichment and KEGG pathway enrichment analysis. We also showed that the main functions of most modules for the corresponding disease have been addressed by other researchers, which may provide the theoretical basis for further studying the modules experimentally.

  4. The Holistic Integrity Test (HIT - quantified resilience analysis

    Directory of Open Access Journals (Sweden)

    Dobson Mike

    2016-01-01

    Full Text Available The Holistic Integrity Test (HIT - Quantified Resilience Analysis. Rising sea levels and wider climate change mean we face an increasing risk from flooding and other natural hazards. Tough economic times make it difficult to economically justify or afford the desired level of engineered risk reduction. Add to this significant uncertainty from a range of future predictions, constantly updated with new science. We therefore need to understand not just how to reduce the risk, but what could happen should above design standard events occur. In flood terms this includes not only the direct impacts (damage and loss of life, but the wider cascade impacts to infrastructure systems and the longer term impacts on the economy and society. However, understanding the “what if” is only the first part of the equation; a range of improvement measures to mitigate such effects need to be identified and implemented. These measures should consider reducing the risk, lessening the consequences, aiding the response, and speeding up the recovery. However, they need to be objectively assessed through quantitative analysis, which underpins them technically and economically. Without such analysis, it cannot be predicted how measures will perform if the extreme events occur. It is also vital to consider all possible hazards as measures for one hazard may hinder the response to another. The Holistic Integrity Test (HIT, uses quantitative system analysis and “HITs” the site, its infrastructure, contained dangers and wider regional system to determine how it copes with a range of severe shock events, Before, During and After the event, whilst also accounting for uncertainty (as illustrated in figure 1. First explained at the TINCE 2014 Nuclear Conference in Paris, it was explained in terms of a Nuclear Facility needing to analyse the site in response to post Fukushima needs; the hit is however universally applicable. The HIT has three key risk reduction goals: The

  5. Integration

    DEFF Research Database (Denmark)

    Emerek, Ruth

    2004-01-01

    Bidraget diskuterer de forskellige intergrationsopfattelse i Danmark - og hvad der kan forstås ved vellykket integration......Bidraget diskuterer de forskellige intergrationsopfattelse i Danmark - og hvad der kan forstås ved vellykket integration...

  6. Practical use of the integrated reporting framework – an analysis of the content of integrated reports of selected companies

    Directory of Open Access Journals (Sweden)

    Monika Raulinajtys-Grzybek

    2017-09-01

    Full Text Available Practical use of the integrated reporting framework – an analysis of the content of integrated reports of selected companies The purpose of the article is to provide a research tool for an initial assessment of whether a company’s integrated reports meet the objectives set out in the IIRC Integrated Reporting Framework and its empirical verification. In particular, the research addresses whether the reports meet the goal of improving the quality of information available and covering all factors that influence the organization’s ability to create value. The article uses the theoretical output on the principles of preparing integrated reports and analyzes the content of selected integrated reports. Based on the source analysis, a research tool has been developed for an initial assessment of whether an integrated report fulfills its objectives. It consists of 42 questions that verify the coverage of the defined elements and the implementation of the guiding principles set by the IIRC. For empirical verification of the tool, a comparative analysis was carried out for reports prepared by selected companies operating in the utilities sector. Answering questions from the research tool allows a researcher to formulate conclusions about the implementation of the guiding principles and the completeness of the presentation of the content elements. As a result of the analysis of selected integrated reports, it was stated that various elements of the report are presented with different levels of accuracy in different reports. Reports provide the most complete information on performance and strategy. The information about business model and prospective data is in some cases presented without making a link to other parts of the report – e.g. risks and opportunities, financial data or capitals. The absence of such links limits the ability to claim that an integrated report meets its objectives, since a set of individual reports, each presenting

  7. Integrating PROOF Analysis in Cloud and Batch Clusters

    International Nuclear Information System (INIS)

    Rodríguez-Marrero, Ana Y; Fernández-del-Castillo, Enol; López García, Álvaro; Marco de Lucas, Jesús; Matorras Weinig, Francisco; González Caballero, Isidro; Cuesta Noriega, Alberto

    2012-01-01

    High Energy Physics (HEP) analysis are becoming more complex and demanding due to the large amount of data collected by the current experiments. The Parallel ROOT Facility (PROOF) provides researchers with an interactive tool to speed up the analysis of huge volumes of data by exploiting parallel processing on both multicore machines and computing clusters. The typical PROOF deployment scenario is a permanent set of cores configured to run the PROOF daemons. However, this approach is incapable of adapting to the dynamic nature of interactive usage. Several initiatives seek to improve the use of computing resources by integrating PROOF with a batch system, such as Proof on Demand (PoD) or PROOF Cluster. These solutions are currently in production at Universidad de Oviedo and IFCA and are positively evaluated by users. Although they are able to adapt to the computing needs of users, they must comply with the specific configuration, OS and software installed at the batch nodes. Furthermore, they share the machines with other workloads, which may cause disruptions in the interactive service for users. These limitations make PROOF a typical use-case for cloud computing. In this work we take profit from Cloud Infrastructure at IFCA in order to provide a dynamic PROOF environment where users can control the software configuration of the machines. The Proof Analysis Framework (PAF) facilitates the development of new analysis and offers a transparent access to PROOF resources. Several performance measurements are presented for the different scenarios (PoD, SGE and Cloud), showing a speed improvement closely correlated with the number of cores used.

  8. Marketing Mix Formulation for Higher Education: An Integrated Analysis Employing Analytic Hierarchy Process, Cluster Analysis and Correspondence Analysis

    Science.gov (United States)

    Ho, Hsuan-Fu; Hung, Chia-Chi

    2008-01-01

    Purpose: The purpose of this paper is to examine how a graduate institute at National Chiayi University (NCYU), by using a model that integrates analytic hierarchy process, cluster analysis and correspondence analysis, can develop effective marketing strategies. Design/methodology/approach: This is primarily a quantitative study aimed at…

  9. Behavior and analysis of an integral abutment bridge.

    Science.gov (United States)

    2013-08-01

    As a result of abutment spalling on the integral abutment bridge over 400 South Street in Salt Lake City, Utah, the Utah Department of Transportation (UDOT) instigated research measures to better understand the behavior of integral abutment bridges. ...

  10. Reverse Engineering Integrated Circuits Using Finite State Machine Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Oler, Kiri J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Miller, Carl H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-04-12

    In this paper, we present a methodology for reverse engineering integrated circuits, including a mathematical verification of a scalable algorithm used to generate minimal finite state machine representations of integrated circuits.

  11. Integrating Biological Perspectives:. a Quantum Leap for Microarray Expression Analysis

    Science.gov (United States)

    Wanke, Dierk; Kilian, Joachim; Bloss, Ulrich; Mangelsen, Elke; Supper, Jochen; Harter, Klaus; Berendzen, Kenneth W.

    2009-02-01

    Biologists and bioinformatic scientists cope with the analysis of transcript abundance and the extraction of meaningful information from microarray expression data. By exploiting biological information accessible in public databases, we try to extend our current knowledge over the plant model organism Arabidopsis thaliana. Here, we give two examples of increasing the quality of information gained from large scale expression experiments by the integration of microarray-unrelated biological information: First, we utilize Arabidopsis microarray data to demonstrate that expression profiles are usually conserved between orthologous genes of different organisms. In an initial step of the analysis, orthology has to be inferred unambiguously, which then allows comparison of expression profiles between orthologs. We make use of the publicly available microarray expression data of Arabidopsis and barley, Hordeum vulgare. We found a generally positive correlation in expression trajectories between true orthologs although both organisms are only distantly related in evolutionary time scale. Second, extracting clusters of co-regulated genes implies similarities in transcriptional regulation via similar cis-regulatory elements (CREs). Vice versa approaches, where co-regulated gene clusters are found by investigating on CREs were not successful in general. Nonetheless, in some cases the presence of CREs in a defined position, orientation or CRE-combinations is positively correlated with co-regulated gene clusters. Here, we make use of genes involved in the phenylpropanoid biosynthetic pathway, to give one positive example for this approach.

  12. Process integration and pinch analysis in sugarcane industry

    Energy Technology Data Exchange (ETDEWEB)

    Prado, Adelk de Carvalho; Pinheiro, Ricardo Brant [UFMG, Departamento de Engenharia Nuclear, Programa de Pos-Graduacao em Ciencias e Tecnicas Nucleares, Belo Horizonte, MG (Brazil)], E-mail: rbp@nuclear.ufmg.br

    2010-07-01

    Process integration techniques were applied, particularly through the Pinch Analysis method, to sugarcane industry. Research was performed upon harvest data from an agroindustrial complex which processes sugarcane plant in excess of 3.5 million metric tons per year, producing motor fuel grade ethanol, standard quality sugar, and delivering excess electric power to the grid. Pinch Analysis was used in assessing internal heat recovery as well as external utility demand targets, while keeping the lowest but economically achievable targets for entropy increase. Efficiency on the use of energy was evaluated for the plant as it was found (the base case) as well as for five selected process and/or plant design modifications, always with guidance of the method. The first alternative design (case 2) was proposed to evaluate equipment mean idle time in the base case, to support subsequent comparisons. Cases 3 and 4 were used to estimate the upper limits of combined heat and power generation while raw material supply of the base case is kept; both the cases did not prove worth implementing. Cases 5 and 6 were devised to deal with the bottleneck of the plant, namely boiler capacity, in order to allow for some production increment. Inexpensive, minor modifications considered in case 5 were found unable to produce reasonable outcome gain. Nevertheless, proper changes in cane juice evaporation section (case 6) could allow sugar and ethanol combined production to rise up to 9.1% relative to the base case, without dropping cogenerated power. (author)

  13. Integrated Data Collection Analysis (IDCA) Program - SSST Testing Methods

    Energy Technology Data Exchange (ETDEWEB)

    Sandstrom, Mary M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Geoffrey W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Daniel N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pollard, Colin J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Warner, Kirstin F. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Remmers, Daniel L. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Sorensen, Daniel N. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Whinnery, LeRoy L. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Phillips, Jason J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Shelley, Timothy J. [Bureau of Alcohol, Tobacco and Firearms (ATF), Huntsville, AL (United States); Reyes, Jose A. [Applied Research Associates, Tyndall AFB, FL (United States); Hsu, Peter C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Reynolds, John G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-03-25

    The Integrated Data Collection Analysis (IDCA) program is conducting a proficiency study for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are the methods used for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis during the IDCA program. These methods changed throughout the Proficiency Test and the reasons for these changes are documented in this report. The most significant modifications in standard testing methods are: 1) including one specified sandpaper in impact testing among all the participants, 2) diversifying liquid test methods for selected participants, and 3) including sealed sample holders for thermal testing by at least one participant. This effort, funded by the Department of Homeland Security (DHS), is putting the issues of safe handling of these materials in perspective with standard military explosives. The study is adding SSST testing results for a broad suite of different HMEs to the literature. Ultimately the study will suggest new guidelines and methods and possibly establish the SSST testing accuracies needed to develop safe handling practices for HMEs. Each participating testing laboratory uses identical test materials and preparation methods wherever possible. The testing performers involved are Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory (LANL), Indian Head Division, Naval Surface Warfare Center, (NSWC IHD), Sandia National Laboratories (SNL), and Air Force Research Laboratory (AFRL/RXQL). These tests are conducted as a proficiency study in order to establish some consistency in test protocols, procedures, and experiments and to compare results when these testing variables cannot be made consistent.

  14. Integrated Genomic Analysis of the Ubiquitin Pathway across Cancer Types

    Directory of Open Access Journals (Sweden)

    Zhongqi Ge

    2018-04-01

    Full Text Available Summary: Protein ubiquitination is a dynamic and reversible process of adding single ubiquitin molecules or various ubiquitin chains to target proteins. Here, using multidimensional omic data of 9,125 tumor samples across 33 cancer types from The Cancer Genome Atlas, we perform comprehensive molecular characterization of 929 ubiquitin-related genes and 95 deubiquitinase genes. Among them, we systematically identify top somatic driver candidates, including mutated FBXW7 with cancer-type-specific patterns and amplified MDM2 showing a mutually exclusive pattern with BRAF mutations. Ubiquitin pathway genes tend to be upregulated in cancer mediated by diverse mechanisms. By integrating pan-cancer multiomic data, we identify a group of tumor samples that exhibit worse prognosis. These samples are consistently associated with the upregulation of cell-cycle and DNA repair pathways, characterized by mutated TP53, MYC/TERT amplification, and APC/PTEN deletion. Our analysis highlights the importance of the ubiquitin pathway in cancer development and lays a foundation for developing relevant therapeutic strategies. : Ge et al. analyze a cohort of 9,125 TCGA samples across 33 cancer types to provide a comprehensive characterization of the ubiquitin pathway. They detect somatic driver candidates in the ubiquitin pathway and identify a cluster of patients with poor survival, highlighting the importance of this pathway in cancer development. Keywords: ubiquitin pathway, pan-cancer analysis, The Cancer Genome Atlas, tumor subtype, cancer prognosis, therapeutic targets, biomarker, FBXW7

  15. Penalized differential pathway analysis of integrative oncogenomics studies

    NARCIS (Netherlands)

    van Wieringen, W.N.; van de Wiel, M.A.

    2014-01-01

    Through integration of genomic data from multiple sources, we may obtain a more accurate and complete picture of the molecular mechanisms underlying tumorigenesis. We discuss the integration of DNA copy number and mRNA gene expression data from an observational integrative genomics study involving

  16. An integrated internal flow analysis for ramjet propulsion system

    Science.gov (United States)

    Hsieh, Shih-Yang

    An integrated numerical analysis has been conducted to study the ramjet internal flowfield. Emphasis is placed on the establishment of a unified numerical scheme and accurate representation of the internal flow development. The theoretical model is based on the complete conservation equations of mass, momentum, energy, and species concentration, with consideration of finite-rate chemical reactions and variable properties. Turbulence closure is achieved using a low-Reynolds number k-epsilon two-equation model. A new computation procedure capable of treating time-accurate, chemically reacting flows over a wide range of Mach number was developed. This numerical scheme allows for a unified treatment of the entire flowfield in a ramjet engine, including both the supersonic inlet and the combustion chamber. The algorithm is based on scaling the pressure terms in the momentum equations and preconditioning the conservation equations to circumvent numerical difficulties at low Mach numbers. The resulting equations are solved using the lower-upper (LU) factorization method in a fully-coupled manner, with the incorporation of a flux-differencing upwind TVD scheme to achieve high-order spatial accuracy. The transient behavior of the modeled system is preserved through implementation of the dual time-stepping integration technique. Calculations have been carried out for the flowfield in a typical ramjet engine consisting of an axisymmetric mixed-compression supersonic inlet and a coaxial dump combustor. Distinguished shock structures in the forward section of the inlet were clearly captured. The boundary layer thickening and flow separation behind the terminal shock due to shock/boundary-layer interactions and inlet configuration were observed. The mutual coupling between the inlet and combustor was carefully examined. In particular, strong vortices arising from the inlet shock/acoustic and shock/boundary-layer interactions may convect downstream and affect the combustion

  17. Packaged integrated opto-fluidic solution for harmful fluid analysis

    Science.gov (United States)

    Allenet, T.; Bucci, D.; Geoffray, F.; Canto, F.; Couston, L.; Jardinier, E.; Broquin, J.-E.

    2016-02-01

    Advances in nuclear fuel reprocessing have led to a surging need for novel chemical analysis tools. In this paper, we present a packaged lab-on-chip approach with co-integration of optical and micro-fluidic functions on a glass substrate as a solution. A chip was built and packaged to obtain light/fluid interaction in order for the entire device to make spectral measurements using the photo spectroscopy absorption principle. The interaction between the analyte solution and light takes place at the boundary between a waveguide and a fluid micro-channel thanks to the evanescent part of the waveguide's guided mode that propagates into the fluid. The waveguide was obtained via ion exchange on a glass wafer. The input and the output of the waveguides were pigtailed with standard single mode optical fibers. The micro-scale fluid channel was elaborated with a lithography procedure and hydrofluoric acid wet etching resulting in a 150+/-8 μm deep channel. The channel was designed with fluidic accesses, in order for the chip to be compatible with commercial fluidic interfaces/chip mounts. This allows for analyte fluid in external capillaries to be pumped into the device through micro-pipes, hence resulting in a fully packaged chip. In order to produce this co-integrated structure, two substrates were bonded. A study of direct glass wafer-to-wafer molecular bonding was carried-out to improve detector sturdiness and durability and put forward a bonding protocol with a bonding surface energy of γ>2.0 J.m-2. Detector viability was shown by obtaining optical mode measurements and detecting traces of 1.2 M neodymium (Nd) solute in 12+/-1 μL of 0.01 M and pH 2 nitric acid (HNO3) solvent by obtaining an absorption peak specific to neodymium at 795 nm.

  18. Harmonic analysis in integrated energy system based on compressed sensing

    International Nuclear Information System (INIS)

    Yang, Ting; Pen, Haibo; Wang, Dan; Wang, Zhaoxia

    2016-01-01

    Highlights: • We propose a harmonic/inter-harmonic analysis scheme with compressed sensing theory. • Property of sparseness of harmonic signal in electrical power system is proved. • The ratio formula of fundamental and harmonic components sparsity is presented. • Spectral Projected Gradient-Fundamental Filter reconstruction algorithm is proposed. • SPG-FF enhances the precision of harmonic detection and signal reconstruction. - Abstract: The advent of Integrated Energy Systems enabled various distributed energy to access the system through different power electronic devices. The development of this has made the harmonic environment more complex. It needs low complexity and high precision of harmonic detection and analysis methods to improve power quality. To solve the shortages of large data storage capacities and high complexity of compression in sampling under the Nyquist sampling framework, this research paper presents a harmonic analysis scheme based on compressed sensing theory. The proposed scheme enables the performance of the functions of compressive sampling, signal reconstruction and harmonic detection simultaneously. In the proposed scheme, the sparsity of the harmonic signals in the base of the Discrete Fourier Transform (DFT) is numerically calculated first. This is followed by providing a proof of the matching satisfaction of the necessary conditions for compressed sensing. The binary sparse measurement is then leveraged to reduce the storage space in the sampling unit in the proposed scheme. In the recovery process, the scheme proposed a novel reconstruction algorithm called the Spectral Projected Gradient with Fundamental Filter (SPG-FF) algorithm to enhance the reconstruction precision. One of the actual microgrid systems is used as simulation example. The results of the experiment shows that the proposed scheme effectively enhances the precision of harmonic and inter-harmonic detection with low computing complexity, and has good

  19. [Integrity].

    Science.gov (United States)

    Gómez Rodríguez, Rafael Ángel

    2014-01-01

    To say that someone possesses integrity is to claim that that person is almost predictable about responses to specific situations, that he or she can prudentially judge and to act correctly. There is a closed interrelationship between integrity and autonomy, and the autonomy rests on the deeper moral claim of all humans to integrity of the person. Integrity has two senses of significance for medical ethic: one sense refers to the integrity of the person in the bodily, psychosocial and intellectual elements; and in the second sense, the integrity is the virtue. Another facet of integrity of the person is la integrity of values we cherish and espouse. The physician must be a person of integrity if the integrity of the patient is to be safeguarded. The autonomy has reduced the violations in the past, but the character and virtues of the physician are the ultimate safeguard of autonomy of patient. A field very important in medicine is the scientific research. It is the character of the investigator that determines the moral quality of research. The problem arises when legitimate self-interests are replaced by selfish, particularly when human subjects are involved. The final safeguard of moral quality of research is the character and conscience of the investigator. Teaching must be relevant in the scientific field, but the most effective way to teach virtue ethics is through the example of the a respected scientist.

  20. From organizational integration to clinical integration: analysis of the path between one level of integration to another using official documents

    Science.gov (United States)

    Mandza, Matey; Gagnon, Dominique; Carrier, Sébastien; Belzile, Louise; Demers, Louis

    2010-01-01

    Purpose Services’ integration comprises organizational, normative, economic, informational and clinical dimensions. Since 2004, the province of Quebec has devoted significant efforts to unify the governance of the main health and social care organizations of its various territories. Notwithstanding the uniformity of the national plan’s prescription, the territorial integration modalities greatly vary across the province. Theory This research is based upon a conceptual model of integration that comprises six components: inter-organizational partnership, case management, standardized assessment, a single entry point, a standardized service planning tool and a shared clinical file. Methods We conducted an embedded case study in six contrasted sites in terms of their level of integration. All documents prescribing the implementation of integration were retrieved and analyzed. Results and conclusions The analyzed documents demonstrate a growing local appropriation of the current integrative reform. Interestingly however, no link seems to exist between the quality of local prescriptions and the level of integration achieved in each site. This finding leads us to hypothesize that the variable quality of the operational accompaniment offered to implement these prescriptions is a variable in play.

  1. Proceedings of a NEA workshop on probabilistic structure integrity analysis and its relationship to deterministic analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-07-01

    This workshop was hosted jointly by the Swedish Nuclear Power Inspectorate (SKi) and the Swedish Royal Institute of Technology (KTH). It was sponsored by the Principal Working Group 3 (PWG-3) of the NEA CSNI. PWG-3 deals with the integrity of structures and components, and has three sub-groups, dealing with the integrity of metal components and structures, ageing of concrete structures, and the seismic behaviour of structures. The sub-group dealing with metal components has three mains areas of activity: non-destructive examination; fracture mechanics; and material degradation. The topic of this workshop is primarily probabilistic fracture mechanics, but probabilistic integrity analysis includes NDE and materials degradation also. Session 1 (5 papers) was devoted to the development of probabilistic models; Session 2 (5 papers) to the random modelling of defects and material properties; Session 3 (8 papers) to the applications of probabilistic modelling to nuclear components; Sessions 4 is a concluding panel discussion

  2. Proceedings of a NEA workshop on probabilistic structure integrity analysis and its relationship to deterministic analysis

    International Nuclear Information System (INIS)

    1996-01-01

    This workshop was hosted jointly by the Swedish Nuclear Power Inspectorate (SKi) and the Swedish Royal Institute of Technology (KTH). It was sponsored by the Principal Working Group 3 (PWG-3) of the NEA CSNI. PWG-3 deals with the integrity of structures and components, and has three sub-groups, dealing with the integrity of metal components and structures, ageing of concrete structures, and the seismic behaviour of structures. The sub-group dealing with metal components has three mains areas of activity: non-destructive examination; fracture mechanics; and material degradation. The topic of this workshop is primarily probabilistic fracture mechanics, but probabilistic integrity analysis includes NDE and materials degradation also. Session 1 (5 papers) was devoted to the development of probabilistic models; Session 2 (5 papers) to the random modelling of defects and material properties; Session 3 (8 papers) to the applications of probabilistic modelling to nuclear components; Sessions 4 is a concluding panel discussion

  3. Canonical integration and analysis of periodic maps using non-standard analysis and life methods

    Energy Technology Data Exchange (ETDEWEB)

    Forest, E.; Berz, M.

    1988-06-01

    We describe a method and a way of thinking which is ideally suited for the study of systems represented by canonical integrators. Starting with the continuous description provided by the Hamiltonians, we replace it by a succession of preferably canonical maps. The power series representation of these maps can be extracted with a computer implementation of the tools of Non-Standard Analysis and analyzed by the same tools. For a nearly integrable system, we can define a Floquet ring in a way consistent with our needs. Using the finite time maps, the Floquet ring is defined only at the locations s/sub i/ where one perturbs or observes the phase space. At most the total number of locations is equal to the total number of steps of our integrator. We can also produce pseudo-Hamiltonians which describe the motion induced by these maps. 15 refs., 1 fig.

  4. Microblogging for Class: An Analysis of Affective, Cognitive, Personal Integrative, and Social Integrative Gratifications

    Science.gov (United States)

    Gant, Camilla; Hadley, Patrick D.

    2014-01-01

    This study shows that undergraduate students can gratify cognitive, affective, social integrative, and personal integrative needs microblogging via a learning management system discussion tool. Moreover, the researchers find that microblogging about news regarding mass media events and issues via Blackboard heightened engagement, expanded…

  5. An example of system integration for RCRA policy analysis

    International Nuclear Information System (INIS)

    Tonn, B.; Goeltz, R.; Schmidt, K.

    1991-01-01

    This paper describes the synthesis of various computer technologies and software systems used on a project to estimate the costs of remediating Solid Waste Management Units (SWMUs) that fall under the corrective action provisions of the Resource Conservation and Recovery Act (RCRA). The project used two databases collected by Research Triangle Institute (RTI) that contain information on SWMUs and a PC-based software system called CORA that develops cost estimates for remediating SWMUs. The project team developed rules to categorize every SWMU in the databases by the kinds of technologies required to clean them up. These results were input into CORA, which estimated costs associated with the technologies. Early on, several computing challenges presented themselves. First, the databases have several hundred thousand records each. Second, the categorization rules could not be written to cover all combinations of variables. Third, CORA is run interactively and the analysis plan called for running CORA tens of thousands of times. Fourth, large data transfers needed to take place between RTI and Oak Ridge National Laboratory. Solutions to these problems required systems integration. SWMU categorization was streamlined by using INTERNET as was the data transfer. SAS was used to create files used by a program called SuperKey that was used to run CORA. Because the analysis plan required the generation of hundreds of thousands of cost estimates, memory management software was needed to allow the portable IBM P70 to do the job. During the course of the project, several other software packages were used, including: SAS System for Personal Computers (SAS/PC), DBase III, LOTUS 1-2-3, PIZAZZ PLUS, LOTUS Freelance Plus, and Word Perfect. Only the comprehensive use of all available hardware and software resources allowed this project to be completed within the time and budget constraints. 5 refs., 3 figs., 3 tabs

  6. The Venetian Ghetto: Semantic Modelling for an Integrated Analysis

    Directory of Open Access Journals (Sweden)

    Alessandra Ferrighi

    2017-12-01

    Full Text Available In the digital era, historians are embracing information technology as a research tool. New technologies offer investigation and interpretation, synthesis and communication tools that are more effective than the more traditional study methods, as they guarantee a multidisciplinary approach and analyses integration. Among the available technologies the best suited for the study or urban phenomena are databases (DB, the Geographic Information System (GIS, the Building Information Modelling (BIM and the multimedia tools (Video, APP for the dissemination of results. The case study described here concerns the analysis of part of Venice that changed its appearance from 1516 onwards, with the creation of the Jewish Ghetto. This was an event that would have repercussions throughout Europe, changing the course of history. Our research confirms that the exclusive use of one of the systems mentioned above (DB, GIS, BIM makes it possible to manage the complexity of the subject matter only partially. Consequently, it became necessary to analyse the possible interactions between such tools, so as to create a link between an alphanumeric DB and a geographical DB. The use of only GIS and BIM that provide for a 4D time management of objects turned out to be able to manage information and geometry in an effective and scalable way, providing a starting point for the mapping in depth of the historical analysis. Software products for digital modelling have changed in nature over time, going from simple viewing tools to simulation tools. The reconstruction of the time phases of the three Ghettos (Nuovo, Vecchio, and Nuovissimo and their visualisation through digital narratives of the history of that specific area of the city, for instance through videos, is making it possible for an increasing number of scholars and the general public to access the results of the study.

  7. IRRAS, Integrated Reliability and Risk Analysis System for PC

    International Nuclear Information System (INIS)

    Russell, K.D.

    1995-01-01

    1 - Description of program or function: IRRAS4.16 is a program developed for the purpose of performing those functions necessary to create and analyze a complete Probabilistic Risk Assessment (PRA). This program includes functions to allow the user to create event trees and fault trees, to define accident sequences and basic event failure data, to solve system and accident sequence fault trees, to quantify cut sets, and to perform uncertainty analysis on the results. Also included in this program are features to allow the analyst to generate reports and displays that can be used to document the results of an analysis. Since this software is a very detailed technical tool, the user of this program should be familiar with PRA concepts and the methods used to perform these analyses. 2 - Method of solution: IRRAS4.16 is written entirely in MODULA-2 and uses an integrated commercial graphics package to interactively construct and edit fault trees. The fault tree solving methods used are industry recognized top down algorithms. For quantification, the program uses standard methods to propagate the failure information through the generated cut sets. 3 - Restrictions on the complexity of the problem: Due to the complexity of and the variety of ways a fault tree can be defined it is difficult to define limits on the complexity of the problem solved by this software. It is, however, capable of solving a substantial fault tree due to efficient methods. At this time, the software can efficiently solve problems as large as other software currently used on mainframe computers. Does not include source code

  8. Corporate Disclosure, Materiality, and Integrated Report: An Event Study Analysis

    Directory of Open Access Journals (Sweden)

    Maria Cleofe Giorgino

    2017-11-01

    Full Text Available Within the extensive literature investigating the impacts of corporate disclosure in supporting the sustainable growth of an organization, few studies have included in the analysis the materiality issue referred to the information being disclosed. This article aims to address this gap, exploring the effect produced on capital markets by the publication of a recent corporate reporting tool, Integrated Report (IR. The features of this tool are that it aims to represent the multidimensional impact of the organization’s activity and assumes materiality as a guiding principle of the report drafting. Adopting the event study methodology associated with a statistical significance test for categorical data, our results verify that an organization’s release of IR is able to produce a statistically significant impact on the related share prices. Moreover, the term “integrated” assigned to the reports plays a significant role in the impact on capital markets. Our findings have beneficial implications for both researchers and practitioners, adding new evidence for the IR usefulness as a corporate disclosure tool and the effect of an organization’s decision to disclose material information.

  9. Linking Ayurveda and Western medicine by integrative analysis

    Directory of Open Access Journals (Sweden)

    Fazlin Mohd Fauzi

    2013-01-01

    Full Text Available In this article, we discuss our recent work in elucidating the mode-of-action of compounds used in traditional medicine including Ayurvedic medicine. Using computational (′in silico′ approach, we predict potential targets for Ayurvedic anti-cancer compounds, obtained from the Indian Plant Anticancer Database given its chemical structure. In our analysis, we observed that: (i the targets predicted can be connected to cancer pathogenesis i.e. steroid-5-alpha reductase 1 and 2 and estrogen receptor-β, and (ii predominantly hormone-dependent cancer targets were predicted for the anti-cancer compounds. Through the use of our in silico target prediction, we conclude that understanding how traditional medicine such as Ayurveda work through linking with the ′western′ understanding of chemistry and protein targets can be a fruitful avenue in addition to bridging the gap between the two different schools of thinking. Given that compounds used in Ayurveda have been tested and used for thousands of years (although not in the same approach as Western medicine, they can potentially be developed into potential new drugs. Hence, to further advance the case of Ayurvedic medicine, we put forward some suggestions namely: (a employing and integrating novel analytical methods given the advancements of ′omics′ and (b sharing experimental data and clinical results on studies done on Ayurvedic compounds in an easy and accessible way.

  10. Extreme Wave Analysis by Integrating Model and Wave Buoy Data

    Directory of Open Access Journals (Sweden)

    Fabio Dentale

    2018-03-01

    Full Text Available Estimating the extreme values of significant wave height (HS, generally described by the HS return period TR function HS(TR and by its confidence intervals, is a necessity in many branches of coastal science and engineering. The availability of indirect wave data generated by global and regional wind and wave model chains have brought radical changes to the estimation procedures of such probability distribution—weather and wave modeling systems are routinely run all over the world, and HS time series for each grid point are produced and published after assimilation (analysis of the ground truth. However, while the sources of such indirect data are numerous, and generally of good quality, many aspects of their procedures are hidden to the users, who cannot evaluate the reliability and the limits of the HS(TR deriving from such data. In order to provide a simple engineering tool to evaluate the probability of extreme sea-states as well as the quality of such estimates, we propose here a procedure based on integrating HS time series generated by model chains with those recorded by wave buoys in the same area.

  11. MEASURE: An integrated data-analysis and model identification facility

    Science.gov (United States)

    Singh, Jaidip; Iyer, Ravi K.

    1990-01-01

    The first phase of the development of MEASURE, an integrated data analysis and model identification facility is described. The facility takes system activity data as input and produces as output representative behavioral models of the system in near real time. In addition a wide range of statistical characteristics of the measured system are also available. The usage of the system is illustrated on data collected via software instrumentation of a network of SUN workstations at the University of Illinois. Initially, statistical clustering is used to identify high density regions of resource-usage in a given environment. The identified regions form the states for building a state-transition model to evaluate system and program performance in real time. The model is then solved to obtain useful parameters such as the response-time distribution and the mean waiting time in each state. A graphical interface which displays the identified models and their characteristics (with real time updates) was also developed. The results provide an understanding of the resource-usage in the system under various workload conditions. This work is targeted for a testbed of UNIX workstations with the initial phase ported to SUN workstations on the NASA, Ames Research Center Advanced Automation Testbed.

  12. Signal Integrity Analysis in Single and Bundled Carbon Nanotube Interconnects

    International Nuclear Information System (INIS)

    Majumder, M.K.; Pandya, N.D.; Kaushik, B.K.; Manhas, S.K.

    2013-01-01

    Carbon nanotube (CN T) can be considered as an emerging interconnect material in current nano scale regime. They are more promising than other interconnect materials such as Al or Cu because of their robustness to electromigration. This research paper aims to address the crosstalk-related issues (signal integrity) in interconnect lines. Different analytical models of single- (SWCNT), double- (DWCNT), and multiwalled CNTs (MWCNT) are studied to analyze the crosstalk delay at global interconnect lengths. A capacitively coupled three-line bus architecture employing CMOS driver is used for accurate estimation of crosstalk delay. Each line in bus architecture is represented with the equivalent RLC models of single and bundled SWCNT, DWCNT, and MWCNT interconnects. Crosstalk delay is observed at middle line (victim) when it switches in opposite direction with respect to the other two lines (aggressors). Using the data predicted by ITRS 2012, a comparative analysis on the basis of crosstalk delay is performed for bundled SWCNT/DWCNT and single MWCNT interconnects. It is observed that the overall crosstalk delay is improved by 40.92% and 21.37% for single MWCNT in comparison to bundled SWCNT and bundled DWCNT interconnects, respectively.

  13. Functional analysis in the study of differential and integral equations

    International Nuclear Information System (INIS)

    Sell, G.R.

    1976-01-01

    This paper illustrates the use of functional analysis in the study of differential equations. Our particular starting point, the theory of flows or dynamical systems, originated with the work of H. Poincare, who is the founder of the qualitative theory of ordinary differential equations. In the qualitative theory one tries to describe the behaviour of a solution, or a collection of solutions, without ''solving'' the differential equation. As a starting point one assumes the existence, and sometimes the uniqueness, of solutions and then one tries to describe the asymptotic behaviour, as time t→+infinity, of these solutions. We compare the notion of a flow with that of a C 0 -group of bounded linear operators on a Banach space. We shall show how the concept C 0 -group, or more generally a C 0 -semigroup, can be used to study the behaviour of solutions of certain differential and integral equations. Our main objective is to show how the concept of a C 0 -group and especially the notion of weak-compactness can be used to prove the existence of an invariant measure for a flow on a compact Hausdorff space. Applications to the theory of ordinary differential equations are included. (author)

  14. iDASH: integrating data for analysis, anonymization, and sharing

    Science.gov (United States)

    Bafna, Vineet; Boxwala, Aziz A; Chapman, Brian E; Chapman, Wendy W; Chaudhuri, Kamalika; Day, Michele E; Farcas, Claudiu; Heintzman, Nathaniel D; Jiang, Xiaoqian; Kim, Hyeoneui; Kim, Jihoon; Matheny, Michael E; Resnic, Frederic S; Vinterbo, Staal A

    2011-01-01

    iDASH (integrating data for analysis, anonymization, and sharing) is the newest National Center for Biomedical Computing funded by the NIH. It focuses on algorithms and tools for sharing data in a privacy-preserving manner. Foundational privacy technology research performed within iDASH is coupled with innovative engineering for collaborative tool development and data-sharing capabilities in a private Health Insurance Portability and Accountability Act (HIPAA)-certified cloud. Driving Biological Projects, which span different biological levels (from molecules to individuals to populations) and focus on various health conditions, help guide research and development within this Center. Furthermore, training and dissemination efforts connect the Center with its stakeholders and educate data owners and data consumers on how to share and use clinical and biological data. Through these various mechanisms, iDASH implements its goal of providing biomedical and behavioral researchers with access to data, software, and a high-performance computing environment, thus enabling them to generate and test new hypotheses. PMID:22081224

  15. Linking Ayurveda and Western medicine by integrative analysis.

    Science.gov (United States)

    Fauzi, Fazlin Mohd; Koutsoukas, Alexios; Lowe, Robert; Joshi, Kalpana; Fan, Tai-Ping; Glen, Robert C; Bender, Andreas

    2013-04-01

    In this article, we discuss our recent work in elucidating the mode-of-action of compounds used in traditional medicine including Ayurvedic medicine. Using computational ('in silico') approach, we predict potential targets for Ayurvedic anti-cancer compounds, obtained from the Indian Plant Anticancer Database given its chemical structure. In our analysis, we observed that: (i) the targets predicted can be connected to cancer pathogenesis i.e. steroid-5-alpha reductase 1 and 2 and estrogen receptor-β, and (ii) predominantly hormone-dependent cancer targets were predicted for the anti-cancer compounds. Through the use of our in silico target prediction, we conclude that understanding how traditional medicine such as Ayurveda work through linking with the 'western' understanding of chemistry and protein targets can be a fruitful avenue in addition to bridging the gap between the two different schools of thinking. Given that compounds used in Ayurveda have been tested and used for thousands of years (although not in the same approach as Western medicine), they can potentially be developed into potential new drugs. Hence, to further advance the case of Ayurvedic medicine, we put forward some suggestions namely: (a) employing and integrating novel analytical methods given the advancements of 'omics' and (b) sharing experimental data and clinical results on studies done on Ayurvedic compounds in an easy and accessible way.

  16. Thermally-induced voltage alteration for integrated circuit analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cole, E.I. Jr.

    2000-06-20

    A thermally-induced voltage alteration (TIVA) apparatus and method are disclosed for analyzing an integrated circuit (IC) either from a device side of the IC or through the IC substrate to locate any open-circuit or short-circuit defects therein. The TIVA apparatus uses constant-current biasing of the IC while scanning a focused laser beam over electrical conductors (i.e. a patterned metallization) in the IC to produce localized heating of the conductors. This localized heating produces a thermoelectric potential due to the Seebeck effect in any conductors with open-circuit defects and a resistance change in any conductors with short-circuit defects, both of which alter the power demand by the IC and thereby change the voltage of a source or power supply providing the constant-current biasing. By measuring the change in the supply voltage and the position of the focused and scanned laser beam over time, any open-circuit or short-circuit defects in the IC can be located and imaged. The TIVA apparatus can be formed in part from a scanning optical microscope, and has applications for qualification testing or failure analysis of ICs.

  17. Bayesian Integrated Data Analysis of Fast-Ion Measurements by Velocity-Space Tomography

    DEFF Research Database (Denmark)

    Salewski, M.; Nocente, M.; Jacobsen, A.S.

    2018-01-01

    Bayesian integrated data analysis combines measurements from different diagnostics to jointly measure plasma parameters of interest such as temperatures, densities, and drift velocities. Integrated data analysis of fast-ion measurements has long been hampered by the complexity of the strongly non...... framework. The implementation for different types of diagnostics as well as the uncertainties are discussed, and we highlight the importance of integrated data analysis of all available detectors....

  18. An Empirical Analysis of Post-Merger Organizational Integration

    DEFF Research Database (Denmark)

    Smeets, Valerie Anne Rolande; Ierulli, Kathryn; Gibbs, Michael

    2016-01-01

    existing establishments. Worker turnover is high after merger, but new hiring yields stable total employment. Target employees have higher turnover and reassignment, particularly if the target firm is small relative to the acquiring firm. These findings may suggest integration is costly, but can......We study post-merger organizational integration using linked employer-employee data. Integration is implemented by reassigning a small number of high skilled workers, especially in R&D and management. Workforce mixing is concentrated to establishments set up after merger rather than to previously...... be achieved by focusing on key employees. Alternatively, reassigning a few key employees is sufficient for achieving integration....

  19. Strategic Mobility 21: Integrated Tracking System Analysis and Concept Design

    National Research Council Canada - National Science Library

    Mallon, Lawrence G; Savacool, Edwin

    2007-01-01

    ... (ITS). This ITS design document identifies the technical and functional requirements for developing, procuring, and integrating components of an ITS capable of supporting an inland regional port, multi...

  20. Living PRAs [probabilistic risk analysis] made easier with IRRAS [Integrated Reliability and Risk Analysis System

    International Nuclear Information System (INIS)

    Russell, K.D.; Sattison, M.B.; Rasmuson, D.M.

    1989-01-01

    The Integrated Reliability and Risk Analysis System (IRRAS) is an integrated PRA software tool that gives the user the ability to create and analyze fault trees and accident sequences using an IBM-compatible microcomputer. This program provides functions that range from graphical fault tree and event tree construction to cut set generation and quantification. IRRAS contains all the capabilities and functions required to create, modify, reduce, and analyze event tree and fault tree models used in the analysis of complex systems and processes. IRRAS uses advanced graphic and analytical techniques to achieve the greatest possible realization of the potential of the microcomputer. When the needs of the user exceed this potential, IRRAS can call upon the power of the mainframe computer. The role of the Idaho National Engineering Laboratory if the IRRAS program is that of software developer and interface to the user community. Version 1.0 of the IRRAS program was released in February 1987 to prove the concept of performing this kind of analysis on microcomputers. This version contained many of the basic features needed for fault tree analysis and was received very well by the PRA community. Since the release of Version 1.0, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version is designated ''IRRAS 2.0''. Version 3.0 will contain all of the features required for efficient event tree and fault tree construction and analysis. 5 refs., 26 figs

  1. Short circuit analysis of distribution system with integration of DG

    DEFF Research Database (Denmark)

    Su, Chi; Liu, Zhou; Chen, Zhe

    2014-01-01

    and as a result bring challenges to the network protection system. This problem has been frequently discussed in the literature, but mostly considering only the balanced fault situation. This paper presents an investigation on the influence of full converter based wind turbine (WT) integration on fault currents......Integration of distributed generation (DG) such as wind turbines into distribution system is increasing all around the world, because of the flexible and environmentally friendly characteristics. However, DG integration may change the pattern of the fault currents in the distribution system...... during both balanced and unbalanced faults. Major factors such as external grid short circuit power capacity, WT integration location, connection type of WT integration transformer are taken into account. In turn, the challenges brought to the protection system in the distribution network are presented...

  2. AnalyzeHOLE: An Integrated Wellbore Flow Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Keith J. Halford

    2009-10-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically

  3. Tools for integrated sequence-structure analysis with UCSF Chimera

    Directory of Open Access Journals (Sweden)

    Huang Conrad C

    2006-07-01

    Full Text Available Abstract Background Comparing related structures and viewing the structures in the context of sequence alignments are important tasks in protein structure-function research. While many programs exist for individual aspects of such work, there is a need for interactive visualization tools that: (a provide a deep integration of sequence and structure, far beyond mapping where a sequence region falls in the structure and vice versa; (b facilitate changing data of one type based on the other (for example, using only sequence-conserved residues to match structures, or adjusting a sequence alignment based on spatial fit; (c can be used with a researcher's own data, including arbitrary sequence alignments and annotations, closely or distantly related sets of proteins, etc.; and (d interoperate with each other and with a full complement of molecular graphics features. We describe enhancements to UCSF Chimera to achieve these goals. Results The molecular graphics program UCSF Chimera includes a suite of tools for interactive analyses of sequences and structures. Structures automatically associate with sequences in imported alignments, allowing many kinds of crosstalk. A novel method is provided to superimpose structures in the absence of a pre-existing sequence alignment. The method uses both sequence and secondary structure, and can match even structures with very low sequence identity. Another tool constructs structure-based sequence alignments from superpositions of two or more proteins. Chimera is designed to be extensible, and mechanisms for incorporating user-specific data without Chimera code development are also provided. Conclusion The tools described here apply to many problems involving comparison and analysis of protein structures and their sequences. Chimera includes complete documentation and is intended for use by a wide range of scientists, not just those in the computational disciplines. UCSF Chimera is free for non-commercial use and is

  4. Graphical User Interface for Simulink Integrated Performance Analysis Model

    Science.gov (United States)

    Durham, R. Caitlyn

    2009-01-01

    The J-2X Engine (built by Pratt & Whitney Rocketdyne,) in the Upper Stage of the Ares I Crew Launch Vehicle, will only start within a certain range of temperature and pressure for Liquid Hydrogen and Liquid Oxygen propellants. The purpose of the Simulink Integrated Performance Analysis Model is to verify that in all reasonable conditions the temperature and pressure of the propellants are within the required J-2X engine start boxes. In order to run the simulation, test variables must be entered at all reasonable values of parameters such as heat leak and mass flow rate. To make this testing process as efficient as possible in order to save the maximum amount of time and money, and to show that the J-2X engine will start when it is required to do so, a graphical user interface (GUI) was created to allow the input of values to be used as parameters in the Simulink Model, without opening or altering the contents of the model. The GUI must allow for test data to come from Microsoft Excel files, allow those values to be edited before testing, place those values into the Simulink Model, and get the output from the Simulink Model. The GUI was built using MATLAB, and will run the Simulink simulation when the Simulate option is activated. After running the simulation, the GUI will construct a new Microsoft Excel file, as well as a MATLAB matrix file, using the output values for each test of the simulation so that they may graphed and compared to other values.

  5. A multilayered integrated sensor for three-dimensional, micro total analysis systems

    International Nuclear Information System (INIS)

    Xiao, Jing; Song, Fuchuan; Seo, Sang-Woo

    2013-01-01

    This paper presents a layer-by-layer integration approach of different functional devices and demonstrates a heterogeneously integrated optical sensor featuring a micro-ring resonator and a high-speed thin-film InGaAs-based photodetector co-integrated with a microfluidic droplet generation device. A thin optical device structure allows a seamless integration with other polymer-based devices on a silicon platform. The integrated sensor successfully demonstrates its transient measurement capability of two-phase liquid flow in a microfluidic droplet generation device. The proposed approach represents an important step toward fully integrated micro total analysis systems. (paper)

  6. Integrated safety analysis to operate while constructing Urenco USA

    International Nuclear Information System (INIS)

    Kohrt, Rick; Su, Shiaw-Der; Lehman, Richard

    2013-01-01

    The URENCO USA (UUSA) site in Lea County, New Mexico, USA is authorized by the U.S. Nuclear Regulatory Commission (NRC) for construction and operation of a uranium enrichment facility under 10 CFR 70 (Ref 1). The facility employs the gas centrifuge process to separate natural uranium hexafluoride (UF 6 ) feed material into a product stream enriched up to 5% U-235 and a depleted UF 6 stream containing approximately 0.2 to 0.34% U-235. Initial plant operations, with a limited number of cascades on line, commenced in the second half of 2010. Construction activities continue as each subsequent cascade is commissioned and placed into service. UUSA performed an Integrated Safety Analysis (ISA) to allow the facility to operate while constructing the remainder of the facility. The ISA Team selected the What-If/Checklist method based on guidance in NUREG-1513 (Ref 2) and AIChE Guidelines (Ref 3). Of the three methods recommended for high risk events HAZOP, What-If/Checklist, or Failure Modes and Effects Analysis (FMEA), the What-If/Checklist lends itself best to construction activities. It combines the structure of a checklist with an unstructured 'brainstorming' approach to create a list of specific accident events that could produce an undesirable consequence. The What-If/Checklist for Operate While Constructing divides the UUSA site into seven areas and creates what-if questions for sixteen different construction activities, such as site preparation, external construction cranes, and internal construction lifts. The result is a total of 112 nodes, for which the Operate While Constructing ISA Team created hundreds of what-if questions. For each what-if question the team determined the likelihood, consequences, safeguards, and acceptability of risk. What-if questions with unacceptable risk are the accident sequences and their selected safeguards are the Items Relied on For Safety (IROFS). The final ISA identified four (4) new accident sequences that, unless

  7. A comparison of integrated safety analysis and probabilistic risk assessment

    International Nuclear Information System (INIS)

    Damon, Dennis R.; Mattern, Kevin S.

    2013-01-01

    The U.S. Nuclear Regulatory Commission conducted a comparison of two standard tools for risk informing the regulatory process, namely, the Probabilistic Risk Assessment (PRA) and the Integrated Safety Analysis (ISA). PRA is a calculation of risk metrics, such as Large Early Release Frequency (LERF), and has been used to assess the safety of all commercial power reactors. ISA is an analysis required for fuel cycle facilities (FCFs) licensed to possess potentially critical quantities of special nuclear material. A PRA is usually more detailed and uses more refined models and data than an ISA, in order to obtain reasonable quantitative estimates of risk. PRA is considered fully quantitative, while most ISAs are typically only partially quantitative. The extension of PRA methodology to augment or supplant ISAs in FCFs has long been considered. However, fuel cycle facilities have a wide variety of possible accident consequences, rather than a few surrogates like LERF or core damage as used for reactors. It has been noted that a fuel cycle PRA could be used to better focus attention on the most risk-significant structures, systems, components, and operator actions. ISA and PRA both identify accident sequences; however, their treatment is quite different. ISA's identify accidents that lead to high or intermediate consequences, as defined in 10 Code of Federal Regulations (CFR) 70, and develop a set of Items Relied on For Safety (IROFS) to assure adherence to performance criteria. PRAs identify potential accident scenarios and estimate their frequency and consequences to obtain risk metrics. It is acceptable for ISAs to provide bounding evaluations of accident consequences and likelihoods in order to establish acceptable safety; but PRA applications usually require a reasonable quantitative estimate, and often obtain metrics of uncertainty. This paper provides the background, features, and methodology associated with the PRA and ISA. The differences between the

  8. Building-integrated PV -- Analysis and US market potential

    International Nuclear Information System (INIS)

    Frantzis, L.; Hill, S.; Teagan, P.; Friedman, D.

    1994-01-01

    Arthur D Little, Inc., in conjunction with Solar Design Associates, conducted a study for the US Department of Energy (DOE), Office of Building Technologies (OBT) to determine the market potential for building-integrated photovoltaics (BIPV). This study defines BIPV as two types of applications: (1) where the PV modules are an integral part of the building, often serving as the exterior weathering skin, and (2) the PV modules are mounted on the existing building exterior. Both of these systems are fully integrated with the energy usage of the building and have potential for significant market penetration in the US

  9. Improvements to Integrated Tradespace Analysis of Communications Architectures (ITACA) Network Loading Analysis Tool

    Science.gov (United States)

    Lee, Nathaniel; Welch, Bryan W.

    2018-01-01

    NASA's SCENIC project aims to simplify and reduce the cost of space mission planning by replicating the analysis capabilities of commercially licensed software which are integrated with relevant analysis parameters specific to SCaN assets and SCaN supported user missions. SCENIC differs from current tools that perform similar analyses in that it 1) does not require any licensing fees, 2) will provide an all-in-one package for various analysis capabilities that normally requires add-ons or multiple tools to complete. As part of SCENIC's capabilities, the ITACA network loading analysis tool will be responsible for assessing the loading on a given network architecture and generating a network service schedule. ITACA will allow users to evaluate the quality of service of a given network architecture and determine whether or not the architecture will satisfy the mission's requirements. ITACA is currently under development, and the following improvements were made during the fall of 2017: optimization of runtime, augmentation of network asset pre-service configuration time, augmentation of Brent's method of root finding, augmentation of network asset FOV restrictions, augmentation of mission lifetimes, and the integration of a SCaN link budget calculation tool. The improvements resulted in (a) 25% reduction in runtime, (b) more accurate contact window predictions when compared to STK(Registered Trademark) contact window predictions, and (c) increased fidelity through the use of specific SCaN asset parameters.

  10. Development and analysis of a new integrated power and cooling ...

    Indian Academy of Sciences (India)

    It consists of characteristics of Rankine cycle and vapour .... Solar thermal integrated plant material flow details with respect to figure 1 at Tsep .... Design features of parabolic trough collector with vacuum tube at the focal line have been col-.

  11. Integrated proteomic and genomic analysis of colorectal cancer

    Science.gov (United States)

    Investigators who analyzed 95 human colorectal tumor samples have determined how gene alterations identified in previous analyses of the same samples are expressed at the protein level. The integration of proteomic and genomic data, or proteogenomics, pro

  12. Integrated policy analysis of sustainable urban and transportation development

    NARCIS (Netherlands)

    Zhang, J.; Feng, T.; Fujiwara, A.; Fujiwara, A.; Zhang, Junyi

    2013-01-01

    Sustainable urban and transportation development needs to balance economic sustainability, environmental sustainability, and social equity. This study conducts integrated policy analyses by explicitly incorporating these sustainability goals and optimizing the performance of transportation networks.

  13. Strategic Mobility 21: Integrated Tracking System Analysis and Concept Design

    National Research Council Canada - National Science Library

    Mallon, Lawrence G; Savacool, Edwin

    2007-01-01

    .... This design document supports the SM21 efforts in developing a dual-use multi-modal node at the Southern California Logistics Airport in Victorville, CA that will be supported by an Integrated Tracking System...

  14. Economic analysis of the structure, Integration and Performance of ...

    African Journals Online (AJOL)

    The data for the study were collected with the aid of a questionnaire. Statistical tools such as simple descriptive statistics, Gini coefficient, Shepherd-Futrel functional ... Key words: Rice; Market structure; Market integration and Performance.

  15. Critical Analysis of Methods for Integrating Economic and Environmental Indicators

    NARCIS (Netherlands)

    Huguet Ferran, Pau; Heijungs, Reinout; Vogtländer, Joost G.

    2018-01-01

    The application of environmental strategies requires scoring and evaluation methods that provide an integrated vision of the economic and environmental performance of systems. The vector optimisation, ratio and weighted addition of indicators are the three most prevalent techniques for addressing

  16. Wellbore integrity analysis of a natural CO2 producer

    KAUST Repository

    Crow, Walter; Carey, J. William; Gasda, Sarah; Brian Williams, D.; Celia, Michael

    2010-01-01

    integrity, defined as the maintenance of isolation between subsurface intervals. In this report, we investigate a 30-year-old well from a natural CO2 production reservoir using a suite of downhole and laboratory tests to characterize isolation performance

  17. Integral abutment bridges under thermal loading : field monitoring and analysis.

    Science.gov (United States)

    2017-08-01

    Integral abutment bridges (IABs) have gained popularity throughout the United States due to their low construction and maintenance costs. Previous research on IABs has been heavily focused on substructure performance, leaving a need for better unders...

  18. An Empirical Analysis of Post-Merger Organizational Integration

    OpenAIRE

    Smeets, Valerie Anne Rolande; Gibbs, Michael; Ierulli, Kathryn

    2015-01-01

    We study post-merger organizational integration using linked employer-employee data. Integration is implemented by reassigning a small number of high skilled workers, especially in R&D and management. Workforce mixing is concentrated to establishments set up after merger rather than to previously existing establishments. Worker turnover is high after merger, but new hiring yields stable total employment. Target employees have higher turnover and reassignment, particularly if the target fi...

  19. Energy efficiency analysis of styrene production by adiabatic ethylbenzene dehydrogenation using exergy analysis and heat integration

    Directory of Open Access Journals (Sweden)

    Ali Emad

    2018-03-01

    Full Text Available Styrene is a valuable commodity for polymer industries. The main route for producing styrene by dehydrogenation of ethylbenzene consumes a substantial amount of energy because of the use of high-temperature steam. In this work, the process energy requirements and recovery are studied using Exergy analysis and Heat Integration (HI based on Pinch design method. The amount of steam plays a key role in the trade-off between Styrene yield and energy savings. Therefore, optimizing the operating conditions for energy reduction is infeasible. Heat integration indicated an insignificant reduction in the net energy demand and exergy losses, but 24% and 34% saving in external heating and cooling duties, respectively. When the required steam is generated by recovering the heat of the hot reactor effluent, a considerable saving in the net energy demand, as well as the heating and cooling utilities, can be achieved. Moreover, around 68% reduction in the exergy destruction is observed.

  20. Heat integration and analysis of decarbonised IGCC sites

    Energy Technology Data Exchange (ETDEWEB)

    Ng, K.S.; Lopez, Y.; Campbell, G.M.; Sadhukhan, J. [University of Manchester, Manchester (United Kingdom). School of Chemical Engineering & Analytical Science

    2010-02-15

    Integrated gasification combined cycle (IGCC) power generation systems have become of interest due to their high combined heat and power (CHP) generation efficiency and flexibility to include carbon capture and storage (CCS) in order to reduce CO{sub 2} emissions. However, IGCC's biggest challenge is its high cost of energy production. In this study, decarbonised coal IGCC sites integrated with CCS have been investigated for heat integration and economic value analyses. It is envisaged that the high energy production cost of an IGCC site can be offset by maximising site-wide heat recovery and thereby improving the cost of electricity (COE) of CHP generation. Strategies for designing high efficiency CHP networks have been proposed based on thermodynamic heuristics and pinch theory. Additionally, a comprehensive methodology to determine the COE from a process site has been developed. In this work, we have established thermodynamic and economic comparisons between IGCC sites with and without CCS and a trade-off between the degree of decarbonisation and the COE from the heat integrated IGCC sites. The results show that the COE from the heat integrated decarbonised IGCC sites is significantly lower compared to IGCC sites without heat integration making application of CCS in IGCC sites economically competitive.

  1. Analysis of an integrated carbon cylce for storage of renewables

    Science.gov (United States)

    Streibel, Martin; Nakaten, Natalie; Kempka, Thomas; Kühn, Michael

    2013-04-01

    In order to mitigate the consequences of climate change the energy concept of the Government of Germany foresees the reduction of CO2 emissions by 80 % in 2050 compared to the status in 1990. Different routes are followed to achieve this goal. Most advanced is the construction of renewable energy sources in order to replace fossil fuel driven parts of the electricity generation. The increasing share of renewable energy sources in power production introduces the problem of high fluctuation of energy generated by windmills and photovoltaic. On top the production is not driven by demand but by availability of wind and sun. In this context, the "Power to Gas" concept has been developed. Main idea is the storage of excess renewable energy in form of hydrogen produced by electrolysis. If in a second step H2 reacts with CO2 to form CH4 the current natural gas infrastructure can be used. In times of energy production by renewables below the actual electricity demand CH4 is combusted to produce electricity. The emissions can be further reduced if CO2 is captured in the power plant and buffered in a dynamic geological storage (CCS). Subsequently the CO2 is back produced when excess energy is available to synthesise CH4. Storing CH4 locally also reduces energy for transport. Hence an integrated almost closed carbon cycle is implemented. In the present study this extended "Power to Gas" concept is elaborated on a regional-scale for the State of Brandenburg and the control area of 50 hertz. Focus of the analysis is the energetic balance of the concept for the integration of a geological CH4 and CO2 storage. Therefore, the energy conversion efficiency for the "Power to Gas" concept has been calculated using available data from literature. According to our calculations approximately 33 % of the wind energy used can be regained by combusting the synthesised CH4 in a combined cycle plant. In order to fuel a peaking power plant with a power of 120 MW for 2,500 hours a year

  2. Bridging ImmunoGenomic Data Analysis Workflow Gaps (BIGDAWG): An integrated case-control analysis pipeline.

    Science.gov (United States)

    Pappas, Derek J; Marin, Wesley; Hollenbach, Jill A; Mack, Steven J

    2016-03-01

    Bridging ImmunoGenomic Data-Analysis Workflow Gaps (BIGDAWG) is an integrated data-analysis pipeline designed for the standardized analysis of highly-polymorphic genetic data, specifically for the HLA and KIR genetic systems. Most modern genetic analysis programs are designed for the analysis of single nucleotide polymorphisms, but the highly polymorphic nature of HLA and KIR data require specialized methods of data analysis. BIGDAWG performs case-control data analyses of highly polymorphic genotype data characteristic of the HLA and KIR loci. BIGDAWG performs tests for Hardy-Weinberg equilibrium, calculates allele frequencies and bins low-frequency alleles for k×2 and 2×2 chi-squared tests, and calculates odds ratios, confidence intervals and p-values for each allele. When multi-locus genotype data are available, BIGDAWG estimates user-specified haplotypes and performs the same binning and statistical calculations for each haplotype. For the HLA loci, BIGDAWG performs the same analyses at the individual amino-acid level. Finally, BIGDAWG generates figures and tables for each of these comparisons. BIGDAWG obviates the error-prone reformatting needed to traffic data between multiple programs, and streamlines and standardizes the data-analysis process for case-control studies of highly polymorphic data. BIGDAWG has been implemented as the bigdawg R package and as a free web application at bigdawg.immunogenomics.org. Copyright © 2015 American Society for Histocompatibility and Immunogenetics. Published by Elsevier Inc. All rights reserved.

  3. Integration of Formal Job Hazard Analysis and ALARA Work Practice

    International Nuclear Information System (INIS)

    NELSEN, D.P.

    2002-01-01

    ALARA work practices have traditionally centered on reducing radiological exposure and controlling contamination. As such, ALARA policies and procedures are not well suited to a wide range of chemical and human health issues. Assessing relative risk, identifying appropriate engineering/administrative controls and selecting proper Personal Protective Equipment (PPE) for non nuclear work activities extends beyond the limitations of traditional ALARA programs. Forging a comprehensive safety management program in today's (2002) work environment requires a disciplined dialog between health and safety professionals (e.g. safety, engineering, environmental, quality assurance, industrial hygiene, ALARA, etc.) and personnel working in the field. Integrating organizational priorities, maintaining effective pre-planning of work and supporting a team-based approach to safety management represents today's hallmark of safety excellence. Relying on the mandates of any single safety program does not provide industrial hygiene with the tools necessary to implement an integrated safety program. The establishment of tools and processes capable of sustaining a comprehensive safety program represents a key responsibility of industrial hygiene. Fluor Hanford has built integrated safety management around three programmatic attributes: (1) Integration of radiological, chemical and ergonomic issues under a single program. (2) Continuous improvement in routine communications among work planning/scheduling, job execution and management. (3) Rapid response to changing work conditions, formalized work planning and integrated worker involvement

  4. Analysis on the integration of ERP and e-commerce

    Science.gov (United States)

    Wang, Yongqing; Shi, Yuliana

    2017-08-01

    With the continuous development of China's modern economic construction, a variety of information technology are emerging. The new economic development characterized by e-commerce has accelerated the globalization of the economy. In face of increasingly fierce market competition, for enterprises, the constructions of ERP and e-commerce are necessary ways to enhance the core competitiveness of enterprises. At present, most of the internal ERP systems and external e-commerce systems are in relatively independent state. However, with the increasing fierce market competition, a single mode of operation has been unable to meet the requirements of enterprise development. Accordingly, the effective integration of ERP and e-commerce in the new era has become one of the most important topics for enterprise development. This paper firstly analyzes the relationship between ERP and e-commerce, and then analyzes the necessity and feasibility of integration, and finally discusses the integration strategies and technologies.

  5. A New Riemann Type Hydrodynamical Hierarchy and its Integrability Analysis

    International Nuclear Information System (INIS)

    Golenia, Jolanta Jolanta; Bogolubov, Nikolai N. Jr.; Popowicz, Ziemowit; Pavlov, Maxim V.; Prykarpatsky, Anatoliy K.

    2009-12-01

    Short-wave perturbations in a relaxing medium, governed by a special reduction of the Ostrovsky evolution equation, and later derived by Whitham, are studied using the gradient-holonomic integrability algorithm. The bi-Hamiltonicity and complete integrability of the corresponding dynamical system is stated and an infinite hierarchy of commuting to each other conservation laws of dispersive type are found. The well defined regularization of the model is constructed and its Lax type integrability is discussed. A generalized hydrodynamical Riemann type system is considered, infinite hierarchies of conservation laws, related compatible co-symplectic structures and Lax type representations for the special cases N = 2, 3 and N = 4 are constructed. (author)

  6. Introduction to stochastic analysis integrals and differential equations

    CERN Document Server

    Mackevicius, Vigirdas

    2013-01-01

    This is an introduction to stochastic integration and stochastic differential equations written in an understandable way for a wide audience, from students of mathematics to practitioners in biology, chemistry, physics, and finances. The presentation is based on the naïve stochastic integration, rather than on abstract theories of measure and stochastic processes. The proofs are rather simple for practitioners and, at the same time, rather rigorous for mathematicians. Detailed application examples in natural sciences and finance are presented. Much attention is paid to simulation diffusion pro

  7. Analysis of Goat Farming on Integrated Farming System in Banyumas

    Directory of Open Access Journals (Sweden)

    NN Hidayat

    2007-05-01

    Full Text Available The objective of this research were : 1 to find out the income generated from goat farming and its contribution to farmer income in several farming combination, 2 to find out the economic efficiency in goat farming with paddy and fish production, 3 to determine factors affecting level of production and income in different farming system, partially and aggregately, and 4 to determine the best combination of farming which generated maximum income. Household farmer survey method was performed to conduct this research. Farming model chosen in this research was partial and average aggregate. Cobb-Douglas function were chosen to predict functional relationship. Result stated from this research were : 1 goat farming has a significant contribution in integrated farming system, 2 integrated farming (goat and paddy, goat and fish, and goat, fish and paddy in Banyumas district was economically efficient. 3 partially, factor affecting production level in goat farming was number of goat owned (P<0.01, factor affecting paddy production were urea application and number of land owned (P<0.01, TSP application (P<0.05 and man power (P<0.10. Furthermore, factor affecting fish farming were feed, breed and number of land owned (P<0.01; 4 aggregately, factor affecting integrated farming I were urea application and number of land owned (P<0.01, feed and number of land owned (P<0.01, number of goat owned (P<0.10 integrated farming II, where as in integrated farming III were number of paddy land area and breed (P<0.01 also number of goat owned (P<0.10; 5 integrated farming III (goat, paddy and fish farming gave the highest profit, which gave Rp 6.219.283,81 with relatively high efficiency. Therefore, goat farming could be an alternative solution to be developed in integrated farming and could be combined with other farming activities such as paddy and fish farming. (Animal Production 9(2: 105-110 (2007 Key Words : Goat, income, economic efficiency, survey, contribution

  8. Signal Integrity Analysis of High-Speed Interconnects

    CERN Document Server

    Oltean Karlsson, A

    2007-01-01

    LHC detectors and future experiments will produce very large amount of data that will be transferred at multi-Gigabit speeds. At such data rates, signal-integrity effects become important and traditional rules of thumb are no longer enough for the design and layout of the traces. Simulations for signal-integrity effects at board level provide a way to study and validate several scenarios before arriving at a set of optimized design rules prior to building the actual printed circuit board (PCB). This article describes some of the available tools at CERN. Two case studies will be used to highlight the capabilities of these programs.

  9. Developing a comprehensive framework of community integration for people with acquired brain injury: a conceptual analysis.

    Science.gov (United States)

    Shaikh, Nusratnaaz M; Kersten, Paula; Siegert, Richard J; Theadom, Alice

    2018-03-06

    Despite increasing emphasis on the importance of community integration as an outcome for acquired brain injury (ABI), there is still no consensus on the definition of community integration. The aim of this study was to complete a concept analysis of community integration in people with ABI. The method of concept clarification was used to guide concept analysis of community integration based on a literature review. Articles were included if they explored community integration in people with ABI. Data extraction was performed by the initial coding of (1) the definition of community integration used in the articles, (2) attributes of community integration recognized in the articles' findings, and (3) the process of community integration. This information was synthesized to develop a model of community integration. Thirty-three articles were identified that met the inclusion criteria. The construct of community integration was found to be a non-linear process reflecting recovery over time, sequential goals, and transitions. Community integration was found to encompass six components including: independence, sense of belonging, adjustment, having a place to live, involved in a meaningful occupational activity, and being socially connected into the community. Antecedents to community integration included individual, injury-related, environmental, and societal factors. The findings of this concept analysis suggest that the concept of community integration is more diverse than previously recognized. New measures and rehabilitation plans capturing all attributes of community integration are needed in clinical practice. Implications for rehabilitation Understanding of perceptions and lived experiences of people with acquired brain injury through this analysis provides basis to ensure rehabilitation meets patients' needs. This model highlights the need for clinicians to be aware and assess the role of antecedents as well as the attributes of community integration itself to

  10. Analysis of the efficiency-integration nexus of Japanese stock market

    Science.gov (United States)

    Rizvi, Syed Aun R.; Arshad, Shaista

    2017-03-01

    This paper attempts a novel approach in analysing the Japanese economy through a dual-dimension analysis of its stock market, examining the efficiency and market integration. Taking a period of 24 years, this study employs MFDFA and MGARCH to understand how the efficiency and integration of the stock market faired during different business cycle phases of the Japanese economy. The results showed improving efficiency over the time period. For the case of market integration, our findings conform to recent literature on business cycles and stock market integration that every succeeding recession creates a break into integration levels resulting in a decrease.

  11. Integrated environmental policy: A review of economic analysis.

    Science.gov (United States)

    Wiesmeth, Hans; Häckl, Dennis

    2017-04-01

    Holistic environmental policies, which emerged from a mere combination of technical activities in waste management some 40 years ago, constitute the most advanced level of environmental policies. These approaches to environmental policy, among them the policies in integrated waste management, attempt to guide economic agents to an environment-friendly behaviour. Nevertheless, current holistic policies in waste management, including policies on one-way drinks containers and waste electrical and electronic equipment, and implementations of extended producer responsibility with further applications to waste electrical and electronic equipment, reveal more or less severe deficiencies - despite some positive examples. This article relates these policy failures, which are not necessarily the result of an insufficient compliance with the regulations, to missing constitutive elements of what is going to be called an 'integrated environmental policy'. This article therefore investigates - mostly from a practical point of view - constitutive elements, which are necessary for a holistic policy to serve as a well-functioning allocation mechanism. As these constitutive elements result from a careful 'integration' of the environmental commodities into the economic allocation problems, we refer to these policies as 'integrated environmental policies'. The article also discusses and illustrates the main steps of designing such a policy - for waste electrical and electronic equipment and a (possible) ban of Glyphosat in agriculture. As these policies are dependent on economic and political stability with environmental awareness sufficiently developed, the article addresses mostly waste management policies in highly industrialised countries.

  12. ALVIN, Diffusion and Integral Data Comparison and Sensitivity Analysis

    International Nuclear Information System (INIS)

    Harris, D.R.; Reupke, W.A.; Wilson, W.B.

    1982-01-01

    1 - Description of problem or function: ALVIN analyzes the consistency of a set of differential and integral nuclear data, adjusts the differential nuclear data to improve agreement with integral observations, and identifies inconsistent data. ALVIN also computes required sensitivities and related quantities such as sensitivity profiles. 2 - Method of solution: Linear perturbation theory is used for the sensitivity calculations. Data consistency and adjustment computations use least squares techniques. 3 - Restrictions on the complexity of the problem: The DAFT2 consistency and adjustment subroutine treats fully or partially correlated differential and integral parameters, but only as many as the order of the largest matrix that can be inverted. The DAFT3 consistency and adjustment subroutine treats arbitrarily large differential data sets, but only if they are uncorrelated with the integral data. Due to the current dimensions of some arrays, maxima of 75 spatial mesh points, 41 groups, and 6. order Legendre polynomials are allowed. This can be changed by increasing the dimensions of the LCM arrays and the arrays in the labeled COMMON block S1 and blank COMMON

  13. Analysis of nutrient flows in integrated intensive aquaculture systems

    NARCIS (Netherlands)

    Schneider, O.; Sereti, V.; Eding, E.H.; Verreth, J.A.J.

    2005-01-01

    This paper analyses nutrient conversions, which are taking place in integrated intensive aquaculture systems. In these systems fish is cultured next to other organisms, which are converting otherwise discharged nutrients into valuable products. These conversions are analyzed based on nitrogen and

  14. Analysis of Basic Transmission Networks for Integrated Ship Control Systems

    DEFF Research Database (Denmark)

    Hansen, T.N.; Granum-Jensen, M.

    1993-01-01

    Description of a computer network for Integrated Ship Control Systems which is going to be developed as part of an EC-project. Today equipment of different make are not able to communicate with each other because most often each supplier of ISC systems has got their own proprietary network.....

  15. Plasma Etching for Failure Analysis of Integrated Circuit Packages

    NARCIS (Netherlands)

    Tang, J.; Schelen, J.B.J.; Beenakker, C.I.M.

    2011-01-01

    Plastic integrated circuit packages with copper wire bonds are decapsulated by a Microwave Induced Plasma system. Improvements on microwave coupling of the system are achieved by frequency tuning and antenna modification. Plasmas with a mixture of O2 and CF4 showed a high etching rate around 2

  16. A Bernsteinian Analysis of the Integration of Natural Resource ...

    African Journals Online (AJOL)

    25, 2008. © 2008 Environmental Education Association of Southern Africa ..... framework outlined in Table 1, the Glossary had a low level of NRM integration and was allocated a .... Most of the questions required one word answers, or a single ...

  17. Child Psychotherapy, Child Analysis, and Medication: A Flexible, Integrative Approach.

    Science.gov (United States)

    Whitman, Laura

    2015-01-01

    For children with moderate to severe emotional or behavioral problems, the current approach in child psychiatry is to make an assessment for the use of both psychotherapy and medication. This paper describes integration of antidepressants and stimulants with psychoanalytically oriented techniques.

  18. Social and ecological analysis of commercial integrated crop livestock systems

    NARCIS (Netherlands)

    Garrett, R.D.; Niles, M.T.; Gil, J.D.B.; Gaudin, A.; Chaplin-Kramer, R.; Assmann, A.; Assmann, T.S.; Brewer, K.; Faccio Carvalho, de P.C.; Cortner, O.; Dynes, R.; Garbach, K.; Kebreab, E.; Mueller, N.; Peterson, C.; Reis, J.C.; Snow, V.; Valentim, J.

    2017-01-01

    Crops and livestock play a synergistic role in global food production and farmer livelihoods. Increasingly, however, crops and livestock are produced in isolation, particularly in farms operating at the commercial scale. It has been suggested that re-integrating crop and livestock systems at the

  19. A Bernsteinian Analysis of the Integration of Natural Resource ...

    African Journals Online (AJOL)

    Knowledge integration is one of the key principles that underpin curriculum reform in post-apartheid South Africa. One form of teacher support that has been adopted in South Africa is to provide schools throughout the country with samples of pedagogic texts such as curriculum documents and examination exemplars to act ...

  20. Colonialism and National Integration: An Analysis of British Policies ...

    African Journals Online (AJOL)

    The problem of national integration in Nigeria has continued to occupy centre stage in public discourse and intellectual circles especially now that religious and ethnic rancour appears to be on the increase. Much of the blame has been laid on the multi-religious and ethnic character of the population that was forcibly ...

  1. A socioeconomic analysis of biocontrol in integrated pest management

    NARCIS (Netherlands)

    Benjamin, Emmanuel O.; Wesseler, Justus H.H.

    2016-01-01

    European regulations on the sustainable use of pesticides aim to promote integrated pest management (IPM) strategy and the use of biological control agents. However, uncertainty over benefits and costs, irreversibility effects as well as flexibility in adoption of this technology needs to be

  2. Experimental assessment of computer codes used for safety analysis of integral reactors

    Energy Technology Data Exchange (ETDEWEB)

    Falkov, A.A.; Kuul, V.S.; Samoilov, O.B. [OKB Mechanical Engineering, Nizhny Novgorod (Russian Federation)

    1995-09-01

    Peculiarities of integral reactor thermohydraulics in accidents are associated with presence of noncondensable gas in built-in pressurizer, absence of pumped ECCS, use of guard vessel for LOCAs localisation and passive RHRS through in-reactor HX`s. These features defined the main trends in experimental investigations and verification efforts for computer codes applied. The paper reviews briefly the performed experimental investigation of thermohydraulics of AST-500, VPBER600-type integral reactors. The characteristic of UROVEN/MB-3 code for LOCAs analysis in integral reactors and results of its verification are given. The assessment of RELAP5/mod3 applicability for accident analysis in integral reactor is presented.

  3. Research on Integrated Analysis Method for Equipment and Tactics Based on Intervention Strategy Discussion

    Institute of Scientific and Technical Information of China (English)

    陈超; 张迎新; 毛赤龙

    2012-01-01

    As the increase of the complexity of the information warfare,its intervention strategy needs to be designed in an integrated environment.However,the current research always breaks the internal relation between equipment and tactics,and it is difficult to meet the requirements of their integrated analysis.In this paper,the research status quo of the integrated analysis about equipment and tactics is discussed first,some shortages of the current methods are summarized then,and an evolvement mechanism of the integrated analysis for equipment and tactics is given finally.Based on these,a framework of integrated analysis is proposed.This method's effectiveness is validated by an example.

  4. An Integrated Gait and Balance Analysis System to Define Human Locomotor Control

    Science.gov (United States)

    2016-04-29

    test hypotheses they developed about how people walk. An Integrated Gait and Balance Analysis System to define Human Locomotor Control W911NF-14-R-0009...An Integrated Gait and Balance Analysis System to Define Human Locomotor Control Walking is a complicated task that requires the motor coordination...Gait and Balance Analysis System to Define Human Locomotor Control Report Title Walking is a complicated task that requires the motor coordination across

  5. The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology: Capabilities and Applications

    Science.gov (United States)

    Evers, Ken H.; Bachert, Robert F.

    1987-01-01

    The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.

  6. GO-FLOW methodology. Basic concept and integrated analysis framework for its applications

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi

    2010-01-01

    GO-FLOW methodology is a success oriented system analysis technique, and is capable of evaluating a large system with complex operational sequences. Recently an integrated analysis framework of the GO-FLOW has been developed for the safety evaluation of elevator systems by the Ministry of Land, Infrastructure, Transport and Tourism, Japanese Government. This paper describes (a) an Overview of the GO-FLOW methodology, (b) Procedure of treating a phased mission problem, (c) Common cause failure analysis, (d) Uncertainty analysis, and (e) Integrated analysis framework. The GO-FLOW methodology is a valuable and useful tool for system reliability analysis and has a wide range of applications. (author)

  7. A Short Integrated Presentation of Valuation, Profitability and Growth Analysis

    DEFF Research Database (Denmark)

    Pettersson, Kim; Sørensen, Ole

    2016-01-01

    We demonstrate how the valuation models used in finance theory and the profitability and growth analysis taught in financial statement analysis are related. Traditional textbooks on finance and financial statement analysis are often very comprehensive, comprising a vast number of chapters. However......, the learning cost associated to this seems to be that many students are unable to understand either the interrelations between the chapters in a financial statement analysis textbook, or the origins of financial information (i.e., financial statements) in applied finance. Thus, the underlying motivation...... of this teaching note is to highlight the purpose of profitability and growth analysis in financial statement analysis by incorporating the point of value relevance in applied finance. We hope this reduced presentation of valuation and profitability and growth analysis will help students to understand...

  8. Development of safety analysis technology for integral reactor; evaluation on safety concerns of integral reactor

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hee Chul; Kim, Woong Sik; Lee, J. H. [Korea Institute of Nuclear Safety, Taejeon (Korea)

    2002-03-01

    The Nuclear Desalination Plant (NDP) is being developed to produce electricity and fresh water, and is expected to locate near population zone. In the aspect of safety, it is required to protect the public and environment from the possible releases of fission products and to prevent the fresh water from the contamination of radioactivity. Thus, in this study, the safety characteristics of the integral reactor adopting passive and inherent safety features significantly different from existing nuclear power plants were investigated. Also, safety requirements applicable to the NDP were analyzed based on the regulatory requirements for current light water reactor and advanced reactor designs, and user requirements for small-medium size reactors. Based on these analyses, some safety concerns to be considered in the design stage have been identified and discussed. They include the use of proven technology for new safety features, systematic event classification and selection, strengthening containment function, and the safety impacts on desalination-related systems. The study presents the general safety requirements applicable to licensing of an integral reactor and suggests additional regulatory requirements, which need to be developed, based on the direction to resolution of the safety concerns. The efforts to identify and technically resolve the safety concerns in the design stage will provide the early confidence of SMART safety and the technical basis to evaluate the safety to designers and reviewers in the future. Suggestion on the development of additional regulatory requirements will contribute for the regulator to taking actions for licensing of an integral reactor. 66 refs., 5 figs., 24 tabs. (Author)

  9. Analysis of integrated plant upgrading/life extension programs

    International Nuclear Information System (INIS)

    McCutchan, D.A.; Massie, H.W. Jr.; McFetridge, R.H.

    1988-01-01

    A present-worth generating cost model has been developed and used to evaluate the economic value of integrated plant upgrading life extension project in nuclear power plants. This paper shows that integrated plant upgrading programs can be developed in which a mix of near-term availability, power rating, and heat rate improvements can be obtained in combination with life extension. All significant benefits and costs are evaluated from the viewpoint of the utility, as measured in discounted revenue requirement differentials between alternative plans which are equivalent in system generating capacity. The near-term upgrading benefits are shown to enhance the benefit picture substantially. In some cases the net benefit is positive, even if the actual life extension proves to be less than expected

  10. Ontological Analysis of Integrated Process Models: testing hypotheses

    Directory of Open Access Journals (Sweden)

    Michael Rosemann

    2001-11-01

    Full Text Available Integrated process modeling is achieving prominence in helping to document and manage business administration and IT processes in organizations. The ARIS framework is a popular example for a framework of integrated process modeling not least because it underlies the 800 or more reference models embedded in the world's most popular ERP package, SAP R/3. This paper demonstrates the usefulness of the Bunge-Wand-Weber (BWW representation model for evaluating modeling grammars such as those constituting ARIS. It reports some initial insights gained from pilot testing Green and Rosemann's (2000 evaluative propositions. Even when considering all five views of ARIS, modelers have problems representing business rules, the scope and boundary of systems, and decomposing models. However, even though it is completely ontologically redundant, users still find the function view useful in modeling.

  11. Business Intelligence Systems Accounting Integration in Romania. a Comparative Analysis

    Directory of Open Access Journals (Sweden)

    Daniela Postolache (Males

    2010-12-01

    Full Text Available Business Intelligence (BI systems have penetrated the Romanian market, providing a real decision support by integrating and synthesizing a large variety of information available in real time, anywhere in the world, including through mobile terminals. This study examines the BI solutions promoted in Romania through Internet sites written in Romanian, in terms of how the accounting information integration is done. Our paper highlights the most used economic and financial indicators and most often selected tools by BI systems developers to assist decisions. The writing bring forward the lack of transparency of the analyzed sites towards of configuration details of economic instruments, which we consider likely to delay the managers from Romania in order to become familiar with BI solutions, and it represent a weakness of this products promotion.

  12. Analysis of the metallic containment integrity of Angra I

    International Nuclear Information System (INIS)

    Costa, J.R.

    1981-01-01

    The main goal of this work is to evaluate the pressure and temperature long-term behavior inside the metalic containment of a PWR building subjected to a postulated loss-of-coolant accident. The computer program used was CONDRU 4. Calculations were made for Angra I plant assuming the ocurrence of the worst accident to the containment integrity. The results obtained from CONDRU were compared with those from CONTEMPT-LT and COCO, which are codes similar to CONDRU. (Author) [pt

  13. Wellbore integrity analysis of a natural CO2 producer

    KAUST Repository

    Crow, Walter

    2010-03-01

    Long-term integrity of existing wells in a CO2-rich environment is essential for ensuring that geological sequestration of CO2 will be an effective technology for mitigating greenhouse gas-induced climate change. The potential for wellbore leakage depends in part on the quality of the original construction as well as geochemical and geomechanical stresses that occur over its life-cycle. Field data are essential for assessing the integrated effect of these factors and their impact on wellbore integrity, defined as the maintenance of isolation between subsurface intervals. In this report, we investigate a 30-year-old well from a natural CO2 production reservoir using a suite of downhole and laboratory tests to characterize isolation performance. These tests included mineralogical and hydrological characterization of 10 core samples of casing/cement/formation, wireline surveys to evaluate well conditions, fluid samples and an in situ permeability test. We find evidence for CO2 migration in the occurrence of carbonated cement and calculate that the effective permeability of an 11′-region of the wellbore barrier system was between 0.5 and 1 milliDarcy. Despite these observations, we find that the amount of fluid migration along the wellbore was probably small because of several factors: the amount of carbonation decreased with distance from the reservoir, cement permeability was low (0.3-30 microDarcy), the cement-casing and cement-formation interfaces were tight, the casing was not corroded, fluid samples lacked CO2, and the pressure gradient between reservoir and caprock was maintained. We conclude that the barrier system has ultimately performed well over the last 3 decades. These results will be used as part of a broader effort to develop a long-term predictive simulation tool to assess wellbore integrity performance in CO2 storage sites. © 2009 Elsevier Ltd. All rights reserved.

  14. Path integral analysis of Jarzynski's equality: Analytical results

    Science.gov (United States)

    Minh, David D. L.; Adib, Artur B.

    2009-02-01

    We apply path integrals to study nonequilibrium work theorems in the context of Brownian dynamics, deriving in particular the equations of motion governing the most typical and most dominant trajectories. For the analytically soluble cases of a moving harmonic potential and a harmonic oscillator with a time-dependent natural frequency, we find such trajectories, evaluate the work-weighted propagators, and validate Jarzynski’s equality.

  15. J-integral estimation analysis for circumferential throughwall cracked pipes

    International Nuclear Information System (INIS)

    Zahoor, A.

    1988-01-01

    J-integral estimation solution is derived for pipes containing a circumferential throughwall crack. Bending moment and axial tension loadings are considered. These solutions are useful for calculating J from single load-displacement record obtained as part of pipe fracture testing, and are applicable for a wide range of flaw length to pipe circumference ratios. Results for J at initiation of crack growth generated using the solution developed in this paper agree well with J results from finite elements analyses. (orig.)

  16. J-integral estimation analysis for circumferential throughwall cracked pipes

    Energy Technology Data Exchange (ETDEWEB)

    Zahoor, A.

    J-integral estimation solution is derived for pipes containing a circumferential throughwall crack. Bending moment and axial tension loadings are considered. These solutions are useful for calculating J from single load-displacement record obtained as part of pipe fracture testing, and are applicable for a wide range of flaw length to pipe circumference ratios. Results for J at initiation of crack growth generated using the solution developed in this paper agree well with J results from finite elements analyses.

  17. Integrability in the theory of Schroedinger operator and harmonic analysis

    International Nuclear Information System (INIS)

    Chalykh, O.A.; Veselov, A.P.

    1993-01-01

    The algebraic integrability for the Schroedinger equation in R n and the role of the quantum Calogero-Sutherland problem and root systems in this context are discussed. For the special values of the parameters in the potential the explicit formula for the eigenfunction of the corresponding Sutherland operator is found. As an application the explicit formula for the zonal spherical functions on the symmetric spaces SU 2 * n /Sp n (type A II in Cartan notations) is presented. (orig.)

  18. Comparative Studies of Traditional (Non-Energy Integration and Energy Integration of Catalytic Reforming Unit using Pinch Analysis

    Directory of Open Access Journals (Sweden)

    M. Alta

    2012-12-01

    Full Text Available Energy Integration of Catalytic Reforming Unit (CRU of Kaduna Refinery and petrochemicals Company Kaduna Nigeria was carried out using Pinch Technology. The pinch analysis was carried out using Maple. Optimum minimum approach temperature of 20 °C was used to determine the energy target. The pinch point temperature was found to be 278 °C. The utilities targets for the minimum approach temperature were found to be 72711839.47 kJ/hr and 87105834.43 kJ/hr for hot and cold utilities respectively. Pinch analysis as an energy integration technique was found to save more energy and utilities cost than the traditional energy technique. Key words: Pinch point, CRU, Energy Target, Maple

  19. Thermal photovoltaic solar integrated system analysis using neural networks

    Energy Technology Data Exchange (ETDEWEB)

    Ashhab, S. [Hashemite Univ., Zarqa (Jordan). Dept. of Mechanical Engineering

    2007-07-01

    The energy demand in Jordan is primarily met by petroleum products. As such, the development of renewable energy systems is quite attractive. In particular, solar energy is a promising renewable energy source in Jordan and has been used for food canning, paper production, air-conditioning and sterilization. Artificial neural networks (ANNs) have received significant attention due to their capabilities in forecasting, modelling of complex nonlinear systems and control. ANNs have been used for forecasting solar energy. This paper presented a study that examined a thermal photovoltaic solar integrated system that was built in Jordan. Historical input-output system data that was collected experimentally was used to train an ANN that predicted the collector, PV module, pump and total efficiencies. The model predicted the efficiencies well and can therefore be utilized to find the operating conditions of the system that will produce the maximum system efficiencies. The paper provided a description of the photovoltaic solar system including equations for PV module efficiency; pump efficiency; and total efficiency. The paper also presented data relevant to the system performance and neural networks. The results of a neural net model were also presented based on the thermal PV solar integrated system data that was collected. It was concluded that the neural net model of the thermal photovoltaic solar integrated system set the background for achieving the best system performance. 10 refs., 6 figs.

  20. Performance analysis of IMS based LTE and WIMAX integration architectures

    Directory of Open Access Journals (Sweden)

    A. Bagubali

    2016-12-01

    Full Text Available In the current networking field many research works are going on regarding the integration of different wireless technologies, with the aim of providing uninterrupted connectivity to the user anywhere, with high data rates due to increased demand. However, the number of objects like smart devices, industrial machines, smart homes, connected by wireless interface is dramatically increasing due to the evolution of cloud computing and internet of things technology. This Paper begins with the challenges involved in such integrations and then explains the role of different couplings and different architectures. This paper also gives further improvement in the LTE and Wimax integration architectures to provide seamless vertical handover and flexible quality of service for supporting voice, video, multimedia services over IP network and mobility management with the help of IMS networks. Evaluation of various parameters like handover delay, cost of signalling, packet loss,, is done and the performance of the interworking architecture is analysed from the simulation results. Finally, it concludes that the cross layer scenario is better than the non cross layer scenario.

  1. Epidaurus: aggregation and integration analysis of prostate cancer epigenome.

    Science.gov (United States)

    Wang, Liguo; Huang, Haojie; Dougherty, Gregory; Zhao, Yu; Hossain, Asif; Kocher, Jean-Pierre A

    2015-01-01

    Integrative analyses of epigenetic data promise a deeper understanding of the epigenome. Epidaurus is a bioinformatics tool used to effectively reveal inter-dataset relevance and differences through data aggregation, integration and visualization. In this study, we demonstrated the utility of Epidaurus in validating hypotheses and generating novel biological insights. In particular, we described the use of Epidaurus to (i) integrate epigenetic data from prostate cancer cell lines to validate the activation function of EZH2 in castration-resistant prostate cancer and to (ii) study the mechanism of androgen receptor (AR) binding deregulation induced by the knockdown of FOXA1. We found that EZH2's noncanonical activation function was reaffirmed by its association with active histone markers and the lack of association with repressive markers. More importantly, we revealed that the binding of AR was selectively reprogramed to promoter regions, leading to the up-regulation of hundreds of cancer-associated genes including EGFR. The prebuilt epigenetic dataset from commonly used cell lines (LNCaP, VCaP, LNCaP-Abl, MCF7, GM12878, K562, HeLa-S3, A549, HePG2) makes Epidaurus a useful online resource for epigenetic research. As standalone software, Epidaurus is specifically designed to process user customized datasets with both efficiency and convenience. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  2. SUSTAINABILITY OF SUSTAINABLE PALM OIL: A MARKET INTEGRATION ANALYSIS

    Directory of Open Access Journals (Sweden)

    Diana Chalil

    2016-07-01

    Full Text Available Crude Palm Oil (CPO is the biggest consumed vegetable oil in the world. The increase in CPO production raises concern on the environmental impact even outside the producing countries. As a response to this matter, the EU has made a requirement to only import certified CPO (CSPO. India and China, the two biggest importers in the world, are less restrictive to the environmental issues, and their demands are more influenced by CPO price levels. These countries are the main export markets for Indonesia and Malaysia, the two biggest CPO exporters in the world. This research using monthly price data from the Netherlands, Germany, Italy, EU28, India, China, Indonesia and Malaysia. Market integrations are tested with Cointegration Test, Vector Error Correction Model and Seemingly Unrelated Regression. The results show that these markets are integrated, but European countries are unlikely to lead the price movement. Therefore, the concern on sustainable certification from the European countries still slowly spreads to other main importers, resulting in low absorption of CSPO. Keywords: market integration; sustainable palm oil; seemingly unrelated regression; vector Error correction model

  3. Integrated Electrochemical Analysis System with Microfluidic and Sensing Functions

    Directory of Open Access Journals (Sweden)

    Hiroaki Suzuki

    2008-02-01

    Full Text Available An integrated device that carries out the timely transport of solutions andconducts electroanalysis was constructed. The transport of solutions was based oncapillary action in overall hydrophilic flow channels and control by valves that operateon the basis of electrowetting. Electrochemical sensors including glucose, lactate,glutamic oxaloacetic transaminase (GOT, glutamic pyruvic transaminase (GPT, pH,ammonia, urea, and creatinine were integrated. An air gap structure was used for theammonia, urea, and creatinine sensors to realize a rapid response. To enhance thetransport of ammonia that existed or was produced by the enzymatic reactions, the pHof the solution was elevated by mixing it with a NaOH solution using a valve based onelectrowetting. The sensors for GOT and GPT used a freeze-dried substrate matrix torealize rapid mixing. The sample solution was transported to required sensing sites atdesired times. The integrated sensors showed distinct responses when a sample solutionreached the respective sensing sites. Linear relationships were observed between theoutput signals and the concentration or the logarithm of the concentration of theanalytes. An interferent, L-ascorbic acid, could be eliminated electrochemically in thesample injection port.

  4. Locating new uranium occurrence by integrated weighted analysis in Kaladgi basin, Karnataka

    International Nuclear Information System (INIS)

    Sridhar, M.; Chaturvedi, A.K.; Rai, A.K.

    2014-01-01

    This study aims at identifying uranium potential zones by integrated analysis of thematic layer interpreted and derived from airborne radiometric and magnetic data, satellite data along with available ground geochemical data in western part of Kaladgi basin. Integrated weighted analysis of spatial datasets which included airborne radiometric data (eU, eTh and % K conc.), litho-structural map. hydrogeochemical U conc., and geomorphological data pertaining to study area, was attempted. The weightage analysis was done in GIS environment where different spatial dataset were brought on to a single platform and were analyzed by integration

  5. Sparse multivariate factor analysis regression models and its applications to integrative genomics analysis.

    Science.gov (United States)

    Zhou, Yan; Wang, Pei; Wang, Xianlong; Zhu, Ji; Song, Peter X-K

    2017-01-01

    The multivariate regression model is a useful tool to explore complex associations between two kinds of molecular markers, which enables the understanding of the biological pathways underlying disease etiology. For a set of correlated response variables, accounting for such dependency can increase statistical power. Motivated by integrative genomic data analyses, we propose a new methodology-sparse multivariate factor analysis regression model (smFARM), in which correlations of response variables are assumed to follow a factor analysis model with latent factors. This proposed method not only allows us to address the challenge that the number of association parameters is larger than the sample size, but also to adjust for unobserved genetic and/or nongenetic factors that potentially conceal the underlying response-predictor associations. The proposed smFARM is implemented by the EM algorithm and the blockwise coordinate descent algorithm. The proposed methodology is evaluated and compared to the existing methods through extensive simulation studies. Our results show that accounting for latent factors through the proposed smFARM can improve sensitivity of signal detection and accuracy of sparse association map estimation. We illustrate smFARM by two integrative genomics analysis examples, a breast cancer dataset, and an ovarian cancer dataset, to assess the relationship between DNA copy numbers and gene expression arrays to understand genetic regulatory patterns relevant to the disease. We identify two trans-hub regions: one in cytoband 17q12 whose amplification influences the RNA expression levels of important breast cancer genes, and the other in cytoband 9q21.32-33, which is associated with chemoresistance in ovarian cancer. © 2016 WILEY PERIODICALS, INC.

  6. 'Integration'

    DEFF Research Database (Denmark)

    Olwig, Karen Fog

    2011-01-01

    , while the countries have adopted disparate policies and ideologies, differences in the actual treatment and attitudes towards immigrants and refugees in everyday life are less clear, due to parallel integration programmes based on strong similarities in the welfare systems and in cultural notions...... of equality in the three societies. Finally, it shows that family relations play a central role in immigrants’ and refugees’ establishment of a new life in the receiving societies, even though the welfare society takes on many of the social and economic functions of the family....

  7. Comparison of Different Technologies for Integrated Solar Combined Cycles: Analysis of Concentrating Technology and Solar Integration

    Directory of Open Access Journals (Sweden)

    Antonio Rovira

    2018-04-01

    Full Text Available This paper compares the annual performance of Integrated Solar Combined Cycles (ISCCs using different solar concentration technologies: parabolic trough collectors (PTC, linear Fresnel reflectors (LFR and central tower receiver (CT. Each solar technology (i.e. PTC, LFR and CT is proposed to integrate solar energy into the combined cycle in two different ways. The first one is based on the use of solar energy to evaporate water of the steam cycle by means of direct steam generation (DSG, increasing the steam production of the high pressure level of the steam generator. The other one is based on the use of solar energy to preheat the pressurized air at the exit of the gas turbine compressor before it is introduced in the combustion chamber, reducing the fuel consumption. Results show that ISCC with DSG increases the yearly production while solar air heating reduces it due to the incremental pressure drop. However, air heating allows significantly higher solar-to-electricity efficiencies and lower heat rates. Regarding the solar technologies, PTC provides the best thermal results.

  8. Multi-color fluorescent DNA analysis in an integrated optofluidic lab on a chip

    OpenAIRE

    Dongre, C.

    2010-01-01

    Abstract: Sorting and sizing of DNA molecules within the human genome project has enabled the genetic mapping of various illnesses. Furthermore by employing tiny lab-on-a-chip device, integrated DNA sequencing and genetic diagnostics have become feasible. We present the combination of capillary electrophoresis with laser-induced fluorescence for optofluidic integration toward an on-chip bio-analysis tool. Integrated optical fluorescence excitation allows for a high spatial resolution (12 μm) ...

  9. Multi-color fluorescent DNA analysis in an integrated optofluidic lab-on-a-chip

    OpenAIRE

    Dongre, C.; van Weerd, J.; van Weeghel, R.; Martinez-Vazquez, R.; Osellame, R.; Cerullo, G.; Besselink, G.A.J.; van den Vlekkert, H.H.; Hoekstra, Hugo; Pollnau, Markus

    2010-01-01

    Sorting and sizing of DNA molecules within the human genome project has enabled the genetic mapping of various illnesses. By employing tiny lab-on-a-chip devices for such DNA analysis, integrated DNA sequencing and genetic diagnostics have become feasible. However, such diagnostic chips typically lack integrated sensing capability. We address this issue by combining microfluidic capillary electrophoresis with laser-induced fluorescence detection resulting in optofluidic integration towards an...

  10. Semantic integration of gene expression analysis tools and data sources using software connectors

    Science.gov (United States)

    2013-01-01

    Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools

  11. An Integrated Approach to Thermal Analysis of Pharmaceutical Solids

    Science.gov (United States)

    Riley, Shelley R. Rabel

    2015-01-01

    A three-tiered experiment for undergraduate Instrumental Analysis students is presented in which students characterize the solid-state thermal behavior of an active pharmaceutical ingredient (acetaminophen) and excipient (a-lactose hydrate) using differential scanning calorimetry, thermogravimetric analysis, and thermal microscopy. Students are…

  12. Integrating forest inventory and analysis data into a LIDAR-based carbon monitoring system

    Science.gov (United States)

    Kristofer D. Johnson; Richard Birdsey; Andrew O Finley; Anu Swantaran; Ralph Dubayah; Craig Wayson; Rachel. Riemann

    2014-01-01

    Forest Inventory and Analysis (FIA) data may be a valuable component of a LIDAR-based carbon monitoring system, but integration of the two observation systems is not without challenges. To explore integration methods, two wall-to-wall LIDAR-derived biomass maps were compared to FIA data at both the plot and county levels in Anne Arundel and Howard Counties in Maryland...

  13. Integration of thermodynamic insights and MINLP optimisation for the synthesis, design and analysis of process flowsheets

    DEFF Research Database (Denmark)

    Hostrup, Martin; Gani, Rafiqul; Kravanja, Zdravko

    1999-01-01

    This paper presents an integrated approach to the solution of process synthesis, design and analysis problems. Integration is achieved by combining two different techniques, synthesis based on thermodynamic insights and structural optimization together with a simulation engine and a properties pr...

  14. Thermodynamic analysis and optimization of IT-SOFC-based integrated coal gasification fuel cell power plants

    NARCIS (Netherlands)

    Romano, M.C.; Campanari, S.; Spallina, V.; Lozza, G.

    2011-01-01

    This work discusses the thermodynamic analysis of integrated gasification fuel cell plants, where a simple cycle gas turbine works in a hybrid cycle with a pressurized intermediate temperature–solid oxide fuel cell (SOFC), integrated with a coal gasification and syngas cleanup island and a bottoming

  15. Cross-Border Trade: An Analysis of Trade and Market Integration ...

    African Journals Online (AJOL)

    An assessment of cross-border trade and market integration reveal that inhabitants of the border areas have become economically, socially and politically integrated in spite of the conflict over the Bakassi Peninsula. Based on empirical analysis, bilateral agreements between Nigeria and Cameroon have made negligible ...

  16. Analyzing Developing Country Market Integration using Incomplete Price Data and Cluster Analysis

    NARCIS (Netherlands)

    Ansah, I.G.; Gardebroek, Koos; Ihle, R.; Jaletac, M.

    2015-01-01

    Recent global food price developments have spurred renewed interest in analyzing integration of local markets to global markets. A popular approach to quantify market integration is cointegration analysis. However, local market price data often has missing values, outliers, or short and incomplete

  17. Probabilistic treatment of a PWR containment integrity analysis

    International Nuclear Information System (INIS)

    Mark, R.H.

    1978-01-01

    The design analysis for the LOCA (Loss of Coolant Accident) mass and energy release transient and the containment peak pressure transient contain many conservatisms in the parameters and analytical models. The best estimate analysis presented in this report show the large effect these conservatisms have on the design of the containment. Furthermore, the probability analysis presented in this report shows that the probability of the parameters and models being the conservative ones used in the design analysis is extremely small. Also this analysis shows that the probability of exceeding containment design pressure is even smaller. The results of this paper show that a considerable reduction in containment volume could be made while still retaining a large margin for safety. (author)

  18. Driving Pattern Analysis for Electric Vehicle (EV) Grid Integration Study

    DEFF Research Database (Denmark)

    Wu, Qiuwei; Nielsen, Arne Hejde; Østergaard, Jacob

    2010-01-01

    In order to facilitate the integration of electric vehicles (EVs) into the Danish power system, the driving data in Denmark were analyzed to extract the information of driving distances and driving time periods which were used to represent the driving requirements and the EV unavailability...... from the driving time periods to show how many cars are available for charging and discharging in each time period. The obtained EV availability data are in one hour time periods and one quarter time periods for different study purposes. The EV availability data of one hour time period are to be used...

  19. INTEGRATED METHODOLOGY FOR PRODUCT PLANNING USING MULTI CRITERIA ANALYSIS

    Directory of Open Access Journals (Sweden)

    Tarun Soota

    2016-09-01

    Full Text Available Integrated approach to multi-criteria decision problems is proposed using quality function deployment and analytical network process. The objective of the work is to rationalize and improve the method of analyzing and interpreting customer needs and technical requirements. The methodology is used to determine, prioritize engineering requirements based on customer needs for development of best product. Framework allows decision maker to decompose a complex problem in a hierarchical structure to show relationship between objective and criteria. Multi-criteria decision modeling is used for extending the hierarchy process to both dependence and feedback. A case study on bikes is presented for the proposed model.

  20. Parametric design and analysis framework with integrated dynamic models

    DEFF Research Database (Denmark)

    Negendahl, Kristoffer

    2014-01-01

    of building energy and indoor environment, are generally confined to late in the design process. Consequence based design is a framework intended for the early design stage. It involves interdisciplinary expertise that secures validity and quality assurance with a simulationist while sustaining autonomous...... control with the building designer. Consequence based design is defined by the specific use of integrated dynamic modeling, which includes the parametric capabilities of a scripting tool and building simulation features of a building performance simulation tool. The framework can lead to enhanced...

  1. Analysis and Evaluation of Statistical Models for Integrated Circuits Design

    Directory of Open Access Journals (Sweden)

    Sáenz-Noval J.J.

    2011-10-01

    Full Text Available Statistical models for integrated circuits (IC allow us to estimate the percentage of acceptable devices in the batch before fabrication. Actually, Pelgrom is the statistical model most accepted in the industry; however it was derived from a micrometer technology, which does not guarantee reliability in nanometric manufacturing processes. This work considers three of the most relevant statistical models in the industry and evaluates their limitations and advantages in analog design, so that the designer has a better criterion to make a choice. Moreover, it shows how several statistical models can be used for each one of the stages and design purposes.

  2. NEW CORPORATE REPORTING TRENDS. ANALYSIS ON THE EVOLUTION OF INTEGRATED REPORTING

    Directory of Open Access Journals (Sweden)

    Dragu Ioana

    2013-07-01

    Full Text Available The objective of this paper is to present the new corporate reporting trends of the 21st century. Integrated reporting has been launched through a common initiative of the International Integrated Reporting Committee and global accounting organizations. However, the history of integrated reports starts before the initiative of the IIRC, and goes back in time when large corporations begun to disclose sustainability and corporate social responsibility information. Further on, we claim that the initial sustainability and CSR reports that were issued separate along with the financial annual report represent the predecessors of the current integrated reports. The paper consists of a literature review analysis on the evolution of integrated reporting, from the first stage of international non-financial initiatives, up to the current state of a single integrated annual report. In order to understand the background of integrated reporting we analyze the most relevant research papers on corporate reporting, focusing on the international organizations’ perspective on non-financial reporting, in general, and integrated reporting, in particular. Based on the literature overview, we subtracted the essential information for setting the framework of the integrated reporting evolution. The findings suggest that we can delimitate three main stages in the evolution of integrated reports, namely: the non-financial reporting initiatives, the sustainability era, and the revolution of integrated reporting. We illustrate these results by presenting each relevant point in the history of integrated reporting on a time scale axis, developed with the purpose of defining the road to integrated reporting at theoretical, empirical, and practical levels. We consider the current investigation as relevant for future studies concerning integrated reports, as this is a new area of research still in its infancy. The originality of the research derives from the novelty of

  3. Integrating R and Hadoop for Big Data Analysis

    Directory of Open Access Journals (Sweden)

    Bogdan Oancea

    2014-06-01

    Full Text Available Analyzing and working with big data could be very difficult using classical means like relational database management systems or desktop software packages for statistics and visualization. Instead, big data requires large clusters with hundreds or even thousands of computing nodes. Official statistics is increasingly considering big data for deriving new statistics because big data sources could produce more relevant and timely statistics than traditional sources. One of the software tools successfully and wide spread used for storage and processing of big data sets on clusters of commodity hardware is Hadoop. Hadoop framework contains libraries, a distributed file-system (HDFS, a resource-management platform and implements a version of the MapReduce programming model for large scale data processing. In this paper we investigate the possibilities of integrating Hadoop with R which is a popular software used for statistical computing and data visualization. We present three ways of integrating them: R with Streaming, Rhipe and RHadoop and we emphasize the advantages and disadvantages of each solution.

  4. Western Wind and Solar Integration Study: Hydropower Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Acker, T.; Pete, C.

    2012-03-01

    The U.S. Department of Energy's (DOE) study of 20% Wind Energy by 2030 was conducted to consider the benefits, challenges, and costs associated with sourcing 20% of U.S. energy consumption from wind power by 2030. This study found that with proactive measures, no insurmountable barriers were identified to meet the 20% goal. Following this study, DOE and the National Renewable Energy Laboratory (NREL) conducted two more studies: the Eastern Wind Integration and Transmission Study (EWITS) covering the eastern portion of the U.S., and the Western Wind and Solar Integration Study (WWSIS) covering the western portion of the United States. The WWSIS was conducted by NREL and research partner General Electric (GE) in order to provide insight into the costs, technical or physical barriers, and operational impacts caused by the variability and uncertainty of wind, photovoltaic, and concentrated solar power when employed to serve up to 35% of the load energy in the WestConnect region (Arizona, Colorado, Nevada, New Mexico, and Wyoming). WestConnect is composed of several utility companies working collaboratively to assess stakeholder and market needs to and develop cost-effective improvements to the western wholesale electricity market. Participants include the Arizona Public Service, El Paso Electric Company, NV Energy, Public Service of New Mexico, Salt River Project, Tri-State Generation and Transmission Cooperative, Tucson Electric Power, Xcel Energy and the Western Area Power Administration.

  5. Development of Safety Analysis Technology for Integral Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Sim, S. K. [Korea Atomic Energy Research Institute, Taejeon (Korea); Seul, K. W.; Kim, W. S.; Kim, W. K.; Yun, Y. G.; Ahn, H. J.; Lee, J. S.; Sin, A. D. [Korea Institute of Nuclear Safety, Taejeon (Korea)

    2000-03-01

    The Nuclear Desalination Plant(NDP) is being developed to produce electricity and fresh water, and is expected to locate near population zone. In the aspect of safety, it is required to protect the public and environment from the possible releases of fission products and to prevent the fresh water from the contamination of radioactivity. Thus, in a present study, the safety characteristics of the integral reactor adopting passive and inherent safety features significantly different from existing nuclear power plants were investigated based on the design of foreign and domestic integral reactors. Also, safety requirements applicable to the NDP were analyzed based on the regulatory requirements for current and advanced reactor designs, and use requirements for small-medium size reactors. Based on these analyses, some safety concerns to be considered in the design stage have been identified. They includes the use of proven technology for new safety systems, the systematic classification and selection of design basis accidents, and the safety assurance of desalination-related systems. These efforts to identify and resolve the safety concerns in the design stage will provide the early confidence of SMART safety to designers, and the technical basis to evaluate the safety to reviewers in the future. 8 refs., 20 figs., 4 tabs. (Author)

  6. Integrated vehicle-based safety systems (IVBSS) : light vehicle platform field operational test data analysis plan.

    Science.gov (United States)

    2009-12-22

    This document presents the University of Michigan Transportation Research Institutes plan to : perform analysis of data collected from the light vehicle platform field operational test of the : Integrated Vehicle-Based Safety Systems (IVBSS) progr...

  7. Integrated vehicle-based safety systems (IVBSS) : heavy truck platform field operational test data analysis plan.

    Science.gov (United States)

    2009-11-23

    This document presents the University of Michigan Transportation Research Institutes plan to perform : analysis of data collected from the heavy truck platform field operational test of the Integrated Vehicle- : Based Safety Systems (IVBSS) progra...

  8. Transient Thermal Analysis of 3-D Integrated Circuits Packages by the DGTD Method

    KAUST Repository

    Li, Ping; Dong, Yilin; Tang, Min; Mao, Junfa; Jiang, Li Jun; Bagci, Hakan

    2017-01-01

    Since accurate thermal analysis plays a critical role in the thermal design and management of the 3-D system-level integration, in this paper, a discontinuous Galerkin time-domain (DGTD) algorithm is proposed to achieve this purpose

  9. FUZZY DECISION ANALYSIS FOR INTEGRATED ENVIRONMENTAL VULNERABILITY ASSESSMENT OF THE MID-ATLANTIC REGION

    Science.gov (United States)

    A fuzzy decision analysis method for integrating ecological indicators is developed. This is a combination of a fuzzy ranking method and the Analytic Hierarchy Process (AHP). The method is capable ranking ecosystems in terms of environmental conditions and suggesting cumula...

  10. Integrated corridor management initiative : demonstration phase evaluation - Dallas technical capability analysis test plan.

    Science.gov (United States)

    This report presents the test plan for conducting the Technical Capability Analysis for the United States : Department of Transportation (U.S. DOT) evaluation of the Dallas U.S. 75 Integrated Corridor : Management (ICM) Initiative Demonstration. The ...

  11. Integrated corridor management initiative : demonstration phase evaluation, San Diego technical capability analysis test plan.

    Science.gov (United States)

    2012-08-01

    This report presents the test plan for conducting the Technical Capability Analysis for the United States Department of Transportation (U.S. DOT) evaluation of the San Diego Integrated Corridor Management (ICM) Initiative Demonstration. The ICM proje...

  12. 10 CFR 70.62 - Safety program and integrated safety analysis.

    Science.gov (United States)

    2010-01-01

    ...; (iv) Potential accident sequences caused by process deviations or other events internal to the... have experience in nuclear criticality safety, radiation safety, fire safety, and chemical process... this safety program; namely, process safety information, integrated safety analysis, and management...

  13. Integration of targeted health interventions into health systems: a conceptual framework for analysis.

    Science.gov (United States)

    Atun, Rifat; de Jongh, Thyra; Secci, Federica; Ohiri, Kelechi; Adeyi, Olusoji

    2010-03-01

    The benefits of integrating programmes that emphasize specific interventions into health systems to improve health outcomes have been widely debated. This debate has been driven by narrow binary considerations of integrated (horizontal) versus non-integrated (vertical) programmes, and characterized by polarization of views with protagonists for and against integration arguing the relative merits of each approach. The presence of both integrated and non-integrated programmes in many countries suggests benefits to each approach. While the terms 'vertical' and 'integrated' are widely used, they each describe a range of phenomena. In practice the dichotomy between vertical and horizontal is not rigid and the extent of verticality or integration varies between programmes. However, systematic analysis of the relative merits of integration in various contexts and for different interventions is complicated as there is no commonly accepted definition of 'integration'-a term loosely used to describe a variety of organizational arrangements for a range of programmes in different settings. We present an analytical framework which enables deconstruction of the term integration into multiple facets, each corresponding to a critical health system function. Our conceptual framework builds on theoretical propositions and empirical research in innovation studies, and in particular adoption and diffusion of innovations within health systems, and builds on our own earlier empirical research. It brings together the critical elements that affect adoption, diffusion and assimilation of a health intervention, and in doing so enables systematic and holistic exploration of the extent to which different interventions are integrated in varied settings and the reasons for the variation. The conceptual framework and the analytical approach we propose are intended to facilitate analysis in evaluative and formative studies of-and policies on-integration, for use in systematically comparing and

  14. Integrated analysis of rock mass deformation within shaft protective pillar

    Directory of Open Access Journals (Sweden)

    Ewa Warchala

    2016-01-01

    Full Text Available The paper presents an analysis of the rock mass deformation resulting from mining in the vicinity of the shaft protection pillar. A methodology of deformation prediction is based on a deterministic method using Finite Element Method (FEM. The FEM solution is based on the knowledge of the geomechanical properties of the various geological formations, tectonic faults, types of mining systems, and the complexity of the behaviour of the rock mass. The analysis gave the stress and displacement fields in the rock mass. Results of the analysis will allow for design of an optimal mining system. The analysis is illustrated by an example of the shaft R-VIII Rudna Mine KGHM Polish Copper SA.

  15. Integrating structure, conduct and performance into value chain analysis

    NARCIS (Netherlands)

    Santana De Figueiredo Junior, H.; Meuwissen, M.P.M.; Oude Lansink, A.G.J.M.

    2014-01-01

    Value chain analysis has been adopted by several research and funding institutions for analysing local development opportunities. Development practitioners, however, are still looking for more solid grounds for value chain strategy development, especially since the expected outcomes of

  16. Integration and global analysis of isothermal titration calorimetry data for studying macromolecular interactions.

    Science.gov (United States)

    Brautigam, Chad A; Zhao, Huaying; Vargas, Carolyn; Keller, Sandro; Schuck, Peter

    2016-05-01

    Isothermal titration calorimetry (ITC) is a powerful and widely used method to measure the energetics of macromolecular interactions by recording a thermogram of differential heating power during a titration. However, traditional ITC analysis is limited by stochastic thermogram noise and by the limited information content of a single titration experiment. Here we present a protocol for bias-free thermogram integration based on automated shape analysis of the injection peaks, followed by combination of isotherms from different calorimetric titration experiments into a global analysis, statistical analysis of binding parameters and graphical presentation of the results. This is performed using the integrated public-domain software packages NITPIC, SEDPHAT and GUSSI. The recently developed low-noise thermogram integration approach and global analysis allow for more precise parameter estimates and more reliable quantification of multisite and multicomponent cooperative and competitive interactions. Titration experiments typically take 1-2.5 h each, and global analysis usually takes 10-20 min.

  17. Evaluation of Fourier integral. Spectral analysis of seismic events

    International Nuclear Information System (INIS)

    Chitaru, Cristian; Enescu, Dumitru

    2003-01-01

    Spectral analysis of seismic events represents a method for great earthquake prediction. The seismic signal is not a sinusoidal signal; for this, it is necessary to find a method for best approximation of real signal with a sinusoidal signal. The 'Quanterra' broadband station allows the data access in numerical and/or graphical forms. With the numerical form we can easily make a computer program (MSOFFICE-EXCEL) for spectral analysis. (authors)

  18. PHIDIAS: a pathogen-host interaction data integration and analysis system

    OpenAIRE

    Xiang, Zuoshuang; Tian, Yuying; He, Yongqun

    2007-01-01

    The Pathogen-Host Interaction Data Integration and Analysis System (PHIDIAS) is a web-based database system that serves as a centralized source to search, compare, and analyze integrated genome sequences, conserved domains, and gene expression data related to pathogen-host interactions (PHIs) for pathogen species designated as high priority agents for public health and biological security. In addition, PHIDIAS allows submission, search and analysis of PHI genes and molecular networks curated ...

  19. Delight2 Daylighting Analysis in Energy Plus: Integration and Preliminary User Results

    Energy Technology Data Exchange (ETDEWEB)

    Carroll, William L.; Hitchcock, Robert J.

    2005-04-26

    DElight is a simulation engine for daylight and electric lighting system analysis in buildings. DElight calculates interior illuminance levels from daylight, and the subsequent contribution required from electric lighting to meet a desired interior illuminance. DElight has been specifically designed to integrate with building thermal simulation tools. This paper updates the DElight capability set, the status of integration into the simulation tool EnergyPlus, and describes a sample analysis of a simple model from the user perspective.

  20. An Integrated Strategy Framework (ISF) for Combining Porter's 5-Forces, Diamond, PESTEL, and SWOT Analysis

    OpenAIRE

    Anton, Roman

    2015-01-01

    INTRODUCTION Porter's Five-Forces, Porter's Diamond, PESTEL, the 6th-Forths, and Humphrey's SWOT analysis are among the most important and popular concepts taught in business schools around the world. A new integrated strategy framework (ISF) combines all major concepts. PURPOSE Porter's Five-Forces, Porter's Diamond, PESTEL, the 6th-Forths, and Humphrey's SWOT analysis are among the most important and popular concepts taught in business schools around the world. A new integrated strategy fr...

  1. Computational analysis of battery optimized reactor integral system

    International Nuclear Information System (INIS)

    Hwang, J. S.; Son, H. M.; Jeong, W. S.; Kim, T. W.; Suh, K. Y.

    2007-01-01

    Battery Optimized Reactor Integral System (BORIS) is being developed as a multi-purpose fast spectrum reactor cooled by lead (Pb). BORIS is an integral optimized reactor with an ultra-long life core. BORIS aims to satisfy various energy demands maintaining inherent safety with the primary coolant Pb, and improving economics. BORIS is being designed to generate 23 MW t h with 10 MW e for at least twenty consecutive years without refueling and to meet the Generation IV Nuclear Energy System goals of sustainability, safety, reliability, and economics. BORIS is conceptualized to be used as the main power and heat source for remote areas and barren lands, and also considered to be deployed for desalinisation purpose. BORIS, based on modular components to be viable for rapid construction and easy maintenance, adopts an integrated heat exchanger system operated by natural circulation of Pb without pumps to realize a small sized reactor. The BORIS primary system is designed through an optimization study. Thermal hydraulic characteristics during a reactor steady state with heat source and sink by core and heat exchanger, respectively, have been carried out by utilizing a computational fluid dynamics code and hand calculations based on first principles. This paper analyzes a transient condition of the BORIS primary system. The Pb coolant was selected for its lower chemical activity with air or water than sodium (Na) and good thermal characteristics. The reactor transient conditions such as core blockage, heat exchanger failure, and loss of heat sink, were selected for this study. Blockage in the core or its inlet structure causes localized flow starvation in one or several fuel assemblies. The coolant loop blockages cause a more or less uniform flow reduction across the core, which may trigger coolant temperature transient. General conservation equations were applied to model the primary system transients. Numerical approaches were adopted to discretized the governing

  2. Formal Analysis of Key Integrity in PKCS#11

    Science.gov (United States)

    Falcone, Andrea; Focardi, Riccardo

    PKCS#11 is a standard API to cryptographic devices such as smarcards, hardware security modules and usb crypto-tokens. Though widely adopted, this API has been shown to be prone to attacks in which a malicious user gains access to the sensitive keys stored in the devices. In 2008, Delaune, Kremer and Steel proposed a model to formally reason on this kind of attacks. We extend this model to also describe flaws that are based on integrity violations of the stored keys. In particular, we consider scenarios in which a malicious overwriting of keys might fool honest users into using attacker's own keys, while performing sensitive operations. We further enrich the model with a trusted key mechanism ensuring that only controlled, non-tampered keys are used in cryptographic operations, and we show how this modified API prevents the above mentioned key-replacement attacks.

  3. Structural integrity analysis of the 224U elevator mothballing

    Energy Technology Data Exchange (ETDEWEB)

    Boehnke, W.M.

    1994-11-18

    As part of the preparation of Building 224U for turnover to Decontamination and Decommissioning, it is necessary to place the elevator in a mothballed condition so that it can be reactivated for use after 10 to 25 years. This mothballing is going to be accomplished by landing the counterweight on wooden timbers and suspending the elevator cab with wire rope or chain slings. This will take the load off the cables and make it relatively easy to reactive. The objective of this Supporting Document is to verify the structural integrity of all of the load bearing components involved in mothballing the 224U Building elevator. Building 224U is part of the UO{sub 3} Plant where uranyl nitrates from the PUREX Plant was converted to UO{sub 3} powder.

  4. Integration of Formal Job Hazard Analysis and ALARA Work Practice

    CERN Document Server

    Nelsen, D P

    2002-01-01

    ALARA work practices have traditionally centered on reducing radiological exposure and controlling contamination. As such, ALARA policies and procedures are not well suited to a wide range of chemical and human health issues. Assessing relative risk, identifying appropriate engineering/administrative controls and selecting proper Personal Protective Equipment (PPE) for non nuclear work activities extends beyond the limitations of traditional ALARA programs. Forging a comprehensive safety management program in today's (2002) work environment requires a disciplined dialog between health and safety professionals (e.g. safety, engineering, environmental, quality assurance, industrial hygiene, ALARA, etc.) and personnel working in the field. Integrating organizational priorities, maintaining effective pre-planning of work and supporting a team-based approach to safety management represents today's hallmark of safety excellence. Relying on the mandates of any single safety program does not provide industrial hygien...

  5. Process integrated modelling for steelmaking Life Cycle Inventory analysis

    International Nuclear Information System (INIS)

    Iosif, Ana-Maria; Hanrot, Francois; Ablitzer, Denis

    2008-01-01

    During recent years, strict environmental regulations have been implemented by governments for the steelmaking industry in order to reduce their environmental impact. In the frame of the ULCOS project, we have developed a new methodological framework which combines the process integrated modelling approach with Life Cycle Assessment (LCA) method in order to carry out the Life Cycle Inventory of steelmaking. In the current paper, this new concept has been applied to the sinter plant which is the most polluting steelmaking process. It has been shown that this approach is a powerful tool to make the collection of data easier, to save time and to provide reliable information concerning the environmental diagnostic of the steelmaking processes

  6. Integrative Analysis of the Physical Transport Network into Australia.

    Directory of Open Access Journals (Sweden)

    Robert C Cope

    Full Text Available Effective biosecurity is necessary to protect nations and their citizens from a variety of threats, including emerging infectious diseases, agricultural or environmental pests and pathogens, and illegal wildlife trade. The physical pathways by which these threats are transported internationally, predominantly shipping and air traffic, have undergone significant growth and changes in spatial distributions in recent decades. An understanding of the specific pathways and donor-traffic hotspots created by this integrated physical transport network is vital for the development of effective biosecurity strategies into the future. In this study, we analysed the physical transport network into Australia over the period 1999-2012. Seaborne and air traffic were weighted to calculate a "weighted cumulative impact" score for each source region worldwide, each year. High risk source regions, and those source regions that underwent substantial changes in risk over the study period, were determined. An overall risk ranking was calculated by integrating across all possible weighting combinations. The source regions having greatest overall physical connectedness with Australia were Singapore, which is a global transport hub, and the North Island of New Zealand, a close regional trading partner with Australia. Both those regions with large amounts of traffic across multiple vectors (e.g., Hong Kong, and those with high levels of traffic of only one type (e.g., Bali, Indonesia with respect to passenger flights, were represented among high risk source regions. These data provide a baseline model for the transport of individuals and commodities against which the effectiveness of biosecurity controls may be assessed, and are a valuable tool in the development of future biosecurity policy.

  7. Evaluation of time integration methods for transient response analysis of nonlinear structures

    International Nuclear Information System (INIS)

    Park, K.C.

    1975-01-01

    Recent developments in the evaluation of direct time integration methods for the transient response analysis of nonlinear structures are presented. These developments, which are based on local stability considerations of an integrator, show that the interaction between temporal step size and nonlinearities of structural systems has a pronounced effect on both accuracy and stability of a given time integration method. The resulting evaluation technique is applied to a model nonlinear problem, in order to: 1) demonstrate that it eliminates the present costly process of evaluating time integrator for nonlinear structural systems via extensive numerical experiments; 2) identify the desirable characteristics of time integration methods for nonlinear structural problems; 3) develop improved stiffly-stable methods for application to nonlinear structures. Extension of the methodology for examination of the interaction between a time integrator and the approximate treatment of nonlinearities (such as due to pseudo-force or incremental solution procedures) is also discussed. (Auth.)

  8. FACILITATING INTEGRATED SPATIO-TEMPORAL VISUALIZATION AND ANALYSIS OF HETEROGENEOUS ARCHAEOLOGICAL AND PALAEOENVIRONMENTAL RESEARCH DATA

    Directory of Open Access Journals (Sweden)

    C. Willmes

    2012-07-01

    Full Text Available In the context of the Collaborative Research Centre 806 "Our way to Europe" (CRC806, a research database is developed for integrating data from the disciplines of archaeology, the geosciences and the cultural sciences to facilitate integrated access to heterogeneous data sources. A practice-oriented data integration concept and its implementation is presented in this contribution. The data integration approach is based on the application of Semantic Web Technology and is applied to the domains of archaeological and palaeoenvironmental data. The aim is to provide integrated spatio-temporal access to an existing wealth of data to facilitate research on the integrated data basis. For the web portal of the CRC806 research database (CRC806-Database, a number of interfaces and applications have been evaluated, developed and implemented for exposing the data to interactive analysis and visualizations.

  9. Integrated Data Collection Analysis (IDCA) Program - Statistical Analysis of RDX Standard Data Sets

    Energy Technology Data Exchange (ETDEWEB)

    Sandstrom, Mary M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Geoffrey W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Daniel N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pollard, Colin J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Warner, Kirstin F. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Sorensen, Daniel N. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Remmers, Daniel L. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Phillips, Jason J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shelley, Timothy J. [Air Force Research Lab. (AFRL), Tyndall AFB, FL (United States); Reyes, Jose A. [Applied Research Associates, Tyndall AFB, FL (United States); Hsu, Peter C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Reynolds, John G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-10-30

    The Integrated Data Collection Analysis (IDCA) program is conducting a Proficiency Test for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are statistical analyses of the results for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of the RDX Type II Class 5 standard. The material was tested as a well-characterized standard several times during the proficiency study to assess differences among participants and the range of results that may arise for well-behaved explosive materials. The analyses show that there are detectable differences among the results from IDCA participants. While these differences are statistically significant, most of them can be disregarded for comparison purposes to assess potential variability when laboratories attempt to measure identical samples using methods assumed to be nominally the same. The results presented in this report include the average sensitivity results for the IDCA participants and the ranges of values obtained. The ranges represent variation about the mean values of the tests of between 26% and 42%. The magnitude of this variation is attributed to differences in operator, method, and environment as well as the use of different instruments that are also of varying age. The results appear to be a good representation of the broader safety testing community based on the range of methods, instruments, and environments included in the IDCA Proficiency Test.

  10. Integrated analysis software for bulk power system stability

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, T; Nagao, T; Takahashi, K [Central Research Inst. of Electric Power Industry, Tokyo (Japan)

    1994-12-31

    This paper presents Central Research Inst.of Electric Power Industry - CRIEPI`s - own developed three softwares for bulk power network analysis and the user support system which arranges tremendous data necessary for these softwares with easy and high reliability. (author) 3 refs., 7 figs., 2 tabs.

  11. Micro Total Analysis Systems: Microfluidic aspects, integration concept and applications

    NARCIS (Netherlands)

    van den Berg, Albert; Lammerink, Theodorus S.J.

    1997-01-01

    In this contribution three aspects of miniaturized total analysis systems (µTAS) are described and discussed in detail. First, an overview of microfabricated components for fluid handling is given. A description of the importance of sampling- and fluid-handling techniques is followed by details of

  12. Thermodynamic analysis of a novel integrated solar combined cycle

    International Nuclear Information System (INIS)

    Li, Yuanyuan; Yang, Yongping

    2014-01-01

    Highlights: • A novel ISCC scheme with two-stage DSG fields has been proposed and analyzed. • HRSG and steam turbine working parameters have been optimized to match the solar integration. • New scheme exhibits higher solar shares in the power output and solar-to-electricity efficiency. • Thermodynamic performances between new and reference systems have been investigated and compared. - Abstract: Integrated solar combined cycle (ISCC) systems have become more and more popular due to their high fuel and solar energy utilization efficiencies. Conventional ISCC systems with direct steam generation (DSG) have only one-stage solar input. A novel ISCC with DSG system has been proposed and analyzed in this paper. The new system consists two-stage solar input, which would significantly increase solar share in the total power output. Moreover, how and where solar energy is input into ISCC system would have impact on the solar and system overall efficiencies, which have been analyzed in the paper. It has been found that using solar heat to supply latent heat for vaporization of feedwater would be superior to that to be used for sensible heating purposes (e.g. Superheating steam). The study shows that: (1) producing both the high- and low-pressure saturated steam in the DSG trough collector could be an efficient way to improve process and system performance; (2) for a given live steam pressure, the optimum secondary and reheat steam conditions could be matched to reach the highest system thermal efficiency and net solar-to-electricity efficiency; (3) the net solar-to-electricity efficiency could reach up to 30% in the novel two-stage ISCC system, higher than that in the one-stage ISCC power plant; (4) compared with the conventional combined cycle gas turbine (CCGT) power system, lower stack temperature could be achieved, owing to the elimination of the approach-temperature-difference constraint, resulting in better thermal match in the heat recovery steam generator

  13. Comprehensive, Integrative Genomic Analysis of Diffuse Lower-Grade Gliomas.

    Science.gov (United States)

    Brat, Daniel J; Verhaak, Roel G W; Aldape, Kenneth D; Yung, W K Alfred; Salama, Sofie R; Cooper, Lee A D; Rheinbay, Esther; Miller, C Ryan; Vitucci, Mark; Morozova, Olena; Robertson, A Gordon; Noushmehr, Houtan; Laird, Peter W; Cherniack, Andrew D; Akbani, Rehan; Huse, Jason T; Ciriello, Giovanni; Poisson, Laila M; Barnholtz-Sloan, Jill S; Berger, Mitchel S; Brennan, Cameron; Colen, Rivka R; Colman, Howard; Flanders, Adam E; Giannini, Caterina; Grifford, Mia; Iavarone, Antonio; Jain, Rajan; Joseph, Isaac; Kim, Jaegil; Kasaian, Katayoon; Mikkelsen, Tom; Murray, Bradley A; O'Neill, Brian Patrick; Pachter, Lior; Parsons, Donald W; Sougnez, Carrie; Sulman, Erik P; Vandenberg, Scott R; Van Meir, Erwin G; von Deimling, Andreas; Zhang, Hailei; Crain, Daniel; Lau, Kevin; Mallery, David; Morris, Scott; Paulauskis, Joseph; Penny, Robert; Shelton, Troy; Sherman, Mark; Yena, Peggy; Black, Aaron; Bowen, Jay; Dicostanzo, Katie; Gastier-Foster, Julie; Leraas, Kristen M; Lichtenberg, Tara M; Pierson, Christopher R; Ramirez, Nilsa C; Taylor, Cynthia; Weaver, Stephanie; Wise, Lisa; Zmuda, Erik; Davidsen, Tanja; Demchok, John A; Eley, Greg; Ferguson, Martin L; Hutter, Carolyn M; Mills Shaw, Kenna R; Ozenberger, Bradley A; Sheth, Margi; Sofia, Heidi J; Tarnuzzer, Roy; Wang, Zhining; Yang, Liming; Zenklusen, Jean Claude; Ayala, Brenda; Baboud, Julien; Chudamani, Sudha; Jensen, Mark A; Liu, Jia; Pihl, Todd; Raman, Rohini; Wan, Yunhu; Wu, Ye; Ally, Adrian; Auman, J Todd; Balasundaram, Miruna; Balu, Saianand; Baylin, Stephen B; Beroukhim, Rameen; Bootwalla, Moiz S; Bowlby, Reanne; Bristow, Christopher A; Brooks, Denise; Butterfield, Yaron; Carlsen, Rebecca; Carter, Scott; Chin, Lynda; Chu, Andy; Chuah, Eric; Cibulskis, Kristian; Clarke, Amanda; Coetzee, Simon G; Dhalla, Noreen; Fennell, Tim; Fisher, Sheila; Gabriel, Stacey; Getz, Gad; Gibbs, Richard; Guin, Ranabir; Hadjipanayis, Angela; Hayes, D Neil; Hinoue, Toshinori; Hoadley, Katherine; Holt, Robert A; Hoyle, Alan P; Jefferys, Stuart R; Jones, Steven; Jones, Corbin D; Kucherlapati, Raju; Lai, Phillip H; Lander, Eric; Lee, Semin; Lichtenstein, Lee; Ma, Yussanne; Maglinte, Dennis T; Mahadeshwar, Harshad S; Marra, Marco A; Mayo, Michael; Meng, Shaowu; Meyerson, Matthew L; Mieczkowski, Piotr A; Moore, Richard A; Mose, Lisle E; Mungall, Andrew J; Pantazi, Angeliki; Parfenov, Michael; Park, Peter J; Parker, Joel S; Perou, Charles M; Protopopov, Alexei; Ren, Xiaojia; Roach, Jeffrey; Sabedot, Thaís S; Schein, Jacqueline; Schumacher, Steven E; Seidman, Jonathan G; Seth, Sahil; Shen, Hui; Simons, Janae V; Sipahimalani, Payal; Soloway, Matthew G; Song, Xingzhi; Sun, Huandong; Tabak, Barbara; Tam, Angela; Tan, Donghui; Tang, Jiabin; Thiessen, Nina; Triche, Timothy; Van Den Berg, David J; Veluvolu, Umadevi; Waring, Scot; Weisenberger, Daniel J; Wilkerson, Matthew D; Wong, Tina; Wu, Junyuan; Xi, Liu; Xu, Andrew W; Yang, Lixing; Zack, Travis I; Zhang, Jianhua; Aksoy, B Arman; Arachchi, Harindra; Benz, Chris; Bernard, Brady; Carlin, Daniel; Cho, Juok; DiCara, Daniel; Frazer, Scott; Fuller, Gregory N; Gao, JianJiong; Gehlenborg, Nils; Haussler, David; Heiman, David I; Iype, Lisa; Jacobsen, Anders; Ju, Zhenlin; Katzman, Sol; Kim, Hoon; Knijnenburg, Theo; Kreisberg, Richard Bailey; Lawrence, Michael S; Lee, William; Leinonen, Kalle; Lin, Pei; Ling, Shiyun; Liu, Wenbin; Liu, Yingchun; Liu, Yuexin; Lu, Yiling; Mills, Gordon; Ng, Sam; Noble, Michael S; Paull, Evan; Rao, Arvind; Reynolds, Sheila; Saksena, Gordon; Sanborn, Zack; Sander, Chris; Schultz, Nikolaus; Senbabaoglu, Yasin; Shen, Ronglai; Shmulevich, Ilya; Sinha, Rileen; Stuart, Josh; Sumer, S Onur; Sun, Yichao; Tasman, Natalie; Taylor, Barry S; Voet, Doug; Weinhold, Nils; Weinstein, John N; Yang, Da; Yoshihara, Kosuke; Zheng, Siyuan; Zhang, Wei; Zou, Lihua; Abel, Ty; Sadeghi, Sara; Cohen, Mark L; Eschbacher, Jenny; Hattab, Eyas M; Raghunathan, Aditya; Schniederjan, Matthew J; Aziz, Dina; Barnett, Gene; Barrett, Wendi; Bigner, Darell D; Boice, Lori; Brewer, Cathy; Calatozzolo, Chiara; Campos, Benito; Carlotti, Carlos Gilberto; Chan, Timothy A; Cuppini, Lucia; Curley, Erin; Cuzzubbo, Stefania; Devine, Karen; DiMeco, Francesco; Duell, Rebecca; Elder, J Bradley; Fehrenbach, Ashley; Finocchiaro, Gaetano; Friedman, William; Fulop, Jordonna; Gardner, Johanna; Hermes, Beth; Herold-Mende, Christel; Jungk, Christine; Kendler, Ady; Lehman, Norman L; Lipp, Eric; Liu, Ouida; Mandt, Randy; McGraw, Mary; Mclendon, Roger; McPherson, Christopher; Neder, Luciano; Nguyen, Phuong; Noss, Ardene; Nunziata, Raffaele; Ostrom, Quinn T; Palmer, Cheryl; Perin, Alessandro; Pollo, Bianca; Potapov, Alexander; Potapova, Olga; Rathmell, W Kimryn; Rotin, Daniil; Scarpace, Lisa; Schilero, Cathy; Senecal, Kelly; Shimmel, Kristen; Shurkhay, Vsevolod; Sifri, Suzanne; Singh, Rosy; Sloan, Andrew E; Smolenski, Kathy; Staugaitis, Susan M; Steele, Ruth; Thorne, Leigh; Tirapelli, Daniela P C; Unterberg, Andreas; Vallurupalli, Mahitha; Wang, Yun; Warnick, Ronald; Williams, Felicia; Wolinsky, Yingli; Bell, Sue; Rosenberg, Mara; Stewart, Chip; Huang, Franklin; Grimsby, Jonna L; Radenbaugh, Amie J; Zhang, Jianan

    2015-06-25

    Diffuse low-grade and intermediate-grade gliomas (which together make up the lower-grade gliomas, World Health Organization grades II and III) have highly variable clinical behavior that is not adequately predicted on the basis of histologic class. Some are indolent; others quickly progress to glioblastoma. The uncertainty is compounded by interobserver variability in histologic diagnosis. Mutations in IDH, TP53, and ATRX and codeletion of chromosome arms 1p and 19q (1p/19q codeletion) have been implicated as clinically relevant markers of lower-grade gliomas. We performed genomewide analyses of 293 lower-grade gliomas from adults, incorporating exome sequence, DNA copy number, DNA methylation, messenger RNA expression, microRNA expression, and targeted protein expression. These data were integrated and tested for correlation with clinical outcomes. Unsupervised clustering of mutations and data from RNA, DNA-copy-number, and DNA-methylation platforms uncovered concordant classification of three robust, nonoverlapping, prognostically significant subtypes of lower-grade glioma that were captured more accurately by IDH, 1p/19q, and TP53 status than by histologic class. Patients who had lower-grade gliomas with an IDH mutation and 1p/19q codeletion had the most favorable clinical outcomes. Their gliomas harbored mutations in CIC, FUBP1, NOTCH1, and the TERT promoter. Nearly all lower-grade gliomas with IDH mutations and no 1p/19q codeletion had mutations in TP53 (94%) and ATRX inactivation (86%). The large majority of lower-grade gliomas without an IDH mutation had genomic aberrations and clinical behavior strikingly similar to those found in primary glioblastoma. The integration of genomewide data from multiple platforms delineated three molecular classes of lower-grade gliomas that were more concordant with IDH, 1p/19q, and TP53 status than with histologic class. Lower-grade gliomas with an IDH mutation either had 1p/19q codeletion or carried a TP53 mutation. Most

  14. Probabilistic Steady-State Operation and Interaction Analysis of Integrated Electricity, Gas and Heating Systems

    Directory of Open Access Journals (Sweden)

    Lun Yang

    2018-04-01

    Full Text Available The existing studies on probabilistic steady-state analysis of integrated energy systems (IES are limited to integrated electricity and gas networks or integrated electricity and heating networks. This paper proposes a probabilistic steady-state analysis of integrated electricity, gas and heating networks (EGH-IES. Four typical operation modes of an EGH-IES are presented at first. The probabilistic energy flow problem of the EGS-IES considering its operation modes and correlated uncertainties in wind/solar power and electricity/gas/heat loads is then formulated and solved by the Monte Carlo method based on Latin hypercube sampling and Nataf transformation. Numerical simulations are conducted on a sample EGH-IES working in the “electricity/gas following heat” mode to verify the probabilistic analysis proposed in this paper and to study the effects of uncertainties and correlations on the operation of the EGH-IES, especially uncertainty transmissions among the subnetworks.

  15. An integrated 3D design, modeling and analysis resource for SSC detector systems

    International Nuclear Information System (INIS)

    DiGiacomo, N.J.; Adams, T.; Anderson, M.K.; Davis, M.; Easom, B.; Gliozzi, J.; Hale, W.M.; Hupp, J.; Killian, K.; Krohn, M.; Leitch, R.; Lajczok, M.; Mason, L.; Mitchell, J.; Pohlen, J.; Wright, T.

    1989-01-01

    Integrated computer aided engineering and design (CAE/CAD) is having a significant impact on the way design, modeling and analysis is performed, from system concept exploration and definition through final design and integration. Experience with integrated CAE/CAD in high technology projects of scale and scope similar to SSC detectors leads them to propose an integrated computer-based design, modeling and analysis resource aimed specifically at SSC detector system development. The resource architecture emphasizes value-added contact with data and efficient design, modeling and analysis of components, sub-systems or systems with fidelity appropriate to the task. They begin with a general examination of the design, modeling and analysis cycle in high technology projects, emphasizing the transition from the classical islands of automation to the integrated CAE/CAD-based approach. They follow this with a discussion of lessons learned from various attempts to design and implement integrated CAE/CAD systems in scientific and engineering organizations. They then consider the requirements for design, modeling and analysis during SSC detector development, and describe an appropriate resource architecture. They close with a report on the status of the resource and present some results that are indicative of its performance. 10 refs., 7 figs

  16. Employee commitment and motivation: a conceptual analysis and integrative model.

    Science.gov (United States)

    Myer, John P; Becker, Thomas E; Vandenberghe, Christian

    2004-12-01

    Theorists and researchers interested in employee commitment and motivation have not made optimal use of each other's work. Commitment researchers seldom address the motivational processes through which commitment affects behavior, and motivation researchers have not recognized important distinctions in the forms, foci, and bases of commitment. To encourage greater cross-fertilization, the authors present an integrative framework in which commitment is presented as one of several energizing forces for motivated behavior. E. A. Locke's (1997) model of the work motivation process and J. P. Meyer and L. Herscovitch's (2001) model of workplace commitments serve as the foundation for the development of this new framework. To facilitate the merger, a new concept, goal regulation, is derived from self-determination theory (E. L. Deci & R. M. Ryan, 1985) and regulatory focus theory (E. I. Higgins, 1997). By including goal regulation, it is acknowledged that motivated behavior can be accompanied by different mindsets that have particularly important implications for the explanation and prediction of discretionary work behavior. 2004 APA, all rights reserved

  17. Integrator Performance Analysis In Solving Stiff Differential Equation System

    International Nuclear Information System (INIS)

    B, Alhadi; Basaruddin, T.

    2001-01-01

    In this paper we discuss the four-stage index-2 singly diagonally implicit Runge-Kutta method, which is used to solve stiff ordinary differential equations (SODE). Stiff problems require a method where step size is not restricted by the method's stability. We desire SDIRK to be A-stable that has no stability restrictions when solving y'= λy with Reλ>0 and h>0, so by choosing suitable stability function we can determine appropriate constant g) to formulate SDIRK integrator to solve SODE. We select the second stage of the internal stage as embedded method to perform low order estimate for error predictor. The strategy for choosing the step size is adopted from the strategy proposed by Hall(1996:6). And the algorithm that is developed in this paper is implemented using MATLAB 5.3, which is running on Window's 95 environment. Our performance measurement's local truncation error accuracy, and efficiency were evaluated by statistical results of sum of steps, sum of calling functions, average of Newton iterations and elapsed times.As the results, our numerical experiment show that SDIRK is unconditionally stable. By using Hall's step size strategy, the method can be implemented efficiently, provided that suitable parameters are used

  18. Nonlinear Analysis and Intelligent Control of Integrated Vehicle Dynamics

    Directory of Open Access Journals (Sweden)

    C. Huang

    2014-01-01

    Full Text Available With increasing and more stringent requirements for advanced vehicle integration, including vehicle dynamics and control, traditional control and optimization strategies may not qualify for many applications. This is because, among other factors, they do not consider the nonlinear characteristics of practical systems. Moreover, the vehicle wheel model has some inadequacies regarding the sideslip angle, road adhesion coefficient, vertical load, and velocity. In this paper, an adaptive neural wheel network is introduced, and the interaction between the lateral and vertical dynamics of the vehicle is analyzed. By means of nonlinear analyses such as the use of a bifurcation diagram and the Lyapunov exponent, the vehicle is shown to exhibit complicated motions with increasing forward speed. Furthermore, electric power steering (EPS and active suspension system (ASS, which are based on intelligent control, are used to reduce the nonlinear effect, and a negotiation algorithm is designed to manage the interdependences and conflicts among handling stability, driving smoothness, and safety. Further, a rapid control prototype was built using the hardware-in-the-loop simulation platform dSPACE and used to conduct a real vehicle test. The results of the test were consistent with those of the simulation, thereby validating the proposed control.

  19. Social Network Analysis and Nutritional Behavior: An Integrated Modeling Approach.

    Science.gov (United States)

    Senior, Alistair M; Lihoreau, Mathieu; Buhl, Jerome; Raubenheimer, David; Simpson, Stephen J

    2016-01-01

    Animals have evolved complex foraging strategies to obtain a nutritionally balanced diet and associated fitness benefits. Recent research combining state-space models of nutritional geometry with agent-based models (ABMs), show how nutrient targeted foraging behavior can also influence animal social interactions, ultimately affecting collective dynamics and group structures. Here we demonstrate how social network analyses can be integrated into such a modeling framework and provide a practical analytical tool to compare experimental results with theory. We illustrate our approach by examining the case of nutritionally mediated dominance hierarchies. First we show how nutritionally explicit ABMs that simulate the emergence of dominance hierarchies can be used to generate social networks. Importantly the structural properties of our simulated networks bear similarities to dominance networks of real animals (where conflicts are not always directly related to nutrition). Finally, we demonstrate how metrics from social network analyses can be used to predict the fitness of agents in these simulated competitive environments. Our results highlight the potential importance of nutritional mechanisms in shaping dominance interactions in a wide range of social and ecological contexts. Nutrition likely influences social interactions in many species, and yet a theoretical framework for exploring these effects is currently lacking. Combining social network analyses with computational models from nutritional ecology may bridge this divide, representing a pragmatic approach for generating theoretical predictions for nutritional experiments.

  20. Integral Transport Analysis Results for Ions Flowing Through Neutral Gas

    Science.gov (United States)

    Emmert, Gilbert; Santarius, John

    2017-10-01

    Results of a computational model for the flow of energetic ions and neutrals through a background neutral gas will be presented. The method models reactions as creating a new source of ions or neutrals if the energy or charge state of the resulting particle is changed. For a given source boundary condition, the creation and annihilation of the various species is formulated as a 1-D Volterra integral equation that can quickly be solved numerically by finite differences. The present work focuses on multiple-pass, 1-D ion flow through neutral gas and a nearly transparent, concentric anode and cathode pair in spherical, cylindrical, or linear geometry. This has been implemented as a computer code for atomic (3He, 3He +, 3He + +) and molecular (D, D2, D-, D +, D2 +, D3 +) ion and neutral species, and applied to modeling inertial-electrostatic connement (IEC) devices. The code yields detailed energy spectra of the various ions and energetic neutral species. Calculations for several University of Wisconsin IEC and ion implantation devices will be presented. Research supported by US Dept. of Homeland Security Grant 2015-DN-077-ARI095, Dept. of Energy Grant DE-FG02-04ER54745, and the Grainger Foundation.

  1. Factors controlling nanoparticle pharmacokinetics: an integrated analysis and perspective.

    Science.gov (United States)

    Moghimi, S M; Hunter, A C; Andresen, T L

    2012-01-01

    Intravenously injected nanoparticulate drug carriers provide a wide range of unique opportunities for site-specific targeting of therapeutic agents to many areas within the vasculature and beyond. Pharmacokinetics and biodistribution of these carriers are controlled by a complex array of interrelated core and interfacial physicochemical and biological factors. Pertinent to realizing therapeutic goals, definitive maps that establish the interdependency of nanoparticle size, shape, and surface characteristics in relation to interfacial forces, biodistribution, controlled drug release, excretion, and adverse effects must be outlined. These concepts are critically evaluated and an integrated perspective is provided on the basis of the recent application of nanoscience approaches to nanocarrier design and engineering. The future of this exciting field is bright; some regulatory-approved products are already on the market and many are in late-phase clinical trials. With concomitant advances in extensive computational knowledge of the genomics and epigenomics of interindividual variations in drug responses, the boundaries toward development of personalized nanomedicines can be pushed further.

  2. Integrating PAW, a graphical analysis interface to Sybase

    International Nuclear Information System (INIS)

    Fry, A.; Chow, I.

    1993-04-01

    The program PAW (Physics Analysis Workstation) enjoys tremendous popularity within the high energy physics community. It is implemented on a large number of platforms and is available to the high energy physics community free of charge from the CERN computing division. PAW combines extensive graphical display capability (HPLOT/HIGZ), with histogramming (HBOOK4), file and data handling (ZEBRA), vector arithmetic manipulation (SIGMA), user defined functions (COMIS), powerful function minimization (MINUIT), and a command interpreter (KUIP). To facilitate the possibility of using relational databases in physics analysis, we have added an SQL interface to PAW. This interface allows users to create PAW N-tuples from Sybase tables and vice versa. We discuss the implementations below

  3. A flammability and combustion model for integrated accident analysis

    International Nuclear Information System (INIS)

    Plys, M.G.; Astleford, R.D.; Epstein, M.

    1988-01-01

    A model for flammability characteristics and combustion of hydrogen and carbon monoxide mixtures is presented for application to severe accident analysis of Advanced Light Water Reactors (ALWR's). Flammability of general mixtures for thermodynamic conditions anticipated during a severe accident is quantified with a new correlation technique applied to data for several fuel and inertant mixtures and using accepted methods for combining these data. Combustion behavior is quantified by a mechanistic model consisting of a continuity and momentum balance for the burned gases, and considering an uncertainty parameter to match the idealized process to experiment. Benchmarks against experiment demonstrate the validity of this approach for a single recommended value of the flame flux multiplier parameter. The models presented here are equally applicable to analysis of current LWR's. 21 refs., 16 figs., 6 tabs

  4. Integrated quantitative fractal polarimetric analysis of monolayer lung cancer cells

    Science.gov (United States)

    Shrestha, Suman; Zhang, Lin; Quang, Tri; Farrahi, Tannaz; Narayan, Chaya; Deshpande, Aditi; Na, Ying; Blinzler, Adam; Ma, Junyu; Liu, Bo; Giakos, George C.

    2014-05-01

    Digital diagnostic pathology has become one of the most valuable and convenient advancements in technology over the past years. It allows us to acquire, store and analyze pathological information from the images of histological and immunohistochemical glass slides which are scanned to create digital slides. In this study, efficient fractal, wavelet-based polarimetric techniques for histological analysis of monolayer lung cancer cells will be introduced and different monolayer cancer lines will be studied. The outcome of this study indicates that application of fractal, wavelet polarimetric principles towards the analysis of squamous carcinoma and adenocarcinoma cancer cell lines may be proved extremely useful in discriminating among healthy and lung cancer cells as well as differentiating among different lung cancer cells.

  5. COST BENEFIT ANALYSIS OF A DG INTEGRATED SYSTEM: CASE STUDY

    Directory of Open Access Journals (Sweden)

    Ch. V. S. S. SAILAJA

    2017-09-01

    Full Text Available Distributed Generation is capable of meeting the load of the consumers partially or completely. Depending on the type of DG involved it can be operated in interconnected mode and islanded mode. The availability of numerous alternatives present for the DG technologies and large initial investments necessitates a detailed cost benefit analysis for the implementation of DG technologies. In this work an attempt has been made to study the costs involved in implementing the DG technologies. A practical system having two kinds of distributed generation i.e., Diesel Generator and solar photovoltaic system for its back up purpose is considered. A detailed cost analysis of the two DG technologies is carried out.

  6. Chronic wasting disease risk analysis workshop: An integrative approach

    Science.gov (United States)

    Gillette, Shana; Dein, Joshua; Salman, Mo; Richards, Bryan; Duarte, Paulo

    2004-01-01

    Risk analysis tools have been successfully used to determine the potential hazard associated with disease introductions and have facilitated management decisions designed to limit the potential for disease introduction. Chronic Wasting Disease (CWD) poses significant challenges for resource managers due to an incomplete understanding of disease etiology and epidemiology and the complexity of management and political jurisdictions. Tools designed specifically to assess the risk of CWD introduction would be of great value to policy makers in areas where CWD has not been detected.

  7. Integrative Lifecourse and Genetic Analysis of Military Working Dogs

    Science.gov (United States)

    2015-12-01

    Multhoff, Technische Universitaet Muenchen, GERMANY Received: July 22, 2015 Accepted: October 15, 2015 Published: November 11, 2015 Copyright: © 2015...MO) were used. For the monolayer model, cells were plated in 96-well black flat-clear bottom plates (Greiner Bio-One GmbH, Frickenhausen, Germany ...and oncogenic determinants of metastasis have been reported and appear to be similar in both species. Comparative analysis of deregulated gene sets or

  8. HVAC fault tree analysis for WIPP integrated risk assessment

    International Nuclear Information System (INIS)

    Kirby, P.; Iacovino, J.

    1990-01-01

    In order to evaluate the public health risk from operation of the Waste Isolation Pilot Plant (WIPP) due to potential radioactive releases, a probabilistic risk assessment of waste handling operations was conducted. One major aspect of this risk assessment involved fault tree analysis of the plant heating, ventilation, and air conditioning (HVAC) systems, which comprise the final barrier between waste handling operations and the environment. 1 refs., 1 tab

  9. INTEGRATION OF SYSTEM COMPONENTS AND UNCERTAINTY ANALYSIS - HANFORD EXAMPLES

    International Nuclear Information System (INIS)

    Wood, M.I.

    2009-01-01

    (sm b ullet) Deterministic 'One Off' analyses as basis for evaluating sensitivity and uncertainty relative to reference case (sm b ullet) Spatial coverage identical to reference case (sm b ullet) Two types of analysis assumptions - Minimax parameter values around reference case conditions - 'What If' cases that change reference case condition and associated parameter values (sm b ullet) No conclusions about likelihood of estimated result other than' qualitative expectation that actual outcome should tend toward reference case estimate

  10. Development of the integrated system reliability analysis code MODULE

    International Nuclear Information System (INIS)

    Han, S.H.; Yoo, K.J.; Kim, T.W.

    1987-01-01

    The major components in a system reliability analysis are the determination of cut sets, importance measure, and uncertainty analysis. Various computer codes have been used for these purposes. For example, SETS and FTAP are used to determine cut sets; Importance for importance calculations; and Sample, CONINT, and MOCUP for uncertainty analysis. There have been problems when the codes run each other and the input and output are not linked, which could result in errors when preparing input for each code. The code MODULE was developed to carry out the above calculations simultaneously without linking input and outputs to other codes. MODULE can also prepare input for SETS for the case of a large fault tree that cannot be handled by MODULE. The flow diagram of the MODULE code is shown. To verify the MODULE code, two examples are selected and the results and computation times are compared with those of SETS, FTAP, CONINT, and MOCUP on both Cyber 170-875 and IBM PC/AT. Two examples are fault trees of the auxiliary feedwater system (AFWS) of Korea Nuclear Units (KNU)-1 and -2, which have 54 gates and 115 events, 39 gates and 92 events, respectively. The MODULE code has the advantage that it can calculate the cut sets, importances, and uncertainties in a single run with little increase in computing time over other codes and that it can be used in personal computers

  11. Integrated torrefaction vs. external torrefaction – A thermodynamic analysis for the case of a thermochemical biorefinery

    International Nuclear Information System (INIS)

    Clausen, Lasse R.

    2014-01-01

    Integrated and external torrefaction is analyzed and compared via thermodynamic modeling. In this paper, integrated torrefaction is defined as torrefaction integrated with entrained flow gasification. External torrefaction is defined as the decentralized production of torrefied wood pellets and centralized conversion of the pellets by entrained flow gasification. First, the syngas production of the two methods was compared. Second, the two methods were compared by considering complete biorefineries with either integrated torrefaction or external torrefaction. The first part of the analysis showed that the biomass to syngas efficiency can be increased from 63% to 86% (LHV-dry) when switching from external torrefaction to integrated torrefaction. The second part of the analysis showed that the total energy efficiency (biomass to methanol + net electricity) could be increased from 53% to 63% when switching from external torrefaction to integrated torrefaction. The costs of this increase in energy efficiency are as follows: 1) more difficult transport, storage and handling of the biomass feedstock (wood chips vs. torrefied wood pellets); 2) reduced plant size; 3) no net electricity production; and 4) a more complex plant design. - Highlights: • Integrated torrefaction is compared with external torrefaction. • Biomass to syngas energy efficiencies of 63–86% are achieved. • Two thermochemical biorefinery are designed and analysed by thermodynamic modeling. • Biomass to fuel + electricity energy efficiencies of 53–63% are achieved. • The pros and cons of integrated torrefaction are described

  12. Integrated syphilis/HIV screening in China: a qualitative analysis

    Directory of Open Access Journals (Sweden)

    Yin Yue-Pin

    2010-03-01

    Full Text Available Abstract Background The last decade has seen enormous advances in HIV treatment and care, but how to implement scaled up HIV testing, prevention, and treatment in low-income areas still presents a formidable public health challenge. South China faces expanding syphilis and sexually transmitted HIV epidemics, but health systems characteristics important for scaling up syphilis and HIV testing have not been defined. Methods A purposive sample to ensure public, private, and public-private hybrid STI clinic inclusion was selected in a South China city. Eight key informant interviews were conducted with the STI clinic manager, followed by eight focus group discussions with physicians. Data collection relied on a semi-structured format that included questions in each of the following domains: 1 clinical facilities; 2 laboratory capacity with a focus on syphilis/HIV diagnosis; 3 clinic personnel; 4 physical space with a focus on locations to disclose confidential results; 5 financial support. Results Public STI clinics had free syphilis testing/treatment and laboratory facilities to perform essential syphilis and HIV tests. However, despite serving a large number of STI patients, private STI clinics lacked nontreponemal syphilis testing, HIV testing, and had fewer connections to the public health infrastructure. Formally trained assistant physicians were 2.5 times as common as physicians at STI clinics. Only one of the 8 sites had onsite voluntary counseling and testing (VCT services available. Conclusion These STI case studies reveal the potential for expanding integrated syphilis/HIV services at public STI clinics in China. More health services research is needed to guide scale-up of syphilis/HIV testing in China.

  13. Integrative mapping analysis of chicken microchromosome 16 organization

    Directory of Open Access Journals (Sweden)

    Bed'hom Bertrand

    2010-11-01

    Full Text Available Abstract Background The chicken karyotype is composed of 39 chromosome pairs, of which 9 still remain totally absent from the current genome sequence assembly, despite international efforts towards complete coverage. Some others are only very partially sequenced, amongst which microchromosome 16 (GGA16, particularly under-represented, with only 433 kb assembled for a full estimated size of 9 to 11 Mb. Besides the obvious need of full genome coverage with genetic markers for QTL (Quantitative Trait Loci mapping and major genes identification studies, there is a major interest in the detailed study of this chromosome because it carries the two genetically independent MHC complexes B and Y. In addition, GGA16 carries the ribosomal RNA (rRNA genes cluster, also known as the NOR (nucleolus organizer region. The purpose of the present study is to construct and present high resolution integrated maps of GGA16 to refine its organization and improve its coverage with genetic markers. Results We developed 79 STS (Sequence Tagged Site markers to build a physical RH (radiation hybrid map and 34 genetic markers to extend the genetic map of GGA16. We screened a BAC (Bacterial Artificial Chromosome library with markers for the MHC-B, MHC-Y and rRNA complexes. Selected clones were used to perform high resolution FISH (Fluorescent In Situ Hybridization mapping on giant meiotic lampbrush chromosomes, allowing meiotic mapping in addition to the confirmation of the order of the three clusters along the chromosome. A region with high recombination rates and containing PO41 repeated elements separates the two MHC complexes. Conclusions The three complementary mapping strategies used refine greatly our knowledge of chicken microchromosome 16 organisation. The characterisation of the recombination hotspots separating the two MHC complexes demonstrates the presence of PO41 repetitive sequences both in tandem and inverted orientation. However, this region still needs to

  14. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE), Version 5.0: Integrated Reliability and Risk Analysis System (IRRAS) reference manual. Volume 2

    International Nuclear Information System (INIS)

    Russell, K.D.; Kvarfordt, K.J.; Skinner, N.L.; Wood, S.T.; Rasmuson, D.M.

    1994-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of several microcomputer programs that were developed to create and analyze probabilistic risk assessments (PRAs), primarily for nuclear power plants. The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the use the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification to report generation. Version 1.0 of the IRRAS program was released in February of 1987. Since then, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 5.0 and is the subject of this Reference Manual. Version 5.0 of IRRAS provides the same capabilities as earlier versions and ads the ability to perform location transformations, seismic analysis, and provides enhancements to the user interface as well as improved algorithm performance. Additionally, version 5.0 contains new alphanumeric fault tree and event used for event tree rules, recovery rules, and end state partitioning

  15. Integrated computer codes for nuclear power plant severe accident analysis

    International Nuclear Information System (INIS)

    Jordanov, I.; Khristov, Y.

    1995-01-01

    This overview contains a description of the Modular Accident Analysis Program (MAAP), ICARE computer code and Source Term Code Package (STCP). STCP is used to model TMLB sample problems for Zion Unit 1 and WWER-440/V-213 reactors. Comparison is made of STCP implementation on VAX and IBM systems. In order to improve accuracy, a double precision version of MARCH-3 component of STCP is created and the overall thermal hydraulics is modelled. Results of modelling the containment pressure, debris temperature, hydrogen mass are presented. 5 refs., 10 figs., 2 tabs

  16. Integrated computer codes for nuclear power plant severe accident analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jordanov, I; Khristov, Y [Bylgarska Akademiya na Naukite, Sofia (Bulgaria). Inst. za Yadrena Izsledvaniya i Yadrena Energetika

    1996-12-31

    This overview contains a description of the Modular Accident Analysis Program (MAAP), ICARE computer code and Source Term Code Package (STCP). STCP is used to model TMLB sample problems for Zion Unit 1 and WWER-440/V-213 reactors. Comparison is made of STCP implementation on VAX and IBM systems. In order to improve accuracy, a double precision version of MARCH-3 component of STCP is created and the overall thermal hydraulics is modelled. Results of modelling the containment pressure, debris temperature, hydrogen mass are presented. 5 refs., 10 figs., 2 tabs.

  17. Hanford Site Composite Analysis Technical Approach Description: Integrated Computational Framework.

    Energy Technology Data Exchange (ETDEWEB)

    Smith, K. J. [CH2M HILL Plateau Remediation Company, Richland, WA (United States)

    2017-09-14

    The U.S. Department of Energy (DOE) in DOE O 435.1 Chg. 1, Radioactive Waste Management, requires the preparation and maintenance of a composite analysis (CA). The primary purpose of the CA is to provide a reasonable expectation that the primary public dose limit is not likely to be exceeded by multiple source terms that may significantly interact with plumes originating at a low-level waste disposal facility. The CA is used to facilitate planning and land use decisions that help assure disposal facility authorization will not result in long-term compliance problems; or, to determine management alternatives, corrective actions, or assessment needs if potential problems are identified.

  18. Group sparse canonical correlation analysis for genomic data integration.

    Science.gov (United States)

    Lin, Dongdong; Zhang, Jigang; Li, Jingyao; Calhoun, Vince D; Deng, Hong-Wen; Wang, Yu-Ping

    2013-08-12

    The emergence of high-throughput genomic datasets from different sources and platforms (e.g., gene expression, single nucleotide polymorphisms (SNP), and copy number variation (CNV)) has greatly enhanced our understandings of the interplay of these genomic factors as well as their influences on the complex diseases. It is challenging to explore the relationship between these different types of genomic data sets. In this paper, we focus on a multivariate statistical method, canonical correlation analysis (CCA) method for this problem. Conventional CCA method does not work effectively if the number of data samples is significantly less than that of biomarkers, which is a typical case for genomic data (e.g., SNPs). Sparse CCA (sCCA) methods were introduced to overcome such difficulty, mostly using penalizations with l-1 norm (CCA-l1) or the combination of l-1and l-2 norm (CCA-elastic net). However, they overlook the structural or group effect within genomic data in the analysis, which often exist and are important (e.g., SNPs spanning a gene interact and work together as a group). We propose a new group sparse CCA method (CCA-sparse group) along with an effective numerical algorithm to study the mutual relationship between two different types of genomic data (i.e., SNP and gene expression). We then extend the model to a more general formulation that can include the existing sCCA models. We apply the model to feature/variable selection from two data sets and compare our group sparse CCA method with existing sCCA methods on both simulation and two real datasets (human gliomas data and NCI60 data). We use a graphical representation of the samples with a pair of canonical variates to demonstrate the discriminating characteristic of the selected features. Pathway analysis is further performed for biological interpretation of those features. The CCA-sparse group method incorporates group effects of features into the correlation analysis while performs individual feature

  19. Integrated Data Collection Analysis (IDCA) Program — Quarterly Review Meeting

    Energy Technology Data Exchange (ETDEWEB)

    Sandstrom, Mary M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Geoffrey W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Warner, Kirstin F. [Naval Surface Warfare Center (IHD-NSWC), Indian Head, MD (United States). Indian Head Division; Sorensen, Daniel N. [Naval Surface Warfare Center (IHD-NSWC), Indian Head, MD (United States). Indian Head Division; Remmers, Daniel L. [Naval Surface Warfare Center (IHD-NSWC), Indian Head, MD (United States). Indian Head Division; Shelley, Timothy J. [Air Force Research Lab. (AFRL/RXQF), Tyndall AFB, FL (United States); Reyes, Jose A. [Applied Research Associates, Inc., Tyndall AFB, FL (United States); Phillips, Jason J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hsu, Peter C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Reynolds, John G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2011-12-05

    On November 9 and 10, 2011 the IDCA had the annual quarterly meeting. The meeting started the afternoon of the first day with a tour of the NSWC IHD explosives safety testing and analysis facilities. The meeting on the second day addressed the formal sponsor review and further technical issues for the IDCA. Examination of the IHD equipment during the tour, lead to a long discussion on liquid test methods. The discussion resulted in revision of liquid test methods in the impact test and selection of a new liquid test standard. In addition, modifications to friction, spark and thermal test methods were discussed.

  20. Integrated communication, navigation, and identification avionics: Impact analysis. Executive summary

    Science.gov (United States)

    Veatch, M. H.; McManus, J. C.

    1985-10-01

    This paper summarizes the approach and findings of research into reliability, supportability, and survivability prediction techniques for fault-tolerant avionics systems. Since no technique existed to analyze the fault tolerance of reconfigurable systems, a new method was developed and implemented in the Mission Reliability Model (MIREM). The supportability analysis was completed by using the Simulation of Operational Availability/Readiness (SOAR) model. Both the Computation of Vulnerable Area and Repair Time (COVART) model and FASTGEN, a survivability model, proved valuable for the survivability research. Sample results are presented and several recommendations are also given for each of the three areas investigated under this study: reliability supportablility and survivability.

  1. Topology design and performance analysis of an integrated communication network

    Science.gov (United States)

    Li, V. O. K.; Lam, Y. F.; Hou, T. C.; Yuen, J. H.

    1985-01-01

    A research study on the topology design and performance analysis for the Space Station Information System (SSIS) network is conducted. It is begun with a survey of existing research efforts in network topology design. Then a new approach for topology design is presented. It uses an efficient algorithm to generate candidate network designs (consisting of subsets of the set of all network components) in increasing order of their total costs, and checks each design to see if it forms an acceptable network. This technique gives the true cost-optimal network, and is particularly useful when the network has many constraints and not too many components. The algorithm for generating subsets is described in detail, and various aspects of the overall design procedure are discussed. Two more efficient versions of this algorithm (applicable in specific situations) are also given. Next, two important aspects of network performance analysis: network reliability and message delays are discussed. A new model is introduced to study the reliability of a network with dependent failures. For message delays, a collection of formulas from existing research results is given to compute or estimate the delays of messages in a communication network without making the independence assumption. The design algorithm coded in PASCAL is included as an appendix.

  2. Integrating multicriteria evaluation and stakeholders analysis for assessing hydropower projects

    International Nuclear Information System (INIS)

    Rosso, M.; Bottero, M.; Pomarico, S.; La Ferlita, S.; Comino, E.

    2014-01-01

    The use of hydroelectric potential and the protection of the river ecosystem are two contrasting aspects that arise in the management of the same resource, generating conflicts between different stakeholders. The purpose of the paper is to develop a multi-level decision-making tool, able to support energy planning, with specific reference to the construction of hydropower plants in mountain areas. Starting from a real-world problem concerning the basin of the Sesia Valley (Italy), an evaluation framework based on the combined use of Multicriteria Evaluation and Stakeholders Analysis is proposed in the study. The results of the work show that the methodology is able to grant participated decisions through a multi-stakeholders traceable and transparent assessment process, to highlight the important elements of the decision problem and to support the definition of future design guidelines. - Highlights: • The paper concerns a multi-level decision-making tool able to support energy planning. • The evaluation framework is based on the use of AHP and Stakeholders Analysis. • Hydropower projects in the Sesia Valley (Italy) are evaluated and ranked in the study. • Environmental, economic, technical and sociopolitical criteria have been considered. • 42 stakeholder groups have been included in the evaluation

  3. Integrating economic analysis and the science of climate instability

    International Nuclear Information System (INIS)

    Hall, Darwin C.; Behl, Richard J.

    2006-01-01

    Scientific understanding of climate change and climate instability has undergone a revolution in the past decade with the discovery of numerous past climate transitions so rapid, and so unlike the expectation of smooth climate changes, that they would have previously been unbelievable to the scientific community. Models commonly used by economists to assess the wisdom of adapting to human-induced climate change, rather than averting it, lack the ability to incorporate this new scientific knowledge. Here, we identify and explain the nature of recent scientific advances, and describe the key ways in which failure to reflect new knowledge in economic analysis skews the results of that analysis. This includes the understanding that economic optimization models reliant on convexity are inherently unable to determine an 'optimal' policy solution. It is incumbent on economists to understand and to incorporate the new science in their models, and on climatologists and other scientists to understand the basis of economic models so that they can assist in this essential effort. (author)

  4. A Case Study of a Mixed Methods Study Engaged in Integrated Data Analysis

    Science.gov (United States)

    Schiazza, Daniela Marie

    2013-01-01

    The nascent field of mixed methods research has yet to develop a cohesive framework of guidelines and procedures for mixed methods data analysis (Greene, 2008). To support the field's development of analytical frameworks, this case study reflects on the development and implementation of a mixed methods study engaged in integrated data analysis.…

  5. Case-study application of venture analysis: the integrated energy utility. Volume 3. Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Fein, E; Gordon, T J; King, R; Kropp, F G; Shuchman, H L; Stover, J; Hausz, W; Meyer, C

    1978-11-01

    The appendices for a case-study application of venture analysis for an integrated energy utility for commercialization are presented. The following are included and discussed: utility interviews; net social benefits - quantitative calculations; the financial analysis model; market penetration decision model; international district heating systems; political and regulatory environment; institutional impacts.

  6. Global sensitivity analysis of DRAINMOD-FOREST, an integrated forest ecosystem model

    Science.gov (United States)

    Shiying Tian; Mohamed A. Youssef; Devendra M. Amatya; Eric D. Vance

    2014-01-01

    Global sensitivity analysis is a useful tool to understand process-based ecosystem models by identifying key parameters and processes controlling model predictions. This study reported a comprehensive global sensitivity analysis for DRAINMOD-FOREST, an integrated model for simulating water, carbon (C), and nitrogen (N) cycles and plant growth in lowland forests. The...

  7. Integrated Risk-Capability Analysis under Deep Uncertainty : An ESDMA Approach

    NARCIS (Netherlands)

    Pruyt, E.; Kwakkel, J.H.

    2012-01-01

    Integrated risk-capability analysis methodologies for dealing with increasing degrees of complexity and deep uncertainty are urgently needed in an ever more complex and uncertain world. Although scenario approaches, risk assessment methods, and capability analysis methods are used, few organizations

  8. Analysis of unprotected overcooling events in the Integral Fast Reactor

    International Nuclear Information System (INIS)

    Vilim, R.B.

    1989-01-01

    Simple analytic models are developed for predicting the response of a metal fueled, liquid-metal cooled reactor to unprotected overcooling events in the balance of plant. All overcooling initiators are shown to fall into two categories. The first category contains these events for which there is no final equilibrium state of constant overcooling, as in the case for a large steam leak. These events are analyzed using a non-flow control mass approach. The second category contains those events which will eventually equilibrate, such as a loss of feedwater heaters. A steady flow control volume analysis shows that these latter events ultimately affect the plant through the feedwater inlet to the steam generator. The models developed for analyzing these two categories provide upper bounds for the reactor's passive response to overcooling accident initiators. Calculation of these bounds for a prototypic plant indicate that failure limits -- eutectic melting, sodium boiling, fuel pin failure -- are not exceeded in any overcooling event. 2 refs

  9. California-Wyoming Grid Integration Study: Phase 1 -- Economic Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Corbus, D.; Hurlbut, D.; Schwabe, P.; Ibanez, E.; Milligan, M.; Brinkman, G.; Paduru, A.; Diakov, V.; Hand, M.

    2014-03-01

    This study presents a comparative analysis of two different renewable energy options for the California energy market between 2017 and 2020: 12,000 GWh per year from new California in-state renewable energy resources; and 12,000 GWh per year from Wyoming wind delivered to the California marketplace. Either option would add to the California resources already existing or under construction, theoretically providing the last measure of power needed to meet (or to slightly exceed) the state's 33% renewable portfolio standard. Both options have discretely measurable differences in transmission costs, capital costs (due to the enabling of different generation portfolios), capacity values, and production costs. The purpose of this study is to compare and contrast the two different options to provide additional insight for future planning.

  10. Eco taxes and double dividend. A co-integration analysis

    International Nuclear Information System (INIS)

    Ghignoni, E.

    1999-01-01

    The paper tackles with the critical discussion of the eco tax double dividend theory. It is built up a theoretical and empirical analysis on short and long run effects of changes in real prices of energetic inputs on the use of polluted sources and labour, as well as on short and long run effects of changes in real labour cost on the use of labour and energy. At that purpose Johansen's procedure allows to get a structural multivariate model of dynamic relationships among the variables enclosed in the data set of a production function with labour and energy. This model is submitted to a dynamic simulation and results suggest that the introduction of an eco tax coupled with a decreasing in real labour cost would have uncertain and not leasing effects on the reduction of pollution coming from the utilisation of energetic sources and very limited effects on employment [it

  11. Geospatial analysis based on GIS integrated with LADAR.

    Science.gov (United States)

    Fetterman, Matt R; Freking, Robert; Fernandez-Cull, Christy; Hinkle, Christopher W; Myne, Anu; Relyea, Steven; Winslow, Jim

    2013-10-07

    In this work, we describe multi-layered analyses of a high-resolution broad-area LADAR data set in support of expeditionary activities. High-level features are extracted from the LADAR data, such as the presence and location of buildings and cars, and then these features are used to populate a GIS (geographic information system) tool. We also apply line-of-sight (LOS) analysis to develop a path-planning module. Finally, visualization is addressed and enhanced with a gesture-based control system that allows the user to navigate through the enhanced data set in a virtual immersive experience. This work has operational applications including military, security, disaster relief, and task-based robotic path planning.

  12. SVIP-N 1.0: An integrated visualization platform for neutronics analysis

    International Nuclear Information System (INIS)

    Luo Yuetong; Long Pengcheng; Wu Guoyong; Zeng Qin; Hu Liqin; Zou Jun

    2010-01-01

    Post-processing is an important part of neutronics analysis, and SVIP-N 1.0 (scientific visualization integrated platform for neutronics analysis) is designed to ease post-processing of neutronics analysis through visualization technologies. Main capabilities of SVIP-N 1.0 include: (1) ability of manage neutronics analysis result; (2) ability to preprocess neutronics analysis result; (3) ability to visualization neutronics analysis result data in different way. The paper describes the system architecture and main features of SVIP-N, some advanced visualization used in SVIP-N 1.0 and some preliminary applications, such as ITER.

  13. EUROPEAN INTEGRATION: A MULTILEVEL PROCESS THAT REQUIRES A MULTILEVEL STATISTICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Roxana-Otilia-Sonia HRITCU

    2015-11-01

    Full Text Available A process of market regulation and a system of multi-level governance and several supranational, national and subnational levels of decision making, European integration subscribes to being a multilevel phenomenon. The individual characteristics of citizens, as well as the environment where the integration process takes place, are important. To understand the European integration and its consequences it is important to develop and test multi-level theories that consider individual-level characteristics, as well as the overall context where individuals act and express their characteristics. A central argument of this paper is that support for European integration is influenced by factors operating at different levels. We review and present theories and related research on the use of multilevel analysis in the European area. This paper draws insights on various aspects and consequences of the European integration to take stock of what we know about how and why to use multilevel modeling.

  14. Exergy analysis of a combined heat and power plant with integrated lignocellulosic ethanol production

    DEFF Research Database (Denmark)

    Lythcke-Jørgensen, Christoffer Ernst; Haglind, Fredrik; Clausen, Lasse Røngaard

    2014-01-01

    production. An exergy analysis is carried out for a modelled polygeneration system in which lignocellulosic ethanol production based on hydrothermal pretreatment is integrated in an existing combined heat and power (CHP) plant. The ethanol facility is driven by steam extracted from the CHP unit when feasible...... district heating production in the ethanol facility. The results suggest that the efficiency of integrating lignocellulosic ethanol production in CHP plants is highly dependent on operation, and it is therefore suggested that the expected operation pattern of such polygeneration system is taken......Lignocellulosic ethanol production is often assumed integrated in polygeneration systems because of its energy intensive nature. The objective of this study is to investigate potential irreversibilities from such integration, and what impact it has on the efficiency of the integrated ethanol...

  15. Integrated torrefaction vs. external torrefaction - A thermodynamic analysis for the case of a thermochemical biorefinery

    DEFF Research Database (Denmark)

    Clausen, Lasse Røngaard

    2014-01-01

    Integrated and external torrefaction is analyzed and compared via thermodynamic modeling. In this paper, integrated torrefaction is defined as torrefaction integrated with entrained flow gasification. External torrefaction is defined as the decentralized production of torrefied wood pellets...... and centralized conversion of the pellets by entrained flow gasification. First, the syngas production of the two methods was compared. Second, the two methods were compared by considering complete biorefineries with either integrated torrefaction or external torrefaction. The first part of the analysis showed...... from external torrefaction to integrated torrefaction. The costs of this increase in energy efficiency are as follows: 1) more difficult transport, storage and handling of the biomass feedstock (wood chips vs. torrefied wood pellets); 2) reduced plant size; 3) no net electricity production; and 4...

  16. Climate-dependent evolution of Antarctic ectotherms: An integrative analysis

    Science.gov (United States)

    Pörtner, Hans O.

    2006-04-01

    The paper explores the climate-dependent evolution of marine Antarctic fauna and tries to identify key mechanisms involved as well as the driving forces that have caused the physiological and life history characteristics observed today. In an integrative approach it uses the recent concept of oxygen and capacity limited thermal tolerance to identify potential links between molecular, cellular, whole-organism, and ecological characteristics of marine animal life in the Antarctic. As a generalized pattern, minimization of baseline energy costs, for the sake of maximized growth in the cold, appears as one over-arching principle shaping the evolution and functioning of Antarctic marine ectotherms. This conclusion is supported by recent comparisons with (sub-) Arctic ectotherms, where elevated levels of energy turnover result at unstable, including cold temperatures, and are related to wide windows of thermal tolerance and associated metabolic features. At biochemical levels, metabolic regulation at low temperatures in general, is supported by the cold compensation of enzyme kinetic parameters like substrate affinities and turnover numbers, through minute structural modifications of the enzyme molecule. These involve a shift in protein folding, sometimes supported by the replacement of individual amino acids. The hypothesis is developed that efficient metabolic regulation at low rates in Antarctic marine stenotherms occurs through high mitochondrial densities at low capacities and possibly enhanced levels of Arrhenius activation energies or activation enthalpies. This contrasts the more costly patterns of metabolic regulation at elevated rates in cold-adapted eurytherms. Energy savings in Antarctic ectotherms, largely exemplified in fish, typically involve low-cost, diffusive oxygen distribution due to high density of lipid membranes, loss of haemoglobin, myoglobin and the heat shock response, reduced anaerobic capacity, large myocytes with low ion exchange activities

  17. Exergy analysis of a combined heat and power plant with integrated lignocellulosic ethanol production

    International Nuclear Information System (INIS)

    Lythcke-Jørgensen, Christoffer; Haglind, Fredrik; Clausen, Lasse R.

    2014-01-01

    Highlights: • We model a system where lignocellulosic ethanol production is integrated with a combined heat and power (CHP) plant. • We conduct an exergy analysis for the ethanol production in six different system operation points. • Integrated operation, district heating (DH) production and low CHP loads all increase the exergy efficiency. • Separate operation has the largest negative impact on the exergy efficiency. • Operation is found to have a significant impact on the exergy efficiency of the ethanol production. - Abstract: Lignocellulosic ethanol production is often assumed integrated in polygeneration systems because of its energy intensive nature. The objective of this study is to investigate potential irreversibilities from such integration, and what impact it has on the efficiency of the integrated ethanol production. An exergy analysis is carried out for a modelled polygeneration system in which lignocellulosic ethanol production based on hydrothermal pretreatment is integrated in an existing combined heat and power (CHP) plant. The ethanol facility is driven by steam extracted from the CHP unit when feasible, and a gas boiler is used as back-up when integration is not possible. The system was evaluated according to six operation points that alternate on the following three different operation parameters: Load in the CHP unit, integrated versus separate operation, and inclusion of district heating production in the ethanol facility. The calculated standard exergy efficiency of the ethanol facility varied from 0.564 to 0.855, of which the highest was obtained for integrated operation at reduced CHP load and full district heating production in the ethanol facility, and the lowest for separate operation with zero district heating production in the ethanol facility. The results suggest that the efficiency of integrating lignocellulosic ethanol production in CHP plants is highly dependent on operation, and it is therefore suggested that the

  18. A Comprehensive Database and Analysis Framework To Incorporate Multiscale Data Types and Enable Integrated Analysis of Bioactive Polyphenols.

    Science.gov (United States)

    Ho, Lap; Cheng, Haoxiang; Wang, Jun; Simon, James E; Wu, Qingli; Zhao, Danyue; Carry, Eileen; Ferruzzi, Mario G; Faith, Jeremiah; Valcarcel, Breanna; Hao, Ke; Pasinetti, Giulio M

    2018-03-05

    The development of a given botanical preparation for eventual clinical application requires extensive, detailed characterizations of the chemical composition, as well as the biological availability, biological activity, and safety profiles of the botanical. These issues are typically addressed using diverse experimental protocols and model systems. Based on this consideration, in this study we established a comprehensive database and analysis framework for the collection, collation, and integrative analysis of diverse, multiscale data sets. Using this framework, we conducted an integrative analysis of heterogeneous data from in vivo and in vitro investigation of a complex bioactive dietary polyphenol-rich preparation (BDPP) and built an integrated network linking data sets generated from this multitude of diverse experimental paradigms. We established a comprehensive database and analysis framework as well as a systematic and logical means to catalogue and collate the diverse array of information gathered, which is securely stored and added to in a standardized manner to enable fast query. We demonstrated the utility of the database in (1) a statistical ranking scheme to prioritize response to treatments and (2) in depth reconstruction of functionality studies. By examination of these data sets, the system allows analytical querying of heterogeneous data and the access of information related to interactions, mechanism of actions, functions, etc., which ultimately provide a global overview of complex biological responses. Collectively, we present an integrative analysis framework that leads to novel insights on the biological activities of a complex botanical such as BDPP that is based on data-driven characterizations of interactions between BDPP-derived phenolic metabolites and their mechanisms of action, as well as synergism and/or potential cancellation of biological functions. Out integrative analytical approach provides novel means for a systematic integrative

  19. Integral Hellmann--Feynman analysis of nonisoelectronic processes and the determination of local ionization potentials

    International Nuclear Information System (INIS)

    Simons, G.

    1975-01-01

    The integral Hellmann--Feynmann theorem is extended to apply to nonisoelectronic processes. A local ionization potential formula is proposed, and test calculations on three different approximate helium wavefunctions are reported which suggest that it may be numerically superior to the standard difference of expectation values. Arguments for the physical utility of the new concept are presented, and an integral Hellmann--Feynman analysis of transition energies is begun

  20. AMIC: an expandable integrated analog front-end for light distribution moments analysis

    OpenAIRE

    SPAGGIARI, MICHELE; Herrero Bosch, Vicente; Lerche, Christoph Werner; Aliaga Varea, Ramón José; Monzó Ferrer, José María; Gadea Gironés, Rafael

    2011-01-01

    In this article we introduce AMIC (Analog Moments Integrated Circuit), a novel analog Application Specific Integrated Circuit (ASIC) front-end for Positron Emission Tomography (PET) applications. Its working principle is based on mathematical analysis of light distribution through moments calculation. Each moment provides useful information about light distribution, such as energy, position, depth of interaction, skewness (deformation due to border effect) etc. A current buffer delivers a cop...

  1. INSIGHT: an integrated scoping analysis tool for in-core fuel management of PWR

    International Nuclear Information System (INIS)

    Yamamoto, Akio; Noda, Hidefumi; Ito, Nobuaki; Maruyama, Taiji.

    1997-01-01

    An integrated software tool for scoping analysis of in-core fuel management, INSIGHT, has been developed to automate the scoping analysis and to improve the fuel cycle cost using advanced optimization techniques. INSIGHT is an interactive software tool executed on UNIX based workstations that is equipped with an X-window system. INSIGHT incorporates the GALLOP loading pattern (LP) optimization module that utilizes hybrid genetic algorithms, the PATMAKER interactive LP design module, the MCA multicycle analysis module, an integrated database, and other utilities. Two benchmark problems were analyzed to confirm the key capabilities of INSIGHT: LP optimization and multicycle analysis. The first was the single cycle LP optimization problem that included various constraints. The second one was the multicycle LP optimization problem that includes the assembly burnup limitation at rod cluster control (RCC) positions. The results for these problems showed the feasibility of INSIGHT for the practical scoping analysis, whose work almost consists of LP generation and multicycle analysis. (author)

  2. Integrated analysis of mismatch repair system in malignant astrocytomas.

    Directory of Open Access Journals (Sweden)

    Irene Rodríguez-Hernández

    Full Text Available Malignant astrocytomas are the most aggressive primary brain tumors with a poor prognosis despite optimal treatment. Dysfunction of mismatch repair (MMR system accelerates the accumulation of mutations throughout the genome causing uncontrolled cell growth. The aim of this study was to characterize the MMR system defects that could be involved in malignant astrocytoma pathogenesis. We analyzed protein expression and promoter methylation of MLH1, MSH2 and MSH6 as well as microsatellite instability (MSI and MMR gene mutations in a set of 96 low- and high-grade astrocytomas. Forty-one astrocytomas failed to express at least one MMR protein. Loss of MSH2 expression was more frequent in low-grade astrocytomas. Loss of MLH1 expression was associated with MLH1 promoter hypermethylation and MLH1-93G>A promoter polymorphism. However, MSI was not related with MMR protein expression and only 5% of tumors were MSI-High. Furthermore, the incidence of tumors carrying germline mutations in MMR genes was low and only one glioblastoma was associated with Lynch syndrome. Interestingly, survival analysis identified that tumors lacking MSH6 expression presented longer overall survival in high-grade astrocytoma patients treated only with radiotherapy while MSH6 expression did not modify the prognosis of those patients treated with both radiotherapy and chemotherapy. Our findings suggest that MMR system alterations are a frequent event in malignant astrocytomas and might help to define a subgroup of patients with different outcome.

  3. An integrated sampling and analysis approach for improved biodiversity monitoring

    Science.gov (United States)

    DeWan, Amielle A.; Zipkin, Elise F.

    2010-01-01

    Successful biodiversity conservation requires high quality monitoring data and analyses to ensure scientifically defensible policy, legislation, and management. Although monitoring is a critical component in assessing population status and trends, many governmental and non-governmental organizations struggle to develop and implement effective sampling protocols and statistical analyses because of the magnitude and diversity of species in conservation concern. In this article we describe a practical and sophisticated data collection and analysis framework for developing a comprehensive wildlife monitoring program that includes multi-species inventory techniques and community-level hierarchical modeling. Compared to monitoring many species individually, the multi-species approach allows for improved estimates of individual species occurrences, including rare species, and an increased understanding of the aggregated response of a community to landscape and habitat heterogeneity. We demonstrate the benefits and practicality of this approach to address challenges associated with monitoring in the context of US state agencies that are legislatively required to monitor and protect species in greatest conservation need. We believe this approach will be useful to regional, national, and international organizations interested in assessing the status of both common and rare species.

  4. NordicWalking Performance Analysis with an Integrated Monitoring System

    Directory of Open Access Journals (Sweden)

    Francesco Mocera

    2018-05-01

    Full Text Available There is a growing interest in Nordic walking both from the fitness and medical point of views due to its possible therapeutic applications. The proper execution of the technique is an essential requirement to maximize the benefits of this practice. This is the reason why a monitoring system for outdoor Nordic walking activity was developed. Using data obtained from synchronized sensors, it is possible to have a complete overview of the users’ movements. The system described in this paper is able to measure: the pole angle during the pushing phase, the arms cycle frequency and synchronization and the pushing force applied to the ground. Furthermore, data from a GPS module give an image of the environment where the activity session takes place, in terms of the distance, slope, as well as the ground typology. A heart rate sensor is used to monitor the effort of the user through his/her Beats Per Minute (BPM. In this work, the developed monitoring system is presented, explaining how to use the gathered data to obtain the main feedback parameters for Nordic walking performance analysis. The comparison between left and right arm measurements allowed validating the system as a tool for technique evaluation. Finally, a procedure to estimate the peak pushing force from acceleration measurements is proposed.

  5. Computer-aided-engineering system for modeling and analysis of ECLSS integration testing

    Science.gov (United States)

    Sepahban, Sonbol

    1987-01-01

    The accurate modeling and analysis of two-phase fluid networks found in environmental control and life support systems is presently undertaken by computer-aided engineering (CAE) techniques whose generalized fluid dynamics package can solve arbitrary flow networks. The CAE system for integrated test bed modeling and analysis will also furnish interfaces and subsystem/test-article mathematical models. Three-dimensional diagrams of the test bed are generated by the system after performing the requisite simulation and analysis.

  6. Integration of electrochemistry in micro-total analysis systems for biochemical assays: recent developments.

    Science.gov (United States)

    Xu, Xiaoli; Zhang, Song; Chen, Hui; Kong, Jilie

    2009-11-15

    Micro-total analysis systems (microTAS) integrate different analytical operations like sample preparation, separation and detection into a single microfabricated device. With the outstanding advantages of low cost, satisfactory analytical efficiency and flexibility in design, highly integrated and miniaturized devices from the concept of microTAS have gained widespread applications, especially in biochemical assays. Electrochemistry is shown to be quite compatible with microanalytical systems for biochemical assays, because of its attractive merits such as simplicity, rapidity, high sensitivity, reduced power consumption, and sample/reagent economy. This review presents recent developments in the integration of electrochemistry in microdevices for biochemical assays. Ingenious microelectrode design and fabrication methods, and versatility of electrochemical techniques are involved. Practical applications of such integrated microsystem in biochemical assays are focused on in situ analysis, point-of-care testing and portable devices. Electrochemical techniques are apparently suited to microsystems, since easy microfabrication of electrochemical elements and a high degree of integration with multi-analytical functions can be achieved at low cost. Such integrated microsystems will play an increasingly important role for analysis of small volume biochemical samples. Work is in progress toward new microdevice design and applications.

  7. Integrative analysis of RUNX1 downstream pathways and target genes

    Directory of Open Access Journals (Sweden)

    Liu Marjorie

    2008-07-01

    Full Text Available Abstract Background The RUNX1 transcription factor gene is frequently mutated in sporadic myeloid and lymphoid leukemia through translocation, point mutation or amplification. It is also responsible for a familial platelet disorder with predisposition to acute myeloid leukemia (FPD-AML. The disruption of the largely unknown biological pathways controlled by RUNX1 is likely to be responsible for the development of leukemia. We have used multiple microarray platforms and bioinformatic techniques to help identify these biological pathways to aid in the understanding of why RUNX1 mutations lead to leukemia. Results Here we report genes regulated either directly or indirectly by RUNX1 based on the study of gene expression profiles generated from 3 different human and mouse platforms. The platforms used were global gene expression profiling of: 1 cell lines with RUNX1 mutations from FPD-AML patients, 2 over-expression of RUNX1 and CBFβ, and 3 Runx1 knockout mouse embryos using either cDNA or Affymetrix microarrays. We observe that our datasets (lists of differentially expressed genes significantly correlate with published microarray data from sporadic AML patients with mutations in either RUNX1 or its cofactor, CBFβ. A number of biological processes were identified among the differentially expressed genes and functional assays suggest that heterozygous RUNX1 point mutations in patients with FPD-AML impair cell proliferation, microtubule dynamics and possibly genetic stability. In addition, analysis of the regulatory regions of the differentially expressed genes has for the first time systematically identified numerous potential novel RUNX1 target genes. Conclusion This work is the first large-scale study attempting to identify the genetic networks regulated by RUNX1, a master regulator in the development of the hematopoietic system and leukemia. The biological pathways and target genes controlled by RUNX1 will have considerable importance in disease

  8. Devoloping an integrated analysis approach to exoplanetary spectroscopy

    Science.gov (United States)

    Waldmann, Ingo

    2015-07-01

    Analysing the atmospheres of Earth and SuperEarth type planets for possible biomarkers will push us to the limits of current and future instrumentation. As the field matures, we must also upgrade our data analysis and interpretation techniques from their "ad-hoc" beginnings to a solid statistical foundation. This is particularly important for the optimal exploitation of future instruments, such as JWST and E-ELT. At the limits of low signal-to-noise, we are prone to two sources of biases: 1) Prior selection in the data reduction; 2) Prior constraints on the spectral retrieval. A unified set of tools addressing both points is required. To de-trend low S/N, correlated data, we demonstrated blind-source-separation (BSS) machine learning techniques to be a significant step forward. Both in photometry and spectroscopy. BSS finds applications in fields as diverse as medical imaging to cosmology. Applied to exoplanets, it allows us to resolve de-trending biases and demonstrate consistency between data sets that were previously found to be highly discrepant and subject to much debate. For the interpretation of the data, we developed a novel atmospheric retrieval suite, Tau-REx. Tau-REx implements an unbiased prior selections via a custom built pattern recognition software. A full subsequent mapping of the likelihood space (using cluster computing) allows us, for the first time, to fully study degeneracies and biases in emission and transmission spectroscopy. The development of a coherent end-to-end infrastructure is paramount to the characterisation of ever smaller and fainter foreign worlds. In this conference, I will discuss what we have learned for current observations and the need for unified statistical frameworks in the era of JWST, E-ELT.

  9. The ICVSIE: A General Purpose Integral Equation Method for Bio-Electromagnetic Analysis.

    Science.gov (United States)

    Gomez, Luis J; Yucel, Abdulkadir C; Michielssen, Eric

    2018-03-01

    An internally combined volume surface integral equation (ICVSIE) for analyzing electromagnetic (EM) interactions with biological tissue and wide ranging diagnostic, therapeutic, and research applications, is proposed. The ICVSIE is a system of integral equations in terms of volume and surface equivalent currents in biological tissue subject to fields produced by externally or internally positioned devices. The system is created by using equivalence principles and solved numerically; the resulting current values are used to evaluate scattered and total electric fields, specific absorption rates, and related quantities. The validity, applicability, and efficiency of the ICVSIE are demonstrated by EM analysis of transcranial magnetic stimulation, magnetic resonance imaging, and neuromuscular electrical stimulation. Unlike previous integral equations, the ICVSIE is stable regardless of the electric permittivities of the tissue or frequency of operation, providing an application-agnostic computational framework for EM-biomedical analysis. Use of the general purpose and robust ICVSIE permits streamlining the development, deployment, and safety analysis of EM-biomedical technologies.

  10. Integrating Expert Knowledge with Statistical Analysis for Landslide Susceptibility Assessment at Regional Scale

    Directory of Open Access Journals (Sweden)

    Christos Chalkias

    2016-03-01

    Full Text Available In this paper, an integration landslide susceptibility model by combining expert-based and bivariate statistical analysis (Landslide Susceptibility Index—LSI approaches is presented. Factors related with the occurrence of landslides—such as elevation, slope angle, slope aspect, lithology, land cover, Mean Annual Precipitation (MAP and Peak Ground Acceleration (PGA—were analyzed within a GIS environment. This integrated model produced a landslide susceptibility map which categorized the study area according to the probability level of landslide occurrence. The accuracy of the final map was evaluated by Receiver Operating Characteristics (ROC analysis depending on an independent (validation dataset of landslide events. The prediction ability was found to be 76% revealing that the integration of statistical analysis with human expertise can provide an acceptable landslide susceptibility assessment at regional scale.

  11. Real analysis an introduction to the theory of real functions and integration

    CERN Document Server

    Dshalalow, Jewgeni H

    2000-01-01

    Designed for use in a two-semester course on abstract analysis, REAL ANALYSIS: An Introduction to the Theory of Real Functions and Integration illuminates the principle topics that constitute real analysis. Self-contained, with coverage of topology, measure theory, and integration, it offers a thorough elaboration of major theorems, notions, and constructions needed not only by mathematics students but also by students of statistics and probability, operations research, physics, and engineering.Structured logically and flexibly through the author''s many years of teaching experience, the material is presented in three main sections:Part 1, chapters 1through 3, covers the preliminaries of set theory and the fundamentals of metric spaces and topology. This section can also serves as a text for first courses in topology.Part II, chapter 4 through 7, details the basics of measure and integration and stands independently for use in a separate measure theory course.Part III addresses more advanced topics, includin...

  12. Integrative analysis for finding genes and networks involved in diabetes and other complex diseases

    DEFF Research Database (Denmark)

    Bergholdt, R.; Størling, Zenia, Marian; Hansen, Kasper Lage

    2007-01-01

    We have developed an integrative analysis method combining genetic interactions, identified using type 1 diabetes genome scan data, and a high-confidence human protein interaction network. Resulting networks were ranked by the significance of the enrichment of proteins from interacting regions. We...... identified a number of new protein network modules and novel candidate genes/proteins for type 1 diabetes. We propose this type of integrative analysis as a general method for the elucidation of genes and networks involved in diabetes and other complex diseases....

  13. Integrating computer aided radiography and plantar pressure measurements for complex gait analysis

    International Nuclear Information System (INIS)

    Gefen, A.; Megido-Ravid, M.; Itzchak, Y.; Arcan, M.

    1998-01-01

    Radiographic Fluoroscopy (DRF) and Contact Pressure Display (CPD). The CPD method uses a birefiingent integrated optical sandwich for contact stress analysis, e.g. plantar pressure distribution. The DRF method displays and electronically records skeletal motion using X-ray radiation, providing the exact bone and joint positions during gait. Integrating the two techniques, contribution of each segment to the HFS behavior may be studied by applying image processing and analysis techniques. The combined resulted data may be used not only to detect and diagnose gait pathologies but also as a base for development of advanced numerical models of the foot structure

  14. PHIDIAS: a pathogen-host interaction data integration and analysis system.

    Science.gov (United States)

    Xiang, Zuoshuang; Tian, Yuying; He, Yongqun

    2007-01-01

    The Pathogen-Host Interaction Data Integration and Analysis System (PHIDIAS) is a web-based database system that serves as a centralized source to search, compare, and analyze integrated genome sequences, conserved domains, and gene expression data related to pathogen-host interactions (PHIs) for pathogen species designated as high priority agents for public health and biological security. In addition, PHIDIAS allows submission, search and analysis of PHI genes and molecular networks curated from peer-reviewed literature. PHIDIAS is publicly available at http://www.phidias.us.

  15. Microfluidic device for continuous single cells analysis via Raman spectroscopy enhanced by integrated plasmonic nanodimers

    DEFF Research Database (Denmark)

    Perozziello, Gerardo; Candeloro, Patrizio; De Grazia, Antonio

    2016-01-01

    In this work a Raman flow cytometer is presented. It consists of a microfluidic device that takes advantages of the basic principles of Raman spectroscopy and flow cytometry. The microfluidic device integrates calibrated microfluidic channels-where the cells can flow one-by-one -, allowing single...... cell Raman analysis. The microfluidic channel integrates plasmonic nanodimers in a fluidic trapping region. In this way it is possible to perform Enhanced Raman Spectroscopy on single cell. These allow a label-free analysis, providing information about the biochemical content of membrane and cytoplasm...

  16. Integrated analysis of core debris interactions and their effects on containment integrity using the CONTAIN computer code

    International Nuclear Information System (INIS)

    Carroll, D.E.; Bergeron, K.D.; Williams, D.C.; Tills, J.L.; Valdez, G.D.

    1987-01-01

    The CONTAIN computer code includes a versatile system of phenomenological models for analyzing the physical, chemical and radiological conditions inside the containment building during severe reactor accidents. Important contributors to these conditions are the interactions which may occur between released corium and cavity concrete. The phenomena associated with interactions between ejected corium debris and the containment atmosphere (Direct Containment Heating or DCH) also pose a potential threat to containment integrity. In this paper, we describe recent enhancements of the CONTAIN code which allow an integrated analysis of these effects in the presence of other mitigating or aggravating physical processes. In particular, the recent inclusion of the CORCON and VANESA models is described and a calculation example presented. With this capability CONTAIN can model core-concrete interactions occurring simultaneously in multiple compartments and can couple the aerosols thereby generated to the mechanistic description of all atmospheric aerosol components. Also discussed are some recent results of modeling the phenomena involved in Direct Containment Heating. (orig.)

  17. Deterministic factor analysis: methods of integro-differentiation of non-integral order

    Directory of Open Access Journals (Sweden)

    Valentina V. Tarasova

    2016-12-01

    Full Text Available Objective to summarize the methods of deterministic factor economic analysis namely the differential calculus and the integral method. nbsp Methods mathematical methods for integrodifferentiation of nonintegral order the theory of derivatives and integrals of fractional nonintegral order. Results the basic concepts are formulated and the new methods are developed that take into account the memory and nonlocality effects in the quantitative description of the influence of individual factors on the change in the effective economic indicator. Two methods are proposed for integrodifferentiation of nonintegral order for the deterministic factor analysis of economic processes with memory and nonlocality. It is shown that the method of integrodifferentiation of nonintegral order can give more accurate results compared with standard methods method of differentiation using the first order derivatives and the integral method using the integration of the first order for a wide class of functions describing effective economic indicators. Scientific novelty the new methods of deterministic factor analysis are proposed the method of differential calculus of nonintegral order and the integral method of nonintegral order. Practical significance the basic concepts and formulas of the article can be used in scientific and analytical activity for factor analysis of economic processes. The proposed method for integrodifferentiation of nonintegral order extends the capabilities of the determined factorial economic analysis. The new quantitative method of deterministic factor analysis may become the beginning of quantitative studies of economic agents behavior with memory hereditarity and spatial nonlocality. The proposed methods of deterministic factor analysis can be used in the study of economic processes which follow the exponential law in which the indicators endogenous variables are power functions of the factors exogenous variables including the processes

  18. Analysis of metabolomic data: tools, current strategies and future challenges for omics data integration.

    Science.gov (United States)

    Cambiaghi, Alice; Ferrario, Manuela; Masseroli, Marco

    2017-05-01

    Metabolomics is a rapidly growing field consisting of the analysis of a large number of metabolites at a system scale. The two major goals of metabolomics are the identification of the metabolites characterizing each organism state and the measurement of their dynamics under different situations (e.g. pathological conditions, environmental factors). Knowledge about metabolites is crucial for the understanding of most cellular phenomena, but this information alone is not sufficient to gain a comprehensive view of all the biological processes involved. Integrated approaches combining metabolomics with transcriptomics and proteomics are thus required to obtain much deeper insights than any of these techniques alone. Although this information is available, multilevel integration of different 'omics' data is still a challenge. The handling, processing, analysis and integration of these data require specialized mathematical, statistical and bioinformatics tools, and several technical problems hampering a rapid progress in the field exist. Here, we review four main tools for number of users or provided features (MetaCoreTM, MetaboAnalyst, InCroMAP and 3Omics) out of the several available for metabolomic data analysis and integration with other 'omics' data, highlighting their strong and weak aspects; a number of related issues affecting data analysis and integration are also identified and discussed. Overall, we provide an objective description of how some of the main currently available software packages work, which may help the experimental practitioner in the choice of a robust pipeline for metabolomic data analysis and integration. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  19. Nonlinear Coupling Characteristics Analysis of Integrated System of Electromagnetic Brake and Frictional Brake of Car

    Directory of Open Access Journals (Sweden)

    Ren He

    2015-01-01

    Full Text Available Since theoretical guidance is lacking in the design and control of the integrated system of electromagnetic brake and frictional brake, this paper aims to solve this problem and explores the nonlinear coupling characteristics and dynamic characteristics of the integrated system of electromagnetic brake and frictional brake. This paper uses the power bond graph method to establish nonlinear coupling mathematical model of the integrated system of electromagnetic brake and frictional brake and conducts the contrastive analysis on the dynamic characteristics based on this mathematical model. Meanwhile, the accuracy of the nonlinear coupling mathematical model proposed above is verified on the hardware in the loop simulation platform, and nonlinear coupling characteristics of the integrated system are also analyzed through experiments.

  20. Integrated analysis of wind turbines - The impact of power systems on wind turbine design

    DEFF Research Database (Denmark)

    Barahona Garzón, Braulio

    Megawatt-size wind turbines nowadays operate in very complex environmental conditions, and increasingly demanding power system requirements. Pursuing a cost-effective and reliable wind turbine design is a multidisciplinary task. However nowadays, wind turbine design and research areas...... conditions that stem from disturbances in the power system. An integrated simulation environment, wind turbine models, and power system models are developed in order to take an integral perspective that considers the most important aeroelastic, structural, electrical, and control dynamics. Applications...... of the integrated simulation environment are presented. The analysis of an asynchronous machine, and numerical simulations of a fixedspeed wind turbine in the integrated simulation environment, demonstrate the effects on structural loads of including the generator rotor fluxes dynamics in aeroelastic studies. Power...

  1. Integration of hydrothermal carbonization and a CHP plant: Part 2 –operational and economic analysis

    International Nuclear Information System (INIS)

    Saari, Jussi; Sermyagina, Ekaterina; Kaikko, Juha; Vakkilainen, Esa; Sergeev, Vitaly

    2016-01-01

    Wood-fired combined heat and power (CHP) plants are a proven technology for producing domestic, carbon-neutral heat and power in Nordic countries. One drawback of CHP plants is the low capacity factors due to varying heat loads. In the current economic environment, uncertainty over energy prices creates also uncertainty over investment profitability. Hydrothermal carbonization (HTC) is a promising thermochemical conversion technology for producing an improved, more versatile wood-based fuel. Integrating HTC with a CHP plant allows simplifying the HTC process and extending the CHP plant operating time. An integrated polygeneration plant producing three energy products is also less sensitive to price changes in any one product. This study compares three integration cases chosen from the previous paper, and the case of separate stand-alone plants. The best economic performance is obtained using pressurized hot water from the CHP plant boiler drum as HTC process water. This has the poorest efficiency, but allows the greatest cost reduction in the HTC process and longest CHP plant operating time. The result demonstrates the suitability of CHP plants for integration with a HTC process, and the importance of economic and operational analysis considering annual load variations in sufficient detail. - Highlights: • Integration of wood hydrothermal carbonization with a small CHP plant studied. • Operation and economics of three concepts and stand-alone plants are compared. • Sensitivity analysis is performed. • Results show technical and thermodynamic analysis inadequate and misleading alone. • Minimizing HTC investment, extending CHP operating time important for profitability.

  2. J-integral evaluation and stability analysis in the unstable ductile fracture

    International Nuclear Information System (INIS)

    Miyoshi, Toshiro; Yoshida, Yuichiro; Shiratori, Masaki.

    1984-01-01

    Concerning unstable ductile fracture, which is an important problem on the structural stability of line pipes, nuclear reactor piping and so on, the research on fracture mechanics parameters which control the beginning of the stable growth and unstable growth of cracks attracts interest. At present, as the parameters, the T-modulus based on J-integral crack tip opening angle, crack opening angle averaged over crack developing part, plastic work coefficient and so on have been proposed. The research on the effectiveness and inter-relation of these parameters is divided into generation phase and application phase, and by these researches, it was reported that all T-modulus, CTOA and COA took almost constant values in relation to crack development, except initial transition period. In order to decide which parameter is most appropriate, the detailed analysis is required. In this study, the analysis of unstable ductile fracture of a central crack test piece and a small tensile test piece was carried out by finite element method, and the evaluation of J-integral in relation to crack development, J-integral resistance value when COA is assumed to be a constant, the form of an unstable fracture occurring point and the compliance dependence were examined. The method of analysis, the evaluation of J-integral, J-integral resistance value, unstable fracture occurring point and stability diagram are described. (Kako, I.)

  3. Stability Analysis and Variational Integrator for Real-Time Formation Based on Potential Field

    Directory of Open Access Journals (Sweden)

    Shengqing Yang

    2014-01-01

    Full Text Available This paper investigates a framework of real-time formation of autonomous vehicles by using potential field and variational integrator. Real-time formation requires vehicles to have coordinated motion and efficient computation. Interactions described by potential field can meet the former requirement which results in a nonlinear system. Stability analysis of such nonlinear system is difficult. Our methodology of stability analysis is discussed in error dynamic system. Transformation of coordinates from inertial frame to body frame can help the stability analysis focus on the structure instead of particular coordinates. Then, the Jacobian of reduced system can be calculated. It can be proved that the formation is stable at the equilibrium point of error dynamic system with the effect of damping force. For consideration of calculation, variational integrator is introduced. It is equivalent to solving algebraic equations. Forced Euler-Lagrange equation in discrete expression is used to construct a forced variational integrator for vehicles in potential field and obstacle environment. By applying forced variational integrator on computation of vehicles' motion, real-time formation of vehicles in obstacle environment can be implemented. Algorithm based on forced variational integrator is designed for a leader-follower formation.

  4. Performance analysis of solar energy integrated with natural-gas-to-methanol process

    International Nuclear Information System (INIS)

    Yang, Sheng; Liu, Zhiqiang; Tang, Zhiyong; Wang, Yifan; Chen, Qianqian; Sun, Yuhan

    2017-01-01

    Highlights: • Solar energy integrated with natural-gas-to-methanol process is proposed. • The two processes are modeled and simulated. • Performance analysis of the two processes are conducted. • The proposed process can cut down the greenhouse gas emission. • The proposed process can save natural gas consumption. - Abstract: Methanol is an important platform chemical. Methanol production using natural gas as raw material has short processing route and well developed equipment and technology. However, natural gas reserves are not large in China. Solar energy power generation system integrated with natural-gas-to-methanol (NGTM) process is developed, which may provide a technical routine for methanol production in the future. The solar energy power generation produces electricity for reforming unit and system consumption in solar energy integrated natural-gas-to-methanol system (SGTM). Performance analysis of conventional natural-gas-to-methanol process and solar energy integrated with natural-gas-to-methanol process are presented based on simulation results. Performance analysis was conducted considering carbon efficiency, production cost, solar energy price, natural gas price, and carbon tax. Results indicate that solar energy integrated with natural-gas-to-methanol process is able to cut down the greenhouse gas (GHG) emission. In addition, solar energy can replace natural gas as fuel. This can reduce the consumption of natural gas, which equals to 9.2% of the total consumed natural gas. However, it is not economical considering the current technology readiness level, compared with conventional natural-gas-to-methanol process.

  5. Brain Network Analysis: Separating Cost from Topology Using Cost-Integration

    Science.gov (United States)

    Ginestet, Cedric E.; Nichols, Thomas E.; Bullmore, Ed T.; Simmons, Andrew

    2011-01-01

    A statistically principled way of conducting brain network analysis is still lacking. Comparison of different populations of brain networks is hard because topology is inherently dependent on wiring cost, where cost is defined as the number of edges in an unweighted graph. In this paper, we evaluate the benefits and limitations associated with using cost-integrated topological metrics. Our focus is on comparing populations of weighted undirected graphs that differ in mean association weight, using global efficiency. Our key result shows that integrating over cost is equivalent to controlling for any monotonic transformation of the weight set of a weighted graph. That is, when integrating over cost, we eliminate the differences in topology that may be due to a monotonic transformation of the weight set. Our result holds for any unweighted topological measure, and for any choice of distribution over cost levels. Cost-integration is therefore helpful in disentangling differences in cost from differences in topology. By contrast, we show that the use of the weighted version of a topological metric is generally not a valid approach to this problem. Indeed, we prove that, under weak conditions, the use of the weighted version of global efficiency is equivalent to simply comparing weighted costs. Thus, we recommend the reporting of (i) differences in weighted costs and (ii) differences in cost-integrated topological measures with respect to different distributions over the cost domain. We demonstrate the application of these techniques in a re-analysis of an fMRI working memory task. We also provide a Monte Carlo method for approximating cost-integrated topological measures. Finally, we discuss the limitations of integrating topology over cost, which may pose problems when some weights are zero, when multiplicities exist in the ranks of the weights, and when one expects subtle cost-dependent topological differences, which could be masked by cost-integration. PMID:21829437

  6. Brain network analysis: separating cost from topology using cost-integration.

    Directory of Open Access Journals (Sweden)

    Cedric E Ginestet

    Full Text Available A statistically principled way of conducting brain network analysis is still lacking. Comparison of different populations of brain networks is hard because topology is inherently dependent on wiring cost, where cost is defined as the number of edges in an unweighted graph. In this paper, we evaluate the benefits and limitations associated with using cost-integrated topological metrics. Our focus is on comparing populations of weighted undirected graphs that differ in mean association weight, using global efficiency. Our key result shows that integrating over cost is equivalent to controlling for any monotonic transformation of the weight set of a weighted graph. That is, when integrating over cost, we eliminate the differences in topology that may be due to a monotonic transformation of the weight set. Our result holds for any unweighted topological measure, and for any choice of distribution over cost levels. Cost-integration is therefore helpful in disentangling differences in cost from differences in topology. By contrast, we show that the use of the weighted version of a topological metric is generally not a valid approach to this problem. Indeed, we prove that, under weak conditions, the use of the weighted version of global efficiency is equivalent to simply comparing weighted costs. Thus, we recommend the reporting of (i differences in weighted costs and (ii differences in cost-integrated topological measures with respect to different distributions over the cost domain. We demonstrate the application of these techniques in a re-analysis of an fMRI working memory task. We also provide a Monte Carlo method for approximating cost-integrated topological measures. Finally, we discuss the limitations of integrating topology over cost, which may pose problems when some weights are zero, when multiplicities exist in the ranks of the weights, and when one expects subtle cost-dependent topological differences, which could be masked by cost-integration.

  7. Double path-integral migration velocity analysis: a real data example

    International Nuclear Information System (INIS)

    Costa, Jessé C; Schleicher, Jörg

    2011-01-01

    Path-integral imaging forms an image with no knowledge of the velocity model by summing over the migrated images obtained for a set of migration velocity models. Double path-integral imaging migration extracts the stationary velocities, i.e. those velocities at which common-image gathers align horizontally, as a byproduct. An application of the technique to a real data set demonstrates that quantitative information about the time migration velocity model can be determined by double path-integral migration velocity analysis. Migrated images using interpolations with different regularizations of the extracted velocities prove the high quality of the resulting time-migration velocity information. The so-obtained velocity model can then be used as a starting model for subsequent velocity analysis tools like migration tomography or other tomographic methods

  8. Integration of Multifidelity Multidisciplinary Computer Codes for Design and Analysis of Supersonic Aircraft

    Science.gov (United States)

    Geiselhart, Karl A.; Ozoroski, Lori P.; Fenbert, James W.; Shields, Elwood W.; Li, Wu

    2011-01-01

    This paper documents the development of a conceptual level integrated process for design and analysis of efficient and environmentally acceptable supersonic aircraft. To overcome the technical challenges to achieve this goal, a conceptual design capability which provides users with the ability to examine the integrated solution between all disciplines and facilitates the application of multidiscipline design, analysis, and optimization on a scale greater than previously achieved, is needed. The described capability is both an interactive design environment as well as a high powered optimization system with a unique blend of low, mixed and high-fidelity engineering tools combined together in the software integration framework, ModelCenter. The various modules are described and capabilities of the system are demonstrated. The current limitations and proposed future enhancements are also discussed.

  9. IMG 4 version of the integrated microbial genomes comparative analysis system

    Science.gov (United States)

    Markowitz, Victor M.; Chen, I-Min A.; Palaniappan, Krishna; Chu, Ken; Szeto, Ernest; Pillay, Manoj; Ratner, Anna; Huang, Jinghua; Woyke, Tanja; Huntemann, Marcel; Anderson, Iain; Billis, Konstantinos; Varghese, Neha; Mavromatis, Konstantinos; Pati, Amrita; Ivanova, Natalia N.; Kyrpides, Nikos C.

    2014-01-01

    The Integrated Microbial Genomes (IMG) data warehouse integrates genomes from all three domains of life, as well as plasmids, viruses and genome fragments. IMG provides tools for analyzing and reviewing the structural and functional annotations of genomes in a comparative context. IMG’s data content and analytical capabilities have increased continuously since its first version released in 2005. Since the last report published in the 2012 NAR Database Issue, IMG’s annotation and data integration pipelines have evolved while new tools have been added for recording and analyzing single cell genomes, RNA Seq and biosynthetic cluster data. Different IMG datamarts provide support for the analysis of publicly available genomes (IMG/W: http://img.jgi.doe.gov/w), expert review of genome annotations (IMG/ER: http://img.jgi.doe.gov/er) and teaching and training in the area of microbial genome analysis (IMG/EDU: http://img.jgi.doe.gov/edu). PMID:24165883

  10. Application of RELAP/SCDAPSIM with integrated uncertainty options to research reactor systems thermal hydraulic analysis

    International Nuclear Information System (INIS)

    Allison, C.M.; Hohorst, J.K.; Perez, M.; Reventos, F.

    2010-01-01

    The RELAP/SCDAPSIM/MOD4.0 code, designed to predict the behavior of reactor systems during normal and accident conditions, is being developed as part of the international SCDAP Development and Training Program (SDTP). RELAP/SCDAPSIM/MOD4.0, which is the first version of RELAP5 completely rewritten to FORTRAN 90/95/2000 standards, uses publicly available RELAP5 and SCDAP models in combination with advanced programming and numerical techniques and other SDTP-member modeling/user options. One such member developed option is an integrated uncertainty analysis package being developed jointly by the Technical University of Catalonia (UPC) and Innovative Systems Software (ISS). This paper briefly summarizes the features of RELAP/SCDAPSIM/MOD4.0 and the integrated uncertainty analysis package, and then presents an example of how the integrated uncertainty package can be setup and used for a simple pipe flow problem. (author)

  11. IMG 4 version of the integrated microbial genomes comparative analysis system

    Energy Technology Data Exchange (ETDEWEB)

    Markowitz, Victor M. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Biological Data Management and Technology Center. Computational Research Division; Chen, I-Min A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Biological Data Management and Technology Center. Computational Research Division; Palaniappan, Krishna [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Biological Data Management and Technology Center. Computational Research Division; Chu, Ken [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Biological Data Management and Technology Center. Computational Research Division; Szeto, Ernest [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Biological Data Management and Technology Center. Computational Research Division; Pillay, Manoj [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Biological Data Management and Technology Center. Computational Research Division; Ratner, Anna [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Biological Data Management and Technology Center. Computational Research Division; Huang, Jinghua [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Biological Data Management and Technology Center. Computational Research Division; Woyke, Tanja [USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States). Microbial Genome and Metagenome Program; Huntemann, Marcel [USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States). Microbial Genome and Metagenome Program; Anderson, Iain [USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States). Microbial Genome and Metagenome Program; Billis, Konstantinos [USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States). Microbial Genome and Metagenome Program; Varghese, Neha [USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States). Microbial Genome and Metagenome Program; Mavromatis, Konstantinos [USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States). Microbial Genome and Metagenome Program; Pati, Amrita [USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States). Microbial Genome and Metagenome Program; Ivanova, Natalia N. [USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States). Microbial Genome and Metagenome Program; Kyrpides, Nikos C. [USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States). Microbial Genome and Metagenome Program

    2013-10-27

    The Integrated Microbial Genomes (IMG) data warehouse integrates genomes from all three domains of life, as well as plasmids, viruses and genome fragments. IMG provides tools for analyzing and reviewing the structural and functional annotations of genomes in a comparative context. IMG’s data content and analytical capabilities have increased continuously since its first version released in 2005. Since the last report published in the 2012 NAR Database Issue, IMG’s annotation and data integration pipelines have evolved while new tools have been added for recording and analyzing single cell genomes, RNA Seq and biosynthetic cluster data. Finally, different IMG datamarts provide support for the analysis of publicly available genomes (IMG/W: http://img.jgi.doe.gov/w), expert review of genome annotations (IMG/ER: http://img.jgi.doe.gov/er) and teaching and training in the area of microbial genome analysis (IMG/EDU: http://img.jgi.doe.gov/edu).

  12. Implementation of the structural integrity analysis for PWR primary components and piping

    International Nuclear Information System (INIS)

    Pellissier-Tanon, A.

    1982-01-01

    The trends on the definition, the assessment and the application of fracture strength evaluation methodology, which have arisen through experience in the design, construction and operation of French 900-MW plants are reviewed. The main features of the methodology proposed in a draft of Appendix ZG of the RCC-M code of practice for the design verification of fracture strength of primary components are presented. The research programs are surveyed and discussed from four viewpoints, first implementation of the LEFM analysis, secondly implementation of the fatigue crack propagation analysis, thirdly analysis of vessel integrity during emergency core cooling, and fourthly methodology for tear fracture analysis. (author)

  13. Mechanisms and mediation in survival analysis: towards an integrated analytical framework.

    LENUS (Irish Health Repository)

    Haase, Trutz

    2016-02-29

    A wide-ranging debate has taken place in recent years on mediation analysis and causal modelling, raising profound theoretical, philosophical and methodological questions. The authors build on the results of these discussions to work towards an integrated approach to the analysis of research questions that situate survival outcomes in relation to complex causal pathways with multiple mediators. The background to this contribution is the increasingly urgent need for policy-relevant research on the nature of inequalities in health and healthcare.

  14. SIGMA: A System for Integrative Genomic Microarray Analysis of Cancer Genomes

    Directory of Open Access Journals (Sweden)

    Davies Jonathan J

    2006-12-01

    Full Text Available Abstract Background The prevalence of high resolution profiling of genomes has created a need for the integrative analysis of information generated from multiple methodologies and platforms. Although the majority of data in the public domain are gene expression profiles, and expression analysis software are available, the increase of array CGH studies has enabled integration of high throughput genomic and gene expression datasets. However, tools for direct mining and analysis of array CGH data are limited. Hence, there is a great need for analytical and display software tailored to cross platform integrative analysis of cancer genomes. Results We have created a user-friendly java application to facilitate sophisticated visualization and analysis such as cross-tumor and cross-platform comparisons. To demonstrate the utility of this software, we assembled array CGH data representing Affymetrix SNP chip, Stanford cDNA arrays and whole genome tiling path array platforms for cross comparison. This cancer genome database contains 267 profiles from commonly used cancer cell lines representing 14 different tissue types. Conclusion In this study we have developed an application for the visualization and analysis of data from high resolution array CGH platforms that can be adapted for analysis of multiple types of high throughput genomic datasets. Furthermore, we invite researchers using array CGH technology to deposit both their raw and processed data, as this will be a continually expanding database of cancer genomes. This publicly available resource, the System for Integrative Genomic Microarray Analysis (SIGMA of cancer genomes, can be accessed at http://sigma.bccrc.ca.

  15. Assessment of the TRINO reactor pressure vessel integrity: theoretical analysis and NDE

    Energy Technology Data Exchange (ETDEWEB)

    Milella, P P; Pini, A [ENEA, Rome (Italy)

    1988-12-31

    This document presents the method used for the capability assessment of the Trino reactor pressure vessel. The vessel integrity assessment is divided into the following parts: transients evaluation and selection, fluence estimate for the projected end of life of the vessel, characterization of unirradiated and irradiated materials, thermal and stress analysis, fracture mechanics analysis and eventually fracture input to Non Destructive Examination (NDE). For each part, results are provided. (TEC).

  16. Gene ARMADA: an integrated multi-analysis platform for microarray data implemented in MATLAB.

    Science.gov (United States)

    Chatziioannou, Aristotelis; Moulos, Panagiotis; Kolisis, Fragiskos N

    2009-10-27

    The microarray data analysis realm is ever growing through the development of various tools, open source and commercial. However there is absence of predefined rational algorithmic analysis workflows or batch standardized processing to incorporate all steps, from raw data import up to the derivation of significantly differentially expressed gene lists. This absence obfuscates the analytical procedure and obstructs the massive comparative processing of genomic microarray datasets. Moreover, the solutions provided, heavily depend on the programming skills of the user, whereas in the case of GUI embedded solutions, they do not provide direct support of various raw image analysis formats or a versatile and simultaneously flexible combination of signal processing methods. We describe here Gene ARMADA (Automated Robust MicroArray Data Analysis), a MATLAB implemented platform with a Graphical User Interface. This suite integrates all steps of microarray data analysis including automated data import, noise correction and filtering, normalization, statistical selection of differentially expressed genes, clustering, classification and annotation. In its current version, Gene ARMADA fully supports 2 coloured cDNA and Affymetrix oligonucleotide arrays, plus custom arrays for which experimental details are given in tabular form (Excel spreadsheet, comma separated values, tab-delimited text formats). It also supports the analysis of already processed results through its versatile import editor. Besides being fully automated, Gene ARMADA incorporates numerous functionalities of the Statistics and Bioinformatics Toolboxes of MATLAB. In addition, it provides numerous visualization and exploration tools plus customizable export data formats for seamless integration by other analysis tools or MATLAB, for further processing. Gene ARMADA requires MATLAB 7.4 (R2007a) or higher and is also distributed as a stand-alone application with MATLAB Component Runtime. Gene ARMADA provides a

  17. Man-Machine Integrated Design and Analysis System (MIDAS): Functional Overview

    Science.gov (United States)

    Corker, Kevin; Neukom, Christian

    1998-01-01

    Included in the series of screen print-outs illustrates the structure and function of the Man-Machine Integrated Design and Analysis System (MIDAS). Views into the use of the system and editors are featured. The use-case in this set of graphs includes the development of a simulation scenario.

  18. Online analysis of oxygen inside silicon-glass microreactors with integrated optical sensors

    DEFF Research Database (Denmark)

    Ehgartner, Josef; Sulzer, Philipp; Burger, Tobias

    2016-01-01

    A powerful online analysis set-up for oxygen measurements within microfluidic devices is presented. It features integration of optical oxygen sensors into microreactors, which enables contactless, accurate and inexpensive readout using commercially available oxygen meters via luminescent lifetime...... monitoring of enzyme transformations, including d-alanine or d-phenylalanine oxidation by d-amino acid oxidase, and glucose oxidation by glucose oxidase....

  19. Summative Mass Analysis of Algal Biomass - Integration of Analytical Procedures: Laboratory Analytical Procedure (LAP)

    Energy Technology Data Exchange (ETDEWEB)

    Laurens, Lieve M. L.

    2016-01-13

    This procedure guides the integration of laboratory analytical procedures to measure algal biomass constituents in an unambiguous manner and ultimately achieve mass balance closure for algal biomass samples. Many of these methods build on years of research in algal biomass analysis.

  20. Painlevйe analysis and integrability of two-coupled non-linear ...

    Indian Academy of Sciences (India)

    the Painlevйe property. In this case the system is expected to be integrable. In recent years more attention is paid to the study of coupled non-linear oscilla- ... Painlevйe analysis. To be self-contained, in §2 we briefly outline the salient features.

  1. Integral cost-benefit analysis of Maglev technology under market imperfections

    NARCIS (Netherlands)

    Elhorst, J. Paul; Oosterhaven, Jan; Romp, Ward E.

    2001-01-01

    The aim of this article is to assess a proposed new mode of guided high speed ground transportation, the magnetic levitation rail system (Maglev), and to compare the results of a partial cost-benefit analysis with those of an integral CBA. We deal with an urbanconglomeration as well as a

  2. Explaining Technology Integration in K-12 Classrooms: A Multilevel Path Analysis Model

    Science.gov (United States)

    Liu, Feng; Ritzhaupt, Albert D.; Dawson, Kara; Barron, Ann E.

    2017-01-01

    The purpose of this research was to design and test a model of classroom technology integration in the context of K-12 schools. The proposed multilevel path analysis model includes teacher, contextual, and school related variables on a teacher's use of technology and confidence and comfort using technology as mediators of classroom technology…

  3. Integrative analysis of histone ChIP-seq and transcription data using Bayesian mixture models

    DEFF Research Database (Denmark)

    Klein, Hans-Ulrich; Schäfer, Martin; Porse, Bo T

    2014-01-01

    Histone modifications are a key epigenetic mechanism to activate or repress the transcription of genes. Datasets of matched transcription data and histone modification data obtained by ChIP-seq exist, but methods for integrative analysis of both data types are still rare. Here, we present a novel...

  4. Integrative Genomic Analysis of Cholangiocarcinoma Identifies Distinct IDH-Mutant Molecular Profiles

    DEFF Research Database (Denmark)

    Farshidfar, Farshad; Zheng, Siyuan; Gingras, Marie-Claude

    2017-01-01

    Cholangiocarcinoma (CCA) is an aggressive malignancy of the bile ducts, with poor prognosis and limited treatment options. Here, we describe the integrated analysis of somatic mutations, RNA expression, copy number, and DNA methylation by The Cancer Genome Atlas of a set of predominantly intrahep...

  5. ATLAS, an integrated structural analysis and design system. Volume 4: Random access file catalog

    Science.gov (United States)

    Gray, F. P., Jr. (Editor)

    1979-01-01

    A complete catalog is presented for the random access files used by the ATLAS integrated structural analysis and design system. ATLAS consists of several technical computation modules which output data matrices to corresponding random access file. A description of the matrices written on these files is contained herein.

  6. Fuzzy Decision Analysis for Integrated Environmental Vulnerability Assessment of the Mid-Atlantic Region

    Science.gov (United States)

    Liem T. Tran; C. Gregory Knight; Robert V. O' Neill; Elizabeth R. Smith; Kurt H. Riitters; James D. Wickham

    2002-01-01

    A fuzzy decision analysis method for integrating ecological indicators was developed. This was a combination of a fuzzy ranking method and the analytic hierarchy process (AHP). The method was capable of ranking ecosystems in terms of environmental conditions and suggesting cumulative impacts across a large region. Using data on land cover, population, roads, streams,...

  7. Multi-color fluorescent DNA analysis in an integrated optofluidic lab-on-a-chip

    NARCIS (Netherlands)

    Dongre, C.; van Weerd, J.; van Weeghel, R.; Martinez-Vazquez, R.; Osellame, R.; Cerullo, G.; Besselink, G.A.J.; van den Vlekkert, H.H.; Hoekstra, Hugo; Pollnau, Markus

    Sorting and sizing of DNA molecules within the human genome project has enabled the genetic mapping of various illnesses. By employing tiny lab-on-a-chip devices for such DNA analysis, integrated DNA sequencing and genetic diagnostics have become feasible. However, such diagnostic chips typically

  8. Flipping the Audience Script: An Activity That Integrates Research and Audience Analysis

    Science.gov (United States)

    Lam, Chris; Hannah, Mark A.

    2016-01-01

    This article describes a flipped classroom activity that requires students to integrate research and audience analysis. The activity uses Twitter as a data source. In the activity, students identify a sample, collect customer tweets, and analyze the language of the tweets in an effort to construct knowledge about an audience's values, needs, and…

  9. 3-D fracture analysis using a partial-reduced integration scheme

    International Nuclear Information System (INIS)

    Leitch, B.W.

    1987-01-01

    This paper presents details of 3-D elastic-plastic analyses of axially orientated external surface flaw in an internally pressurized thin-walled cylinder and discusses the variation of the J-integral values around the crack tip. A partial-reduced-integration-penalty method is introduced to minimize this variation of the J-integral near the crack tip. Utilizing 3-D symmetry, an eighth segment of a tube containing an elliptically shaped external surface flaw is modelled using 20-noded isoparametric elements. The crack-tip elements are collapsed to form a 1/r stress singularity about the curved crack front. The finite element model is subjected to internal pressure and axial pressure-generated loads. The virtual crack extension method is used to determine linear elastic stress intensity factors from the J-integral results at various points around the crack front. Despite the different material constants and the thinner wall thickness in this analysis, the elastic results compare favourably with those obtained by other researchers. The nonlinear stress-strain behaviour of the tube material is modelled using an incremental theory of plasticity. Variations of the J-integral values around the curved crack front of the 3-D flaw were seen. These variations could not be resolved by neglecting the immediate crack-tip elements J-integral results in favour of the more remote contour paths or else smoothed out when all the path results are averaged. Numerical incompatabilities in the 20-noded 3-D finite elements used to model the surface flaw were found. A partial-reduced integration scheme, using a combination of full and reduced integration elements, is proposed to determine J-integral results for 3-D fracture analyses. This procedure is applied to the analysis of an external semicircular surface flaw projecting halfway into the tube wall thickness. Examples of the J-integral values, before and after the partial-reduced integration method is employed, are given around the

  10. Timing of Formal Phase Safety Reviews for Large-Scale Integrated Hazard Analysis

    Science.gov (United States)

    Massie, Michael J.; Morris, A. Terry

    2010-01-01

    Integrated hazard analysis (IHA) is a process used to identify and control unacceptable risk. As such, it does not occur in a vacuum. IHA approaches must be tailored to fit the system being analyzed. Physical, resource, organizational and temporal constraints on large-scale integrated systems impose additional direct or derived requirements on the IHA. The timing and interaction between engineering and safety organizations can provide either benefits or hindrances to the overall end product. The traditional approach for formal phase safety review timing and content, which generally works well for small- to moderate-scale systems, does not work well for very large-scale integrated systems. This paper proposes a modified approach to timing and content of formal phase safety reviews for IHA. Details of the tailoring process for IHA will describe how to avoid temporary disconnects in major milestone reviews and how to maintain a cohesive end-to-end integration story particularly for systems where the integrator inherently has little to no insight into lower level systems. The proposal has the advantage of allowing the hazard analysis development process to occur as technical data normally matures.

  11. A hybrid approach to device integration on a genetic analysis platform

    International Nuclear Information System (INIS)

    Brennan, Des; Justice, John; Aherne, Margaret; Galvin, Paul; Jary, Dorothee; Kurg, Ants; Berik, Evgeny; Macek, Milan

    2012-01-01

    Point-of-care (POC) systems require significant component integration to implement biochemical protocols associated with molecular diagnostic assays. Hybrid platforms where discrete components are combined in a single platform are a suitable approach to integration, where combining multiple device fabrication steps on a single substrate is not possible due to incompatible or costly fabrication steps. We integrate three devices each with a specific system functionality: (i) a silicon electro-wetting-on-dielectric (EWOD) device to move and mix sample and reagent droplets in an oil phase, (ii) a polymer microfluidic chip containing channels and reservoirs and (iii) an aqueous phase glass microarray for fluorescence microarray hybridization detection. The EWOD device offers the possibility of fully integrating on-chip sample preparation using nanolitre sample and reagent volumes. A key challenge is sample transfer from the oil phase EWOD device to the aqueous phase microarray for hybridization detection. The EWOD device, waveguide performance and functionality are maintained during the integration process. An on-chip biochemical protocol for arrayed primer extension (APEX) was implemented for single nucleotide polymorphism (SNiP) analysis. The prepared sample is aspirated from the EWOD oil phase to the aqueous phase microarray for hybridization. A bench-top instrumentation system was also developed around the integrated platform to drive the EWOD electrodes, implement APEX sample heating and image the microarray after hybridization. (paper)

  12. Analysis of Elastic-Plastic J Integrals for 3-Dimensional Cracks Using Finite Element Alternating Method

    International Nuclear Information System (INIS)

    Park, Jai Hak

    2009-01-01

    SGBEM(Symmetric Galerkin Boundary Element Method)-FEM alternating method has been proposed by Nikishkov, Park and Atluri. In the proposed method, arbitrarily shaped three-dimensional crack problems can be solved by alternating between the crack solution in an infinite body and the finite element solution without a crack. In the previous study, the SGBEM-FEM alternating method was extended further in order to solve elastic-plastic crack problems and to obtain elastic-plastic stress fields. For the elastic-plastic analysis the algorithm developed by Nikishkov et al. is used after modification. In the algorithm, the initial stress method is used to obtain elastic-plastic stress and strain fields. In this paper, elastic-plastic J integrals for three-dimensional cracks are obtained using the method. For that purpose, accurate values of displacement gradients and stresses are necessary on an integration path. In order to improve the accuracy of stress near crack surfaces, coordinate transformation and partitioning of integration domain are used. The coordinate transformation produces a transformation Jacobian, which cancels the singularity of the integrand. Using the developed program, simple three-dimensional crack problems are solved and elastic and elastic-plastic J integrals are obtained. The obtained J integrals are compared with the values obtained using a handbook solution. It is noted that J integrals obtained from the alternating method are close to the values from the handbook

  13. Integrative omics analysis. A study based on Plasmodium falciparum mRNA and protein data.

    Science.gov (United States)

    Tomescu, Oana A; Mattanovich, Diethard; Thallinger, Gerhard G

    2014-01-01

    Technological improvements have shifted the focus from data generation to data analysis. The availability of large amounts of data from transcriptomics, protemics and metabolomics experiments raise new questions concerning suitable integrative analysis methods. We compare three integrative analysis techniques (co-inertia analysis, generalized singular value decomposition and integrative biclustering) by applying them to gene and protein abundance data from the six life cycle stages of Plasmodium falciparum. Co-inertia analysis is an analysis method used to visualize and explore gene and protein data. The generalized singular value decomposition has shown its potential in the analysis of two transcriptome data sets. Integrative Biclustering applies biclustering to gene and protein data. Using CIA, we visualize the six life cycle stages of Plasmodium falciparum, as well as GO terms in a 2D plane and interpret the spatial configuration. With GSVD, we decompose the transcriptomic and proteomic data sets into matrices with biologically meaningful interpretations and explore the processes captured by the data sets. IBC identifies groups of genes, proteins, GO Terms and life cycle stages of Plasmodium falciparum. We show method-specific results as well as a network view of the life cycle stages based on the results common to all three methods. Additionally, by combining the results of the three methods, we create a three-fold validated network of life cycle stage specific GO terms: Sporozoites are associated with transcription and transport; merozoites with entry into host cell as well as biosynthetic and metabolic processes; rings with oxidation-reduction processes; trophozoites with glycolysis and energy production; schizonts with antigenic variation and immune response; gametocyctes with DNA packaging and mitochondrial transport. Furthermore, the network connectivity underlines the separation of the intraerythrocytic cycle from the gametocyte and sporozoite stages

  14. TRAC-CFD code integration and its application to containment analysis

    International Nuclear Information System (INIS)

    Tahara, M.; Arai, K.; Oikawa, H.

    2004-01-01

    Several safety systems utilizing natural driving force have been recently adopted for operating reactors, or applied to next-generation reactor design. Examples of these safety systems are the Passive Containment Cooling System (PCCS) and the Drywell Cooler (DWC) for removing decay heat, and the Passive Auto-catalytic Recombiner (PAR) for removing flammable gas in reactor containment during an accident. DWC is used in almost all Boiling Water Reactors (BWR) in service. PAR has been introduced for some reactors in Europe and will be introduced for Japanese reactors. PCCS is a safety device of next-generation BWR. The functional mechanism of these safety systems is closely related to the transient of the thermal-hydraulic condition of the containment atmosphere. The performance depends on the containment atmospheric condition, which is eventually affected by the mass and energy changes caused by the safety system. Therefore, the thermal fluid dynamics in the containment vessel should be appropriately considered in detail to properly estimate the performance of these systems. A computational fluid dynamics (CFD) code is useful for evaluating detailed thermal hydraulic behavior related to this equipment. However, it also requires a considerable amount of computational resources when it is applied to whole containment system transient analysis. The paper describes the method and structure of the integrated analysis tool, and discusses the results of its application to the start-up behavior analysis of a containment cooling system, a drywell local cooler. The integrated analysis code was developed and applied to estimate the DWC performance during a severe accident. The integrated analysis tool is composed of three codes, TRAC-PCV, CFD-DW and TRAC-CC, and analyzes the interaction of the natural convection and steam condensation of the DWC as well as analyzing the thermal hydraulic transient behavior of the containment vessel during a severe accident in detail. The

  15. Empirical Analysis of the Integration Activity of Business Structures in the Regions of Russia

    Directory of Open Access Journals (Sweden)

    Maria Gennadyevna Karelina

    2015-12-01

    Full Text Available The article investigates the integration activity of business structures in the regions of Russia. A wide variety of approaches to the study of the problems and prospects of economic integration and the current dispute on the role of integration processes in the regional economic development have determined the complexity of the concepts “integration” and “integration activities” in order to develop the objective conditions to analyse the integration activity of business structures in the Russian regions. The monitoring of the current legal system of the Russian Federation carried out in the area of statistics and compiling statistical databases on mergers and acquisitions has showed the absence of the formal executive authority dealing with the compiling and collections of information on the integration activity at the regional level. In this connection, the data of Russian information and analytical agencies are made from the information and analytical base. As the research tools, the methods of analysis of structural changes, methods of analysis of economic differentiation and concentration, methods of non-parametric statistics are used. The article shows the close relationship between the social and economic development of the subjects of Russia and the integrated business structures functioning on its territory. An investigation of the integration activity structure and dynamics in the subjects of the Russian Federation based on the statistical data for the period from 2003 to 2012 has revealed the increasing heterogeneity of the integration activity of business structures in the regions of Russia. The hypothesis of a substantial divergence of mergers and acquisitions of corporate structures in the Russian regions was confirmed by the high values of the Gini coefficient, the Herfindahl index, and the decile coefficient of differentiation. The research results are of practical importance since they can be used to improve the existing

  16. Human papilloma viruses and cervical tumours: mapping of integration sites and analysis of adjacent cellular sequences

    International Nuclear Information System (INIS)

    Klimov, Eugene; Vinokourova, Svetlana; Moisjak, Elena; Rakhmanaliev, Elian; Kobseva, Vera; Laimins, Laimonis; Kisseljov, Fjodor; Sulimova, Galina

    2002-01-01

    In cervical tumours the integration of human papilloma viruses (HPV) transcripts often results in the generation of transcripts that consist of hybrids of viral and cellular sequences. Mapping data using a variety of techniques has demonstrated that HPV integration occurred without obvious specificity into human genome. However, these techniques could not demonstrate whether integration resulted in the generation of transcripts encoding viral or viral-cellular sequences. The aim of this work was to map the integration sites of HPV DNA and to analyse the adjacent cellular sequences. Amplification of the INTs was done by the APOT technique. The APOT products were sequenced according to standard protocols. The analysis of the sequences was performed using BLASTN program and public databases. To localise the INTs PCR-based screening of GeneBridge4-RH-panel was used. Twelve cellular sequences adjacent to integrated HPV16 (INT markers) expressed in squamous cell cervical carcinomas were isolated. For 11 INT markers homologous human genomic sequences were readily identified and 9 of these showed significant homologies to known genes/ESTs. Using the known locations of homologous cDNAs and the RH-mapping techniques, mapping studies showed that the INTs are distributed among different human chromosomes for each tumour sample and are located in regions with the high levels of expression. Integration of HPV genomes occurs into the different human chromosomes but into regions that contain highly transcribed genes. One interpretation of these studies is that integration of HPV occurs into decondensed regions, which are more accessible for integration of foreign DNA

  17. Integrating enzyme fermentation in lignocellulosic ethanol production: life-cycle assessment and techno-economic analysis.

    Science.gov (United States)

    Olofsson, Johanna; Barta, Zsolt; Börjesson, Pål; Wallberg, Ola

    2017-01-01

    Cellulase enzymes have been reported to contribute with a significant share of the total costs and greenhouse gas emissions of lignocellulosic ethanol production today. A potential future alternative to purchasing enzymes from an off-site manufacturer is to integrate enzyme and ethanol production, using microorganisms and part of the lignocellulosic material as feedstock for enzymes. This study modelled two such integrated process designs for ethanol from logging residues from spruce production, and compared it to an off-site case based on existing data regarding purchased enzymes. Greenhouse gas emissions and primary energy balances were studied in a life-cycle assessment, and cost performance in a techno-economic analysis. The base case scenario suggests that greenhouse gas emissions per MJ of ethanol could be significantly lower in the integrated cases than in the off-site case. However, the difference between the integrated and off-site cases is reduced with alternative assumptions regarding enzyme dosage and the environmental impact of the purchased enzymes. The comparison of primary energy balances did not show any significant difference between the cases. The minimum ethanol selling price, to reach break-even costs, was from 0.568 to 0.622 EUR L -1 for the integrated cases, as compared to 0.581 EUR L -1 for the off-site case. An integrated process design could reduce greenhouse gas emissions from lignocellulose-based ethanol production, and the cost of an integrated process could be comparable to purchasing enzymes produced off-site. This study focused on the environmental and economic assessment of an integrated process, and in order to strengthen the comparison to the off-site case, more detailed and updated data regarding industrial off-site enzyme production are especially important.

  18. Requirement analysis and architecture of data communication system for integral reactor

    International Nuclear Information System (INIS)

    Jeong, K. I.; Kwon, H. J.; Park, J. H.; Park, H. Y.; Koo, I. S.

    2005-05-01

    When digitalizing the Instrumentation and Control(I and C) systems in Nuclear Power Plants(NPP), a communication network is required for exchanging the digitalized data between I and C equipments in a NPP. A requirements analysis and an analysis of design elements and techniques are required for the design of a communication network. Through the requirements analysis of the code and regulation documents such as NUREG/CR-6082, section 7.9 of NUREG 0800 , IEEE Standard 7-4.3.2 and IEEE Standard 603, the extracted requirements can be used as a design basis and design concept for a detailed design of a communication network in the I and C system of an integral reactor. Design elements and techniques such as a physical topology, protocol transmission media and interconnection device should be considered for designing a communication network. Each design element and technique should be analyzed and evaluated as a portion of the integrated communication network design. In this report, the basic design requirements related to the design of communication network are investigated by using the code and regulation documents and an analysis of the design elements and techniques is performed. Based on these investigation and analysis, the overall architecture including the safety communication network and the non-safety communication network is proposed for an integral reactor

  19. Advanced GPR imaging of sedimentary features: integrated attribute analysis applied to sand dunes

    Science.gov (United States)

    Zhao, Wenke; Forte, Emanuele; Fontolan, Giorgio; Pipan, Michele

    2018-04-01

    We evaluate the applicability and the effectiveness of integrated GPR attribute analysis to image the internal sedimentary features of the Piscinas Dunes, SW Sardinia, Italy. The main objective is to explore the limits of GPR techniques to study sediment-bodies geometry and to provide a non-invasive high-resolution characterization of the different subsurface domains of dune architecture. On such purpose, we exploit the high-quality Piscinas data-set to extract and test different attributes of the GPR trace. Composite displays of multi-attributes related to amplitude, frequency, similarity and textural features are displayed with overlays and RGB mixed models. A multi-attribute comparative analysis is used to characterize different radar facies to better understand the characteristics of internal reflection patterns. The results demonstrate that the proposed integrated GPR attribute analysis can provide enhanced information about the spatial distribution of sediment bodies, allowing an enhanced and more constrained data interpretation.

  20. Integration of numerical analysis tools for automated numerical optimization of a transportation package design

    International Nuclear Information System (INIS)

    Witkowski, W.R.; Eldred, M.S.; Harding, D.C.

    1994-01-01

    The use of state-of-the-art numerical analysis tools to determine the optimal design of a radioactive material (RAM) transportation container is investigated. The design of a RAM package's components involves a complex coupling of structural, thermal, and radioactive shielding analyses. The final design must adhere to very strict design constraints. The current technique used by cask designers is uncoupled and involves designing each component separately with respect to its driving constraint. With the use of numerical optimization schemes, the complex couplings can be considered directly, and the performance of the integrated package can be maximized with respect to the analysis conditions. This can lead to more efficient package designs. Thermal and structural accident conditions are analyzed in the shape optimization of a simplified cask design. In this paper, details of the integration of numerical analysis tools, development of a process model, nonsmoothness difficulties with the optimization of the cask, and preliminary results are discussed

  1. Environmental sustainable decision making – The need and obstacles for integration of LCA into decision analysis

    DEFF Research Database (Denmark)

    Dong, Yan; Miraglia, Simona; Manzo, Stefano

    2018-01-01

    systems, revealing potential problem shifting between life cycle stages. Through the integration with traditional risk based decision analysis, LCA may thus facilitate a better informed decision process. In this study we explore how environmental impacts are taken into account in different fields......Decision analysis is often used to help decision makers choose among alternatives, based on the expected utility associated to each alternative as function of its consequences and potential impacts. Environmental impacts are not always among the prioritized concerns of traditional decision making...... of interest for decision makers to identify the need, potential and obstacles for integrating LCA into conventional approaches to decision problems. Three application areas are used as examples: transportation planning, flood management, and food production and consumption. The analysis of these cases shows...

  2. Strategic Integration of Multiple Bioinformatics Resources for System Level Analysis of Biological Networks.

    Science.gov (United States)

    D'Souza, Mark; Sulakhe, Dinanath; Wang, Sheng; Xie, Bing; Hashemifar, Somaye; Taylor, Andrew; Dubchak, Inna; Conrad Gilliam, T; Maltsev, Natalia

    2017-01-01

    Recent technological advances in genomics allow the production of biological data at unprecedented tera- and petabyte scales. Efficient mining of these vast and complex datasets for the needs of biomedical research critically depends on a seamless integration of the clinical, genomic, and experimental information with prior knowledge about genotype-phenotype relationships. Such experimental data accumulated in publicly available databases should be accessible to a variety of algorithms and analytical pipelines that drive computational analysis and data mining.We present an integrated computational platform Lynx (Sulakhe et al., Nucleic Acids Res 44:D882-D887, 2016) ( http://lynx.cri.uchicago.edu ), a web-based database and knowledge extraction engine. It provides advanced search capabilities and a variety of algorithms for enrichment analysis and network-based gene prioritization. It gives public access to the Lynx integrated knowledge base (LynxKB) and its analytical tools via user-friendly web services and interfaces. The Lynx service-oriented architecture supports annotation and analysis of high-throughput experimental data. Lynx tools assist the user in extracting meaningful knowledge from LynxKB and experimental data, and in the generation of weighted hypotheses regarding the genes and molecular mechanisms contributing to human phenotypes or conditions of interest. The goal of this integrated platform is to support the end-to-end analytical needs of various translational projects.

  3. Integrated design and performance analysis of the KO HCCR TBM for ITER

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dong Won, E-mail: dwlee@kaeri.re.kr [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Jin, Hyung Gon; Lee, Eo Hwak; Yoon, Jae Sung; Kim, Suk Kwon; Lee, Cheol Woo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Ahn, Mu-Young; Cho, Seungyon [National Fusion Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    Highlights: • Integrated analysis is performed with the conventional CFD code (ANSYS-CFX). • Overall pressure drop and coolant flow scheme are investigated. • Manifold design is being performed considering flow distribution. - Abstract: To develop tritium breeding technology for a Fusion Reactor, Korea has participated in the Test Blanket Module (TBM) program in ITER. The He Cooled Ceramic Reflector (HCCR) TBM consists of functional components such as First Wall (FW), Breeding Zone (BZ), Side Wall (SW), and Back Manifold (BM) and it was designed based on the separate analyses for each component in 2012. Based on the each component analysis model, the integrated model is prepared and thermal-hydraulic analysis for the HCCR TBM is performed in the present study. The coolant flow distribution from BM and SW to FW and BZ, and resulted structure temperatures are obtained with the integrated model. It is found that the non-uniform flow rate occurs at FW and BZ and it causes excess of the design limit (550 °C) at some region. Based on this integrated model, we will perform the design optimization for obtaining uniform flow distribution for satisfying the design requirements.

  4. Strategies for Integrated Analysis of Genetic, Epigenetic, and Gene Expression Variation in Cancer

    DEFF Research Database (Denmark)

    Thingholm, Louise B; Andersen, Lars; Makalic, Enes

    2016-01-01

    The development and progression of cancer, a collection of diseases with complex genetic architectures, is facilitated by the interplay of multiple etiological factors. This complexity challenges the traditional single-platform study design and calls for an integrated approach to data analysis...... to integration strategies used for analyzing genetic risk factors for cancer. We critically examine the ability of these strategies to handle the complexity of the human genome and also accommodate information about the biological and functional interactions between the elements that have been measured...

  5. The demand for gasoline in South Africa. An empirical analysis using co-integration techniques

    International Nuclear Information System (INIS)

    Akinboade, Oludele A.; Ziramba, Emmanuel; Kumo, Wolassa L.

    2008-01-01

    Using the recently developed Autoregressive Distributed Lag (ARDL) bound testing approach to co-integration, suggested by Pesaran et al. (Pesaran, M.H., Shin, Y., Smith, R.J. Bounds Testing Approaches to the Analysis of Level Relationships. Journal of Applied Econometrics 2001; 16(3) 289-326), we empirically analyzed the long-run relationship among the variables in the aggregate gasoline demand function over the period 1978-2005. Our study confirms the existence of a co-integrating relationship. The estimated price and income elasticities of - 0.47 and 0.36 imply that gasoline demand in South Africa is price and income inelastic. (author)

  6. Chicken hepatic response to chronic heat stress using integrated transcriptome and metabolome analysis.

    Directory of Open Access Journals (Sweden)

    Sara F Jastrebski

    Full Text Available The liver plays a central role in metabolism and is important in maintaining homeostasis throughout the body. This study integrated transcriptomic and metabolomic data to understand how the liver responds under chronic heat stress. Chickens from a rapidly growing broiler line were heat stressed for 8 hours per day for one week and liver samples were collected at 28 days post hatch. Transcriptome analysis reveals changes in genes responsible for cell cycle regulation, DNA replication, and DNA repair along with immune function. Integrating the metabolome and transcriptome data highlighted multiple pathways affected by heat stress including glucose, amino acid, and lipid metabolism along with glutathione production and beta-oxidation.

  7. The demand for gasoline in South Africa. An empirical analysis using co-integration techniques

    Energy Technology Data Exchange (ETDEWEB)

    Akinboade, Oludele A.; Ziramba, Emmanuel; Kumo, Wolassa L. [Department of Economics, University of South Africa, P.O.Box 392, Pretoria 0003 (South Africa)

    2008-11-15

    Using the recently developed Autoregressive Distributed Lag (ARDL) bound testing approach to co-integration, suggested by Pesaran et al. (Pesaran, M.H., Shin, Y., Smith, R.J. Bounds Testing Approaches to the Analysis of Level Relationships. Journal of Applied Econometrics 2001; 16(3) 289-326), we empirically analyzed the long-run relationship among the variables in the aggregate gasoline demand function over the period 1978-2005. Our study confirms the existence of a co-integrating relationship. The estimated price and income elasticities of - 0.47 and 0.36 imply that gasoline demand in South Africa is price and income inelastic. (author)

  8. Strategies for Integrated Analysis of Genetic, Epigenetic, and Gene Expression Variation in Cancer: Addressing the Challenges

    DEFF Research Database (Denmark)

    Thingholm, Louise Bruun; Andersen, Lars; Makalic, Enes

    2016-01-01

    to integration strategies used for analyzing genetic risk factors for cancer. We critically examine the ability of these strategies to handle the complexity of the human genome and also accommodate information about the biological and functional interactions between the elements that have been measured......The development and progression of cancer, a collection of diseases with complex genetic architectures, is facilitated by the interplay of multiple etiological factors. This complexity challenges the traditional single-platform study design and calls for an integrated approach to data analysis...

  9. Application of Sensitivity Analysis in Design of Integrated Building Concepts

    DEFF Research Database (Denmark)

    Heiselberg, Per; Brohus, Henrik; Hesselholt, Allan Tind

    2007-01-01

    analysis makes it possible to identify the most important parameters in relation to building performance and to focus design and optimization of integrated building concepts on these fewer, but most important parameters. The sensitivity analyses will typically be performed at a reasonably early stage...... the design requirements and objectives. In the design of integrated building concepts it is beneficial to identify the most important design parameters in order to more efficiently develop alternative design solutions or more efficiently perform an optimization of the building performance. The sensitivity...

  10. Simultaneous and integrated neutron-based techniques for material analysis of a metallic ancient flute

    International Nuclear Information System (INIS)

    Festa, G; Andreani, C; Pietropaolo, A; Grazzi, F; Scherillo, A; Barzagli, E; Sutton, L F; Bognetti, L; Bini, A; Schooneveld, E

    2013-01-01

    A metallic 19th century flute was studied by means of integrated and simultaneous neutron-based techniques: neutron diffraction, neutron radiative capture analysis and neutron radiography. This experiment follows benchmark measurements devoted to assessing the effectiveness of a multitask beamline concept for neutron-based investigation on materials. The aim of this study is to show the potential application of the approach using multiple and integrated neutron-based techniques for musical instruments. Such samples, in the broad scenario of cultural heritage, represent an exciting research field. They may represent an interesting link between different disciplines such as nuclear physics, metallurgy and acoustics. (paper)

  11. Analysis and design of substrate integrated waveguide using efficient 2D hybrid method

    CERN Document Server

    Wu, Xuan Hui

    2010-01-01

    Substrate integrated waveguide (SIW) is a new type of transmission line. It implements a waveguide on a piece of printed circuit board by emulating the side walls of the waveguide using two rows of metal posts. It inherits the merits both from the microstrip for compact size and easy integration, and from the waveguide for low radiation loss, and thus opens another door to design efficient microwave circuits and antennas at a low cost. This book presents a two-dimensional fullwave analysis method to investigate an SIW circuit composed of metal and dielectric posts. It combines the cylindrical

  12. Integrative Analysis of Gene Expression Data Including an Assessment of Pathway Enrichment for Predicting Prostate Cancer

    Directory of Open Access Journals (Sweden)

    Pingzhao Hu

    2006-01-01

    Full Text Available Background: Microarray technology has been previously used to identify genes that are differentially expressed between tumour and normal samples in a single study, as well as in syntheses involving multiple studies. When integrating results from several Affymetrix microarray datasets, previous studies summarized probeset-level data, which may potentially lead to a loss of information available at the probe-level. In this paper, we present an approach for integrating results across studies while taking probe-level data into account. Additionally, we follow a new direction in the analysis of microarray expression data, namely to focus on the variation of expression phenotypes in predefined gene sets, such as pathways. This targeted approach can be helpful for revealing information that is not easily visible from the changes in the individual genes. Results: We used a recently developed method to integrate Affymetrix expression data across studies. The idea is based on a probe-level based test statistic developed for testing for differentially expressed genes in individual studies. We incorporated this test statistic into a classic random-effects model for integrating data across studies. Subsequently, we used a gene set enrichment test to evaluate the significance of enriched biological pathways in the differentially expressed genes identified from the integrative analysis. We compared statistical and biological significance of the prognostic gene expression signatures and pathways identified in the probe-level model (PLM with those in the probeset-level model (PSLM. Our integrative analysis of Affymetrix microarray data from 110 prostate cancer samples obtained from three studies reveals thousands of genes significantly correlated with tumour cell differentiation. The bioinformatics analysis, mapping these genes to the publicly available KEGG database, reveals evidence that tumour cell differentiation is significantly associated with many

  13. Integrated uncertainty analysis using RELAP/SCDAPSIM/MOD4.0

    International Nuclear Information System (INIS)

    Perez, M.; Reventos, F.; Wagner, R.; Allison, C.

    2009-01-01

    The RELAP/SCDAPSIM/MOD4.0 code, designed to predict the behavior of reactor systems during normal and accident conditions, is being developed as part of an international nuclear technology Software Development and Training Program (SDTP). RELAP/SCDAPSIM/MOD4.0, which is the first version of RELAP5 completely rewritten to FORTRAN 90/95/2000 standards, uses the publicly available RELAP5 and SCDAP models in combination with (a) advanced programming and numerical techniques, (b) advanced SDTP-member-developed models for LWR, HWR, and research reactor analysis, and (c) a variety of other member-developed computational packages. One such computational package is an integrated uncertainty analysis package being developed jointly by the Technical University of Catalunya (UPC) and Innovative Systems Software (ISS). The integrated uncertainty analysis approach used in the package uses the following steps: 1. Selection of the plant; 2. Selection of the scenario; 3. Selection of the safety criteria; 4. Identification and ranking of the relevant phenomena based on the safety criteria; 5. Selection of the appropriate code parameters to represent those phenomena; 6. Association of uncertainty by means of Probability Distribution Functions (PDFs) for each selected parameter; 7. Random sampling of the selected parameters according to its PDF and performing multiple computer runs to obtain uncertainty bands with a certain percentile and confidence level; 8. Processing the results of the multiple computer runs to estimate the uncertainty bands for the computed quantities associated with the selected safety criteria. The first four steps are performed by the user prior to the RELAP/SCDAPSIM/MOD4.0 analysis. The remaining steps are included with the MOD4.0 integrated uncertainty analysis (IUA) package. This paper briefly describes the integrated uncertainty analysis package including (a) the features of the package, (b) the implementation of the package into RELAP/SCDAPSIM/MOD4.0, and

  14. Integration of TGS and CTEN assays using the CTENFIT analysis and databasing program

    International Nuclear Information System (INIS)

    Estep, R.

    2000-01-01

    The CTEN F IT program, written for Windows 9x/NT in C++, performs databasing and analysis of combined thermal/epithermal neutron (CTEN) passive and active neutron assay data and integrates that with isotopics results and gamma-ray data from methods such as tomographic gamma scanning (TGS). The binary database is reflected in a companion Excel database that allows extensive customization via Visual Basic for Applications macros. Automated analysis options make the analysis of the data transparent to the assay system operator. Various record browsers and information displays simplified record keeping tasks

  15. Thermodynamic Analysis of a Woodchips Gasification Integrated with Solid Oxide Fuel Cell and Stirling Engine

    DEFF Research Database (Denmark)

    Rokni, Masoud

    2013-01-01

    Integrated gasification Solid Oxide Fuel Cell (SOFC) and Stirling engine for combined heat and power application is analysed. The target for electricity production is 120 kW. Woodchips are used as gasification feedstock to produce syngas which is utilized for feeding the SOFC stacks for electricity...... and suggested. Thermodynamic analysis shows that a thermal efficiency of 42.4% based on LHV (lower heating value) can be achieved. Different parameter studies are performed to analysis system behaviour under different conditions. The analysis show that increasing fuel mass flow from the design point results...

  16. PIXE analysis of tree leaves as a possible comparative integral monitor of particulates in urban areas

    International Nuclear Information System (INIS)

    Zucchiati, A.; Annegarm, H.J.; Chisci, R.

    1988-01-01

    The possibility of obtaing integral comparative data for particulate distribution in urban areas from PIXE analysis of tree leaves is discussed in relation to the leaf gross anatomy, to the diffusion of selected tree species in such areas and to the implementation of experimental techniques necessary to make PIXE analysis effective. Multielemental scans were performed on a small set samples; results are compared to PIXE analysis of typical urban aerosols. The validity of the method and the criteria for yearly relative comparisons of different areas are discissed

  17. MONGKIE: an integrated tool for network analysis and visualization for multi-omics data.

    Science.gov (United States)

    Jang, Yeongjun; Yu, Namhee; Seo, Jihae; Kim, Sun; Lee, Sanghyuk

    2016-03-18

    Network-based integrative analysis is a powerful technique for extracting biological insights from multilayered omics data such as somatic mutations, copy number variations, and gene expression data. However, integrated analysis of multi-omics data is quite complicated and can hardly be done in an automated way. Thus, a powerful interactive visual mining tool supporting diverse analysis algorithms for identification of driver genes and regulatory modules is much needed. Here, we present a software platform that integrates network visualization with omics data analysis tools seamlessly. The visualization unit supports various options for displaying multi-omics data as well as unique network models for describing sophisticated biological networks such as complex biomolecular reactions. In addition, we implemented diverse in-house algorithms for network analysis including network clustering and over-representation analysis. Novel functions include facile definition and optimized visualization of subgroups, comparison of a series of data sets in an identical network by data-to-visual mapping and subsequent overlaying function, and management of custom interaction networks. Utility of MONGKIE for network-based visual data mining of multi-omics data was demonstrated by analysis of the TCGA glioblastoma data. MONGKIE was developed in Java based on the NetBeans plugin architecture, thus being OS-independent with intrinsic support of module extension by third-party developers. We believe that MONGKIE would be a valuable addition to network analysis software by supporting many unique features and visualization options, especially for analysing multi-omics data sets in cancer and other diseases. .

  18. FEATUREOUS: AN INTEGRATED ENVIRONMENT FOR FEATURE-CENTRIC ANALYSIS AND MODIFICATION OF OBJECT-ORIENTED SOFTWARE

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2011-01-01

    The decentralized nature of collaborations between objects in object-oriented software makes it difficult to understand the implementations of user-observable program features and their respective interdependencies. As feature-centric program understanding and modification are essential during...... software maintenance and evolution, this situation needs to change. In this paper, we present Featureous, an integrated development environment built on top of the NetBeans IDE that facilitates feature-centric analysis of object-oriented software. Our integrated development environment encompasses...... a lightweight feature location mechanism, a number of reusable analytical views, and necessary APIs for supporting future extensions. The base of the integrated development environment is a conceptual framework comprising of three complementary dimensions of comprehension: perspective, abstraction...

  19. Performance analysis of different tuning rules for an isothermal CSTR using integrated EPC and SPC

    Science.gov (United States)

    Roslan, A. H.; Karim, S. F. Abd; Hamzah, N.

    2018-03-01

    This paper demonstrates the integration of Engineering Process Control (EPC) and Statistical Process Control (SPC) for the control of product concentration of an isothermal CSTR. The objectives of this study are to evaluate the performance of Ziegler-Nichols (Z-N), Direct Synthesis, (DS) and Internal Model Control (IMC) tuning methods and determine the most effective method for this process. The simulation model was obtained from past literature and re-constructed using SIMULINK MATLAB to evaluate the process response. Additionally, the process stability, capability and normality were analyzed using Process Capability Sixpack reports in Minitab. Based on the results, DS displays the best response for having the smallest rise time, settling time, overshoot, undershoot, Integral Time Absolute Error (ITAE) and Integral Square Error (ISE). Also, based on statistical analysis, DS yields as the best tuning method as it exhibits the highest process stability and capability.

  20. The Integrated Microbial Genomes (IMG) System: An Expanding Comparative Analysis Resource

    Energy Technology Data Exchange (ETDEWEB)

    Markowitz, Victor M.; Chen, I-Min A.; Palaniappan, Krishna; Chu, Ken; Szeto, Ernest; Grechkin, Yuri; Ratner, Anna; Anderson, Iain; Lykidis, Athanasios; Mavromatis, Konstantinos; Ivanova, Natalia N.; Kyrpides, Nikos C.

    2009-09-13

    The integrated microbial genomes (IMG) system serves as a community resource for comparative analysis of publicly available genomes in a comprehensive integrated context. IMG contains both draft and complete microbial genomes integrated with other publicly available genomes from all three domains of life, together with a large number of plasmids and viruses. IMG provides tools and viewers for analyzing and reviewing the annotations of genes and genomes in a comparative context. Since its first release in 2005, IMG's data content and analytical capabilities have been constantly expanded through regular releases. Several companion IMG systems have been set up in order to serve domain specific needs, such as expert review of genome annotations. IMG is available at .