WorldWideScience

Sample records for wellbore integrity analysis

  1. Wellbore Integrity Network

    Energy Technology Data Exchange (ETDEWEB)

    Carey, James W. [Los Alamos National Laboratory; Bachu, Stefan [Alberta Innovates

    2012-06-21

    In this presentation, we review the current state of knowledge on wellbore integrity as developed in the IEA Greenhouse Gas Programme's Wellbore Integrity Network. Wells are one of the primary risks to the successful implementation of CO{sub 2} storage programs. Experimental studies show that wellbore materials react with CO{sub 2} (carbonation of cement and corrosion of steel) but the impact on zonal isolation is unclear. Field studies of wells in CO{sub 2}-bearing fields show that CO{sub 2} does migrate external to casing. However, rates and amounts of CO{sub 2} have not been quantified. At the decade time scale, wellbore integrity is driven by construction quality and geomechanical processes. Over longer time-scales (> 100 years), chemical processes (cement degradation and corrosion) become more important, but competing geomechanical processes may preserve wellbore integrity.

  2. Wellbore integrity analysis of a natural CO2 producer

    KAUST Repository

    Crow, Walter

    2010-03-01

    Long-term integrity of existing wells in a CO2-rich environment is essential for ensuring that geological sequestration of CO2 will be an effective technology for mitigating greenhouse gas-induced climate change. The potential for wellbore leakage depends in part on the quality of the original construction as well as geochemical and geomechanical stresses that occur over its life-cycle. Field data are essential for assessing the integrated effect of these factors and their impact on wellbore integrity, defined as the maintenance of isolation between subsurface intervals. In this report, we investigate a 30-year-old well from a natural CO2 production reservoir using a suite of downhole and laboratory tests to characterize isolation performance. These tests included mineralogical and hydrological characterization of 10 core samples of casing/cement/formation, wireline surveys to evaluate well conditions, fluid samples and an in situ permeability test. We find evidence for CO2 migration in the occurrence of carbonated cement and calculate that the effective permeability of an 11′-region of the wellbore barrier system was between 0.5 and 1 milliDarcy. Despite these observations, we find that the amount of fluid migration along the wellbore was probably small because of several factors: the amount of carbonation decreased with distance from the reservoir, cement permeability was low (0.3-30 microDarcy), the cement-casing and cement-formation interfaces were tight, the casing was not corroded, fluid samples lacked CO2, and the pressure gradient between reservoir and caprock was maintained. We conclude that the barrier system has ultimately performed well over the last 3 decades. These results will be used as part of a broader effort to develop a long-term predictive simulation tool to assess wellbore integrity performance in CO2 storage sites. © 2009 Elsevier Ltd. All rights reserved.

  3. AnalyzeHOLE: An Integrated Wellbore Flow Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Keith J. Halford

    2009-10-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically

  4. Problems in the wellbore integrity of a shale gas horizontal well and corresponding countermeasures

    Directory of Open Access Journals (Sweden)

    Zhonglan Tian

    2015-12-01

    Full Text Available In the Changning–Weiyuan national shale gas demonstration area, SW Sichuan Basin, the wellbore integrity damage occurs in some shale gas wells and has direct effect on the gas production rate of single shale gas horizontal well. After statistics analysis was performed on the problems related with wellbore integrity, such as casing damage, casing running difficulty and cement sheath blow-by, the multi-factor coupling casing stress calculation and evaluation mode laws established. Then study was conducted on the influential mechanism of multi-factor coupling (temperature effect, casing bending and axial pressure on casing damage. The shale slip mechanism and its relationship with casing sheared formation were analyzed by using the Mohr–Coulomb criterion. Inversion analysis was performed on the main controlling factors of casing friction by using the developed casing hook load prediction and friction analysis software. And finally, based on the characteristics of shale gas horizontal wells, wellbore integrity control measures were proposed in terms of design and construction process, so as to improve the drilling quality (DQ. More specifically, shale gas well casing design calculation method and check standard were modified, well structure and full bore hole trajectory design were optimized, drilling quality was improved, cement properties were optimized and cement sealing integrity during fracturing process was checked. These research findings are significant in the design and management of future shale gas borehole integrity.

  5. Systematic assessment of wellbore integrity for geologic carbon storage projects using regulatory and industry information

    Energy Technology Data Exchange (ETDEWEB)

    Moody, Mark [Battelle Memorial Institute, Columbus, OH (United States); Sminchak, J.R. [Battelle Memorial Institute, Columbus, OH (United States)

    2015-11-01

    Under this three year project, the condition of legacy oil and gas wells in the Midwest United States was evaluated through analysis of well records, well plugging information, CBL evaluation, sustained casing pressure (SCP) field testing, and analysis of hypothetical CO2 test areas to provide a realistic description of wellbore integrity factors. The research included a state-wide review of oil and gas well records for Ohio and Michigan, along with more detailed testing of wells in Ohio. Results concluded that oil and gas wells are clustered along fields in areas. Well records vary in quality, and there may be wells that have not been identified in records, but there are options for surveying unknown wells. Many of the deep saline formations being considered for CO2 storage have few wells that penetrate the storage zone or confining layers. Research suggests that a variety of well construction and plugging approaches have been used over time in the region. The project concluded that wellbore integrity is an important issue for CO2 storage applications in the Midwest United States. Realistic CO2 storage projects may cover an area in the subsurface with several hundred legacy oil and gas wells. However, closer inspection may often establish that most of the wells do not penetrate the confining layers or storage zone. Therefore, addressing well integrity may be manageable. Field monitoring of SCP also indicated that tested wells provided zonal isolation of the reservoirs they were designed to isolate. Most of these wells appeared to exhibit gas pressure originating from intermediate zones. Based on these results, more flexibility in terms of cementing wells to surface, allowing well testing, and monitoring wells may aid operators in completing CO2 storage project. Several useful products were developed under this project for examining wellbore integrity for CO2 storage applications including, a

  6. Wellbore stability analysis and its application in the Fergana basin, central Asia

    Science.gov (United States)

    Chuanliang, Yan; Jingen, Deng; Baohua, Yu; Hailong, Liu; Fucheng, Deng; Zijian, Chen; Lianbo, Hu; Haiyan, Zhu; Qin, Han

    2014-02-01

    Wellbore instability is one of the major problems hampering the drilling speed in the Fergana basin. Comprehensive analysis of the geological and engineering data in this area indicates that the Fergana basin is characterized by high in situ stress and plenty of natural fractures, especially in the formations which are rich in bedding structure and have several high-pressure systems. Complex accidents such as wellbore collapse, sticking, well kick and lost circulation happen frequently. Tests and theoretical analysis reveals that the wellbore instability in the Fergana basin was influenced by multiple interactive mechanisms dominated by the instability of the bedding shale. Selecting a proper drilling fluid density and improving the sealing characteristic of the applied drilling fluid is the key to preventing wellbore instability in the Fergana basin. The mechanical mechanism of wellbore instability in the Fergana basin was analysed and a method to determine the proper drilling fluid density was proposed. The research results were successfully used to guide the drilling work of the Jida-4 well; compared with the Jida-3 well, the drilling cycle of the Jida-4 well was reduced by 32%.

  7. Wellbore Seal Repair Using Nanocomposite Materials

    Energy Technology Data Exchange (ETDEWEB)

    Stormont, John [Univ. of New Mexico, Albuquerque, NM (United States)

    2016-08-31

    Nanocomposite wellbore repair materials have been developed, tested, and modeled through an integrated program of laboratory testing and numerical modeling. Numerous polymer-cement nanocomposites were synthesized as candidate wellbore repair materials using various combinations of base polymers and nanoparticles. Based on tests of bond strength to steel and cement, ductility, stability, flowability, and penetrability in opening of 50 microns and less, we identified Novolac epoxy reinforced with multi-walled carbon nanotubes and/or alumina nanoparticles to be a superior wellbore seal material compared to conventional microfine cements. A system was developed for testing damaged and repaired wellbore specimens comprised of a cement sheath cast on a steel casing. The system allows independent application of confining pressures and casing pressures while gas flow is measured through the specimens along the wellbore axis. Repair with the nanocomposite epoxy base material was successful in dramatically reducing the flow through flaws of various sizes and types, and restoring the specimen comparable to an intact condition. In contrast, repair of damaged specimens with microfine cement was less effective, and the repair degraded with application of stress. Post-test observations confirm the complete penetration and sealing of flaws using the nanocomposite epoxy base material. A number of modeling efforts have supported the material development and testing efforts. We have modeled the steel-repair material interface behavior in detail during slant shear tests, which we used to characterize bond strength of candidate repair materials. A numerical model of the laboratory testing of damaged wellbore specimens was developed. This investigation found that microannulus permeability can satisfactorily be described by a joint model. Finally, a wellbore model has been developed that can be used to evaluate the response of the wellbore system (casing, cement, and microannulus

  8. Risks to Drinking Water from Oil and Gas Wellbore Construction and Integrity: Case Studies and Lessons Learned

    Science.gov (United States)

    This presentation examines various published reports from two drinking water contamination cases, and discuss the potential roles of wellbore construction and integrity and hydraulic fracturing in the resultant drinking water contamination.

  9. Geomechanical Modeling of CO2 Injection Site to Predict Wellbore Stresses and Strains for the Design of Wellbore Seal Repair Materials

    Science.gov (United States)

    Sobolik, S. R.; Gomez, S. P.; Matteo, E. N.; Stormont, J.

    2015-12-01

    This paper will present the results of large-scale three-dimensional calculations simulating the hydrological-mechanical behavior of a CO2injection reservoir and the resulting effects on wellbore casings and sealant repair materials. A critical aspect of designing effective wellbore seal repair materials is predicting thermo-mechanical perturbations in local stress that can compromise seal integrity. The DOE-NETL project "Wellbore Seal Repair Using Nanocomposite Materials," is interested in the stress-strain history of abandoned wells, as well as changes in local pressure, stress, and temperature conditions that accompany carbon dioxide injection or brine extraction. Two distinct computational models comprise the current modeling effort. The first is a field scale model that uses the stratigraphy, material properties, and injection history from a pilot CO2injection operation in Cranfield, MS to develop a stress-strain history for wellbore locations from 100 to 400 meters from an injection well. The results from the field scale model are used as input to a more detailed model of a wellbore casing. The 3D wellbore model examines the impacts of various loading scenarios on a casing structure. This model has been developed in conjunction with bench-top experiments of an integrated seal system in an idealized scaled wellbore mock-up being used to test candidate seal repair materials. The results from these models will be used to estimate the necessary mechanical properties needed for a successful repair material. This material is based upon work supported by the US Department of Energy (DOE) National Energy Technology Laboratory (NETL) under Grant Number DE-FE0009562. This project is managed and administered by the Storage Division of the NETL and funded by DOE/NETL and cost-sharing partners. This work was funded in part by the Center for Frontiers of Subsurface Energy Security, an Energy Frontier Research Center funded by the US Department of Energy, Office of Science

  10. Wellbore Microannulus Characterization and Modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Matteo, Edward N; Sobolik, Steven R.; Stormont, John C.; Taha, Mahmoud Reda; Gomez, Steven Paul

    2016-05-01

    Subsurface geologic formations used for extracting resources such as oil and gas can subsequently be used as a storage reservoir for the common greenhouse gas CO2, a concept known as Carbon Capture and Storage (CCS). Pre-existing wellbores penetrate the reservoirs where supercritical CO2 is to be injected. These wellbores can potentially be a pathway for contamination if CO2 leaks through wellbore flaws to an overlying aquifer or the atmosphere. Characterizing wellbore integrity and providing zonal isolation by repairing these wellbore flaws is of critical importance to the long-term isolation of CO2 and success of CCS. This research aims to characterize the microannulus region of the cement sheath-steel casing interface in terms of its compressibility and permeability. A mock-up of a wellbore system was used for lab-scale testing. Specimens, consisting of a cement sheath cast on a steel casing with microannuli, were subjected to confining pressures and casing pressures in a pressure vessel that allows simultaneous measurement of gas flow along the axis of the specimen. The flow was interpreted as the hydraulic aperture of the microannuli. Numerical models are used to analyze stress and displacement conditions along the casing-cement interface. These numerical results provide good agreement with closed-form elastic solutions. Numerical models incorporating flaws of varying dimensions along the casing-cement interface were then developed to describe the microannulus region. A joint model is used to describe the hydraulic aperture of the microannulus region, whose mechanical stiffness is altered in response to the imposed stress state across the joint interface. The aperture-stress behavior is based upon laboratory measurements of hydraulic aperture as a function of imposed stress conditions. This investigation found that microannulus permeability can satisfactorily be described by a joint model and that the constitutive

  11. Time-Lapse Measurement of Wellbore Integrity

    Science.gov (United States)

    Duguid, A.

    2017-12-01

    estimate of the cement isolating capacity. Cased-hole sidewall cores in the steel and fiberglass casing sections allowed analysis of bulk cement and the cement at the casing- and formation-interface. This presentation will cover how time-lapse logging was conducted, how the results may be applicable to other wells, and how monitoring well design may affect wellbore integrity.

  12. Shale-Gas Experience as an Analog for Potential Wellbore Integrity Issues in CO2 Sequestration

    Energy Technology Data Exchange (ETDEWEB)

    Carey, James W. [Los Alamos National Laboratory; Simpson, Wendy S. [Los Alamos National Laboratory; Ziock, Hans-Joachim [Los Alamos National Laboratory

    2011-01-01

    Shale-gas development in Pennsylvania since 2003 has resulted in about 19 documented cases of methane migration from the deep subsurface (7,0000) to drinking water aquifers, soils, domestic water wells, and buildings, including one explosion. In all documented cases, the methane leakage was due to inadequate wellbore integrity, possibly aggravated by hydrofracking. The leakage of methane is instructive on the potential for CO{sub 2} leakage from sequestration operations. Although there are important differences between the two systems, both involve migrating, buoyant gas with wells being a primary leakage pathway. The shale-gas experience demonstrates that gas migration from faulty wells can be rapid and can have significant impacts on water quality and human health and safety. Approximately 1.4% of the 2,200 wells drilled into Pennsylvania's Marcellus Formation for shale gas have been implicated in methane leakage. These have resulted in damage to over 30 domestic water supplies and have required significant remediation via well repair and homeowner compensation. The majority of the wellbore integrity problems are a result of over-pressurization of the wells, meaning that high-pressure gas has migrated into an improperly protected wellbore annulus. The pressurized gas leaks from the wellbore into the shallow subsurface, contaminating drinking water or entering structures. The effects are localized to a few thousands of feet to perhaps two-three miles. The degree of mixing between the drinking water and methane is sufficient that significant chemical impacts are created in terms of elevated Fe and Mn and the formation of black precipitates (metal sulfides) as well as effervescing in tap water. Thus it appears likely that leaking CO{sub 2} could also result in deteriorated water quality by a similar mixing process. The problems in Pennsylvania highlight the critical importance of obtaining background data on water quality as well as on problems associated with

  13. WELLBORE INSTABILITY: CAUSES AND CONSEQUENCES

    Directory of Open Access Journals (Sweden)

    Borivoje Pašić

    2007-12-01

    Full Text Available Wellbore instability is one of the main problems that engineers meet during drilling. The causes of wellbore instability are often classified into either mechanical (for example, failure of the rock around the hole because of high stresses, low rock strength, or inappropriate drilling practice or chemical effects which arise from damaging interaction between the rock, generally shale, and the drilling fluid. Often, field instances of instability are a result of a combination of both chemical and mechanical. This problem might cause serious complication in well and in some case can lead to expensive operational problems. The increasing demand for wellbore stability analyses during the planning stage of a field arise from economic considerations and the increasing use of deviated, extended reach and horizontal wells. This paper presents causes, indicators and diagnosing of wellbore instability as well as the wellbore stresses model.

  14. Heating production fluids in a wellbore

    Science.gov (United States)

    Orrego, Yamila; Jankowski, Todd A.

    2016-07-12

    A method for heating a production fluid in a wellbore. The method can include heating, using a packer fluid, a working fluid flowing through a first medium disposed in a first section of the wellbore, where the first medium transfers heat from the packer fluid to the working fluid. The method can also include circulating the working fluid into a second section of the wellbore through a second medium, where the second medium transfers heat from the working fluid to the production fluid. The method can further include returning the working fluid to the first section of the wellbore through the first medium.

  15. Steam injection for heavy oil recovery: Modeling of wellbore heat efficiency and analysis of steam injection performance

    International Nuclear Information System (INIS)

    Gu, Hao; Cheng, Linsong; Huang, Shijun; Li, Bokai; Shen, Fei; Fang, Wenchao; Hu, Changhao

    2015-01-01

    Highlights: • A comprehensive mathematical model was established to estimate wellbore heat efficiency of steam injection wells. • A simplified approach of predicting steam pressure in wellbores was proposed. • High wellhead injection rate and wellhead steam quality can improve wellbore heat efficiency. • High wellbore heat efficiency does not necessarily mean good performance of heavy oil recovery. • Using excellent insulation materials is a good way to save water and fuels. - Abstract: The aims of this work are to present a comprehensive mathematical model for estimating wellbore heat efficiency and to analyze performance of steam injection for heavy oil recovery. In this paper, we firstly introduce steam injection process briefly. Secondly, a simplified approach of predicting steam pressure in wellbores is presented and a complete expression for steam quality is derived. More importantly, both direct and indirect methods are adopted to determine the wellbore heat efficiency. Then, the mathematical model is solved using an iterative technique. After the model is validated with measured field data, we study the effects of wellhead injection rate and wellhead steam quality on steam injection performance reflected in wellbores. Next, taking cyclic steam stimulation as an example, we analyze steam injection performance reflected in reservoirs with numerical reservoir simulation method. Finally, the significant role of improving wellbore heat efficiency in saving water and fuels is discussed in detail. The results indicate that we can improve the wellbore heat efficiency by enhancing wellhead injection rate or steam quality. However, high wellbore heat efficiency does not necessarily mean satisfactory steam injection performance reflected in reservoirs or good performance of heavy oil recovery. Moreover, the paper shows that using excellent insulation materials is a good way to save water and fuels due to enhancement of wellbore heat efficiency

  16. Final Scientific/Technical Report for "Nanite" for Better Well-Bore Integrity and Zonal Isolation

    Energy Technology Data Exchange (ETDEWEB)

    Veedu, Vinod [Oceanit Laboratories, Inc., Honolulu, HI (United States); Hadmack, Michael [Oceanit Laboratories, Inc., Honolulu, HI (United States); Pollock, Jacob [Oceanit Laboratories, Inc., Honolulu, HI (United States); Pernambuco-Wise, Paul [Oceanit Laboratories, Inc., Honolulu, HI (United States); Ah Yo, Derek [Oceanit Laboratories, Inc., Honolulu, HI (United States)

    2017-05-30

    Nanite™ is a cementitious material that contains a proprietary formulation of functionalized nanomaterial additive to transform conventional cement into a smart material responsive to pressure (or stress), temperature, and any intrinsic changes in composition. This project has identified optimal sensing modalities of smart well cement and demonstrated how real-time sensing of Nanite™ can improve long-term wellbore integrity and zonal isolation in shale gas and applicable oil and gas operations. Oceanit has explored Nanite’s electrical sensing properties in depth and has advanced the technology from laboratory proof-of-concept to sub-scale testing in preparation for field trials.

  17. Numerical analysis of wellbore instability in gas hydrate formation during deep-water drilling

    Science.gov (United States)

    Zhang, Huaiwen; Cheng, Yuanfang; Li, Qingchao; Yan, Chuanliang; Han, Xiuting

    2018-02-01

    Gas hydrate formation may be encountered during deep-water drilling because of the large amount and wide distribution of gas hydrates under the shallow seabed of the South China Sea. Hydrates are extremely sensitive to temperature and pressure changes, and drilling through gas hydrate formation may cause dissociation of hydrates, accompanied by changes in wellbore temperatures, pore pressures, and stress states, thereby leading to wellbore plastic yield and wellbore instability. Considering the coupling effect of seepage of drilling fluid into gas hydrate formation, heat conduction between drilling fluid and formation, hydrate dissociation, and transformation of the formation framework, this study established a multi-field coupling mathematical model of the wellbore in the hydrate formation. Furthermore, the influences of drilling fluid temperatures, densities, and soaking time on the instability of hydrate formation were calculated and analyzed. Results show that the greater the temperature difference between the drilling fluid and hydrate formation is, the faster the hydrate dissociates, the wider the plastic dissociation range is, and the greater the failure width becomes. When the temperature difference is greater than 7°C, the maximum rate of plastic deformation around the wellbore is more than 10%, which is along the direction of the minimum horizontal in-situ stress and associated with instability and damage on the surrounding rock. The hydrate dissociation is insensitive to the variation of drilling fluid density, thereby implying that the change of the density of drilling fluids has a minimal effect on the hydrate dissociation. Drilling fluids that are absorbed into the hydrate formation result in fast dissociation at the initial stage. As time elapses, the hydrate dissociation slows down, but the risk of wellbore instability is aggravated due to the prolonged submersion in drilling fluids. For the sake of the stability of the wellbore in deep

  18. In-situ Mechanical Manipulation of Wellbore Cements as a Solution to Leaky Wells

    Science.gov (United States)

    Kupresan, D.; Radonjic, M.; Heathman, J.

    2013-12-01

    Wellbore cement provides casing support, zonal isolation, and casing protection from corrosive fluids, which are essential for wellbore integrity. Cements can undergo one or more forms of failure such as debonding at cement/formation and cement/casing interface, fracturing and defects within cement matrix. Failures and defects within cement will ultimately lead to fluids migration, resulting in inter-zonal fluid migration and premature well abandonment. There are over 27,000 abandoned oil and gas wells only in The Gulf of Mexico (some of them dating from the late 1940s) with no gas leakage monitoring. Cement degradation linked with carbon sequestration can potentially lead to contamination of fresh water aquifers with CO2. Gas leaks can particularly be observed in deviated wells used for hydraulic fracking (60% leakage rate as they age) as high pressure fracturing increases the potential for migration pathways. Experimental method utilized in this study enables formation of impermeable seals at interfaces present in a wellbore by mechanically manipulating wellbore cement. Preliminary measurements obtained in bench scale experiments demonstrate that an impermeable cement/formation and cement/casing interface can be obtained. In post-modified cement, nitrogen gas flow-through experiments showed complete zonal isolation and no permeability in samples with pre-engineered microannulus. Material characterization experiments of modified cement revealed altered microstructural properties of cement as well as changes in mineralogical composition. Calcium-silicate-hydrate (CSH), the dominant mineral in hydrated cement which provides low permeability of cement, was modified as a result of cement pore water displacement, resulting in more dense structures. Calcium hydroxide (CH), which is associated with low resistance of cement to acidic fluids and therefore detrimental in most wellbore cements, was almost completely displaced and/or integrated in CSH as a result of

  19. THE EFFECT OF WELL-BORE REVERSE FLOW OF FLUID ON ...

    African Journals Online (AJOL)

    ES Obe

    1980-03-01

    Mar 1, 1980 ... ABSTRACT. Well-bore storage may dominate the bottom-hole pressure profile of ... Type- curve matching is however only accurate when the storage factor .... numerical integration technique ... existence of a measure of well-.

  20. Reaction-driven casing expansion : potential for wellbore leakage mitigation

    NARCIS (Netherlands)

    Wolterbeek, Timotheus K. T.; van Noort, Reinier; Spiers, Christopher J.

    It is generally challenging to predict the post-abandonment behaviour and integrity of wellbores. Leakage is, moreover, difficult to mitigate, particularly between the steel casing and outer cement sheath. Radially expanding the casing with some form of internal plug, thereby closing annular voids

  1. Wellbore stability in shales considering chemo-poroelastic effects

    Energy Technology Data Exchange (ETDEWEB)

    Araujo, Ewerton M.P.; Pastor, Jorge A.S.C.; Fontoura, Sergio A.B.; Rabe, Claudio [Pontificia Univ. Catolica do Rio de Janeiro (PUC-Rio), RJ (Brazil). Dept. de Engenharia Civil. Grupo de Tecnologia e Engenharia de Petroleo

    2004-07-01

    Under compaction and low geothermal gradients are deep water characteristics. Both under compaction and low geothermal gradients generate considerable thickness of smectite-rich shales. These rocks are the major source of wellbore stability problems, because they are susceptible to adverse physico-chemical reactions when in contact with inadequate drilling fluids. Due shales are low permeability rocks diffusion processes dominate the changes of pore pressure around wellbore. Diffusion of fluids, ions and temperature occurs in shales during drilling and demand a fully coupled modelling taking account these factors. Despite temperature importance, in this paper wellbore stability in shales is analyzed through a model that considers only the coupling between poroelastic and physico-chemical effects. The coupled equations are solved analytically and have been implemented in a computational simulator with user-friendly interface. Time-dependent simulations of wellbore stability in shales are presented for a typical deep water scenario. The results show that physico-chemical effects change pore pressure around wellbore and have high impact on the wellbore stability. (author)

  2. Geomechanical analysis to predict the oil leak at the wellbores in Big Hill Strategic Petroleum Reserve

    Energy Technology Data Exchange (ETDEWEB)

    Park, Byoung Yoon

    2014-02-01

    Oil leaks were found in wellbores of Caverns 105 and 109 at the Big Hill Strategic Petroleum Reserve site. According to the field observations, two instances of casing damage occurred at the depth of the interbed between the caprock bottom and salt top. A three dimensional finite element model, which contains wellbore element blocks and allows each cavern to be configured individually, is constructed to investigate the wellbore damage mechanism. The model also contains element blocks to represent interface between each lithology and a shear zone to examine the interbed behavior in a realistic manner. The causes of the damaged casing segments are a result of vertical and horizontal movements of the interbed between the caprock and salt dome. The salt top subsides because the volume of caverns below the salt top decrease with time due to salt creep closure, while the caprock subsides at a slower rate because the caprock is thick and stiffer. This discrepancy yields a deformation of the well. The deformed wellbore may fail at some time. An oil leak occurs when the wellbore fails. A possible oil leak date of each well is determined using the equivalent plastic strain failure criterion. A well grading system for a remediation plan is developed based on the predicted leak dates of each wellbore.

  3. Drilling subsurface wellbores with cutting structures

    Science.gov (United States)

    Mansure, Arthur James; Guimerans, Rosalvina Ramona

    2010-11-30

    A system for forming a wellbore includes a drill tubular. A drill bit is coupled to the drill tubular. One or more cutting structures are coupled to the drill tubular above the drill bit. The cutting structures remove at least a portion of formation that extends into the wellbore formed by the drill bit.

  4. Self-healing polymer cement composites for geothermal wellbore applications

    Science.gov (United States)

    Rod, K. A.; Fernandez, C.; Childers, I.; Koech, P.; Um, W.; Roosendaal, T.; Nguyen, M.; Huerta, N. J.; Chun, J.; Glezakou, V. A.

    2017-12-01

    Cement is vital for controlling leaks from wellbores employed in oil, gas, and geothermal operations by sealing the annulus between the wellbore casing and geologic formation. Wellbore cement failure due to physical and chemical stresses is common and can result in significant environmental consequences and ultimately significant financial costs due to remediation efforts. To date numerous alternative cement blends have been proposed for the oil and gas industry. Most of these possess poor mechanical properties, or are not designed to work in high temperature environments. This research investigates novel polymer-cement composites which could function at most geothermal temperatures. Thermal stability and mechanical strength of the polymer is attributed to the formation of a number of chemical interactions between the polymer and cement matrix including covalent bonds, hydrogen bonding, and van der Waals interactions. It has been demonstrated that the bonding between cement and casing is more predictable when polymer is added to cement and can even improve healing of adhesion break when subjected to stresses such as thermal shock. Fractures have also been healed, effectively reducing permeability with fractures up to 0.3-0.5mm apertures, which is two orders of magnitude larger than typical wellbore fractures. Additionally, tomography analysis was used to determine internal structure of the cement polymer composite and imaging reveals that polymers fill fractures in the cement and between the cement and casing. By plugging fractures that occur in wellbore cement, reducing permeability of fractures, both environmental safety and economics of subsurface operations will be improved for geothermal energy and oil and gas production.

  5. Cementing a wellbore using cementing material encapsulated in a shell

    Energy Technology Data Exchange (ETDEWEB)

    Aines, Roger D.; Bourcier, William L.; Duoss, Eric B.; Spadaccini, Christopher M.; Cowan, Kenneth Michael

    2016-08-16

    A system for cementing a wellbore penetrating an earth formation into which a pipe extends. A cement material is positioned in the space between the wellbore and the pipe by circulated capsules containing the cement material through the pipe into the space between the wellbore and the pipe. The capsules contain the cementing material encapsulated in a shell. The capsules are added to a fluid and the fluid with capsules is circulated through the pipe into the space between the wellbore and the pipe. The shell is breached once the capsules contain the cementing material are in position in the space between the wellbore and the pipe.

  6. Cementing a wellbore using cementing material encapsulated in a shell

    Energy Technology Data Exchange (ETDEWEB)

    Aines, Roger D.; Bourcier, William L.; Duoss, Eric B.; Floyd, III, William C.; Spadaccini, Christopher M.; Vericella, John J.; Cowan, Kenneth Michael

    2017-03-14

    A system for cementing a wellbore penetrating an earth formation into which a pipe extends. A cement material is positioned in the space between the wellbore and the pipe by circulated capsules containing the cement material through the pipe into the space between the wellbore and the pipe. The capsules contain the cementing material encapsulated in a shell. The capsules are added to a fluid and the fluid with capsules is circulated through the pipe into the space between the wellbore and the pipe. The shell is breached once the capsules contain the cementing material are in position in the space between the wellbore and the pipe.

  7. Flexible cement improves wellbore integrity for steam assisted gravity drainage SAGD wells

    Energy Technology Data Exchange (ETDEWEB)

    DeBruijn, G.; Whitton, S.; Redekopp, D. [Society of Petroleum Engineers, Canadian Section, Calgary, AB (Canada)]|[Schlumberger Canada Ltd., Calgary, AB (Canada); Siso, C. [ConocoPhillips Canada Resources Corp., Calgary, AB (Canada); Reinheimer, D. [Schlumberger Canada Ltd., Calgary, AB (Canada)

    2008-10-15

    Cement sheath integrity is an important factor in ensuring the zonal isolation of wells. Significant stresses are placed on the cement sheaths of wells during steam assisted gravity drainage (SAGD) processes, as the expanded forces from the heating of the well are transferred to the cement sheath, which places a tensile load on the cement at the sheath's outer edge. In this study, a computerized simulation was conducted to examine stresses in a novel flexible cement sheath system during an SAGD heat-up cycle. Wellbore temperature was increased from 10 degrees C to 250 degrees C over a period of 720 minutes. Pressure was increased from 0 MPa to 5 MPa. The finite element model was used to predict microannulus, cement failure in compression, and cement failure in tension. A sensitivity analysis was used to estimate the effect of different parameters as well as to estimate the value of the Young's modulus of the shale. Results of the study showed that temperature and pressure dynamics have a significant impact on stresses in the cement sheath. An extended heat-up period resulted in reduced stresses to the sheath. Lower operating pressures also reduced stresses. It was concluded that pressure and temperature increases should be extended over a long a period as possible in order to reduce stresses. Results suggested that a flexible cement system with a low Young's modulus is suitable for SAGD wells. 8 refs., 2 tabs., 6 figs.

  8. A coupled conductive-convective thermo-poroelastic solution and implications for wellbore stability

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Yarlong [Petro-Geotech Inc., Suite no.300, 840-6th Avenue, S.W., Calgary, AB (Canada) T2P 3E5; Dusseault, Maurice B. [Porous Media Research Institute, Department of Earth Sciences, University of Waterloo, Waterloo, ON (Canada) N2L 361

    2003-06-01

    Steam injection is widely used in heavy oil reservoirs to enhance oil recovery; elevated temperatures increase fluid mobility in several ways, but can also generate damage through shearing, crushing of weak grains, and casing impairment by shear, collapse, or buckling. Disposal of cold produced water by injection can generate thermally induced extensional fracturing, increasing the effective wellbore radius. Drilling with long open-hole sections can lead to rock temperature changes as large as 30-40 C at the casing shoe through mud heating at depth and upward mud circulation, dramatically impacting wellbore stability. Clearly, thermal stress analysis of open and cased boreholes is of primary interest for drilling and completion planning, as bottom-hole temperature changes can have as large an impact as bottom-hole pressure changes. Local wellbore stresses are the sum of far-field, pore pressure and thermally induced stresses; they may be highly inhomogeneous because of different rock properties and heat transport processes. These stresses, combined with thermal weakening and pore pressure changes, may lead to phenomena such as formation damage, sand production, shale shrinkage, and various modes of instability (shearing, spalling, fracturing, etc.). Previous studies of thermally induced stresses were primarily based on assumptions of low permeability and heat conduction only; this is inadequate when high-permeability formations are encountered. To analyze induced stresses and formation damage, a geomechanics model that is fully coupled to diffusive transport processes is employed. By assuming a constant wellbore pressure and temperature boundary condition, a closed-form solution including heat conduction and convection is obtained for the stresses near a cylindrical wellbore. The stability of an open-hole subject to non-isothermal, non-hydrostatic in situ loading and various conditions is then investigated. Our studies indicate that maximum tangential stresses are

  9. The successful use of transverse hydraulic fractures from horizontal wellbores

    Energy Technology Data Exchange (ETDEWEB)

    Crosby, D. G.; Yang, Z.; Rahman, S. S. [New South Wales Univ., NSW (Australia)

    1998-12-31

    Since a significant proportion of the world`s recoverable hydrocarbon resources exist in reservoirs possessing permeabilities of less than one milli-Darcy (mD), some form of permeability enhancement or stimulation is necessary if the hydrocarbons are to be exploited economically. Multi-stage, transversely fractured horizontal wellbores are shown to have the potential to greatly increase production from low permeability formations. To overcome the problems caused by near-wellbore tortuosity, common to wells with multiple fracturing from the same perforated interval, a criterion was devised which predicts the wellbore pressures to initiate secondary multiple transverse hydraulic fractures in close proximity to primary fractures. The criterion, confirmed by laboratory experiments, demonstrates that transversely fractured horizontal wellbores have limited capacities to resist the initiation of multiple fractures from adjacent perforations. This characteristic can be used in designing hydraulic fracture treatments to establish injection pressure limits or threshold pressures, above which additional multiple fractures will initiate and propagate from the wellbore. 23 refs., 1 tab., 10 figs.

  10. A Transient Analytical Model for Predicting Wellbore/Reservoir Temperature and Stresses during Drilling with Fluid Circulation

    Directory of Open Access Journals (Sweden)

    Bisheng Wu

    2017-12-01

    Full Text Available Accurate characterization of heat transfer in a wellbore during drilling, which includes fluid circulation, is important for wellbore stability analysis. In this work, a pseudo-3D model is developed to simultaneously calculate the heat exchange between the flowing fluid and the surrounding media (drill pipe and rock formation and the in-plane thermoelastic stresses. The cold drilling fluid descends through the drill pipe at constant injection rates and returns to the ground surface via the annulus. The fluid circulation will decrease the wellbore bottom temperature and reduce the near-wellbore high compressive stress, potentially leading to tensile fracturing of the well. The governing equations for the coupled heat transfer stress problem are formulated to ensure that the most important parameters are taken into account. The wellbore is subject to a non-hydrostatic in situ far-field stress field. In modeling heat exchange between fluid and surrounding media, the heat transfer coefficients are dependent on fluid properties and flow behavior. Analytical solutions in the Laplace space are obtained for the temperatures of the fluid in both the drill pipe and annulus and for the temperature and stress changes in the formation. The numerical results in the time domain are obtained by using an efficient inversion approach. In particular, the near-well stresses are compared for the cases with fixed and time-dependent cooling wellbore conditions. This comparison indicates that the using a fixed temperature wellbore conditions may over-estimate or under-estimate the bottom-hole stress change, potentially leading to wellbore stability problems.

  11. Experimental determination of wellbore diameter and shape (4D imaging of wellbore) by using ultrasonic caliper within different fluids for real-time drilling application

    Energy Technology Data Exchange (ETDEWEB)

    Elahifar, Behzad; Esmaeili, Abdolali; Thonhauser, Gerhard [Montanuniversitaet Leoben (Austria); Fruhwirth, Rudolf K. [TDE Thonhauser Data Engineering GmbH, Leoben (Austria)

    2013-03-15

    Drilling programs continue to push into new and more complicated environments. As a result, accurate measurement, interpretation and analysis of drilling data in real time are becoming more critical. One of the key measurement devices for drilling, cementing and formation evaluation is the borehole caliper. An ultrasonic sensor caliper tool is thereby a key measurement device for determining the borehole diameter in MWD or LWD tools. Another use of ultrasonic caliper tools is to offer a method for calculating borehole volumes. Real-time application of ultrasonic caliper tools can also support the early detection of borehole instability. This paper describes the experiments related to the accuracy of the ultrasonic sensor for measuring wellbore diameter by performing the tests in different fluids, comparing the results and determining the weak points of the sensor for detecting echoes. In addition the wellbore profiles were simulated and the simulated results were compared with the recorded data. Different tests related to the position of the caliper tool inside the wellbore were performed as well as the evaluation of the accuracy of the ultrasonic sensor by simulating dog-legs and latches. (orig.)

  12. Numerical Investigation of the Influences of Wellbore Flow on Compressed Air Energy Storage in Aquifers

    Directory of Open Access Journals (Sweden)

    Yi Li

    2017-01-01

    Full Text Available With the blossoming of intermittent energy, compressed air energy storage (CAES has attracted much attention as a potential large-scale energy storage technology. Compared with caverns as storage vessels, compressed air energy storage in aquifers (CAESA has the advantages of wide availability and lower costs. The wellbore can play an important role as the energy transfer mechanism between the surroundings and the air in CAESA system. In this paper, we investigated the influences of the well screen length on CAESA system performance using an integrated wellbore-reservoir simulator (T2WELL/EOS3. The results showed that the well screen length can affect the distribution of the initial gas bubble and that a system with a fully penetrating wellbore can obtain acceptably stable pressurized air and better energy efficiencies. Subsequently, we investigated the impact of the energy storage scale and the target aquifer depth on the performance of a CAESA system using a fully penetrating wellbore. The simulation results demonstrated that larger energy storage scales exhibit better performances of CAESA systems. In addition, deeper target aquifer systems, which could decrease the energy loss by larger storage density and higher temperature in surrounding formation, can obtain better energy efficiencies.

  13. Pulse testing in the presence of wellbore storage and skin effects

    Energy Technology Data Exchange (ETDEWEB)

    Ogbe, D.O.; Brigham, W.E.

    1984-08-01

    A pulse test is conducted by creating a series of short-time pressure transients in an active (pulsing) well and recording the observed pressure response at an observation (responding) well. Using the pressure response and flow rate data, the transmissivity and storativity of the tested formation can be determined. Like any other pressure transient data, the pulse-test response is significantly influenced by wellbore storage and skin effects. The purpose of this research is to examine the influence of wellbore storage and skin effects on interference testing in general and on pulse-testing in particular, and to present the type curves and procedures for designing and analyzing pulse-test data when wellbore storage and skin effects are active at either the responding well or the pulsing well. A mathematical model for interference testing was developed by solving the diffusivity equation for radial flow of a single-phase, slightly compressible fluid in an infinitely large, homogeneous reservoir. When wellbore storage and skin effects are present in a pulse test, the observed response amplitude is attenuated and the time lag is inflated. Consequently, neglecting wellbore storage and skin effects in a pulse test causes the calculated storativity to be over-estimated and the transmissivity to be under-estimated. The error can be as high as 30%. New correlations and procedures are developed for correcting the pulse response amplitude and time lag for wellbore storage effects. Using these correlations, it is possible to correct the wellbore storage-dominated response amplitude and time lag to within 3% of their expected values without wellbore storage, and in turn to calculate the corresponding transmissivity and storativity. Worked examples are presented to illustrate how to use the new correction techniques. 45 references.

  14. The model coupling fluid flow in reservoir with flow in horizontal wellbore

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Xiangping; Jiang, Zhixiang [RIPED-TEXACO Horizontal Well Technology Laboratory (United States)

    1998-12-31

    Three-dimensional pressure distributions of oil flow in a reservoir with horizontal well were derived, and a new formula to calculate pressure drop along the horizontal wellbore was developed based on the principle of conservation of matter and momentum. The formula considers the effect of influx into the horizontal wellbore from the reservoir on pressure drop in the wellbore. A mathematical model to couple fluid flow in the reservoir with flow in the horizontal wellbore is presented. Model results and experimental data showed good correspondence. Results showed the influence of pressure drop on well performance. 13 refs., 2 tabs., 7 figs.

  15. Gas and Oil Flow through Wellbore Flaws

    Science.gov (United States)

    Hatambeigi, M.; Anwar, I.; Reda Taha, M.; Bettin, G.; Chojnicki, K. N.; Stormont, J.

    2017-12-01

    We have measured gas and oil flow through laboratory samples that represent two important potential flow paths in wellbores associated with the Strategic Petroleum Reserve (SPR): cement-steel interfaces (microannuli) and cement fractures. Cement fractures were created by tensile splitting of cement cores. Samples to represent microannuli were created by placing thin steel sheets within split cement cores so flow is channeled along the cement-steel interface. The test sequence included alternating gas and oil flow measurements. The test fluids were nitrogen and silicone oil with properties similar to a typical crude oil stored in the SPR. After correcting for non-linear (inertial) flow when necessary, flows were interpreted as effective permeability and hydraulic aperture using the cubic law. For both samples with cement fractures and those with cement-steel interfaces, initial gas and oil permeabilities were comparable. Once saturated with oil, a displacement pressure had to be overcome to establish gas flow through a sample, and the subsequent gas permeability were reduced by more than 50% compared to its initial value. Keywords: wellbore integrity, leakage, fracture, microannulus, SPR. Sandia National Laboratories is a multi-mission laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of NTESS/Honeywell, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000. SAND2017-8168 A

  16. Final Research Performance Progress Report: Geothermal Resource Development with Zero Mass Withdrawal, Engineered Convection, and Wellbore Energy Conversion

    Energy Technology Data Exchange (ETDEWEB)

    Hughes, Richard [Louisiana State Univ., Baton Rouge, LA (United States); Tyagi, Mayank [Louisiana State Univ., Baton Rouge, LA (United States); Radonjic, Mileva [Louisiana State Univ., Baton Rouge, LA (United States); Dahi, Arash [Louisiana State Univ., Baton Rouge, LA (United States); Wang, Fahui [Louisiana State Univ., Baton Rouge, LA (United States); John, Chacko [Louisiana State Univ., Baton Rouge, LA (United States); Kaiser, Mark [Louisiana State Univ., Baton Rouge, LA (United States); Snyder, Brian [Louisiana State Univ., Baton Rouge, LA (United States); Sears, Stephen [Louisiana State Univ., Baton Rouge, LA (United States)

    2017-07-07

    This project is intended to demonstrate the technical and economic feasibility, and environmental and social attractiveness of a novel method of heat extraction from geothermal reservoirs. The emphasis is on assessing the potential for a heat extraction method that couples forced and free convection to maximize extraction efficiency. The heat extraction concept is enhanced by considering wellbore energy conversion, which may include only a boiler for a working fluid, or perhaps a complete boiler, turbine, and condenser cycle within the wellbore. The feasibility of this system depends on maintaining mechanical and hydraulic integrity of the wellbore, so the material properties of the casing-cement system are examined both experimentally and with well design calculations. The attractiveness depends on mitigation of seismic and subsidence risks, economic performance, environmental impact, and social impact – all of which are assessed as components of this study.

  17. Hydrated Ordinary Portland Cement as a Carbonic Cement: The Mechanisms, Dynamics, and Implications of Self-Sealing and CO2 Resistance in Wellbore Cements

    Energy Technology Data Exchange (ETDEWEB)

    Guthrie, George Drake Jr. [Los Alamos National Laboratory; Pawar, Rajesh J. [Los Alamos National Laboratory; Carey, James William [Los Alamos National Laboratory; Karra, Satish [Los Alamos National Laboratory; Harp, Dylan Robert [Los Alamos National Laboratory; Viswanathan, Hari S. [Los Alamos National Laboratory

    2017-07-28

    This report analyzes the dynamics and mechanisms of the interactions of carbonated brine with hydrated Portland cement. The analysis is based on a recent set of comprehensive reactive-transport simulations, and it relies heavily on the synthesis of the body of work on wellbore integrity that we have conducted for the Carbon Storage Program over the past decade.

  18. Polymer-cement interactions towards improved wellbore cement fracture sealants

    Science.gov (United States)

    Beckingham, B. S.; Iloejesi, C.; Minkler, M. J.; Schindler, A. K.; Beckingham, L. E.

    2017-12-01

    Carbon capture, utilization, and storage (CCUS) in deep geologic formations is a promising means of reducing point source emissions of CO2. In these systems, CO2 is captured at the source and then injected to be utilized (eg. in enhanced oil recovery or as a working fluid in enhanced geothermal energy plants) or stored in geologic formations such as depleted oil and gas reservoirs or saline aquifers. While CCUS in subsurface systems could aid in reducing atmospheric CO2 emissions, the potential for CO2 leakage from these systems to overlying formations remains a major limitation and poses a significant risk to the security of injected CO2. Thus, improved materials for both initial wellbore isolation and repairing leakage pathways that develop over time are sought. One approach for the repair of cement fractures in wellbore (and other) systems is the injection of polymer materials into the fracture with a subsequent environmentally dependent (temperature, pressure, pH, etc.) densification or solidification. Here, we aim to investigate novel polymer materials for use to repair leaking wellbores in the context of CCUS. We synthesize and fully characterize a series of novel polymer materials and utilize a suite of analysis techniques to examine polymer-cement interactions at a range of conditions (namely temperature, pressure and pH). Initial findings will be leveraged to design novel polymer materials for further evaluation in polymer-cement composite cores, cement fracture healing, and the aging behavior of healed cements.

  19. Reduced-Order Model for Leakage Through an Open Wellbore from the Reservoir due to Carbon Dioxide Injection

    Energy Technology Data Exchange (ETDEWEB)

    Pan, Lehua [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Oldenburg, Curtis M. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2017-07-26

    Potential CO2 leakage through existing open wellbores is one of the most significant hazards that need to be addressed in geologic carbon sequestration (GCS) projects. In the framework of the National Risk Assessment Partnership (NRAP) which requires fast computations for uncertainty analysis, rigorous simulation of the coupled wellbore-reservoir system is not practical. We have developed a 7,200-point look-up table reduced-order model (ROM) for estimating the potential leakage rate up open wellbores in response to CO2 injection nearby. The ROM is based on coupled simulations using T2Well/ECO2H which was run repeatedly for representative conditions relevant to NRAP to create a look-up table response-surface ROM. The ROM applies to a wellbore that fully penetrates a 20-m thick reservoir that is used for CO2 storage. The radially symmetric reservoir is assumed to have initially uniform pressure, temperature, gas saturation, and brine salinity, and it is assumed these conditions are held constant at the far-field boundary (100 m away from the wellbore). In such a system, the leakage can quickly reach quasi-steady state. The ROM table can be used to estimate both the free-phase CO2 and brine leakage rates through an open well as a function of wellbore and reservoir conditions. Results show that injection-induced pressure and reservoir gas saturation play important roles in controlling leakage. Caution must be used in the application of this ROM because well leakage is formally transient and the ROM lookup table was populated using quasi-steady simulation output after 1000 time steps which may correspond to different physical times for the various parameter combinations of the coupled wellbore-reservoir system.

  20. Geomechanical analyses to investigate wellbore/mine interactions in the Potash Enclave of Southeastern New Mexico.

    Energy Technology Data Exchange (ETDEWEB)

    Ehgartner, Brian L.; Bean, James E. (Sandia Staffing Alliance, LLC, Albuquerque, NM); Arguello, Jose Guadalupe, Jr.; Stone, Charles Michael

    2010-04-01

    Geomechanical analyses have been performed to investigate potential mine interactions with wellbores that could occur in the Potash Enclave of Southeastern New Mexico. Two basic models were used in the study; (1) a global model that simulates the mechanics associated with mining and subsidence and (2) a wellbore model that examines the resulting interaction impacts on the wellbore casing. The first model is a 2D approximation of a potash mine using a plane strain idealization for mine depths of 304.8 m (1000 ft) and 609.6 m (2000 ft). A 3D wellbore model then considers the impact of bedding plane slippage across single and double cased wells cemented through the Salado formation. The wellbore model establishes allowable slippage to prevent casing yield.

  1. Geomechanics of fracture caging in wellbores

    NARCIS (Netherlands)

    Weijermars, R.; Zhang, X.; Schultz-Ela, D.

    2013-01-01

    This study highlights the occurrence of so-called ‘fracture cages’ around underbalanced wellbores, where fractures cannot propagate outwards due to unfavourable principal stress orientations. The existence of such cages is demonstrated here by independent analytical and numerical methods. We explain

  2. Numerical simulation on streaming potentials in a wellbore; Koseinai no ryudo den`i ni kansuru suchi simulation

    Energy Technology Data Exchange (ETDEWEB)

    Ikeda, N [Kyushu University, Fukuoka (Japan)

    1996-05-01

    This paper reports numerical computation of streaming potentials which are generated by transient pressure waves propagating the vicinity of wellbore wall immediately after a mud cake formed on the wellbore wall has been removed. One existing analysis solution on heat conduction was utilized upon changing the parameters in order to derive fluid pressure inside the ground bed. Calculations were carried out by using the existing three-dimensional finite difference method (partly re-written) based on the relationship constituted between the fluid pressure and the streaming potential. This paper presents results of calculating the streaming potentials in wellbores on models having wellbores filled with mud water in a cubic ground bed existing with ground bed water at saturation of 100%. The calculations have been conducted on the following cases: a case where permeability of the ground bed is small with the fluid under two conditions of low electric resistivity and high electric resistivity, a case where the permeability is large with the fluid under the above conditions, and a case where a small area of bore wall is covered with a rubber pad having high electric resistivity under a low electric resistivity condition. 8 refs., 5 figs.

  3. Simulation of a SAGD well blowout using a reservoir-wellbore coupled simulator

    Energy Technology Data Exchange (ETDEWEB)

    Walter, J.; Vanegas, P.; Cunha, L.B. [Alberta Univ., Edmonton, AB (Canada); Worth, D.J. [C-FER Technologies, Edmonton, AB (Canada); Crepin, S. [Petrocedeno, Caracas (Venezuela)

    2008-10-15

    Single barrier completion systems are typically used in SAGD projects due to the lack of equipment suitable for high temperature SAGD downhole environments. This study used a wellbore and reservoir coupled thermal simulator tool to investigate the blowout behaviour of a steam assisted gravity drainage (SAGD) well pair when the safety barrier has failed. Fluid flow pressure drop through the wellbore and heat losses between the wellbore and the reservoir were modelled using a discretized wellbore option and a semi-analytical model. The fully coupled mechanistic model accounted for the simultaneous transient pressure and temperature variations along the wellbore and the reservoir. The simulations were used to predict flowing potential and fluid compositions of both wells in a SAGD well pair under various flowing conditions. Blowout scenarios were created for 3 different points in the well pair's life. Three flow paths during the blowout were evaluated for both the production and injection wells. Results of the study were used to conduct a comparative risk assessment between a double barrier and a single barrier completion. The modelling study confirmed that both the injection and production wells had the potential for blowouts lasting significant periods of time, with liquid rates over 50 times the normal production liquid rates. The model successfully predicted the blowout flow potential of the SAGD well pairs. 8 refs., 3 tabs., 18 figs.

  4. Optimization of SAGD wellbore completions : short production tubing string sensitivities

    Energy Technology Data Exchange (ETDEWEB)

    Cokar, M.; Graham, J. [Society of Petroleum Engineers, Canadian Section, Calgary, AB (Canada)]|[Petro-Canada, Calgary, AB (Canada)

    2008-10-15

    This study investigated the effects of changing the landing position of short production tubing strings near the heel of steam assisted gravity drainage (SAGD) production wells. A homogenous discretized wellbore model was used to model the reservoir and wellbore simultaneously in order to study wellbore and reservoir interactions. The aim of the study was to develop a method of optimizing bitumen production and determining the most economical position for wellbore strings. Simulations were conducted to examine the effect of shortening the production tubing string and examine the impact of extending the tubing string beyond the heel of the well on bitumen bitumen production rates and the steam oil ratio (SOR). Results of the study showed that a shortened string decreased bitumen production rates, while the amounts of steam produced through the tubing string increased. When the tubing string was extended past the heel of the well, bitumen production rates remained the same, but steam injection rates and SOR decreased. A lower pressure differential between the injector and producer wells was also observed. The study showed that SAGD producers can re-position production tubing strings in order to determine ratios of liquid production. It was concluded that although placing the short production tubing string close to the heel increased oil production, a longer tubing string improved production rates while lowering operating costs. 3 refs., 3 tabs., 35 figs.

  5. Parametric Sensitivity Study of Operating and Design Variables in Wellbore Heat Exchangers

    International Nuclear Information System (INIS)

    Nalla, G.; Shook, G.M.; Mines, G.L.; Bloomfield, K.K.

    2004-01-01

    This report documents the results of an extensive sensitivity study conducted by the Idaho National Engineering and Environmental Laboratory. This study investigated the effects of various operating and design parameters on wellbore heat exchanger performance to determine conditions for optimal thermal energy extraction and evaluate the potential for using a wellbore heat exchanger model for power generation. Variables studied included operational parameters such as circulation rates, wellbore geometries and working fluid properties, and regional properties including basal heat flux and formation rock type. Energy extraction is strongly affected by fluid residence time, heat transfer contact area, and formation thermal properties. Water appears to be the most appropriate working fluid. Aside from minimal tubing insulation, tubing properties are second order effects. On the basis of the sensitivity study, a best case model was simulated and the results compared against existing low-temperature power generation plants. Even assuming ideal work conversion to electric power, a wellbore heat exchange model cannot generate 200 kW (682.4e+3 BTU/h) at the onset of pseudosteady state. Using realistic conversion efficiency, the method is unlikely to generate 50 kW (170.6e+3 BTU/h)

  6. Evolution of Cement-Casing Interface in Wellbore Microannuli under Stress

    Science.gov (United States)

    Matteo, E. N.; Gomez, S. P.; Sobolik, S. R.; Taha, M. R.; Stormont, J.

    2017-12-01

    Laboratory tests measured the compressibility and flow characteristics of wellbore microannuli. Specimens, consisting of a cement sheath cast on a steel casing with microannuli, were subjected to confining pressures and casing pressures in a pressure vessel that allows simultaneous measurement of gas flow along the axis of the specimen. The flow was interpreted as the hydraulic aperture of the microannuli. We found the hydraulic aperture decreases as confining stress is increased. The larger the initial hydraulic aperture, the more it decreases as confining stress increases. The changes in measured hydraulic aperture correspond to changes of many orders of magnitude in permeability of the wellbore system, suggesting that microannulus response to stress changes may have a significant impact on estimates of wellbore leakage. A finite element model of a wellbore system was developed that included elements representing the microannulus that incorporated the hyperbolic joint model. The thickness of the microannulus elements is equivalent to the hydraulic aperture. The calculated normal stress across the microannulus used in the numerical implementation was found to be similar to the applied confining pressure in the laboratory tests. The microannulus elements were found to reasonably reproduce laboratory behavior during loading from confining pressure increases. The calculated microannulus response to internal casing pressure changes was less stiff than measured, which may be due to hardening of the microannulus during testing. In particular, the microannulus model could be used to estimate CO2 leakage as a function of formation stress changes and/or displacements, or loading from casing expansion or contraction during wellbore operations. Recommendations for future work include an application of the joint model with a thermally active large-scale reservoir coupled with pore pressure caused by dynamic CO2 injection and subsequent microannulus region affects. Sandia

  7. Plugging wellbore fractures : limit equilibrium of a Bingham drilling mud cake in a tensile crack

    Energy Technology Data Exchange (ETDEWEB)

    Garagash, D.I. [Dalhousie Univ., Halifax, NS (Canada). Dept. of Civil and Resource Engineering

    2009-07-01

    The proper selection of drilling muds is important in order to successfully drill hydrocarbon wells in which wellbore mud pressure remains low enough to prevent circulation loss and high enough to support the uncased wellbore against the shear failure. This paper presented a mathematical model to study invasion of mud cake into a drilling-induced planar fracture at the edge of a wellbore perpendicular to the minimum in situ principal stress. The model assumed a planar edge-crack geometry loaded by the wellbore hoop stress, variable mud pressure along the invaded region adjacent to the wellbore, and uniform pore-fluid pressure along the rest of the crack. The invading mud was assumed to freely displaces the pore-fluid in the crack without mixing with it. The case corresponding to a sufficiently permeable formation was considered. This solution provides a means to evaluate whether or not the mud cake could effectively plug the fracture, thereby prevent fracture propagation and associated uncontrollable loss of wellbore drilling mud. The toughness or tensile strength is evaluated based on criterion for initiation of crack propagation, which may lead to uncontrollable loss of mud circulation in a well. The study provided information on the breakdown pressure as a function of the rock ambient stress, ambient pore pressure, pre-existing crack length, and mud cake properties. 12 refs., 6 figs.

  8. Wellbore integrity analysis of a natural CO2 producer

    KAUST Repository

    Crow, Walter; Carey, J. William; Gasda, Sarah; Brian Williams, D.; Celia, Michael

    2010-01-01

    integrity, defined as the maintenance of isolation between subsurface intervals. In this report, we investigate a 30-year-old well from a natural CO2 production reservoir using a suite of downhole and laboratory tests to characterize isolation performance

  9. A 3-D wellbore simulator (WELLTHER-SIM) to determine the thermal diffusivity of rock-formations

    Science.gov (United States)

    Wong-Loya, J. A.; Santoyo, E.; Andaverde, J.

    2017-06-01

    Acquiring thermophysical properties of rock-formations in geothermal systems is an essential task required for the well drilling and completion. Wellbore thermal simulators require such properties for predicting the thermal behavior of a wellbore and the formation under drilling and shut-in conditions. The estimation of static formation temperatures also needs the use of these properties for the wellbore and formation materials (drilling fluids and pipes, cements, casings, and rocks). A numerical simulator (WELLTHER-SIM) has been developed for modeling the drilling fluid circulation and shut-in processes of geothermal wellbores, and for the in-situ determination of thermal diffusivities of rocks. Bottomhole temperatures logged under shut-in conditions (BHTm), and thermophysical and transport properties of drilling fluids were used as main input data. To model the thermal disturbance and recovery processes in the wellbore and rock-formation, initial drilling fluid and static formation temperatures were used as initial and boundary conditions. WELLTHER-SIM uses these temperatures together with an initial thermal diffusivity for the rock-formation to solve the governing equations of the heat transfer model. WELLTHER-SIM was programmed using the finite volume technique to solve the heat conduction equations under 3-D and transient conditions. Thermal diffusivities of rock-formations were inversely computed by using an iterative and efficient numerical simulation, where simulated thermal recovery data sets (BHTs) were statistically compared with those temperature measurements (BHTm) logged in some geothermal wellbores. The simulator was validated using a well-documented case reported in the literature, where the thermophysical properties of the rock-formation are known with accuracy. The new numerical simulator has been successfully applied to two wellbores drilled in geothermal fields of Japan and Mexico. Details of the physical conceptual model, the numerical

  10. Numerical Simulation on Open Wellbore Shrinkage and Casing Equivalent Stress in Bedded Salt Rock Stratum

    Directory of Open Access Journals (Sweden)

    Jianjun Liu

    2013-01-01

    Full Text Available Most salt rock has interbed of mudstone in China. Owing to the enormous difference of mechanical properties between the mudstone interbed and salt rock, the stress-strain and creep behaviors of salt rock are significantly influenced by neighboring mudstone interbed. In order to identify the rules of wellbore shrinkage and casings equivalent stress in bedded salt rock stratum, three-dimensional finite difference models were established. The effects of thickness and elasticity modulus of mudstone interbed on the open wellbore shrinkage and equivalent stress of casing after cementing operation were studied, respectively. The results indicate that the shrinkage of open wellbore and equivalent stress of casings decreases with the increase of mudstone interbed thickness. The increasing of elasticity modulus will reduce the shrinkage of open wellbore and casing equivalent stress. Research results can provide the scientific basis for the design of mud density and casing strength.

  11. Effect of fluid penetration on tensile failure during fracturing of an open-hole wellbore

    Science.gov (United States)

    Zeng, Fanhui; Cheng, Xiaozhao; Guo, Jianchun; Chen, Zhangxin; Tao, Liang; Liu, Xiaohua; Jiang, Qifeng; Xiang, Jianhua

    2018-06-01

    It is widely accepted that a fracture can be induced at a wellbore surface when the fluid pressure overcomes the rock tensile strength. However, few models of this phenomenon account for the fluid penetration effect. A rock is a typical permeable, porous medium, and the transmission of pressure from a wellbore to the surrounding rock temporally and spatially perturbs the effective stresses. In addition, these induced stresses influence the fracture initiation pressure. To gain a better understanding of the penetration effect on the initiation pressure of a permeable formation, a comprehensive formula is presented to study the effects of the in situ stresses, rock mechanical properties, injection rate, rock permeability, fluid viscosity, fluid compressibility and wellbore size on the magnitude of the initiation pressure during fracturing of an open-hole wellbore. In this context, the penetration effect is treated as a consequence of the interaction among these parameters by using Darcy’s law of radial flow. A fully coupled analytical procedure is developed to show how the fracturing fluid infiltrates the rock around the wellbore and considerably reduces the magnitude of the initiation pressure. Moreover, the calculation results are validated by hydraulic fracturing experiments in hydrostone. An exhaustive sensitivity study is performed, indicating that the local fluid pressure induced from a seepage effect strongly influences the fracture evolution. For permeable reservoirs, a low injection rate and a low viscosity of the injected fluid have a significant impact on the fracture initiation pressure. In this case, the Hubbert and Haimson equations to predict the fracture initiation pressure are not valid. The open-hole fracture initiation pressure increases with the fracturing fluid viscosity and fluid compressibility, while it decreases as the rock permeability, injection rate and wellbore size increase.

  12. Quantifying drag on wellbore casings in moving salt sheets

    Science.gov (United States)

    Weijermars, R.; Jackson, M. P. A.; Dooley, T. P.

    2014-08-01

    Frontier hydrocarbon development projects in the deepwater slopes of the Gulf of Mexico Basin, Santos Basin and Lower Congo Basin all require wells to cross ductile layers of autochthonous or allochthonous salt moving at peak rates of 100 mm yr-1. The Couette-Poiseuille number is introduced here to help pinpoint the depth of shear stress reversal in such salt layers. For any well-planned through salt, the probable range of creep forces of moving salt needs to be taken into account when designing safety margins and load-factor tolerance of the well casing. Drag forces increase with wellbore diameter, but more significantly with effective viscosity and speed of the creeping salt layer. The potential drag forces on cased wellbores in moving salt sheets are estimated analytically using a range of salt viscosities (1015-1019 Pa s) and creep rates (0-10 mm yr-1). Drag on perfectly rigid casing of infinite strength may reach up to 13 Giga Newton per meter wellbore length in salt having a viscosity of 1019 Pa s. Well designers may delay stress accumulations due to salt drag when flexible casing accommodates some of the early displacement and strain. However, all creeping salt could displace, fracture and disconnect well casing, eventually. The shear strength of typical heavy duty well casing (about 1000 MPa) can be reached due to drag by moving salt. Internal flow of salt will then fracture the casing near salt entry and exit points, but the structural damage is likely to remain unnoticed early in the well-life when the horizontal shift of the wellbore is still negligibly small (at less than 1 cm yr-1). Disruption of casing and production flow lines within the anticipated service lifetime of a well remains a significant risk factor within distinct zones of low-viscosity salt which may reach ultrafast creep rates of 100 mm yr-1.

  13. Determination of transient temperature distribution inside a wellbore considering drill string assembly and casing program

    International Nuclear Information System (INIS)

    Yang, Mou; Zhao, Xiangyang; Meng, Yingfeng; Li, Gao; Zhang, Lin; Xu, Haiming; Tang, Daqian

    2017-01-01

    Highlights: • The different wellbore conditions of heat transfer models were developed. • Drill string assembly and casing programs impact on down-hole temperatures. • The thermal performance in circulation and shut-in stages were deeply investigated. • Full-scale model coincided with the measured field data preferably. - Abstract: Heat exchange efficiency between each region of the wellbore and formation systems is influenced by the high thermal conductivity of the drill string and casing, which further affects temperature distribution of the wellbore. Based on the energy conservation principle, the Modified Raymond, Simplified and Full-scale models were developed, which were solved by the fully implicit finite difference method. The results indicated that wellbore and formation temperatures were significantly influenced at the connection points between the drill collar and drill pipe, as well as the casing shoe. Apart from the near surface, little change was observed in temperature distribution in the cement section. In the open-hole section, the temperature rapidly decreased in the circulation stage and gradually increased in the shut-in stage. Most important, the simulated result from the full-scale model coincided with the measured field data better than the other numerical models. These findings not only confirm the effect of the drill string assembly and casing programs on the wellbore and formation temperature distribution, but also contribute to resource exploration, drilling safety and reduced drilling costs.

  14. Micro Mechanics and Microstructures of Major Subsurface Hydraulic Barriers: Shale Caprock vs Wellbore Cement

    Science.gov (United States)

    Radonjic, M.; Du, H.

    2015-12-01

    Shale caprocks and wellbore cements are two of the most common subsurface impermeable barriers in the oil and gas industry. More than 60% of effective seals for geologic hydrocarbon bearing formations as natural hydraulic barriers constitute of shale rocks. Wellbore cements provide zonal isolation as an engineered hydraulic barrier to ensure controlled fluid flow from the reservoir to the production facilities. Shale caprocks were deposited and formed by squeezing excess formation water and mineralogical transformations at different temperatures and pressures. In a similar process, wellbore cements are subjected to compression during expandable tubular operations, which lead to a rapid pore water propagation and secondary mineral precipitation within the cement. The focus of this research was to investigate the effect of wellbore cement compression on its microstructure and mechanical properties, as well as a preliminary comparison of shale caprocks and hydrated cement. The purpose of comparative evaluation of engineered vs natural hydraulic barrier materials is to further improve wellbore cement durability when in contact with geofluids. The micro-indentation was utilized to evaluate the change in cement mechanical properties caused by compression. Indentation experiments showed an overall increase in hardness and Young's modulus of compressed cement. Furthermore, SEM imaging and Electron Probe Microanalysis showed mineralogical alterations and decrease in porosity. These can be correlated with the cement rehydration caused by microstructure changes as a result of compression. The mechanical properties were also quantitatively compared to shale caprock samples in order to investigate the similarities of hydraulic barrier features that could help to improve the subsurface application of cement in zonal isolation. The comparison results showed that the poro-mechanical characteristics of wellbore cement appear to be improved when inherent pore sizes are shifted to

  15. Wellbore enlargement investigation: Potential analogs to the Waste Isolation Pilot Plant during inadvertent intrusion of the repository

    International Nuclear Information System (INIS)

    Boak, D.M.; Dotson, L.; Aguilar, R.

    1997-01-01

    This study involved the evaluation and documentation of cases in which petroleum wellbores were enlarged beyond the nominal hole diameter as a consequence of erosion during exploratory drilling, particularly as a function of gas flow into the wellbore during blowout conditions. A primary objective was to identify analogs to potential wellbore enlargement at the Waste Isolation Pilot Plant (WIPP) during inadvertent human intrusion. Secondary objectives were to identify drilling scenarios associated with enlargement, determine the physical extent of enlargement, and establish the physical properties of the formation in which the enlargement occurred. No analogs of sufficient quality to establish quantitative limits on wellbore enlargement at the WIPP disposal system were identified. However, some information was obtained regarding the frequency of petroleum well blowouts and the likelihood that such blowouts would bridge downhole, self-limiting the surface release of disposal-system material. Further work would be necessary, however, to determine the conditions under which bridging could occur and the extent to which the bridging might be applicable to WIPP. In addition, data on casing sizes of petroleum boreholes in the WIPP vicinity support the use of a 12-1/4 inch borehole size in WIPP performance assessment calculations. Finally, although data are limited, there was no evidence of significant wellbore enlargement in any of three blowouts that occur-red in wellbores in the Delaware Basin (South Culebra Bluff Unit No. 1, Energy Research and Development Administration (ERDA) 6, and WIPP 12)

  16. Stick-slip and Torsional Friction Factors in Inclined Wellbores

    Directory of Open Access Journals (Sweden)

    Aarsnes Ulf Jakob F.

    2018-01-01

    The model is shown to have a good match with the surface and downhole behavior of two deviated wellbores for depths ranging from 1500 to 3000 meters. In particular, the model replicates the amplitude and period of the oscillations, in both the topside torque and the downhole RPM, as caused by the along-string stick slip. It is further shown that by using the surface behavior of the drill-string during rotational startup, an estimate of the static and dynamic friction factors along the wellbore can be obtained, even during stick-slip oscillations, if axial tension in the drillstring is considered. This presents a possible method to estimate friction factors in the field when off-bottom stick slip is encountered, and points in the direction of avoiding stick slip through the design of an appropriate torsional start-up procedure without the need of an explicit friction test.

  17. Optimum position for wells producing at constant wellbore pressure

    Energy Technology Data Exchange (ETDEWEB)

    Camacho-Velazquez, R.; Rodriguez de la Garza, F. [Univ. Nacional Autonoma de Mexico, Mexico City (Mexico); Galindo-Nava, A. [Inst. Mexicanos del Petroleo, Mexico City (Mexico)]|[Univ. Nacional de Mexico, Mexico City (Mexico); Prats, M.

    1994-12-31

    This paper deals with the determination of the optimum position of several wells, producing at constant different wellbore pressures from a two-dimensional closed-boundary reservoirs, to maximize the cumulative production or the total flow rate. To achieve this objective they authors use an improved version of the analytical solution recently proposed by Rodriguez and Cinco-Ley and an optimization algorithm based on a quasi-Newton procedure with line search. At each iteration the algorithm approximates the negative of the objective function by a cuadratic relation derived from a Taylor series. The improvement of rodriguez and Cinco`s solution is attained in four ways. First, an approximation is obtained, which works better at earlier times (before the boundary dominated period starts) than the previous solution. Second, the infinite sums that are present in the solution are expressed in a condensed form, which is relevant for reducing the computer time when the optimization algorithm is used. Third, the solution is modified to take into account the possibility of having wells starting to produce at different times. This point allows them to deal with the problem of getting the optimum position for an infill drilling program. Last, the solution is extended to include the possibility of changing the value of wellbore pressure or being able to stimulate any of the wells at any time. When the wells are producing at different wellbore pressures it is found that the optimum position is a function of time, otherwise the optimum position is fixed.

  18. Coupling of the reservoir simulator TOUGH and the wellbore simulator WFSA

    Energy Technology Data Exchange (ETDEWEB)

    Hadgu, T.; Zimmerman, R.W.; Bodvarsson [Lawrence Berkeley Laboratory, Berkeley, CA (United States)

    1995-03-01

    The reservoir simulator TOUGH and the wellbore simulator WFSA have been coupled, so as to allow simultaneous modeling of the flow of geothermal brine in the reservoir as well as in the wellbore. A new module, COUPLE, allows WFSA to be called as a subroutine by TOUGH. The mass flowrate computed by WFSA now serves as a source/sink term for the TOUGH wellblocks. Sample problems are given to illustrate the use of the coupled codes. One of these problems compares the results of the new simulation method to those obtained using the deliverability option in TOUGH. The coupled computing procedure is shown to simulate more accurately the behavior of a geothermal reservoir under exploitation.

  19. Novel Experimental Techniques to Investigate Wellbore Damage Mechanisms

    Science.gov (United States)

    Choens, R. C., II; Ingraham, M. D.; Lee, M.; Dewers, T. A.

    2017-12-01

    A new experimental technique with unique geometry is presented investigating deformation of simulated boreholes using standard axisymmetric triaxial deformation equipment. The Sandia WEllbore SImulation, SWESI, geometry, uses right cylinders of rock 50mm in diameter and 75mm in length. A 11.3mm hole is drilled perpendicular to the axis of the cylinder in the center of the sample to simulate a borehole. The hole is covered with a solid metal cover, and sealed with polyurethane. The metal cover can be machined with a high-pressure port to introduce different fluid chemistries into the borehole at controlled pressures. Samples are deformed in a standard load frame under confinement, allowing for a broad range of possible stresses, load paths, and temperatures. Experiments in this study are loaded to the desired confining pressure, then deformed at a constant axial strain rate or 10-5 sec-1. Two different suites of experiments are conducted in this study on sedimentary and crystalline rock types. The first series of experiments are conducted on Mancos Shale, a finely laminated transversely isotropic rock. Samples are cored at three different orientations to the laminations. A second series of experiments is conducted on Sierra White granite with different fluid chemistries inside the borehole. Numerical modelling and experimental observations including CT-microtomography demonstrate that stresses are concentrated around the simulated wellbore and recreate wellbore deformation mechanisms. Borehole strength and damage development is dependent on anisotropy orientation and fluid chemistry. Observed failure geometries, particularly for Mancos shale, can be highly asymmetric. These results demonstrate uncertainties in in situ stresses measurements using commonly-applied borehole breakout techniques in complicated borehole physico-chemical environments. Sandia National Laboratories is a multimission laboratory managed and operated by National Technology and Engineering

  20. Numerical simulation in steam injection wellbores by mechanistic approach; Simulacao numerica do escoamento de vapor em pocos por uma abordagem mecanicista

    Energy Technology Data Exchange (ETDEWEB)

    Souza Junior, J.C. de; Campos, W.; Lopes, D.; Moura, L.S.S. [PETROBRAS, Rio de Janeiro, RJ (Brazil); Thomas, A. Clecio F. [Universidade Estadual do Ceara (UECE), CE (Brazil)

    2008-07-01

    This work addresses to the development of a hydrodynamic and heat transfer mechanistic model for steam flow in injection wellbores. The problem of two-phase steam flow in wellbores has been solved recently by using available empirical correlations from petroleum industry (Lopes, 1986) and nuclear industry (Moura, 1991).The good performance achieved by mechanistic models developed by Ansari (1994), Hasan (1995), Gomez (2000) and Kaya (2001) supports the importance of the mechanistic approach for the steam flow problem in injection wellbores. In this study, the methodology to solve the problem consists in the application of a numerical method to the governing equations of steam flow and a marching algorithm to determine the distribution of the pressure and temperature along the wellbore. So, a computer code has been formulated to get numerical results, which provides a comparative study to the main models found in the literature. Finally, when compared to available field data, the mechanistic model for downward vertical steam flow in wellbores gave better results than the empirical correlations. (author)

  1. The Development of a Gas-Liquid Two-Phase Flow Sensor Applicable to CBM Wellbore Annulus.

    Science.gov (United States)

    Wu, Chuan; Wen, Guojun; Han, Lei; Wu, Xiaoming

    2016-11-18

    The measurement of wellbore annulus gas-liquid two-phase flow in CBM (coalbed methane) wells is of great significance for reasonably developing gas drainage and extraction processes, estimating CBM output, judging the operating conditions of CBM wells and analyzing stratum conditions. Hence, a specially designed sensor is urgently needed for real-time measurement of gas-liquid two-phase flow in CBM wellbore annulus. Existing flow sensors fail to meet the requirements of the operating conditions of CBM wellbore annulus due to such factors as an inapplicable measurement principle, larger size, poor sealability, high installation accuracy, and higher requirements for fluid media. Therefore, based on the principle of a target flowmeter, this paper designs a new two-phase flow sensor that can identify and automatically calibrate different flow patterns of two-phase flows. Upon the successful development of the new flow sensor, lab and field tests were carried out, and the results show that the newly designed sensor, with a measurement accuracy of ±2.5%, can adapt to the operating conditions of CBM wells and is reliable for long-term work.

  2. The Development of a Gas–Liquid Two-Phase Flow Sensor Applicable to CBM Wellbore Annulus

    Science.gov (United States)

    Wu, Chuan; Wen, Guojun; Han, Lei; Wu, Xiaoming

    2016-01-01

    The measurement of wellbore annulus gas–liquid two-phase flow in CBM (coalbed methane) wells is of great significance for reasonably developing gas drainage and extraction processes, estimating CBM output, judging the operating conditions of CBM wells and analyzing stratum conditions. Hence, a specially designed sensor is urgently needed for real-time measurement of gas–liquid two-phase flow in CBM wellbore annulus. Existing flow sensors fail to meet the requirements of the operating conditions of CBM wellbore annulus due to such factors as an inapplicable measurement principle, larger size, poor sealability, high installation accuracy, and higher requirements for fluid media. Therefore, based on the principle of a target flowmeter, this paper designs a new two-phase flow sensor that can identify and automatically calibrate different flow patterns of two-phase flows. Upon the successful development of the new flow sensor, lab and field tests were carried out, and the results show that the newly designed sensor, with a measurement accuracy of ±2.5%, can adapt to the operating conditions of CBM wells and is reliable for long-term work. PMID:27869708

  3. Bayou Choctaw Well Integrity Grading Component Based on Geomechanical Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Park, Byoung [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Geotechnology & Engineering Dept.

    2016-09-08

    This letter report provides a Bayou Choctaw (BC) Strategic Petroleum Reserve (SPR) well grading system based on the geomechanical simulation. The analyses described in this letter were used to evaluate the caverns’ geomechanical effect on wellbore integrity, which is an important component in the well integrity grading system recently developed by Roberts et al. [2015]. Using these analyses, the wellbores for caverns BC-17 and 20 are expected to be significantly impacted by cavern geomechanics, BC-18 and 19 are expected to be medium impacted; and the other caverns are expected to be less impacted.

  4. The Development of a Gas–Liquid Two-Phase Flow Sensor Applicable to CBM Wellbore Annulus

    Directory of Open Access Journals (Sweden)

    Chuan Wu

    2016-11-01

    Full Text Available The measurement of wellbore annulus gas–liquid two-phase flow in CBM (coalbed methane wells is of great significance for reasonably developing gas drainage and extraction processes, estimating CBM output, judging the operating conditions of CBM wells and analyzing stratum conditions. Hence, a specially designed sensor is urgently needed for real-time measurement of gas–liquid two-phase flow in CBM wellbore annulus. Existing flow sensors fail to meet the requirements of the operating conditions of CBM wellbore annulus due to such factors as an inapplicable measurement principle, larger size, poor sealability, high installation accuracy, and higher requirements for fluid media. Therefore, based on the principle of a target flowmeter, this paper designs a new two-phase flow sensor that can identify and automatically calibrate different flow patterns of two-phase flows. Upon the successful development of the new flow sensor, lab and field tests were carried out, and the results show that the newly designed sensor, with a measurement accuracy of ±2.5%, can adapt to the operating conditions of CBM wells and is reliable for long-term work.

  5. Hydrophysical logging: A new wellbore technology for hydrogeologic and contaminant characterization of aquifers

    International Nuclear Information System (INIS)

    Pedler, W.H.; Williams, L.L.; Head, C.L.

    1992-01-01

    In the continuing search for improved groundwater characterization technologies, a new wellbore fluid logging method has recently been developed to provide accurate and cost effective hydrogeologic and contaminant characterization of bedrock aquifers. This new technique, termed hydrophysical logging, provides critical information for contaminated site characterization and water supply studies and, in addition, offers advantages compared to existing industry standards for aquifer characterization. Hydrophysical logging is based on measuring induced electrical conductivity changes in the fluid column of a wellbore by employing advanced downhole water quality instrumentation specifically developed for the dynamic borehole environment. Hydrophysical logging contemporaneously identifies the locations of water bearing intervals, the interval-specific inflow rate during pumping, and in-situ hydrochemistry of the formation waters associated with each producing interval. In addition, by employing a discrete point downhole fluid sampler during hydrophysical logging, this technique provides evaluation of contaminant concentrations and migration of contaminants vertically within the borehole. Recently, hydrophysical logging was applied in a deep bedrock wellbore at an industrial site in New Hampshire contaminated with dense nonaqueous phase liquids (DNAPLs). The results of the hydrophysical logging, conducted as part of a hydrogeologic site investigation and feasibility study, facilitated investigation of the site by providing information which indicated that the contamination had not penetrated into deeper bedrock fractures at concentrations of concern. This information was used to focus the pending Remedial Action Plan and to provide a more cost-effective remedial design

  6. Numerical studies of CO2 and brine leakage into a shallow aquifer through an open wellbore

    Science.gov (United States)

    Wang, Jingrui; Hu, Litang; Pan, Lehua; Zhang, Keni

    2018-03-01

    Industrial-scale geological storage of CO2 in saline aquifers may cause CO2 and brine leakage from abandoned wells into shallow fresh aquifers. This leakage problem involves the flow dynamics in both the wellbore and the storage reservoir. T2Well/ECO2N, a coupled wellbore-reservoir flow simulator, was used to analyze CO2 and brine leakage under different conditions with a hypothetical simulation model in water-CO2-brine systems. Parametric studies on CO2 and brine leakage, including the salinity, excess pore pressure (EPP) and initially dissolved CO2 mass fraction, are conducted to understand the mechanism of CO2 migration. The results show that brine leakage rates increase proportionally with EPP and inversely with the salinity when EPP varies from 0.5 to 1.5 MPa; however, there is no CO2 leakage into the shallow freshwater aquifer if EPP is less than 0.5 MPa. The dissolved CO2 mass fraction shows an important influence on the CO2 plume, as part of the dissolved CO2 becomes a free phase. Scenario simulation shows that the gas lifting effect will significantly increase the brine leakage rate into the shallow freshwater aquifer under the scenario of 3.89% dissolved CO2 mass fraction. The equivalent porous media (EPM) approach used to model the wellbore flow has been evaluated and results show that the EPM approach could either under- or over-estimate brine leakage rates under most scenarios. The discrepancies become more significant if a free CO2 phase evolves. Therefore, a model that can correctly describe the complex flow dynamics in the wellbore is necessary for investigating the leakage problems.

  7. Understanding acoustic physics in oil and gas wellbores with the presence of ubiquitous geometric eccentricity

    Science.gov (United States)

    Liu, Yang; D'Angelo, Ralph M.; Choi, Gloria; Zhu, Lingchen; Bose, Sandip; Zeroug, Smaine

    2018-04-01

    Once an oil and gas wellbore has been drilled, steel casings and cement slurry are placed to ensure structural support, protection from fluid invasion, and most importantly to provide zonal isolation. The actual wellbore and string structure is rarely concentric but rather is often an eccentric one, especially in deviated boreholes. The term "eccentricity" is used to describe how off-center a casing string is within another pipe or the open-hole. In a typical double-string configuration, the inner casing is eccentered with respect to the outer string which itself is also eccentered within the cylindrical hole. The annuli may or may not be filled with solid cement, and the cement may have liquid-filled channels or be disbonded over localized azimuthal ranges. The complexity of wave propagation along axial intervals is significant in that multiple modes can be excited and detected with characteristics that are affected by the various parameters, including eccentering, in a non-linear fashion. A successful diagnosis of cement flaws largely relies on a thorough understanding of the complex acoustic modal information. The present study employs both modeling and experiments to fully understand the acoustic wave propagation in the complex, fluid-solid nested, cylindrically layered structures, with geometric eccentricities. The experimental results show excellent agreement with the theoretical predictions from newly developed, borehole acoustic modeling approaches. As such, it provides the basis for better understanding the operative wave physics and providing the means for effective inspection methodologies to assess well integrity and zonal isolation of oil wells.

  8. Near Wellbore Hydraulic Fracture Propagation from Perforations in Tight Rocks: The Roles of Fracturing Fluid Viscosity and Injection Rate

    Directory of Open Access Journals (Sweden)

    Seyed Hassan Fallahzadeh

    2017-03-01

    Full Text Available Hydraulic fracture initiation and near wellbore propagation is governed by complex failure mechanisms, especially in cased perforated wellbores. Various parameters affect such mechanisms, including fracturing fluid viscosity and injection rate. In this study, three different fracturing fluids with viscosities ranging from 20 to 600 Pa.s were used to investigate the effects of varying fracturing fluid viscosities and fluid injection rates on the fracturing mechanisms. Hydraulic fracturing tests were conducted in cased perforated boreholes made in tight 150 mm synthetic cubic samples. A true tri-axial stress cell was used to simulate real far field stress conditions. In addition, dimensional analyses were performed to correspond the results of lab experiments to field-scale operations. The results indicated that by increasing the fracturing fluid viscosity and injection rate, the fracturing energy increased, and consequently, higher fracturing pressures were observed. However, when the fracturing energy was transferred to a borehole at a faster rate, the fracture initiation angle also increased. This resulted in more curved fracture planes. Accordingly, a new parameter, called fracturing power, was introduced to relate fracture geometry to fluid viscosity and injection rate. Furthermore, it was observed that the presence of casing in the wellbore impacted the stress distribution around the casing in such a way that the fracture propagation deviated from the wellbore vicinity.

  9. Regional-scale advective, diffusive, and eruptive dynamics of CO2 and brine leakage through faults and wellbores

    Science.gov (United States)

    Jung, Na-Hyun; Han, Weon Shik; Han, Kyungdoe; Park, Eungyu

    2015-05-01

    Regional-scale advective, diffusive, and eruptive transport dynamics of CO2 and brine within a natural analogue in the northern Paradox Basin, Utah, were explored by integrating numerical simulations with soil CO2 flux measurements. Deeply sourced CO2 migrates through steeply dipping fault zones to the shallow aquifers predominantly as an aqueous phase. Dense CO2-rich brine mixes with regional groundwater, enhancing CO2 dissolution. Linear stability analysis reveals that CO2 could be dissolved completely within only 500 years. Assigning lower permeability to the fault zones induces fault-parallel movement, feeds up-gradient aquifers with more CO2, and impedes down-gradient fluid flow, developing anticlinal CO2 traps at shallow depths (<300 m). The regional fault permeability that best reproduces field spatial CO2 flux variation is estimated 1 × 10-17 ≤ kh < 1 × 10-16 m2 and 5 × 10-16 ≤ kv < 1 × 10-15 m2. The anticlinal trap serves as an essential fluid source for eruption at Crystal Geyser. Geyser-like discharge sensitively responds to varying well permeability, radius, and CO2 recharge rate. The cyclic behavior of wellbore CO2 leakage decreases with time.

  10. Relative permeability of fractured wellbore cement: an experimental investigation using electrical resistivity monitoring for moisture content

    Science.gov (United States)

    Um, W.; Rod, K. A.; Strickland, C. E.

    2016-12-01

    Permeability is a critical parameter needed to understand flow in subsurface environments; it is particularly important in deep subsurface reservoirs where multiphase fluid flow is common, such as carbon sequestration and geothermal reservoirs. Cement is used in the annulus of wellbores due to its low permeable properties to seal aquifers, reducing leaks to adjacent strata. Extreme subsurface environments of CO2 storage and geothermal production conditions will eventually reduce the cement integrity, propagating fracture networks and increasing the permeability for air and/or water. To date, there have been no reproducible experimental investigations of relative permeability in fractured wellbore cement published. To address this gap, we conducted a series of experiments using fractured Portland cement monoliths with increasing fracture networks. The monolith cylinder sides were jacketed with heavy-duty moisture-seal heat-shrink tubing, then fractured using shear force applied via a hydraulic press. Fractures were generated with different severity for each of three monoliths. Stainless steel endcaps were fixed to the monoliths using the same shrink-wrapped jacket. Fracture characteristics were determined using X-ray microtomography and image analysis. Flow controllers were used to control flow of water and air to supply continuous water or water plus air, both of which were delivered through the influent end cap. Effluent air flow was monitored using a flow meter, and water flow was measured gravimetrically. To monitor the effective saturation of the fractures, a RCON2 concrete bulk electrical resistivity test device was attached across both endcaps and a 0.1M NaNO3 brine was used as the transport fluid to improve resistivity measurements. Water content correlated to resistivity measurements with a r2 > 0.96. Data from the experiments was evaluated using two relative permeability models, the Corey-curve, often used for modeling relative permeability in porous media

  11. Adaptive forward-inverse modeling of reservoir fluids away from wellbores; TOPICAL

    International Nuclear Information System (INIS)

    Ziagos, J P; Gelinas, R J; Doss, S K; Nelson, R G

    1999-01-01

    This Final Report contains the deliverables of the DeepLook Phase I project entitled, ''Adaptive Forward-Inverse Modeling of Reservoir Fluids Away from Wellbores''. The deliverables are: (i) a description of 2-D test problem results, analyses, and technical descriptions of the techniques used, (ii) a listing of program setup commands that construct and execute the codes for selected test problems (these commands are in mathematical terminology, which reinforces technical descriptions in the text), and (iii) an evaluation and recommendation regarding continuance of this project, including considerations of possible extensions to 3-D codes, additional technical scope, and budget for the out-years. The far-market objective in this project is to develop advanced technologies that can help locate and enhance the recovery of oil from heterogeneous rock formations. The specific technical objective in Phase I was to develop proof-of-concept of new forward and inverse (F-I) modeling techniques[Gelinas et al, 1998] that seek to enhance estimates (images) of formation permeability distributions and fluid motion away from wellbore volumes. This goes to the heart of improving industry's ability to jointly image reservoir permeability and flow predictions of trapped and recovered oil versus time. The estimation of formation permeability away from borehole measurements is an ''inverse'' problem. It is an inseparable part of modeling fluid flows throughout the reservoir in efforts to increase the efficiency of oil recovery at minimum cost. Classic issues of non-uniqueness, mathematical instability, noise effects, and inadequate numerical solution techniques have historically impeded progress in reservoir parameter estimations. Because information pertaining to fluid and rock properties is always sampled sparsely by wellbore measurements, a successful method for interpolating permeability and fluid data between the measurements must be: (i) physics-based, (ii) conditioned by signal

  12. Wellbore inertial navigation system (WINS) software development and test results

    Energy Technology Data Exchange (ETDEWEB)

    Wardlaw, R. Jr.

    1982-09-01

    The structure and operation of the real-time software developed for the Wellbore Inertial Navigation System (WINS) application are described. The procedure and results of a field test held in a 7000-ft well in the Nevada Test Site are discussed. Calibration and instrumentation error compensation are outlined, as are design improvement areas requiring further test and development. Notes on Kalman filtering and complete program listings of the real-time software are included in the Appendices. Reference is made to a companion document which describes the downhole instrumentation package.

  13. Thermal effects on fluid flow and hydraulic fracturing from wellbores and cavities in low-permeability formations

    Energy Technology Data Exchange (ETDEWEB)

    Yarlong Wang [Petro-Geotech Inc., Calgary, AB (Canada); Papamichos, Euripides [IKU Petroleum Research, Trondheim (Norway)

    1999-07-01

    The coupled heat-fluid-stress problem of circular wellbore or spherical cavity subjected to a constant temperature change and a constant fluid flow rate is considered. Transient analytical solutions for temperature, pore pressure and stress are developed by coupling conductive heat transfer with Darcy fluid flow in a poroelastic medium. They are applicable to lower permeability porous media suitable for liquid-waste disposal and also simulating reservoir for enhanced oil recovery, where conduction dominates the heat transfer process. A full range of solutions is presented showing separately the effects of temperature and fluid flow on pore pressure and stress development. It is shown that injection of warm fluid can be used to restrict fracture development around wellbores and cavities and generally to optimise a fluid injection operation. Both the limitations of the solutions and the convective flow effect are addressed. (Author)

  14. Exploring the hole cleaning parameters of horizontal wellbore using two-phase Eulerian CFD approach

    Directory of Open Access Journals (Sweden)

    Satish K Dewangan

    2016-03-01

    Full Text Available The present investigation deals with the flow through concentric annulus with the inner cylinder in rotation. This work has got its importance in the petroleum industries in relation to the wellbore drilling. In wellbore drilling, the issue of the hole-cleaning is very serious problem especially in case of the horizontal drilling process. The effect of the various parameters like slurry flow velocity, inner cylinder rotational speed, inlet solid concentration which affect hole cleaning was discussed. Their effect on the pressure drop, wall shear stress, mixture turbulence kinetic energy, and solid-phase velocity and slip velocity were analyzed, which are responsible for solid-phase distribution. Flow was considered to be steady, incompressible and two-phase slurry flow with water as carrier fluid and silica sand as the secondary phase. Eulerian approach was used for modeling the slurry flow. Silica sand was considered of spherical shape with particle size of 180 µm. ANSYS FLUENT software was used for modeling and solution. Plotting was done using Tecplot software and Microsoft Office.

  15. Impact of Casing Expansion on the Mechanical and Petro-Physical Properties of Wellbore Cements

    Science.gov (United States)

    Oyibo, A. E.

    2014-12-01

    The main objective of this research is to investigate the applicability of expandable casing technology as a remediation technique for leaky wells resulting in gas migration problems. Micro annulus is usually created at the cement-formation/cement-casing interface or within the cement matrix either due to poor primary cementing or as a result of activities such as temperature and pressure variation or fracturing operations. Recent reports on gas migration in hydraulically fractured wellbores, has raised concerns on the contamination of fresh water aquifers resulting from fluid migration though this flow path. A unique bench-scale physical model which utilizes expandable tubulars in the remediation of micro annular gas flow has been used to simulate expansion of a previously-cemented casing under field-like conditions. Three different designs of cement slurry: regular 16.4 lb. /gal, 16.4 lb. /gal base slurry foamed to 13 lb. /gal and 16.4 lb. /gal cement slurry with 10% salt concentration. Gas flow path (microannulus) was artificially created at the pipe-cement interface by rotating the inner pipe in a pipe inside pipe assembly with cement in the annulus within the first few hours of hydration to create debonding at the cement-casing interface. Nitrogen gas flow-through experiments were performed before and after the expansion to confirm the sealing of the microannulus. The results obtained confirmed the effectiveness of this technique in the complete closure of gas leakage path, providing seal-tight cement-formation interface free of microannulus. The manipulation of the cement sheath during the casing expansion resulted in improved porosity, permeability and the strength of the cement sheath. SEM micrographs revealed decrease in pore size and fracturing of unhydrated cement grains within the cement matrix. This technology has great potential to become one of the leading cement remediation techniques for leaks behind the casing if implemented. Keywords: Wellbore

  16. Polymer nanocomposites for sealing microannulus cracks in wellbores cement-steel interface

    Science.gov (United States)

    Genedy, M.; Fernandez, S. G.; Stormont, J.; Matteo, E. N.; Dewers, T. A.; Reda Taha, M.

    2017-12-01

    Seal integrity of production and storage wellbores has become a critical challenge with the increasing oil and gas leakage incidents. The general consensus is that one of the potential leakage pathways is micro-annuli at the cement-steel interface. In this paper, we examine the efficiency of proposed polymer nanocomposite to seal microannulus cracks at the cement-steel interface. The repair material efficiency is defined as the ability of the repair material to reduce or eliminate the gas permeability of the cement-steel interface. The flow rate of an inert gas (Nitrogen) at the cement-steel interface was investigated for three cases: 1) repaired test samples with traditional repair material (microfine cement), 2) polymer nanocomposites, and 3) unrepaired test samples. Flow rates were measured and compared for all three cases. The experimental results show up to 99.5% seal efficiency achieved by using polymer nanocomposites compared to 20% efficiency achieved in the case of microfine cement. Sandia National Laboratories is a multimission laboratory managed and operated by National Technology and Engineering Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell International, Inc., for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA-0003525. SAND2017-8094 A.

  17. Measurement of flowing water salinity within or behind wellbore casing

    International Nuclear Information System (INIS)

    Arnold, D.M.

    1981-01-01

    Water flowing within or behind a wellbore casing is irradiated with 14 MeV neutrons from a source in a downhole sonde. Gamma radiation from the isotope nitrogen-16 induced from the O 16 (n,p)N 16 reaction and the products of either the Na 23 (n,α)F 20 or the Cl 37 (n,α)P 34 reactions is measured in intensity and energy with detectors in the sonde. From the gamma radiation measurements, the relative presence of oxygen to at least one of sodium or chlorine in the water is measured, and from the measurement the salinity of the water is to be determined. (author)

  18. Methodology to predict the initiation of multiple transverse fractures from horizontal wellbores

    Energy Technology Data Exchange (ETDEWEB)

    Crosby, D. G.; Yang, Z.; Rahman, S. S. [Univ. of New South Wales (Australia)

    2001-10-01

    The criterion based on Drucker and Prager which is designed to predict the pressure required to initiate secondary multiple transverse fractures in close proximity to primary fractures is discussed. Results based on this criterion compare favorably with those measured during a series of laboratory-scale hydraulic fracture interaction tests. It is concluded that the multiple fracture criterion and laboratory results demonstrate that transversely fractured horizontal wellbores have a limited capacity to resist the initiation of multiple fractures from adjacent perforations, or intersecting induced and natural fractures. 23 refs., 1 tab., 9 figs.

  19. Gas Migration Project: Risk Assessment Tool and Computational Analyses to Investigate Wellbore/Mine Interactions, Secretary's Potash Area, Southeastern New Mexico

    Energy Technology Data Exchange (ETDEWEB)

    Sobolik, Steven R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Geomechanics Dept.; Hadgu, Teklu [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Nuclear Waste Disposal Research and Analysis Dept.; Rechard, Robert P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Nuclear Waste Disposal Research and Analysis Dept.

    2016-05-01

    The Bureau of Land Management (BLM), US Department of the Interior has asked Sandia National Laboratories (SNL) to perform scientific studies relevant to technical issues that arise in the development of co-located resources of potash and petroleum in southeastern New Mexico in the Secretary’s Potash Area. The BLM manages resource development, issues permits and interacts with the State of New Mexico in the process of developing regulations, in an environment where many issues are disputed by industry stakeholders. The present report is a deliverable of the study of the potential for gas migration from a wellbore to a mine opening in the event of wellbore leakage, a risk scenario about which there is disagreement among stakeholders and little previous site specific analysis. One goal of this study was to develop a framework that required collaboratively developed inputs and analytical approaches in order to encourage stakeholder participation and to employ ranges of data values and scenarios. SNL presents here a description of a basic risk assessment (RA) framework that will fulfill the initial steps of meeting that goal. SNL used the gas migration problem to set up example conceptual models, parameter sets and computer models and as a foundation for future development of RA to support BLM resource development.

  20. Improving wellbore position accuracy of horizontal wells by using a continuous inclination measurement from a near bit inclination MWD sensor

    Energy Technology Data Exchange (ETDEWEB)

    Berger, P. E.; Sele, R. [Baker Hughes INTEQ (United States)

    1998-12-31

    Wellbore position calculations are typically performed by measuring azimuth and inclination at 10 to 30 meter intervals and using interpolation techniques to determine the borehole position between survey stations. The input parameters are measured depth (MD), azimuth and inclination, where the two parameters are measured with an MWD tool. Output parameters are the geometric coordinates; true value depth (TVD), north and east. By improving the accuracy of the inclination measurement reduces the uncertainty of the calculated TVD value, resulting in increased confidence in wellbore position. Significant improvements in quality control can be achieved by using multiple sensors. This paper describes a set of quality control parameters that can be used to verify individual sensor performance and a method for calculating TVD uncertainty in horizontal wells, using a single sensor or a combination of sensors. 6 refs., 5 figs.

  1. Method for measurement of flowing water salinity within or behind wellbore casing

    International Nuclear Information System (INIS)

    Arnold, D.M.

    1986-01-01

    Water flowing within or behind a wellbore casing is irradiated with 14 MeV neutrons from a source in a downhole sonde. Gamma radiation from the isotope nitrogen-16 induced from the O 16 (n,p)N 16 reaction and the products of either the Na 23 (n,α)F 20 or the Cl 37 (n,α)p 34 reactions is measured in intensity and energy with detectors in the sonde. From the gamma radiation measurements, the relative presence of oxygen to at least one of sodium or chlorine in the water is measured, and from the measurement the salinity of the water is determined

  2. A mechanistic model of heat transfer for gas-liquid flow in vertical wellbore annuli.

    Science.gov (United States)

    Yin, Bang-Tang; Li, Xiang-Fang; Liu, Gang

    2018-01-01

    The most prominent aspect of multiphase flow is the variation in the physical distribution of the phases in the flow conduit known as the flow pattern. Several different flow patterns can exist under different flow conditions which have significant effects on liquid holdup, pressure gradient and heat transfer. Gas-liquid two-phase flow in an annulus can be found in a variety of practical situations. In high rate oil and gas production, it may be beneficial to flow fluids vertically through the annulus configuration between well tubing and casing. The flow patterns in annuli are different from pipe flow. There are both casing and tubing liquid films in slug flow and annular flow in the annulus. Multiphase heat transfer depends on the hydrodynamic behavior of the flow. There are very limited research results that can be found in the open literature for multiphase heat transfer in wellbore annuli. A mechanistic model of multiphase heat transfer is developed for different flow patterns of upward gas-liquid flow in vertical annuli. The required local flow parameters are predicted by use of the hydraulic model of steady-state multiphase flow in wellbore annuli recently developed by Yin et al. The modified heat-transfer model for single gas or liquid flow is verified by comparison with Manabe's experimental results. For different flow patterns, it is compared with modified unified Zhang et al. model based on representative diameters.

  3. Design of Fit-for-Purpose Cement to Restore Cement-Caprock Seal Integrity

    Science.gov (United States)

    Provost, R.

    2015-12-01

    This project aims to study critical research needs in the area of rock-cement interfaces, with a special focus on crosscutting applications in the Wellbore Integrity Pillar of the SubTER initiative. This study will focus on design and test fit-for-purpose cement formulations. The goals of this project are as follows: 1) perform preliminary study of dispersing nanomaterial admixtures in Ordinary Portland Cement (OPC) mixes, 2) characterize the cement-rock interface, and 3) identify potential high-performance cement additives that can improve sorption behavior, chemical durability, bond strength, and interfacial fracture toughness, as appropriate to specific subsurface operational needs. The work presented here focuses on a study of cement-shale interfaces to better understand failure mechanisms, with particular attention to measuring bond strength at the cement-shale interface. Both experimental testing and computational modeling were conducted to determine the mechanical behavior at the interface representing the interaction of cement and shale of a typical wellbore environment. Cohesive zone elements are used in the finite element method to computationally simulate the interface of the cement and rock materials with varying properties. Understanding the bond strength and mechanical performance of the cement-formation interface is critical to wellbore applications such as sequestration, oil and gas production and exploration and nuclear waste disposal. Improved shear bond strength is an indication of the capability of the interface to ensure zonal isolation and prevent zonal communication, two crucial goals in preserving wellbore integrity. Understanding shear bond strength development and interface mechanics will provide an idea as to how the cement-formation interface can be altered under environmental changes (temperature, pressure, chemical degradation, etc.) so that the previously described objectives can be achieved. Sandia National Laboratories is a multi

  4. Near-wellbore modeling of a horizontal well with Computational Fluid Dynamics

    DEFF Research Database (Denmark)

    Szanyi, Márton L.; Hemmingsen, Casper Schytte; Yan, Wei

    2018-01-01

    Dynamics (CFD) is capable of modeling the complex interaction between the creeping reservoir flow and turbulent well flow for single phases, while capturing both the completion geometry and formation damage. A series of single phase steady-state simulations are undertaken, using such fully coupled three...... dimensional numerical models, to predict the inflow to the well. The present study considers the applicability of CFD for near-wellbore modeling through benchmark cases with available analytical solutions. Moreover, single phase steady-state numerical investigations are performed on a specific perforated...... horizontal well producing from the Siri field, offshore Denmark. The performance of the well is investigated with an emphasis on the inflow profile and the productivity index for different formation damage scenarios. A considerable redistribution of the inflow profile were found when the filtrate invasion...

  5. Stress estimation in reservoirs using an integrated inverse method

    Science.gov (United States)

    Mazuyer, Antoine; Cupillard, Paul; Giot, Richard; Conin, Marianne; Leroy, Yves; Thore, Pierre

    2018-05-01

    Estimating the stress in reservoirs and their surroundings prior to the production is a key issue for reservoir management planning. In this study, we propose an integrated inverse method to estimate such initial stress state. The 3D stress state is constructed with the displacement-based finite element method assuming linear isotropic elasticity and small perturbations in the current geometry of the geological structures. The Neumann boundary conditions are defined as piecewise linear functions of depth. The discontinuous functions are determined with the CMA-ES (Covariance Matrix Adaptation Evolution Strategy) optimization algorithm to fit wellbore stress data deduced from leak-off tests and breakouts. The disregard of the geological history and the simplified rheological assumptions mean that only the stress field, statically admissible and matching the wellbore data should be exploited. The spatial domain of validity of this statement is assessed by comparing the stress estimations for a synthetic folded structure of finite amplitude with a history constructed assuming a viscous response.

  6. Equivalent Circulation Density Analysis of Geothermal Well by Coupling Temperature

    Directory of Open Access Journals (Sweden)

    Xiuhua Zheng

    2017-02-01

    Full Text Available The accurate control of the wellbore pressure not only prevents lost circulation/blowout and fracturing formation by managing the density of the drilling fluid, but also improves productivity by mitigating reservoir damage. Calculating the geothermal pressure of a geothermal well by constant parameters would easily bring big errors, as the changes of physical, rheological and thermal properties of drilling fluids with temperature are neglected. This paper researched the wellbore pressure coupling by calculating the temperature distribution with the existing model, fitting the rule of density of the drilling fluid with the temperature and establishing mathematical models to simulate the wellbore pressures, which are expressed as the variation of Equivalent Circulating Density (ECD under different conditions. With this method, the temperature and ECDs in the wellbore of the first medium-deep geothermal well, ZK212 Yangyi Geothermal Field in Tibet, were determined, and the sensitivity analysis was simulated by assumed parameters, i.e., the circulating time, flow rate, geothermal gradient, diameters of the wellbore, rheological models and regimes. The results indicated that the geothermal gradient and flow rate were the most influential parameters on the temperature and ECD distribution, and additives added in the drilling fluid should be added carefully as they change the properties of the drilling fluid and induce the redistribution of temperature. To ensure the safe drilling and velocity of pipes tripping into the hole, the depth and diameter of the wellbore are considered to control the surge pressure.

  7. Incorporating electrokinetic effects in the porochemoelastic inclined wellbore formulation and solution

    Directory of Open Access Journals (Sweden)

    Vinh X. Nguyen

    2010-03-01

    Full Text Available The porochemoelectroelastic analytical models and solutions have been used to describe the response of chemically active and electrically charged saturated porous media such as clays, shales, and biological tissues. However, these attempts have been restricted to one-dimensional consolidation problems, which are very limited in practice and not general enough to serve as benchmark solutions for numerical validation. This work summarizes the general linear porochemoelectroelastic formulation and presents the solution of an inclined wellbore drilled in a fluid-saturated chemically active and ionized formation, such as shale, and subjected to a three-dimensional in-situ state of stress. The analytical solution to this geometry incorporates the coupled solid deformation and simultaneous fluid/ion flows induced by the combined influences of pore pressure, chemical potential, and electrical potential gradients under isothermal conditions. The formation pore fluid is modeled as an electrolyte solution comprised of a solvent and one type of dissolved cation and anion. The analytical approach also integrates into the solution the quantitative use of the cation exchange capacity (CEC commonly obtained from laboratory measurements on shale samples. The results for stresses and pore pressure distributions due to the coupled electrochemical effects are illustrated and plotted in the vicinity of the inclined wellbore and compared with the classical porochemoelastic and poroelastic solutions.Modelos analíticos poroelásticos incluindo acoplamento químico e elétrico e soluções têm sido utilizados paradescrever a resposta de meios porosos saturados ativos química e eletricamente tais como argilas, folhelhos e tecidos biológicos. Entretanto tais tentativas têm sido restritas a problemas de consolidação unidimensional os quais exibem limitações na prática não constituindo exemplos realistas para validação de soluções numéricas. Este trabalho

  8. Effect of Matrix-Wellbore Flow and Porosity on Pressure Transient Response in Shale Formation Modeling by Dual Porosity and Dual Permeability System

    Directory of Open Access Journals (Sweden)

    Daolun Li

    2015-01-01

    Full Text Available A mathematical dual porosity and dual permeability numerical model based on perpendicular bisection (PEBI grid is developed to describe gas flow behaviors in shale-gas reservoirs by incorporating slippage corrected permeability and adsorbed gas effect. Parametric studies are conducted for a horizontal well with multiple infinite conductivity hydraulic fractures in shale-gas reservoir to investigate effect of matrix-wellbore flow, natural fracture porosity, and matrix porosity. We find that the ratio of fracture permeability to matrix permeability approximately decides the bottom hole pressure (BHP error caused by omitting the flow between matrix and wellbore and that the effect of matrix porosity on BHP is related to adsorption gas content. When adsorbed gas accounts for large proportion of the total gas storage in shale formation, matrix porosity only has a very small effect on BHP. Otherwise, it has obvious influence. This paper can help us understand the complex pressure transient response due to existence of the adsorbed gas and help petroleum engineers to interpret the field data better.

  9. Well performance model

    International Nuclear Information System (INIS)

    Thomas, L.K.; Evans, C.E.; Pierson, R.G.; Scott, S.L.

    1992-01-01

    This paper describes the development and application of a comprehensive oil or gas well performance model. The model contains six distinct sections: stimulation design, tubing and/or casing flow, reservoir and near-wellbore calculations, production forecasting, wellbore heat transmission, and economics. These calculations may be performed separately or in an integrated fashion with data and results shared among the different sections. The model analysis allows evaluation of all aspects of well completion design, including the effects on future production and overall well economics

  10. Defining the Brittle Failure Envelopes of Individual Reaction Zones Observed in CO2-Exposed Wellbore Cement.

    Science.gov (United States)

    Hangx, Suzanne J T; van der Linden, Arjan; Marcelis, Fons; Liteanu, Emilia

    2016-01-19

    To predict the behavior of the cement sheath after CO2 injection and the potential for leakage pathways, it is key to understand how the mechanical properties of the cement evolves with CO2 exposure time. We performed scratch-hardness tests on hardened samples of class G cement before and after CO2 exposure. The cement was exposed to CO2-rich fluid for one to six months at 65 °C and 8 MPa Ptotal. Detailed SEM-EDX analyses showed reaction zones similar to those previously reported in the literature: (1) an outer-reacted, porous silica-rich zone; (2) a dense, carbonated zone; and (3) a more porous, Ca-depleted inner zone. The quantitative mechanical data (brittle compressive strength and friction coefficient) obtained for each of the zones suggest that the heterogeneity of reacted cement leads to a wide range of brittle strength values in any of the reaction zones, with only a rough dependence on exposure time. However, the data can be used to guide numerical modeling efforts needed to assess the impact of reaction-induced mechanical failure of wellbore cement by coupling sensitivity analysis and mechanical predictions.

  11. High-Resolution Wellbore Temperature Logging Combined with a Borehole-Scale Heat Budget: Conceptual and Analytical Approaches to Characterize Hydraulically Active Fractures and Groundwater Origin

    Directory of Open Access Journals (Sweden)

    Guillaume Meyzonnat

    2018-01-01

    Full Text Available This work aims to provide an overview of the thermal processes that shape wellbore temperature profiles under static and dynamic conditions. Understanding of the respective influences of advection and conduction heat fluxes is improved through the use of a new heat budget at the borehole scale. Keeping in mind the thermal processes involved, a qualitative interpretation of the temperature profiles allows the occurrence, the position, and the origin of groundwater flowing into wellbores from hydraulically active fractures to be constrained. With the use of a heat budget developed at the borehole scale, temperature logging efficiency has been quantitatively enhanced and allows inflow temperatures to be calculated through the simultaneous use of a flowmeter. Under certain hydraulic or pumping conditions, both inflow intensities and associated temperatures can also be directly modelled from temperature data and the use of the heat budget. Theoretical and applied examples of the heat budget application are provided. Applied examples are shown using high-resolution temperature logging, spinner flow metering, and televiewing for three wells installed in fractured bedrock aquifers in the St-Lawrence Lowlands, Quebec, Canada. Through relatively rapid manipulations, thermal measurements in such cases can be used to detect the intervals or discrete positions of hydraulically active fractures in wellbores, as well as the existence of ambient flows with a high degree of sensitivity, even at very low flows. Heat budget calculations at the borehole scale during pumping indicate that heat advection fluxes rapidly dominate over heat conduction fluxes with the borehole wall. The full characterization of inflow intensities provides information about the distribution of hydraulic properties with depth. The full knowledge of inflow temperatures indicates horizons that are drained from within the aquifer, providing advantageous information on the depth from which

  12. Subsurface fracture mapping from geothermal wellbores. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Hartenbaum, B.A.; Rawson, G.

    1983-08-01

    To advance the state-of-the-art in Hot Dry Rock technology, and evaluation is made of (1) the use of both electromagnetic and acoustic radar to map far-field fractures, (2) the use of more than twenty different conventional well logging tools to map borehole-fracture intercepts, (3) the use of magnetic dipole ranging to determine the relative positions of the injection well and the production well within the fractured zone, (4) the use of passive microseismic methods to determine the orientation and extent of hydraulic fractures, and (5) the application of signal processing techniques to fracture mapping including tomography, holography, synthetic aperture, image reconstruction, and the relative importance of phase and amplitude information. It is found that according to calculations, VHF backscatter radar has the potential for mapping fractures within a distance of 50 +- 20 meters from the wellbore. A new technique for improving fracture identification is presented. The range of acoustic radar is five to seven times greater than that of VHF radar when compared on the basis of equal resolution, i.e., equal wavelengths. Analyses of extant data indicate that when used synergistically the (1) caliper, (2) resistivity dipmeter, (3) televiewer, (4) television, (5) impression packer, and (6) acoustic transmission are useful for mapping borehole-fracture intercepts. A new model of hydraulic fracturing is presented which indicates that a hydraulic fracture is dynamically unstable; consequently, improvements in locating the crack tip may be possible. The importance of phase in signal processing is stressed and those techniques which employ phase data are emphasized for field use.

  13. Numerical analysis of temperature and flow effects in a dry, two-dimensional, porous-media reservoir used for compressed air energy storage

    Energy Technology Data Exchange (ETDEWEB)

    Wiles, L.E.

    1979-10-01

    The purpose of the work is to define the hydrodynamic and thermodynamic response of a CAES dry porous media reservoir subjected to simulated air mass cycling. The knowledge gained will provide, or will assist in providing, design guidelines for the efficient and stable operation of the air storage reservoir. The analysis and results obtained by two-dimensional modeling of dry reservoirs are presented. While the fluid/thermal response of the underground system is dependent on many parameters, the two-dimensional model was applied only to those parameters that entered the analysis by virtue of inclusion of the vertical dimension. In particular, the parameters or responses that were quantified or characterized include wellbore heat transfer, heat losses to the vertical boundaries of the porous zone, gravitationally induced flows, producing length of the wellbore, and the effects of nonuniform permeability. The analysis of the wellbore heat transfer included consideration of insulation, preheating (bubble development with heated air), and air mass flow rate.

  14. Improving the accuracy and reliability of MWD/magnetic-Wellbore-Directional surveying in the barents sea

    DEFF Research Database (Denmark)

    Edvardsen, I.; Nyrnes, E.; Johnsen, M. G.

    2014-01-01

    of nonmagnetic steel in the bottomhole assembly (BHA). To maintain azimuth uncertaintyat an acceptable level in northern areas, it is crucial that wellbore-directional-surveying requirements are given high priority and considered early during well planning. During the development phase of an oil and gas field...... magnetic-reference stations. The different land and sea configuration, distant offshore oil and gas fields, higher geomagnetic latitude, and different behavior of the magnetic field require the procedures to be reassessed before being applied to the Barents Sea. To reduce drilling delays, procedures must...... be implemented to enable efficient management of magnetic disturbances.In some areas of the Barents Sea, the management requires new equipment to be developed and tested before drilling, such as seabed magnetometer stations. One simple way to reduce drillstring interference is increasing the amount...

  15. Wellbore cement fracture evolution at the cement–basalt caprock interface during geologic carbon sequestration

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Hun Bok; Kabilan, Senthil; Carson, James P.; Kuprat, Andrew P.; Um, Wooyong; Martin, Paul F.; Dahl, Michael E.; Kafentzis, Tyler A.; Varga, Tamas; Stephens, Sean A.; Arey, Bruce W.; Carroll, KC; Bonneville, Alain; Fernandez, Carlos A.

    2014-08-07

    Composite Portland cement-basalt caprock cores with fractures, as well as neat Portland cement columns, were prepared to understand the geochemical and geomechanical effects on the integrity of wellbores with defects during geologic carbon sequestration. The samples were reacted with CO2-saturated groundwater at 50 ºC and 10 MPa for 3 months under static conditions, while one cement-basalt core was subjected to mechanical stress at 2.7 MPa before the CO2 reaction. Micro-XRD and SEM-EDS data collected along the cement-basalt interface after 3-month reaction with CO2-saturated groundwater indicate that carbonation of cement matrix was extensive with the precipitation of calcite, aragonite, and vaterite, whereas the alteration of basalt caprock was minor. X-ray microtomography (XMT) provided three-dimensional (3-D) visualization of the opening and interconnection of cement fractures due to mechanical stress. Computational fluid dynamics (CFD) modeling further revealed that this stress led to the increase in fluid flow and hence permeability. After the CO2-reaction, XMT images displayed that calcium carbonate precipitation occurred extensively within the fractures in the cement matrix, but only partially along the fracture located at the cement-basalt interface. The 3-D visualization and CFD modeling also showed that the precipitation of calcium carbonate within the cement fractures after the CO2-reaction resulted in the disconnection of cement fractures and permeability decrease. The permeability calculated based on CFD modeling was in agreement with the experimentally determined permeability. This study demonstrates that XMT imaging coupled with CFD modeling represent a powerful tool to visualize and quantify fracture evolution and permeability change in geologic materials and to predict their behavior during geologic carbon sequestration or hydraulic fracturing for shale gas production and enhanced geothermal systems.

  16. Thermal transient analysis applied to horizontal wells

    Energy Technology Data Exchange (ETDEWEB)

    Duong, A.N. [Society of Petroleum Engineers, Canadian Section, Calgary, AB (Canada)]|[ConocoPhillips Canada Resources Corp., Calgary, AB (Canada)

    2008-10-15

    Steam assisted gravity drainage (SAGD) is a thermal recovery process used to recover bitumen and heavy oil. This paper presented a newly developed model to estimate cooling time and formation thermal diffusivity by using a thermal transient analysis along the horizontal wellbore under a steam heating process. This radial conduction heating model provides information on the heat influx distribution along a horizontal wellbore or elongated steam chamber, and is therefore important for determining the effectiveness of the heating process in the start-up phase in SAGD. Net heat flux estimation in the target formation during start-up can be difficult to measure because of uncertainties regarding heat loss in the vertical section; steam quality along the horizontal segment; distribution of steam along the wellbore; operational conditions; and additional effects of convection heating. The newly presented model can be considered analogous to pressure transient analysis of a buildup after a constant pressure drawdown. The model is based on an assumption of an infinite-acting system. This paper also proposed a new concept of a heating ring to measure the heat storage in the heated bitumen at the time of testing. Field observations were used to demonstrate how the model can be used to save heat energy, conserve steam and enhance bitumen recovery. 18 refs., 14 figs., 2 appendices.

  17. A well test analysis method accounting for pre-test operations

    International Nuclear Information System (INIS)

    Silin, D.B.; Tsang, C.-F.

    2003-01-01

    We propose to use regular monitoring data from a production or injection well for estimating the formation hydraulic properties in the vicinity of the wellbore without interrupting the operations. In our approach, we select a portion of the pumping data over a certain time interval and then derive our conclusions from analysis of these data. A distinctive feature of the proposed approach differing it form conventional methods is in the introduction of an additional parameter, an effective pre-test pumping rate. The additional parameter is derived based on a rigorous asymptotic analysis of the flow model. Thus, we account for the non-uniform pressure distribution at the beginning of testing time interval caused by pre-test operations at the well. By synthetic and field examples, we demonstrate that deviation of the matching curve from the data that is usually attributed to skin and wellbore storage effects, can also be interpreted through this new parameter. Moreover, with our method, the data curve is matched equally well and the results of the analysis remain stable when the analyzed data interval is perturbed, whereas traditional methods are sensitive to the choice of the data interval. A special efficient minimization procedure has been developed for searching the best fitting parameters. We enhanced our analysis above with a procedure of estimating ambient reservoir pressure and dimensionless wellbore radius. The methods reported here have been implemented in code ODA (Operations Data Analysis). A beta version of the code is available for free testing and evaluation to interested parties

  18. Selected hydraulic test analysis techniques for constant-rate discharge tests

    International Nuclear Information System (INIS)

    Spane, F.A. Jr.

    1993-03-01

    The constant-rate discharge test is the principal field method used in hydrogeologic investigations for characterizing the hydraulic properties of aquifers. To implement this test, the aquifer is stressed by withdrawing ground water from a well, by using a downhole pump. Discharge during the withdrawal period is regulated and maintained at a constant rate. Water-level response within the well is monitored during the active pumping phase (i.e., drawdown) and during the subsequent recovery phase following termination of pumping. The analysis of drawdown and recovery response within the stress well (and any monitored, nearby observation wells) provides a means for estimating the hydraulic properties of the tested aquifer, as well as discerning formational and nonformational flow conditions (e.g., wellbore storage, wellbore damage, presence of boundaries, etc.). Standard analytical methods that are used for constant-rate pumping tests include both log-log type-curve matching and semi-log straight-line methods. This report presents a current ''state of the art'' review of selected transient analysis procedures for constant-rate discharge tests. Specific topics examined include: analytical methods for constant-rate discharge tests conducted within confined and unconfined aquifers; effects of various nonideal formation factors (e.g., anisotropy, hydrologic boundaries) and well construction conditions (e.g., partial penetration, wellbore storage) on constant-rate test response; and the use of pressure derivatives in diagnostic analysis for the identification of specific formation, well construction, and boundary conditions

  19. Monitoring of well integrity by magnetic imaging defectoscopy (MID) at the Ketzin pilot site, Germany

    Science.gov (United States)

    Zemke, Kornelia; Liebscher, Axel; Möller, Fabian

    2017-04-01

    One of the key requirements for safe CO2 storage operation is to ensure wellbore integrity. The CO2 triggered acid in-well environment may lead to pitting and/or surface corrosion and eventually to fatigue of well casings and cementation by this giving raise to wellbore leakage. Corrosion effects are conventionally monitored by measurement of inner casing surface, internal diameter and wall thickness. Caliper logging provides inner surface and internal diameter data while ultrasonic tools measure both the internal diameter and casing thickness as well as the bonding between casing and cement. However, both tools can only monitor and characterize the most inner casing and ultrasonic tools in addition can only be applied in fluid filled wells. At the Ketzin CO2 storage test site, Germany, about 67 kt of CO2 were injected between June 2008 and August 2013 and an interdisciplinary monitoring concept was developed with focus on the storage complex, the overburden, the surface and the wellbores. Four deep wells penetrate the reservoir and their integrity has been monitored by a combination of video inspection, pulsed neutron gamma logging PNG and magnetic imaging defectoscopy MID. MID is an advanced logging method for non-destructive testing and has the great advantages that it can be operated in gas filled boreholes and that it provides information also for outer casings. The MID tool generates electromagnetic pulsed transient eddy currents and records the response of the surrounding media. The distribution and strength of the eddy-currents is then converted into averaged, depth-resolved thicknesses of the individual casings. Run in time-lapse mode, MID provides a measure to detect changes in casing thickness and therefore hints to corrosion. At Ketzin, the four deep wells haven been monitored by repeat MID logging on a roughly annual basis in cooperation with VNG Gasspeicher GmbH (VGS) and GAZPROMENERGODIAGNOSTIKA, applying their in-house MID tool. The MID based depth

  20. Integrative Analysis of Omics Big Data.

    Science.gov (United States)

    Yu, Xiang-Tian; Zeng, Tao

    2018-01-01

    The diversity and huge omics data take biology and biomedicine research and application into a big data era, just like that popular in human society a decade ago. They are opening a new challenge from horizontal data ensemble (e.g., the similar types of data collected from different labs or companies) to vertical data ensemble (e.g., the different types of data collected for a group of person with match information), which requires the integrative analysis in biology and biomedicine and also asks for emergent development of data integration to address the great changes from previous population-guided to newly individual-guided investigations.Data integration is an effective concept to solve the complex problem or understand the complicate system. Several benchmark studies have revealed the heterogeneity and trade-off that existed in the analysis of omics data. Integrative analysis can combine and investigate many datasets in a cost-effective reproducible way. Current integration approaches on biological data have two modes: one is "bottom-up integration" mode with follow-up manual integration, and the other one is "top-down integration" mode with follow-up in silico integration.This paper will firstly summarize the combinatory analysis approaches to give candidate protocol on biological experiment design for effectively integrative study on genomics and then survey the data fusion approaches to give helpful instruction on computational model development for biological significance detection, which have also provided newly data resources and analysis tools to support the precision medicine dependent on the big biomedical data. Finally, the problems and future directions are highlighted for integrative analysis of omics big data.

  1. The Effects of Boundary Conditions and Friction on the Helical Buckling of Coiled Tubing in an Inclined Wellbore.

    Science.gov (United States)

    Gong, Yinchun; Ai, Zhijiu; Sun, Xu; Fu, Biwei

    2016-01-01

    Analytical buckling models are important for down-hole operations to ensure the structural integrity of the drill string. A literature survey shows that most published analytical buckling models do not address the effects of inclination angle, boundary conditions or friction. The objective of this paper is to study the effects of boundary conditions, friction and angular inclination on the helical buckling of coiled tubing in an inclined wellbore. In this paper, a new theoretical model is established to describe the buckling behavior of coiled tubing. The buckling equations are derived by applying the principles of virtual work and minimum potential energy. The proper solution for the post-buckling configuration is determined based on geometric and natural boundary conditions. The effects of angular inclination and boundary conditions on the helical buckling of coiled tubing are considered. Many significant conclusions are obtained from this study. When the dimensionless length of the coiled tubing is greater than 40, the effects of the boundary conditions can be ignored. The critical load required for helical buckling increases as the angle of inclination and the friction coefficient increase. The post-buckling behavior of coiled tubing in different configurations and for different axial loads is determined using the proposed analytical method. Practical examples are provided that illustrate the influence of the angular inclination on the axial force. The rate of change of the axial force decreases with increasing angular inclination. Moreover, the total axial friction also decreases with an increasing inclination angle. These results will help researchers to better understand helical buckling in coiled tubing. Using this knowledge, measures can be taken to prevent buckling in coiled tubing during down-hole operations.

  2. Integrated genetic analysis microsystems

    International Nuclear Information System (INIS)

    Lagally, Eric T; Mathies, Richard A

    2004-01-01

    With the completion of the Human Genome Project and the ongoing DNA sequencing of the genomes of other animals, bacteria, plants and others, a wealth of new information about the genetic composition of organisms has become available. However, as the demand for sequence information grows, so does the workload required both to generate this sequence and to use it for targeted genetic analysis. Microfabricated genetic analysis systems are well poised to assist in the collection and use of these data through increased analysis speed, lower analysis cost and higher parallelism leading to increased assay throughput. In addition, such integrated microsystems may point the way to targeted genetic experiments on single cells and in other areas that are otherwise very difficult. Concomitant with these advantages, such systems, when fully integrated, should be capable of forming portable systems for high-speed in situ analyses, enabling a new standard in disciplines such as clinical chemistry, forensics, biowarfare detection and epidemiology. This review will discuss the various technologies available for genetic analysis on the microscale, and efforts to integrate them to form fully functional robust analysis devices. (topical review)

  3. The Development and Test of a Sensor for Measurement of the Working Level of Gas–Liquid Two-Phase Flow in a Coalbed Methane Wellbore Annulus

    OpenAIRE

    Chuan Wu; Huafeng Ding; Lei Han

    2018-01-01

    Coalbed methane (CBM) is one kind of clean-burning gas and has been valued as a new form of energy that will be used widely in the near future. When producing CBM, the working level within a CBM wellbore annulus needs to be monitored to dynamically adjust the gas drainage and extraction processes. However, the existing method of measuring the working level does not meet the needs of accurate adjustment, so we designed a new sensor for this purpose. The principle of our sensor is a liquid pres...

  4. THE INFLUENCE OF CO2 ON WELL CEMENT

    Directory of Open Access Journals (Sweden)

    Nediljka Gaurina-Međimurec

    2010-12-01

    Full Text Available Carbon capture and storage is one way to reduce emissions of greenhouse gases in the atmosphere. Underground gas storage operations and CO2 sequestration in aquifers relay on both the proper wellbore construction and sealing properties of the cap rock. CO2 injection candidates may be new wells or old wells. In both cases, the long-term wellbore integrity (up to 1 000 years is one of the key performance criteria in the geological storage of CO2. The potential leakage paths are the migration CO2 along the wellbore due to poor cementation and flow through the cap rock. The permeability and integrity of the set cement will determine how effective it is in preventing the leakage. The integrity of the cap rock is assured by an adequate fracture gradient and by sufficient set cement around the casing across the cap rock and without a micro-annulus. CO2 storage in underground formations has revived the researc of long term influence of the injected CO2 on Portland cements and methods for improving the long term efficiency of the wellbore sealant. Some researchers predicted that set cement will fail when exposed to CO2 leading to potential leakage to the atmosphere or into underground formations that may contain potable water. Other researchers show set cement samples from 30 to 50 year-old wells (CO2 EOR projects that have maintained sealing integrity and prevented CO2 leakage, in spite of some degree of carbonation. One of reasons for the discrepancy between certain research lab tests and actual field performance measurements is the absence of standard protocol for CO2 resistance-testing devices, conditions, or procedures. This paper presents potential flow paths along the wellbore, CO2 behaviour under reservoir conditions, and geochemical alteration of hydrated Portland cement due to supercritical CO2 injection.

  5. Rehabilitation of Mature Gas Fields in Romania: Success Through Integration of Management Processes and New Technology

    Directory of Open Access Journals (Sweden)

    Louboutin Michel

    2004-09-01

    Full Text Available Nature oil and gas fields are difficult to rehabilitate effectively because of the economics of declining production. Many fields are abandoned prematurely when their life could be prolonged significantly through application of new technology. Romgaz (a national exploration and production company and Schlumberger (an integrated oilfield services company developed a new business model to overcome these obstacles. The key to success of this model, which is being applied to gas fields in the Transylvanian basin of Romania, is the shared risk and shared reward for the two companies. Integrated management processes addressing the complete system from reservoir to wellbore to surface/transmission facilities and application of new technology (logging, perforation, etc. have resulted in multifold increases in production.

  6. Nanostructural control of methane release in kerogen and its implications to wellbore production decline

    Science.gov (United States)

    Ho, Tuan Anh; Criscenti, Louise J.; Wang, Yifeng

    2016-06-01

    Despite massive success of shale gas production in the US in the last few decades there are still major concerns with the steep decline in wellbore production and the large uncertainty in a long-term projection of decline curves. A reliable projection must rely on a mechanistic understanding of methane release in shale matrix-a limiting step in shale gas extraction. Using molecular simulations, we here show that methane release in nanoporous kerogen matrix is characterized by fast release of pressurized free gas (accounting for ~30-47% recovery) followed by slow release of adsorbed gas as the gas pressure decreases. The first stage is driven by the gas pressure gradient while the second stage is controlled by gas desorption and diffusion. We further show that diffusion of all methane in nanoporous kerogen behaves differently from the bulk phase, with much smaller diffusion coefficients. The MD simulations also indicate that a significant fraction (3-35%) of methane deposited in kerogen can potentially become trapped in isolated nanopores and thus not recoverable. Our results shed a new light on mechanistic understanding gas release and production decline in unconventional reservoirs. The long-term production decline appears controlled by the second stage of gas release.

  7. Seismic monitoring of hydraulic fracturing: techniques for determining fluid flow paths and state of stress away from a wellbore

    Energy Technology Data Exchange (ETDEWEB)

    Fehler, M.; House, L.; Kaieda, H.

    1986-01-01

    Hydraulic fracturing has gained in popularity in recent years as a way to determine the orientations and magnitudes of tectonic stresses. By augmenting conventional hydraulic fracturing measurements with detection and mapping of the microearthquakes induced by fracturing, we can supplement and idependently confirm information obtained from conventional analysis. Important information obtained from seismic monitoring includes: the state of stress of the rock, orientation and spacing of the major joint sets, and measurements of rock elastic parameters at locations distant from the wellbore. While conventional well logging operations can provide information about several of these parameters, the zone of interrogation is usually limited to the immediate proximity of the borehole. The seismic waveforms of the microearthquakes contain a wealth of information about the rock in regions that are otherwise inaccessible for study. By reliably locating the hypocenters of many microearthquakes, we have inferred the joint patterns in the rock. We observed that microearthquake locations do not define a simple, thin, planar distribution, that the fault plane solutions are consistent with shear slippage, and that spectral analysis indicates that the source dimensions and slip along the faults are small. Hence we believe that the microearthquakes result from slip along preexisting joints, and not from tensile extension at the tip of the fracture. Orientations of the principal stresses can be estimated by using fault plane solutions of the larger microearthquakes. By using a joint earthquake location scheme, and/or calibrations with downhole detonators, rock velocities and heterogeneities thereof can be investigated in rock volumes that are far enough from the borehole to be representative of intrincis rock properties.

  8. Rate transient analysis for homogeneous and heterogeneous gas reservoirs using the TDS technique

    International Nuclear Information System (INIS)

    Escobar, Freddy Humberto; Sanchez, Jairo Andres; Cantillo, Jose Humberto

    2008-01-01

    In this study pressure test analysis in wells flowing under constant wellbore flowing pressure for homogeneous and naturally fractured gas reservoir using the TDS technique is introduced. Although, constant rate production is assumed in the development of the conventional well test analysis methods, constant pressure production conditions are sometimes used in the oil and gas industry. The constant pressure technique or rate transient analysis is more popular reckoned as decline curve analysis under which rate is allows to decline instead of wellbore pressure. The TDS technique, everyday more used even in the most recognized software packages although without using its trade brand name, uses the log-log plot to analyze pressure and pressure derivative test data to identify unique features from which exact analytical expression are derived to easily estimate reservoir and well parameters. For this case, the fingerprint characteristics from the log-log plot of the reciprocal rate and reciprocal rate derivative were employed to obtain the analytical expressions used for the interpretation analysis. Many simulation experiments demonstrate the accuracy of the new method. Synthetic examples are shown to verify the effectiveness of the proposed methodology

  9. Modeling the key factors that could influence the diffusion of CO2 from a wellbore blowout in the Ordos Basin, China.

    Science.gov (United States)

    Li, Qi; Shi, Hui; Yang, Duoxing; Wei, Xiaochen

    2017-02-01

    Carbon dioxide (CO 2 ) blowout from a wellbore is regarded as a potential environment risk of a CO 2 capture and storage (CCS) project. In this paper, an assumed blowout of a wellbore was examined for China's Shenhua CCS demonstration project. The significant factors that influenced the diffusion of CO 2 were identified by using a response surface method with the Box-Behnken experiment design. The numerical simulations showed that the mass emission rate of CO 2 from the source and the ambient wind speed have significant influence on the area of interest (the area of high CO 2 concentration above 30,000 ppm). There is a strong positive correlation between the mass emission rate and the area of interest, but there is a strong negative correlation between the ambient wind speed and the area of interest. Several other variables have very little influence on the area of interest, e.g., the temperature of CO 2 , ambient temperature, relative humidity, and stability class values. Due to the weather conditions at the Shenhua CCS demonstration site at the time of the modeled CO 2 blowout, the largest diffusion distance of CO 2 in the downwind direction did not exceed 200 m along the centerline. When the ambient wind speed is in the range of 0.1-2.0 m/s and the mass emission rate is in the range of 60-120 kg/s, the range of the diffusion of CO 2 is at the most dangerous level (i.e., almost all Grade Four marks in the risk matrix). Therefore, if the injection of CO 2 takes place in a region that has relatively low perennial wind speed, special attention should be paid to the formulation of pre-planned, emergency measures in case there is a leakage accident. The proposed risk matrix that classifies and grades blowout risks can be used as a reference for the development of appropriate regulations. This work may offer some indicators in developing risk profiles and emergency responses for CO 2 blowouts.

  10. Application of Neutron imaging in pore structure of hydrated wellbore cement: comparison of hydration of H20 with D2O based Portland cements

    Science.gov (United States)

    Dussenova, D.; Bilheux, H.; Radonjic, M.

    2012-12-01

    Wellbore Cement studies have been ongoing for decades. The studies vary from efforts to reduce permeability and resistance to corrosive environment to issues with gas migration also known as Sustained Casing Pressure (SCP). These practical issues often lead to health and safety problems as well as huge economic loss in oil and gas industry. Several techniques have been employed to reduce the impact of gas leakage. In this study we purely focus on expandable liners, which are introduced as part of oil well reconstruction and work-overs and as well abandonment procedures that help in prevention of SCP. Expandable liner is a tube that after application of a certain tool can increase its diameter. The increase in diameter creates extra force on hydrated cement that results in reducing width of interface fractures and cement-tube de-bonding. Moreover, this also causes cement to change its microstructure and other porous medium properties, primarily hydraulic conductivity. In order to examine changes before and after operations, cement pore structure must be well characterized and correlated to cement slurry design as well as chemical and physical environmental conditions. As modern oil well pipes and tubes contain iron, it is difficult to perform X-ray tomography of a bulk measurement of the cement in its wellbore conditions, which are tube wall-cement-tube wall. Neutron imaging is a complementary technique to x-ray imaging and is well suited for detection of light elements imbedded in metallic containers. Thus, Neutron Imaging (NI) is investigated as a tool for the detection of pore structure of hydrated wellbore cement. Recent measurements were conducted at the Oak Ridge National Laboratory (ORNL) High Flux Isotope Reactor (HFIR) neutron imaging facility. NI is is highly sensitive to light elements such as Hydrogen (H). Oil well cements that have undergone a full hydration contain on average 30%-40% of free water in its pore structure. The unreacted water is the main

  11. Problems in mathematical analysis III integration

    CERN Document Server

    Kaczor, W J

    2003-01-01

    We learn by doing. We learn mathematics by doing problems. This is the third volume of Problems in Mathematical Analysis. The topic here is integration for real functions of one real variable. The first chapter is devoted to the Riemann and the Riemann-Stieltjes integrals. Chapter 2 deals with Lebesgue measure and integration. The authors include some famous, and some not so famous, integral inequalities related to Riemann integration. Many of the problems for Lebesgue integration concern convergence theorems and the interchange of limits and integrals. The book closes with a section on Fourier series, with a concentration on Fourier coefficients of functions from particular classes and on basic theorems for convergence of Fourier series. The book is primarily geared toward students in analysis, as a study aid, for problem-solving seminars, or for tutorials. It is also an excellent resource for instructors who wish to incorporate problems into their lectures. Solutions for the problems are provided in the boo...

  12. Integrated Radiation Analysis and Design Tools

    Data.gov (United States)

    National Aeronautics and Space Administration — The Integrated Radiation Analysis and Design Tools (IRADT) Project develops and maintains an integrated tool set that collects the current best practices, databases,...

  13. The Vehicle Integrated Performance Analysis Experience: Reconnecting With Technical Integration

    Science.gov (United States)

    McGhee, D. S.

    2006-01-01

    Very early in the Space Launch Initiative program, a small team of engineers at MSFC proposed a process for performing system-level assessments of a launch vehicle. Aimed primarily at providing insight and making NASA a smart buyer, the Vehicle Integrated Performance Analysis (VIPA) team was created. The difference between the VIPA effort and previous integration attempts is that VIPA a process using experienced people from various disciplines, which focuses them on a technically integrated assessment. The foundations of VIPA s process are described. The VIPA team also recognized the need to target early detailed analysis toward identifying significant systems issues. This process is driven by the T-model for technical integration. VIPA s approach to performing system-level technical integration is discussed in detail. The VIPA process significantly enhances the development and monitoring of realizable project requirements. VIPA s assessment validates the concept s stated performance, identifies significant issues either with the concept or the requirements, and then reintegrates these issues to determine impacts. This process is discussed along with a description of how it may be integrated into a program s insight and review process. The VIPA process has gained favor with both engineering and project organizations for being responsive and insightful

  14. The wellbore simulator SIMU1999; El simulador de pozos SIMU1999

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez Upton, Pedro [Comision Federal de Electricidad, Morelia, Michoacan (Mexico)

    1999-08-01

    This work presents a brief description of the architecture and scope of the wellbore simulator SIMU1999. Its prime application involves the representation of the different flow types and thermodynamic conditions found in geothermal wells. The simulator utilizes a homogeneous flow model which incorporates the fundamental theories of fluid mechanics and allows the handling of two-phase three component mixtures (H{sub 2}O-NaCl-CO{sub 2}), which represent the main constituents appearing in the production of geothermal fluids. SIMU1999 uses a two-phase friction factor developed on the basis of 64 production test carried out on 45 different wells. There were recovered more than 324 pressure drop data and 628 temperature measurements from the inner of the wells. Mechanical log recorders (Kuster) were mainly used but some electronic logs (Hot Hole and Pruett) were carried out, too. The friction factor is calculated using the Reynolds number, steam quality, and fluid pressure, therefore, it is independent of any previous flow pattern identification. Production data included specific enthalpies from 650 to 2 780 kj/kg, fluid pressures between 0.4 and 14 MPa, and fluid temperatures from 110 to 340 Celsius degrees. The computer code of SIMU 1999 is written in Fortran 90 and generates and executable file a little bit greater than 1 Mb. The program is divided in four parts, these are: the wellbore simulator; a graphical output to analyze the results on the screen; a separated subroutine to evaluate the mass flow rate of three component flows discharging to the atmosphere at the speed of sound; and an independent thermodynamic module which could be utilized to make estimations to be used in manual analysis. The code incorporates an efficient algorithm to solve the fluid transport phenomena problem, based on a numerical method of successive approaches. The simulator uses the International System of Units, for data input and for results (outcomes) generation. Everything is realized

  15. Integral data analysis for resonance parameters determination

    International Nuclear Information System (INIS)

    Larson, N.M.; Leal, L.C.; Derrien, H.

    1997-09-01

    Neutron time-of-flight experiments have long been used to determine resonance parameters. Those resonance parameters have then been used in calculations of integral quantities such as Maxwellian averages or resonance integrals, and results of those calculations in turn have been used as a criterion for acceptability of the resonance analysis. However, the calculations were inadequate because covariances on the parameter values were not included in the calculations. In this report an effort to correct for that deficiency is documented: (1) the R-matrix analysis code SAMMY has been modified to include integral quantities of importance, (2) directly within the resonance parameter analysis, and (3) to determine the best fit to both differential (microscopic) and integral (macroscopic) data simultaneously. This modification was implemented because it is expected to have an impact on the intermediate-energy range that is important for criticality safety applications

  16. Geochemical and Geomechanical Effects on Wellbore Cement Fractures: Data Information for Wellbore Reduced Order Model

    Energy Technology Data Exchange (ETDEWEB)

    Um, Wooyong [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Jung, Hun Bok [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kabilan, Senthil [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Suh, Dong-Myung [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fernandez, Carlos A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-01-01

    The primary objective of the National Risk Assessment Partnership (NRAP) program is to develop a defensible, generalized, and science-based methodology and platform for quantifying risk profiles at CO2 injection and storage sites. The methodology must incorporate and define the scientific basis for assessing residual risks associated with long-term stewardship and help guide site operational decision-making and risk management. Development of an integrated and risk-based protocol will help minimize uncertainty in the predicted long-term behavior of the CO2 storage site and thereby increase confidence in storage integrity. The risk profile concept has proven useful in conveying the qualitative evolution of risks for CO2 injection and storage site. However, qualitative risk profiles are not sufficient for specifying long-term liability for CO2 storage sites. Because there has been no science-based defensible and robust methodology developed for quantification of risk profiles for CO2 injection and storage, NRAP has been focused on developing a science-based methodology for quantifying risk profiles for various risk proxies.

  17. Real analysis measure theory, integration, and Hilbert spaces

    CERN Document Server

    Stein, Elias M

    2005-01-01

    Real Analysis is the third volume in the Princeton Lectures in Analysis, a series of four textbooks that aim to present, in an integrated manner, the core areas of analysis. Here the focus is on the development of measure and integration theory, differentiation and integration, Hilbert spaces, and Hausdorff measure and fractals. This book reflects the objective of the series as a whole: to make plain the organic unity that exists between the various parts of the subject, and to illustrate the wide applicability of ideas of analysis to other fields of mathematics and science. After

  18. Semantic web for integrated network analysis in biomedicine.

    Science.gov (United States)

    Chen, Huajun; Ding, Li; Wu, Zhaohui; Yu, Tong; Dhanapalan, Lavanya; Chen, Jake Y

    2009-03-01

    The Semantic Web technology enables integration of heterogeneous data on the World Wide Web by making the semantics of data explicit through formal ontologies. In this article, we survey the feasibility and state of the art of utilizing the Semantic Web technology to represent, integrate and analyze the knowledge in various biomedical networks. We introduce a new conceptual framework, semantic graph mining, to enable researchers to integrate graph mining with ontology reasoning in network data analysis. Through four case studies, we demonstrate how semantic graph mining can be applied to the analysis of disease-causal genes, Gene Ontology category cross-talks, drug efficacy analysis and herb-drug interactions analysis.

  19. Advanced Concept Architecture Design and Integrated Analysis (ACADIA)

    Science.gov (United States)

    2017-11-03

    1 Advanced Concept Architecture Design and Integrated Analysis (ACADIA) Submitted to the National Institute of Aerospace (NIA) on...Research Report 20161001 - 20161030 Advanced Concept Architecture Design and Integrated Analysis (ACADIA) W911NF-16-2-0229 8504Cedric Justin, Youngjun

  20. A new method in predicting productivity of multi-stage fractured horizontal well in tight gas reservoirs

    Directory of Open Access Journals (Sweden)

    Yunsheng Wei

    2016-10-01

    Full Text Available The generally accomplished technique for horizontal wells in tight gas reservoirs is by multi-stage hydraulic fracturing, not to mention, the flow characteristics of a horizontal well with multiple transverse fractures are very intricate. Conventional methods, well as an evaluation unit, are difficult to accurately predict production capacity of each fracture and productivity differences between wells with a different number of fractures. Thus, a single fracture sets the minimum evaluation unit, matrix, fractures, and lateral wellbore model that are then combined integrally to approximate horizontal well with multiple transverse hydraulic fractures in tight gas reservoirs. This paper presents a new semi-analytical methodology for predicting the production capacity of a horizontal well with multiple transverse hydraulic fractures in tight gas reservoirs. Firstly, a mathematical flow model used as a medium, which is disturbed by finite conductivity vertical fractures and rectangular shaped boundaries, is established and explained by the Fourier integral transform. Then the idea of a single stage fracture analysis is incorporated to establish linear flow model within a single fracture with a variable rate. The Fredholm integral numerical solution is applicable for the fracture conductivity function. Finally, the pipe flow model along the lateral wellbore is adapted to couple multi-stages fracture mathematical models, and the equation group of predicting productivity of a multi-stage fractured horizontal well. The whole flow process from the matrix to bottom-hole and production interference between adjacent fractures is also established. Meanwhile, the corresponding iterative algorithm of the equations is given. In this case analysis, the productions of each well and fracture are calculated under the different bottom-hole flowing pressure, and this method also contributes to obtaining the distribution of pressure drop and production for every

  1. Development of safety analysis technology for integral reactor

    Energy Technology Data Exchange (ETDEWEB)

    Sim, Suk K.; Song, J. H.; Chung, Y. J. and others

    1999-03-01

    Inherent safety features and safety system characteristics of the SMART integral reactor are investigated in this study. Performance and safety of the SMART conceptual design have been evaluated and confirmed through the performance and safety analyses using safety analysis system codes as well as a preliminary performance and safety analysis methodology. SMART design base events and their acceptance criteria are identified to develop a preliminary PIRT for the SMART integral reactor. Using the preliminary PIRT, a set of experimental program for the thermal hydraulic separate effect tests and the integral effect tests was developed for the thermal hydraulic model development and the system code validation. Safety characteristics as well as the safety issues of the integral reactor has been identified during the study, which will be used to resolve the safety issues and guide the regulatory criteria for the integral reactor. The results of the performance and safety analyses performed during the study were used to feedback for the SMART conceptual design. The performance and safety analysis code systems as well as the preliminary safety analysis methodology developed in this study will be validated as the SMART design evolves. The performance and safety analysis technology developed during the study will be utilized for the SMART basic design development. (author)

  2. Practical application of failure criteria in determining safe mud weight windows in drilling operations

    Directory of Open Access Journals (Sweden)

    R. Gholami

    2014-02-01

    Full Text Available Wellbore instability is reported frequently as one of the most significant incidents during drilling operations. Analysis of wellbore instability includes estimation of formation mechanical properties and the state of in situ stresses. In this analysis, the only controllable parameter during drilling operation is the mud weight. If the mud weight is larger than anticipated, the mud will invade into the formation, causing tensile failure of the formation. On the other hand, a lower mud weight can result in shear failures of rock, which is known as borehole breakouts. To predict the potential for failures around the wellbore during drilling, one should use a failure criterion to compare the rock strength against induced tangential stresses around the wellbore at a given mud pressure. The Mohr–Coulomb failure criterion is one of the commonly accepted criteria for estimation of rock strength at a given state of stress. However, the use of other criteria has been debated in the literature. In this paper, Mohr–Coulomb, Hoek–Brown and Mogi–Coulomb failure criteria were used to estimate the potential rock failure around a wellbore located in an onshore field of Iran. The log based analysis was used to estimate rock mechanical properties of formations and state of stresses. The results indicated that amongst different failure criteria, the Mohr–Coulomb criterion underestimates the highest mud pressure required to avoid breakouts around the wellbore. It also predicts a lower fracture gradient pressure. In addition, it was found that the results obtained from Mogi–Coulomb criterion yield a better comparison with breakouts observed from the caliper logs than that of Hoek–Brown criterion. It was concluded that the Mogi–Coulomb criterion is a better failure criterion as it considers the effect of the intermediate principal stress component in the failure analysis.

  3. Production (information sheets)

    NARCIS (Netherlands)

    2007-01-01

    Documentation sheets: Geo energy 2 Integrated System Approach Petroleum Production (ISAPP) The value of smartness 4 Reservoir permeability estimation from production data 6 Coupled modeling for reservoir application 8 Toward an integrated near-wellbore model 10 TNO conceptual framework for "E&P

  4. Dynamic analysis of a liquid droplet and optimization of helical angles for vortex drainage gas recovery

    Directory of Open Access Journals (Sweden)

    Xiaodong Wu

    2016-10-01

    Full Text Available Downhole vortex drainage gas recovery is a new gas production technology. So far, however, the forces and motions of liquid phase in the swirling flow field of wellbores during its field application have not been figured out. In this paper, the forces of liquid droplets in the swirling flow field of wellbores were analyzed on the basis of two-phase fluid dynamics theories. Then, the motion equations of fluid droplets along axial and radical directions were established. Magnitude comparison was performed on several typical acting forces, including Basset force, virtual mass force, Magnus force, Saffman force and Stokes force. Besides, the formula for calculating the optimal helical angle of vortex tools was established according to the principle that the vertical resultant force on fluid droplets should be the maximum. And afterwards, each acting force was comprehensively analyzed in terms of its origin, characteristics and direction based on the established force analysis model. Magnitude comparison indicates that the forces with less effect can be neglected, including virtual mass force, Basset force and convection volume force. Moreover, the vertically upward centrifugal force component occurs on the fluid droplets in swirling flow field instead of those in the conventional flow field of wellbores, which is favorable for the fluid droplets to move upward. The reliability of optimal helical angle calculation formula was verified by means of case analysis. It is demonstrated that with the decrease of well depth, the fluid-carrying capability of gas and the optimal helical angle increase. The research results in this paper have a guiding significance to the optimization design of downhole vortex tools and the field application of downhole vortex drainage gas recovery technology.

  5. Cost benefit analysis of power plant database integration

    International Nuclear Information System (INIS)

    Wilber, B.E.; Cimento, A.; Stuart, R.

    1988-01-01

    A cost benefit analysis of plant wide data integration allows utility management to evaluate integration and automation benefits from an economic perspective. With this evaluation, the utility can determine both the quantitative and qualitative savings that can be expected from data integration. The cost benefit analysis is then a planning tool which helps the utility to develop a focused long term implementation strategy that will yield significant near term benefits. This paper presents a flexible cost benefit analysis methodology which is both simple to use and yields accurate, verifiable results. Included in this paper is a list of parameters to consider, a procedure for performing the cost savings analysis, and samples of this procedure when applied to a utility. A case study is presented involving a specific utility where this procedure was applied. Their uses of the cost-benefit analysis are also described

  6. Integration of Design and Control through Model Analysis

    DEFF Research Database (Denmark)

    Russel, Boris Mariboe; Henriksen, Jens Peter; Jørgensen, Sten Bay

    2002-01-01

    A systematic computer aided analysis of the process model is proposed as a pre-solution step for integration of design and control problems. The process model equations are classified in terms of balance equations, constitutive equations and conditional equations. Analysis of the phenomena models...... (structure selection) issues for the integrated problems are considered. (C) 2002 Elsevier Science Ltd. All rights reserved....... representing the constitutive equations identify the relationships between the important process and design variables, which help to understand, define and address some of the issues related to integration of design and control. Furthermore, the analysis is able to identify a set of process (control) variables...

  7. Direct integration multiple collision integral transport analysis method for high energy fusion neutronics

    International Nuclear Information System (INIS)

    Koch, K.R.

    1985-01-01

    A new analysis method specially suited for the inherent difficulties of fusion neutronics was developed to provide detailed studies of the fusion neutron transport physics. These studies should provide a better understanding of the limitations and accuracies of typical fusion neutronics calculations. The new analysis method is based on the direct integration of the integral form of the neutron transport equation and employs a continuous energy formulation with the exact treatment of the energy angle kinematics of the scattering process. In addition, the overall solution is analyzed in terms of uncollided, once-collided, and multi-collided solution components based on a multiple collision treatment. Furthermore, the numerical evaluations of integrals use quadrature schemes that are based on the actual dependencies exhibited in the integrands. The new DITRAN computer code was developed on the Cyber 205 vector supercomputer to implement this direct integration multiple-collision fusion neutronics analysis. Three representative fusion reactor models were devised and the solutions to these problems were studied to provide suitable choices for the numerical quadrature orders as well as the discretized solution grid and to understand the limitations of the new analysis method. As further verification and as a first step in assessing the accuracy of existing fusion-neutronics calculations, solutions obtained using the new analysis method were compared to typical multigroup discrete ordinates calculations

  8. International Space Station Configuration Analysis and Integration

    Science.gov (United States)

    Anchondo, Rebekah

    2016-01-01

    Ambitious engineering projects, such as NASA's International Space Station (ISS), require dependable modeling, analysis, visualization, and robotics to ensure that complex mission strategies are carried out cost effectively, sustainably, and safely. Learn how Booz Allen Hamilton's Modeling, Analysis, Visualization, and Robotics Integration Center (MAVRIC) team performs engineering analysis of the ISS Configuration based primarily on the use of 3D CAD models. To support mission planning and execution, the team tracks the configuration of ISS and maintains configuration requirements to ensure operational goals are met. The MAVRIC team performs multi-disciplinary integration and trade studies to ensure future configurations meet stakeholder needs.

  9. Overcoming barriers to integrating economic analysis into risk assessment.

    Science.gov (United States)

    Hoffmann, Sandra

    2011-09-01

    Regulatory risk analysis is designed to provide decisionmakers with a clearer understanding of how policies are likely to affect risk. The systems that produce risk are biological, physical, and social and economic. As a result, risk analysis is an inherently interdisciplinary task. Yet in practice, risk analysis has been interdisciplinary in only limited ways. Risk analysis could provide more accurate assessments of risk if there were better integration of economics and other social sciences into risk assessment itself. This essay examines how discussions about risk analysis policy have influenced the roles of various disciplines in risk analysis. It explores ways in which integrated bio/physical-economic modeling could contribute to more accurate assessments of risk. It reviews examples of the kind of integrated economics-bio/physical modeling that could be used to enhance risk assessment. The essay ends with a discussion of institutional barriers to greater integration of economic modeling into risk assessment and provides suggestions on how these might be overcome. © 2011 Society for Risk Analysis.

  10. Integrating fire management analysis into land management planning

    Science.gov (United States)

    Thomas J. Mills

    1983-01-01

    The analysis of alternative fire management programs should be integrated into the land and resource management planning process, but a single fire management analysis model cannot meet all planning needs. Therefore, a set of simulation models that are analytically separate from integrated land management planning models are required. The design of four levels of fire...

  11. Integrating neural network technology and noise analysis

    International Nuclear Information System (INIS)

    Uhrig, R.E.; Oak Ridge National Lab., TN

    1995-01-01

    The integrated use of neural network and noise analysis technologies offers advantages not available by the use of either technology alone. The application of neural network technology to noise analysis offers an opportunity to expand the scope of problems where noise analysis is useful and unique ways in which the integration of these technologies can be used productively. The two-sensor technique, in which the responses of two sensors to an unknown driving source are related, is used to demonstration such integration. The relationship between power spectral densities (PSDs) of accelerometer signals is derived theoretically using noise analysis to demonstrate its uniqueness. This relationship is modeled from experimental data using a neural network when the system is working properly, and the actual PSD of one sensor is compared with the PSD of that sensor predicted by the neural network using the PSD of the other sensor as an input. A significant deviation between the actual and predicted PSDs indicate that system is changing (i.e., failing). Experiments carried out on check values and bearings illustrate the usefulness of the methodology developed. (Author)

  12. Development of data analysis tool for combat system integration

    Directory of Open Access Journals (Sweden)

    Seung-Chun Shin

    2013-03-01

    Full Text Available System integration is an important element for the construction of naval combat ships. In particular, because impeccable combat system integration together with the sensors and weapons can ensure the combat capability and survivability of the ship, the integrated performance of the combat system should be verified and validated whether or not it fulfills the requirements of the end user. In order to conduct systematic verification and validation, a data analysis tool is requisite. This paper suggests the Data Extraction, Recording and Analysis Tool (DERAT for the data analysis of the integrated performance of the combat system, including the functional definition, architecture and effectiveness of the DERAT by presenting the test results.

  13. Analysis of Low-Temperature Utilization of Geothermal Resources

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Brian

    2015-06-30

    Full realization of the potential of what might be considered “low-grade” geothermal resources will require that we examine many more uses for the heat than traditional electricity generation. To demonstrate that geothermal energy truly has the potential to be a national energy source we will be designing, assessing, and evaluating innovative uses for geothermal-produced water such as hybrid biomass-geothermal cogeneration of electricity and district heating and efficiency improvements to the use of cellulosic biomass in addition to utilization of geothermal in district heating for community redevelopment projects. The objectives of this project were: 1) to perform a techno-economic analysis of the integration and utilization potential of low-temperature geothermal sources. Innovative uses of low-enthalpy geothermal water were designed and examined for their ability to offset fossil fuels and decrease CO2 emissions. 2) To perform process optimizations and economic analyses of processes that can utilize low-temperature geothermal fluids. These processes included electricity generation using biomass and district heating systems. 3) To scale up and generalize the results of three case study locations to develop a regionalized model of the utilization of low-temperature geothermal resources. A national-level, GIS-based, low-temperature geothermal resource supply model was developed and used to develop a series of national supply curves. We performed an in-depth analysis of the low-temperature geothermal resources that dominate the eastern half of the United States. The final products of this study include 17 publications, an updated version of the cost estimation software GEOPHIRES, and direct-use supply curves for low-temperature utilization of geothermal resources. The supply curves for direct use geothermal include utilization from known hydrothermal, undiscovered hydrothermal, and near-hydrothermal EGS resources and presented these results at the Stanford

  14. Geochemical alteration of wellbore cement by CO2 or CO2+H 2 S reaction during long-term carbon storage: Original Research Article: Geochemical alteration of wellbore cement by CO2

    Energy Technology Data Exchange (ETDEWEB)

    Um, Wooyong [Pacific Northwest National Laboratory, Richland WA USA; Rod, Kenton A. [Pacific Northwest National Laboratory, Richland WA USA; Jung, Hun Bok [New Jersey City University, Jersey City NJ USA; Brown, Christopher F. [Pacific Northwest National Laboratory, Richland WA USA

    2016-03-22

    Cement samples were reacted with CO2-saturated groundwater, with or without added H2S (1 wt.%), at 50°C and 10 MPa for up to 13 months (CO2 only) or for up to 3.5 months (CO2 + H2S) under static conditions. After the reaction, X-ray computed tomography images revealed that calcium carbonate precipitation (CaCO3) occurred extensively within the fractures in the cement matrix, but only partially along fractures at the cement-basalt interface. Exposure of a fractured cement sample to CO2-saturated groundwater (50°C and 10 MPa) over a period of 13 months demonstrated progressive healing of cement fractures by CaCO3(s) precipitation. After reaction with CO2 + H2S-saturated groundwater, CaCO3 (s) precipitation also occurred more extensively within the cement fracture than along the cement-basalt caprock interfaces. X-ray diffraction analysis showed that major cement carbonation products of the CO2 + H2S-saturated groundwater were calcite, aragonite, and vaterite, all consistent with cement carbonation by CO2-saturated groundwater. While pyrite is thermodynamically favored to form, due to the low H2S concentration it was not identified by XRD in this study. The cement alteration rate into neat Portland cement columns by CO2-saturated groundwater was similar at ~0.02 mm/d, regardless of the cement-curing pressure and temperature (P-T) conditions, or the presence of H2S in the brine. The experimental results imply that the wellbore cement with fractures is likely to be healed during exposure to CO2- or CO2 + H2S-saturated groundwater, whereas fractures along the cement-caprock interface are likely to remain open and vulnerable to the leakage of CO2.

  15. Analysis Method for Integrating Components of Product

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jun Ho [Inzest Co. Ltd, Seoul (Korea, Republic of); Lee, Kun Sang [Kookmin Univ., Seoul (Korea, Republic of)

    2017-04-15

    This paper presents some of the methods used to incorporate the parts constituting a product. A new relation function concept and its structure are introduced to analyze the relationships of component parts. This relation function has three types of information, which can be used to establish a relation function structure. The relation function structure of the analysis criteria was established to analyze and present the data. The priority components determined by the analysis criteria can be integrated. The analysis criteria were divided based on their number and orientation, as well as their direct or indirect characteristic feature. This paper presents a design algorithm for component integration. This algorithm was applied to actual products, and the components inside the product were integrated. Therefore, the proposed algorithm was used to conduct research to improve the brake discs for bicycles. As a result, an improved product similar to the related function structure was actually created.

  16. Analysis Method for Integrating Components of Product

    International Nuclear Information System (INIS)

    Choi, Jun Ho; Lee, Kun Sang

    2017-01-01

    This paper presents some of the methods used to incorporate the parts constituting a product. A new relation function concept and its structure are introduced to analyze the relationships of component parts. This relation function has three types of information, which can be used to establish a relation function structure. The relation function structure of the analysis criteria was established to analyze and present the data. The priority components determined by the analysis criteria can be integrated. The analysis criteria were divided based on their number and orientation, as well as their direct or indirect characteristic feature. This paper presents a design algorithm for component integration. This algorithm was applied to actual products, and the components inside the product were integrated. Therefore, the proposed algorithm was used to conduct research to improve the brake discs for bicycles. As a result, an improved product similar to the related function structure was actually created.

  17. A new analytical model for conduction heating during the SAGD circulation phase

    Energy Technology Data Exchange (ETDEWEB)

    Duong, A.N. [Society of Petroleum Engineers, Canadian Section, Calgary, AB (Canada)]|[ConocoPhillips Canada Resources Corp., Calgary, AB (Canada); Tomberlin, T.A. [ConocoPhillips Canada Resources Corp., Calgary, AB (Canada); Cyrot, M. [Total E and P Canada Ltd., Calgary, AB (Canada)

    2008-10-15

    The steam assisted gravity drainage (SAGD) process has become the common procedure to recover bitumen from Alberta's oilsands. Inter-well communication must be initiated during the start-up phase of a SAGD process. The shape of an initial steam chamber that develops during the circulation phase influences the efficiency of bitumen recovery. As such, the heating conformance distributed along the horizontal wellbores must be well understood. The duration of the start-up phase varies with the characteristics of the oilsand formation and the distance between the wellbores, but it is typically a month to several months. This paper presented a newly developed analytical model that predicts the initial steam chamber. The model improves bitumen recovery efficiency by predicting the mid-point temperature front and heating efficiency of a wellpair during the SAGD circulation phase. The Excel-based model uses the exponential integral solution for radial heating in a long cylinder and superposition in space for multi-heating sources. It can predict the temperature profile if the steam temperatures or pressures are known during the circulation period. Wellbore modeling that includes any variation in distances between the wellbores is critical to both circulation time and heating conformance. This model has an advantage over numerical simulation in terms of reducing computational time and accurately modelling any variation in distance between wellbores. The results can be optimized under various operational conditions, wellbore profiles, tubing sizes and convection flow effects. This easy to use model is currently being used by ConocoPhillips Canada to optimize, predict and guide oilsands projects during the start-up phase of a SAGD process. 5 refs., 13 figs.

  18. Downhole Temperature Modeling for Non-Newtonian Fluids in ERD Wells

    Directory of Open Access Journals (Sweden)

    Dan Sui

    2018-04-01

    Full Text Available Having precise information of fluids' temperatures is a critical process during planning of drilling operations, especially for extended reach drilling (ERD. The objective of this paper is to develop an accurate temperature model that can precisely calculate wellbore temperature distributions. An established semi-transient temperature model for vertical wellbores is extended and improved to include deviated wellbores and more realistic scenarios using non-Newtonian fluids. The temperature model is derived based on an energy balance between the formation and the wellbore. Heat transfer is considered steady-state in the wellbore and transient in the formation through the utilization of a formation cooling effect. In this paper, the energy balance is enhanced by implementing heat generation from the drill bit friction and contact friction force caused by drillpipe rotation. A non-linear geothermal gradient as a function of wellbore inclination, is also introduced to extend the model to deviated wellbores. Additionally, the model is improved by considering temperature dependent drilling fluid transport and thermal properties. Transport properties such as viscosity and density are obtained by lab measurements, which allows for investigation of the effect of non-Newtonian fluid behavior on the heat transfer. Furthermore, applying a non-Newtonian pressure loss model enables an opportunity to evaluate the impact of viscous forces on fluid properties and thus the overall heat transfer. Results from sensitivity analysis of both drilling fluid properties and other relevant parameters will be presented. The main application area of this model is related to optimization of drilling fluid, hydraulics, and wellbore design parameters, ultimately leading to safe and cost efficient operations.

  19. An integrated acquisition, display, and analysis system

    International Nuclear Information System (INIS)

    Ahmad, T.; Huckins, R.J.

    1987-01-01

    The design goal of the ND9900/Genuie was to integrate a high performance data acquisition and display subsystem with a state-of-the-art 32-bit supermicrocomputer. This was achieved by integrating a Digital Equipment Corporation MicroVAX II CPU board with acquisition and display controllers via the Q-bus. The result is a tightly coupled processing and analysis system for Pulse Height Analysis and other applications. The system architecture supports distributed processing, so that acquisition and display functions are semi-autonomous, making the VAX concurrently available for applications programs

  20. PHIDIAS- Pathogen Host Interaction Data Integration and Analysis

    Indian Academy of Sciences (India)

    PHIDIAS- Pathogen Host Interaction Data Integration and Analysis- allows searching of integrated genome sequences, conserved domains and gene expressions data related to pathogen host interactions in high priority agents for public health and security ...

  1. Derivation and application of mathematical model for well test analysis with variable skin factor in hydrocarbon reservoirs

    Directory of Open Access Journals (Sweden)

    Pengcheng Liu

    2016-06-01

    Full Text Available Skin factor is often regarded as a constant in most of the mathematical model for well test analysis in oilfields, but this is only a kind of simplified treatment with the actual skin factor changeable. This paper defined the average permeability of a damaged area as a function of time by using the definition of skin factor. Therefore a relationship between a variable skin factor and time was established. The variable skin factor derived was introduced into existing traditional models rather than using a constant skin factor, then, this newly derived mathematical model for well test analysis considering variable skin factor was solved by Laplace transform. The dimensionless wellbore pressure and its derivative changed with dimensionless time were plotted with double logarithm and these plots can be used for type curve fitting. The effects of all the parameters in the expression of variable skin factor were analyzed based on the dimensionless wellbore pressure and its derivative. Finally, actual well testing data were used to fit the type curves developed which validates the applicability of the mathematical model from Sheng-2 Block, Shengli Oilfield, China.

  2. Abel integral equations analysis and applications

    CERN Document Server

    Gorenflo, Rudolf

    1991-01-01

    In many fields of application of mathematics, progress is crucially dependent on the good flow of information between (i) theoretical mathematicians looking for applications, (ii) mathematicians working in applications in need of theory, and (iii) scientists and engineers applying mathematical models and methods. The intention of this book is to stimulate this flow of information. In the first three chapters (accessible to third year students of mathematics and physics and to mathematically interested engineers) applications of Abel integral equations are surveyed broadly including determination of potentials, stereology, seismic travel times, spectroscopy, optical fibres. In subsequent chapters (requiring some background in functional analysis) mapping properties of Abel integral operators and their relation to other integral transforms in various function spaces are investi- gated, questions of existence and uniqueness of solutions of linear and nonlinear Abel integral equations are treated, and for equatio...

  3. Integrated analysis of genetic data with R

    Directory of Open Access Journals (Sweden)

    Zhao Jing

    2006-01-01

    Full Text Available Abstract Genetic data are now widely available. There is, however, an apparent lack of concerted effort to produce software systems for statistical analysis of genetic data compared with other fields of statistics. It is often a tremendous task for end-users to tailor them for particular data, especially when genetic data are analysed in conjunction with a large number of covariates. Here, R http://www.r-project.org, a free, flexible and platform-independent environment for statistical modelling and graphics is explored as an integrated system for genetic data analysis. An overview of some packages currently available for analysis of genetic data is given. This is followed by examples of package development and practical applications. With clear advantages in data management, graphics, statistical analysis, programming, internet capability and use of available codes, it is a feasible, although challenging, task to develop it into an integrated platform for genetic analysis; this will require the joint efforts of many researchers.

  4. Integrated care: a comprehensive bibliometric analysis and literature review

    Directory of Open Access Journals (Sweden)

    Xiaowei Sun

    2014-06-01

    Full Text Available Introduction: Integrated care could not only fix up fragmented health care but also improve the continuity of care and the quality of life. Despite the volume and variety of publications, little is known about how ‘integrated care’ has developed. There is a need for a systematic bibliometric analysis on studying the important features of the integrated care literature.Aim: To investigate the growth pattern, core journals and jurisdictions and identify the key research domains of integrated care.Methods: We searched Medline/PubMed using the search strategy ‘(delivery of health care, integrated [MeSH Terms] OR integrated care [Title/Abstract]’ without time and language limits. Second, we extracted the publishing year, journals, jurisdictions and keywords of the retrieved articles. Finally, descriptive statistical analysis by the Bibliographic Item Co-occurrence Matrix Builder and hierarchical clustering by SPSS were used.Results: As many as 9090 articles were retrieved. Results included: (1 the cumulative numbers of the publications on integrated care rose perpendicularly after 1993; (2 all documents were recorded by 1646 kinds of journals. There were 28 core journals; (3 the USA is the predominant publishing country; and (4 there are six key domains including: the definition/models of integrated care, interdisciplinary patient care team, disease management for chronically ill patients, types of health care organizations and policy, information system integration and legislation/jurisprudence.Discussion and conclusion: Integrated care literature has been most evident in developed countries. International Journal of Integrated Care is highly recommended in this research area. The bibliometric analysis and identification of publication hotspots provides researchers and practitioners with core target journals, as well as an overview of the field for further research in integrated care.

  5. Preliminary Integrated Safety Analysis Status Report

    International Nuclear Information System (INIS)

    Gwyn, D.

    2001-01-01

    This report provides the status of the potential Monitored Geologic Repository (MGR) Integrated Safety Analysis (EA) by identifying the initial work scope scheduled for completion during the ISA development period, the schedules associated with the tasks identified, safety analysis issues encountered, and a summary of accomplishments during the reporting period. This status covers the period from October 1, 2000 through March 30, 2001

  6. An integrated system for genetic analysis

    Directory of Open Access Journals (Sweden)

    Duan Xiao

    2006-04-01

    Full Text Available Abstract Background Large-scale genetic mapping projects require data management systems that can handle complex phenotypes and detect and correct high-throughput genotyping errors, yet are easy to use. Description We have developed an Integrated Genotyping System (IGS to meet this need. IGS securely stores, edits and analyses genotype and phenotype data. It stores information about DNA samples, plates, primers, markers and genotypes generated by a genotyping laboratory. Data are structured so that statistical genetic analysis of both case-control and pedigree data is straightforward. Conclusion IGS can model complex phenotypes and contain genotypes from whole genome association studies. The database makes it possible to integrate genetic analysis with data curation. The IGS web site http://bioinformatics.well.ox.ac.uk/project-igs.shtml contains further information.

  7. Integrated failure probability estimation based on structural integrity analysis and failure data: Natural gas pipeline case

    International Nuclear Information System (INIS)

    Dundulis, Gintautas; Žutautaitė, Inga; Janulionis, Remigijus; Ušpuras, Eugenijus; Rimkevičius, Sigitas; Eid, Mohamed

    2016-01-01

    In this paper, the authors present an approach as an overall framework for the estimation of the failure probability of pipelines based on: the results of the deterministic-probabilistic structural integrity analysis (taking into account loads, material properties, geometry, boundary conditions, crack size, and defected zone thickness), the corrosion rate, the number of defects and failure data (involved into the model via application of Bayesian method). The proposed approach is applied to estimate the failure probability of a selected part of the Lithuanian natural gas transmission network. The presented approach for the estimation of integrated failure probability is a combination of several different analyses allowing us to obtain: the critical crack's length and depth, the failure probability of the defected zone thickness, dependency of the failure probability on the age of the natural gas transmission pipeline. A model's uncertainty analysis and uncertainty propagation analysis are performed, as well. - Highlights: • Degradation mechanisms of natural gas transmission pipelines. • Fracture mechanic analysis of the pipe with crack. • Stress evaluation of the pipe with critical crack. • Deterministic-probabilistic structural integrity analysis of gas pipeline. • Integrated estimation of pipeline failure probability by Bayesian method.

  8. An analysis of 3D particle path integration algorithms

    International Nuclear Information System (INIS)

    Darmofal, D.L.; Haimes, R.

    1996-01-01

    Several techniques for the numerical integration of particle paths in steady and unsteady vector (velocity) fields are analyzed. Most of the analysis applies to unsteady vector fields, however, some results apply to steady vector field integration. Multistep, multistage, and some hybrid schemes are considered. It is shown that due to initialization errors, many unsteady particle path integration schemes are limited to third-order accuracy in time. Multistage schemes require at least three times more internal data storage than multistep schemes of equal order. However, for timesteps within the stability bounds, multistage schemes are generally more accurate. A linearized analysis shows that the stability of these integration algorithms are determined by the eigenvalues of the local velocity tensor. Thus, the accuracy and stability of the methods are interpreted with concepts typically used in critical point theory. This paper shows how integration schemes can lead to erroneous classification of critical points when the timestep is finite and fixed. For steady velocity fields, we demonstrate that timesteps outside of the relative stability region can lead to similar integration errors. From this analysis, guidelines for accurate timestep sizing are suggested for both steady and unsteady flows. In particular, using simulation data for the unsteady flow around a tapered cylinder, we show that accurate particle path integration requires timesteps which are at most on the order of the physical timescale of the flow

  9. Integrity Analysis of Damaged Steam Generator Tubes

    International Nuclear Information System (INIS)

    Stanic, D.

    1998-01-01

    Variety of degradation mechanisms affecting steam generator tubes makes steam generators as one of the critical components in the nuclear power plants. Depending of their nature, degradation mechanisms cause different types of damages. It requires performance of extensive integrity analysis in order to access various conditions of crack behavior under operating and accidental conditions. Development and application of advanced eddy current techniques for steam generator examination provide good characterization of found damages. Damage characteristics (shape, orientation and dimensions) may be defined and used for further evaluation of damage influence on tube integrity. In comparison with experimental and analytical methods, numerical methods are also efficient tools for integrity assessment. Application of finite element methods provides relatively simple modeling of different type of damages and simulation of various operating conditions. The stress and strain analysis may be performed for elastic and elasto-plastic state with good ability for visual presentation of results. Furthermore, the fracture mechanics parameters may be calculated. Results obtained by numerical analysis supplemented with experimental results are the base for definition of alternative plugging criteria which may significantly reduce the number of plugged tubes. (author)

  10. Integrated framework for dynamic safety analysis

    International Nuclear Information System (INIS)

    Kim, Tae Wan; Karanki, Durga R.

    2012-01-01

    In the conventional PSA (Probabilistic Safety Assessment), detailed plant simulations by independent thermal hydraulic (TH) codes are used in the development of accident sequence models. Typical accidents in a NPP involve complex interactions among process, safety systems, and operator actions. As independent TH codes do not have the models of operator actions and full safety systems, they cannot literally simulate the integrated and dynamic interactions of process, safety systems, and operator responses. Offline simulation with pre decided states and time delays may not model the accident sequences properly. Moreover, when stochastic variability in responses of accident models is considered, defining all the combinations for simulations will be cumbersome task. To overcome some of these limitations of conventional safety analysis approach, TH models are coupled with the stochastic models in the dynamic event tree (DET) framework, which provides flexibility to model the integrated response due to better communication as all the accident elements are in the same model. The advantages of this framework also include: Realistic modeling in dynamic scenarios, comprehensive results, integrated approach (both deterministic and probabilistic models), and support for HRA (Human Reliability Analysis)

  11. Integrative sparse principal component analysis of gene expression data.

    Science.gov (United States)

    Liu, Mengque; Fan, Xinyan; Fang, Kuangnan; Zhang, Qingzhao; Ma, Shuangge

    2017-12-01

    In the analysis of gene expression data, dimension reduction techniques have been extensively adopted. The most popular one is perhaps the PCA (principal component analysis). To generate more reliable and more interpretable results, the SPCA (sparse PCA) technique has been developed. With the "small sample size, high dimensionality" characteristic of gene expression data, the analysis results generated from a single dataset are often unsatisfactory. Under contexts other than dimension reduction, integrative analysis techniques, which jointly analyze the raw data of multiple independent datasets, have been developed and shown to outperform "classic" meta-analysis and other multidatasets techniques and single-dataset analysis. In this study, we conduct integrative analysis by developing the iSPCA (integrative SPCA) method. iSPCA achieves the selection and estimation of sparse loadings using a group penalty. To take advantage of the similarity across datasets and generate more accurate results, we further impose contrasted penalties. Different penalties are proposed to accommodate different data conditions. Extensive simulations show that iSPCA outperforms the alternatives under a wide spectrum of settings. The analysis of breast cancer and pancreatic cancer data further shows iSPCA's satisfactory performance. © 2017 WILEY PERIODICALS, INC.

  12. Integrability of dynamical systems algebra and analysis

    CERN Document Server

    Zhang, Xiang

    2017-01-01

    This is the first book to systematically state the fundamental theory of integrability and its development of ordinary differential equations with emphasis on the Darboux theory of integrability and local integrability together with their applications. It summarizes the classical results of Darboux integrability and its modern development together with their related Darboux polynomials and their applications in the reduction of Liouville and elementary integrabilty and in the center—focus problem, the weakened Hilbert 16th problem on algebraic limit cycles and the global dynamical analysis of some realistic models in fields such as physics, mechanics and biology. Although it can be used as a textbook for graduate students in dynamical systems, it is intended as supplementary reading for graduate students from mathematics, physics, mechanics and engineering in courses related to the qualitative theory, bifurcation theory and the theory of integrability of dynamical systems.

  13. Lectures on functional analysis and the Lebesgue integral

    CERN Document Server

    Komornik, Vilmos

    2016-01-01

    This textbook, based on three series of lectures held by the author at the University of Strasbourg, presents functional analysis in a non-traditional way by generalizing elementary theorems of plane geometry to spaces of arbitrary dimension. This approach leads naturally to the basic notions and theorems. Most results are illustrated by the small ℓp spaces. The Lebesgue integral, meanwhile, is treated via the direct approach of Frigyes Riesz, whose constructive definition of measurable functions leads to optimal, clear-cut versions of the classical theorems of Fubini-Tonelli and Radon-Nikodým. Lectures on Functional Analysis and the Lebesgue Integral presents the most important topics for students, with short, elegant proofs. The exposition style follows the Hungarian mathematical tradition of Paul Erdős and others. The order of the first two parts, functional analysis and the Lebesgue integral, may be reversed. In the third and final part they are combined to study various spaces of continuous and integ...

  14. Integrating Pavement Crack Detection and Analysis Using Autonomous Unmanned Aerial Vehicle Imagery

    Science.gov (United States)

    2015-03-27

    INTEGRATING PAVEMENT CRACK DETECTION AND ANALYSIS USING AUTONOMOUS UNMANNED AERIAL VEHICLE...protection in the United States. AFIT-ENV-MS-15-M-195 INTEGRATING PAVEMENT CRACK DETECTION AND ANALYSIS USING AUTONOMOUS UNMANNED AERIAL...APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED. AFIT-ENV-MS-15-M-195 INTEGRATING PAVEMENT CRACK DETECTION AND ANALYSIS USING AUTONOMOUS

  15. Strategic Analysis of Technology Integration at Allstream

    OpenAIRE

    Brown, Jeff

    2011-01-01

    Innovation has been defined as the combination of invention and commercialization. Invention without commercialization is rarely, if ever, profitable. For the purposes of this paper the definition of innovation will be further expanded into the concept of technology integration. Successful technology integration not only includes new technology introduction, but also the operationalization of the new technology within each business unit of the enterprise. This paper conducts an analysis of Al...

  16. Integrated piping structural analysis system

    International Nuclear Information System (INIS)

    Motoi, Toshio; Yamadera, Masao; Horino, Satoshi; Idehata, Takamasa

    1979-01-01

    Structural analysis of the piping system for nuclear power plants has become larger in scale and in quantity. In addition, higher quality analysis is regarded as of major importance nowadays from the point of view of nuclear plant safety. In order to fulfill to the above requirements, an integrated piping structural analysis system (ISAP-II) has been developed. Basic philosophy of this system is as follows: 1. To apply the date base system. All information is concentrated. 2. To minimize the manual process in analysis, evaluation and documentation. Especially to apply the graphic system as much as possible. On the basis of the above philosophy four subsystems were made. 1. Data control subsystem. 2. Analysis subsystem. 3. Plotting subsystem. 4. Report subsystem. Function of the data control subsystem is to control all information of the data base. Piping structural analysis can be performed by using the analysis subsystem. Isometric piping drawing and mode shape, etc. can be plotted by using the plotting subsystem. Total analysis report can be made without the manual process through the reporting subsystem. (author)

  17. Argentinean integrated small reactor design and scale economy analysis of integrated reactor

    International Nuclear Information System (INIS)

    Florido, P. C.; Bergallo, J. E.; Ishida, M. V.

    2000-01-01

    This paper describes the design of CAREM, which is Argentinean integrated small reactor project and the scale economy analysis results of integrated reactor. CAREM project consists on the development, design and construction of a small nuclear power plant. CAREM is an advanced reactor conceived with new generation design solutions and standing on the large experience accumulated in the safe operation of Light Water Reactors. The CAREM is an indirect cycle reactor with some distinctive and characteristic features that greatly simplify the reactor and also contribute to a highly level of safety: integrated primary cooling system, self pressurized, primary cooling by natural circulation and safety system relying on passive features. For a fully doupled economic evaluation of integrated reactors done by IREP (Integrated Reactor Evaluation Program) code transferred to IAEA, CAREM have been used as a reference point. The results shows that integrated reactors become competitive with power larger than 200MWe with Argentinean cheapest electricity option. Due to reactor pressure vessel construction limit, low pressure drop steam generator are used to reach power output of 200MWe for natural circulation. For forced circulation, 300MWe can be achieved. (author)

  18. Integration of End-User Cloud Storage for CMS Analysis

    CERN Document Server

    Riahi, Hassen; Álvarez Ayllón, Alejandro; Balcas, Justas; Ciangottini, Diego; Hernández, José M; Keeble, Oliver; Magini, Nicolò; Manzi, Andrea; Mascetti, Luca; Mascheroni, Marco; Tanasijczuk, Andres Jorge; Vaandering, Eric Wayne

    2018-01-01

    End-user Cloud storage is increasing rapidly in popularity in research communities thanks to the collaboration capabilities it offers, namely synchronisation and sharing. CERN IT has implemented a model of such storage named, CERNBox, integrated with the CERN AuthN and AuthZ services. To exploit the use of the end-user Cloud storage for the distributed data analysis activity, the CMS experiment has started the integration of CERNBox as a Grid resource. This will allow CMS users to make use of their own storage in the Cloud for their analysis activities as well as to benefit from synchronisation and sharing capabilities to achieve results faster and more effectively. It will provide an integration model of Cloud storages in the Grid, which is implemented and commissioned over the world’s largest computing Grid infrastructure, Worldwide LHC Computing Grid (WLCG). In this paper, we present the integration strategy and infrastructure changes needed in order to transparently integrate end-user Cloud storage with...

  19. Advantages of Integrative Data Analysis for Developmental Research

    Science.gov (United States)

    Bainter, Sierra A.; Curran, Patrick J.

    2015-01-01

    Amid recent progress in cognitive development research, high-quality data resources are accumulating, and data sharing and secondary data analysis are becoming increasingly valuable tools. Integrative data analysis (IDA) is an exciting analytical framework that can enhance secondary data analysis in powerful ways. IDA pools item-level data across…

  20. Characteristic Value Method of Well Test Analysis for Horizontal Gas Well

    Directory of Open Access Journals (Sweden)

    Xiao-Ping Li

    2014-01-01

    Full Text Available This paper presents a study of characteristic value method of well test analysis for horizontal gas well. Owing to the complicated seepage flow mechanism in horizontal gas well and the difficulty in the analysis of transient pressure test data, this paper establishes the mathematical models of well test analysis for horizontal gas well with different inner and outer boundary conditions. On the basis of obtaining the solutions of the mathematical models, several type curves are plotted with Stehfest inversion algorithm. For gas reservoir with closed outer boundary in vertical direction and infinite outer boundary in horizontal direction, while considering the effect of wellbore storage and skin effect, the pseudopressure behavior of the horizontal gas well can manifest four characteristic periods: pure wellbore storage period, early vertical radial flow period, early linear flow period, and late horizontal pseudoradial flow period. For gas reservoir with closed outer boundary both in vertical and horizontal directions, the pseudopressure behavior of the horizontal gas well adds the pseudosteady state flow period which appears after the boundary response. For gas reservoir with closed outer boundary in vertical direction and constant pressure outer boundary in horizontal direction, the pseudopressure behavior of the horizontal gas well adds the steady state flow period which appears after the boundary response. According to the characteristic lines which are manifested by pseudopressure derivative curve of each flow period, formulas are developed to obtain horizontal permeability, vertical permeability, skin factor, reservoir pressure, and pore volume of the gas reservoir, and thus the characteristic value method of well test analysis for horizontal gas well is established. Finally, the example study verifies that the new method is reliable. Characteristic value method of well test analysis for horizontal gas well makes the well test analysis

  1. ANALYSIS OF ENVIRONMENTAL FRAGILITY USING MULTI-CRITERIA ANALYSIS (MCE FOR INTEGRATED LANDSCAPE ASSESSMENT

    Directory of Open Access Journals (Sweden)

    Abimael Cereda Junior

    2014-01-01

    Full Text Available The Geographic Information Systems brought greater possibilitie s to the representation and interpretation of the landscap e as well as the integrated a nalysis. However, this approach does not dispense technical and methodological substan tiation for achieving the computational universe. This work is grounded in ecodynamic s and empirical analysis of natural and anthr opogenic environmental Fragility a nd aims to propose and present an integrated paradigm of Multi-criteria Analysis and F uzzy Logic Model of Environmental Fragility, taking as a case study of the Basin of Monjolinho Stream in São Carlos-SP. The use of this methodology allowed for a reduct ion in the subjectivism influences of decision criteria, which factors might have its cartographic expression, respecting the complex integrated landscape.

  2. Development of Probabilistic Structural Analysis Integrated with Manufacturing Processes

    Science.gov (United States)

    Pai, Shantaram S.; Nagpal, Vinod K.

    2007-01-01

    An effort has been initiated to integrate manufacturing process simulations with probabilistic structural analyses in order to capture the important impacts of manufacturing uncertainties on component stress levels and life. Two physics-based manufacturing process models (one for powdered metal forging and the other for annular deformation resistance welding) have been linked to the NESSUS structural analysis code. This paper describes the methodology developed to perform this integration including several examples. Although this effort is still underway, particularly for full integration of a probabilistic analysis, the progress to date has been encouraging and a software interface that implements the methodology has been developed. The purpose of this paper is to report this preliminary development.

  3. An Integrated Solution for Performing Thermo-fluid Conjugate Analysis

    Science.gov (United States)

    Kornberg, Oren

    2009-01-01

    A method has been developed which integrates a fluid flow analyzer and a thermal analyzer to produce both steady state and transient results of 1-D, 2-D, and 3-D analysis models. The Generalized Fluid System Simulation Program (GFSSP) is a one dimensional, general purpose fluid analysis code which computes pressures and flow distributions in complex fluid networks. The MSC Systems Improved Numerical Differencing Analyzer (MSC.SINDA) is a one dimensional general purpose thermal analyzer that solves network representations of thermal systems. Both GFSSP and MSC.SINDA have graphical user interfaces which are used to build the respective model and prepare it for analysis. The SINDA/GFSSP Conjugate Integrator (SGCI) is a formbase graphical integration program used to set input parameters for the conjugate analyses and run the models. The contents of this paper describes SGCI and its thermo-fluids conjugate analysis techniques and capabilities by presenting results from some example models including the cryogenic chill down of a copper pipe, a bar between two walls in a fluid stream, and a solid plate creating a phase change in a flowing fluid.

  4. Bayesian Integrated Data Analysis of Fast-Ion Measurements by Velocity-Space Tomography

    DEFF Research Database (Denmark)

    Salewski, M.; Nocente, M.; Jacobsen, A.S.

    2018-01-01

    Bayesian integrated data analysis combines measurements from different diagnostics to jointly measure plasma parameters of interest such as temperatures, densities, and drift velocities. Integrated data analysis of fast-ion measurements has long been hampered by the complexity of the strongly non...... framework. The implementation for different types of diagnostics as well as the uncertainties are discussed, and we highlight the importance of integrated data analysis of all available detectors....

  5. Building-integrated renewable energy policy analysis in China

    Institute of Scientific and Technical Information of China (English)

    姚春妮; 郝斌

    2009-01-01

    With the dramatic development of renewable energy all over the world,and for purpose of adjusting energy structure,the Ministry of Construction of China plans to promote the large scale application of renewable energy in buildings. In order to ensure the validity of policy-making,this work firstly exerts a method to do cost-benefit analysis for three kinds of technologies such as building-integrated solar hot water (BISHW) system,building-integrated photovoltaic (BIPV) technology and ground water heat pump (GWHP). Through selecting a representative city of every climate region,the analysis comes into different results for different climate regions in China and respectively different suggestion for policy-making. On the analysis basis,the Ministry of Construction (MOC) and the Ministry of Finance of China (MOF) united to start-up Building-integrated Renewable Energy Demonstration Projects (BIREDP) in 2006. In the demonstration projects,renewable energy takes place of traditional energy to supply the domestic hot water,electricity,air-conditioning and heating. Through carrying out the demonstration projects,renewable energy related market has been expanded. More and more relative companies and local governments take the opportunity to promote the large scale application of renewable energy in buildings.

  6. Plug cementing: Horizontal to vertical conditions

    Energy Technology Data Exchange (ETDEWEB)

    Calvert, D.G.; Heathman, J.F.; Griffith, J.E.

    1995-12-31

    This paper presents an in-depth study of cement plug placement that was conducted with large-scale models for the improvement of plug cementing practices and plug integrity. Common hole and workstring geometries were examined with various rheology and density ratios between the drilling fluid and cement. The critical conditions dictating the difference between success and failure for various wellbore angles and conditions were explored, and the mechanisms controlling slurry movement before and after placement are now better understood. An understanding of these mechanisms allows the engineer to better tailor a design to specific hole conditions. Controversial concepts regarding plug-setting practices have been examined and resolved. The cumulative effects of density, rheology, and hole angle are major factors affecting plug success. While the Boycott effect and an extrusion effect were observed to be predominant in inclined wellbores, a spiraling or {open_quotes}roping{close_quotes} effect controls slurry movement in vertical wellbores. Ultimate success of a cement plug can be obtained if allowances are made for these effects in the job design, provided all other previously published recommended placement practices are followed. Results of this work can be applied to many sidetracking and plug-to-abandon operations. Additionally, the understanding of the fluid movement (creep) mechanisms holds potential for use in primary and remedial cementing work, and in controlling the placement of noncementitious fluids in the wellbore.

  7. Integrating computer programs for engineering analysis and design

    Science.gov (United States)

    Wilhite, A. W.; Crisp, V. K.; Johnson, S. C.

    1983-01-01

    The design of a third-generation system for integrating computer programs for engineering and design has been developed for the Aerospace Vehicle Interactive Design (AVID) system. This system consists of an engineering data management system, program interface software, a user interface, and a geometry system. A relational information system (ARIS) was developed specifically for the computer-aided engineering system. It is used for a repository of design data that are communicated between analysis programs, for a dictionary that describes these design data, for a directory that describes the analysis programs, and for other system functions. A method is described for interfacing independent analysis programs into a loosely-coupled design system. This method emphasizes an interactive extension of analysis techniques and manipulation of design data. Also, integrity mechanisms exist to maintain database correctness for multidisciplinary design tasks by an individual or a team of specialists. Finally, a prototype user interface program has been developed to aid in system utilization.

  8. The practical implementation of integrated safety management for nuclear safety analysis and fire hazards analysis documentation

    International Nuclear Information System (INIS)

    COLLOPY, M.T.

    1999-01-01

    In 1995 Mr. Joseph DiNunno of the Defense Nuclear Facilities Safety Board issued an approach to describe the concept of an integrated safety management program which incorporates hazard and safety analysis to address a multitude of hazards affecting the public, worker, property, and the environment. Since then the U S . Department of Energy (DOE) has adopted a policy to systematically integrate safety into management and work practices at all levels so that missions can be completed while protecting the public, worker, and the environment. While the DOE and its contractors possessed a variety of processes for analyzing fire hazards at a facility, activity, and job; the outcome and assumptions of these processes have not always been consistent for similar types of hazards within the safety analysis and the fire hazard analysis. Although the safety analysis and the fire hazard analysis are driven by different DOE Orders and requirements, these analyses should not be entirely independent and their preparation should be integrated to ensure consistency of assumptions, consequences, design considerations, and other controls. Under the DOE policy to implement an integrated safety management system, identification of hazards must be evaluated and agreed upon to ensure that the public. the workers. and the environment are protected from adverse consequences. The DOE program and contractor management need a uniform, up-to-date reference with which to plan. budget, and manage nuclear programs. It is crucial that DOE understand the hazards and risks necessarily to authorize the work needed to be performed. If integrated safety management is not incorporated into the preparation of the safety analysis and the fire hazard analysis, inconsistencies between assumptions, consequences, design considerations, and controls may occur that affect safety. Furthermore, confusion created by inconsistencies may occur in the DOE process to grant authorization of the work. In accordance with

  9. Train integrity detection risk analysis based on PRISM

    Science.gov (United States)

    Wen, Yuan

    2018-04-01

    GNSS based Train Integrity Monitoring System (TIMS) is an effective and low-cost detection scheme for train integrity detection. However, as an external auxiliary system of CTCS, GNSS may be influenced by external environments, such as uncertainty of wireless communication channels, which may lead to the failure of communication and positioning. In order to guarantee the reliability and safety of train operation, a risk analysis method of train integrity detection based on PRISM is proposed in this article. First, we analyze the risk factors (in GNSS communication process and the on-board communication process) and model them. Then, we evaluate the performance of the model in PRISM based on the field data. Finally, we discuss how these risk factors influence the train integrity detection process.

  10. Nonlinear structural analysis using integrated force method

    Indian Academy of Sciences (India)

    A new formulation termed the Integrated Force Method (IFM) was proposed by Patnaik ... nated ``Structure (nY m)'' where (nY m) are the force and displacement degrees of ..... Patnaik S N, Yadagiri S 1976 Frequency analysis of structures.

  11. PHIDIAS: a pathogen-host interaction data integration and analysis system

    OpenAIRE

    Xiang, Zuoshuang; Tian, Yuying; He, Yongqun

    2007-01-01

    The Pathogen-Host Interaction Data Integration and Analysis System (PHIDIAS) is a web-based database system that serves as a centralized source to search, compare, and analyze integrated genome sequences, conserved domains, and gene expression data related to pathogen-host interactions (PHIs) for pathogen species designated as high priority agents for public health and biological security. In addition, PHIDIAS allows submission, search and analysis of PHI genes and molecular networks curated ...

  12. The Development and Test of a Sensor for Measurement of the Working Level of Gas-Liquid Two-Phase Flow in a Coalbed Methane Wellbore Annulus.

    Science.gov (United States)

    Wu, Chuan; Ding, Huafeng; Han, Lei

    2018-02-14

    Coalbed methane (CBM) is one kind of clean-burning gas and has been valued as a new form of energy that will be used widely in the near future. When producing CBM, the working level within a CBM wellbore annulus needs to be monitored to dynamically adjust the gas drainage and extraction processes. However, the existing method of measuring the working level does not meet the needs of accurate adjustment, so we designed a new sensor for this purpose. The principle of our sensor is a liquid pressure formula, i.e., the sensor monitors the two-phase flow patterns and obtains the mean density of the two-phase flow according to the pattern recognition result in the first step, and then combines the pressure data of the working level to calculate the working level using the liquid pressure formula. The sensor was tested in both the lab and on site, and the tests showed that the sensor's error was ±8% and that the sensor could function well in practical conditions and remain stable in the long term.

  13. The Development and Test of a Sensor for Measurement of the Working Level of Gas–Liquid Two-Phase Flow in a Coalbed Methane Wellbore Annulus

    Directory of Open Access Journals (Sweden)

    Chuan Wu

    2018-02-01

    Full Text Available Coalbed methane (CBM is one kind of clean-burning gas and has been valued as a new form of energy that will be used widely in the near future. When producing CBM, the working level within a CBM wellbore annulus needs to be monitored to dynamically adjust the gas drainage and extraction processes. However, the existing method of measuring the working level does not meet the needs of accurate adjustment, so we designed a new sensor for this purpose. The principle of our sensor is a liquid pressure formula, i.e., the sensor monitors the two-phase flow patterns and obtains the mean density of the two-phase flow according to the pattern recognition result in the first step, and then combines the pressure data of the working level to calculate the working level using the liquid pressure formula. The sensor was tested in both the lab and on site, and the tests showed that the sensor’s error was ±8% and that the sensor could function well in practical conditions and remain stable in the long term.

  14. The Development and Test of a Sensor for Measurement of the Working Level of Gas–Liquid Two-Phase Flow in a Coalbed Methane Wellbore Annulus

    Science.gov (United States)

    Wu, Chuan; Ding, Huafeng; Han, Lei

    2018-01-01

    Coalbed methane (CBM) is one kind of clean-burning gas and has been valued as a new form of energy that will be used widely in the near future. When producing CBM, the working level within a CBM wellbore annulus needs to be monitored to dynamically adjust the gas drainage and extraction processes. However, the existing method of measuring the working level does not meet the needs of accurate adjustment, so we designed a new sensor for this purpose. The principle of our sensor is a liquid pressure formula, i.e., the sensor monitors the two-phase flow patterns and obtains the mean density of the two-phase flow according to the pattern recognition result in the first step, and then combines the pressure data of the working level to calculate the working level using the liquid pressure formula. The sensor was tested in both the lab and on site, and the tests showed that the sensor’s error was ±8% and that the sensor could function well in practical conditions and remain stable in the long term. PMID:29443871

  15. Integrated tools for control-system analysis

    Science.gov (United States)

    Ostroff, Aaron J.; Proffitt, Melissa S.; Clark, David R.

    1989-01-01

    The basic functions embedded within a user friendly software package (MATRIXx) are used to provide a high level systems approach to the analysis of linear control systems. Various control system analysis configurations are assembled automatically to minimize the amount of work by the user. Interactive decision making is incorporated via menu options and at selected points, such as in the plotting section, by inputting data. There are five evaluations such as the singular value robustness test, singular value loop transfer frequency response, Bode frequency response, steady-state covariance analysis, and closed-loop eigenvalues. Another section describes time response simulations. A time response for random white noise disturbance is available. The configurations and key equations used for each type of analysis, the restrictions that apply, the type of data required, and an example problem are described. One approach for integrating the design and analysis tools is also presented.

  16. Research on Integrated Analysis Method for Equipment and Tactics Based on Intervention Strategy Discussion

    Institute of Scientific and Technical Information of China (English)

    陈超; 张迎新; 毛赤龙

    2012-01-01

    As the increase of the complexity of the information warfare,its intervention strategy needs to be designed in an integrated environment.However,the current research always breaks the internal relation between equipment and tactics,and it is difficult to meet the requirements of their integrated analysis.In this paper,the research status quo of the integrated analysis about equipment and tactics is discussed first,some shortages of the current methods are summarized then,and an evolvement mechanism of the integrated analysis for equipment and tactics is given finally.Based on these,a framework of integrated analysis is proposed.This method's effectiveness is validated by an example.

  17. Locating new uranium occurrence by integrated weighted analysis in Kaladgi basin, Karnataka

    International Nuclear Information System (INIS)

    Sridhar, M.; Chaturvedi, A.K.; Rai, A.K.

    2014-01-01

    This study aims at identifying uranium potential zones by integrated analysis of thematic layer interpreted and derived from airborne radiometric and magnetic data, satellite data along with available ground geochemical data in western part of Kaladgi basin. Integrated weighted analysis of spatial datasets which included airborne radiometric data (eU, eTh and % K conc.), litho-structural map. hydrogeochemical U conc., and geomorphological data pertaining to study area, was attempted. The weightage analysis was done in GIS environment where different spatial dataset were brought on to a single platform and were analyzed by integration

  18. Practical use of the integrated reporting framework – an analysis of the content of integrated reports of selected companies

    Directory of Open Access Journals (Sweden)

    Monika Raulinajtys-Grzybek

    2017-09-01

    Full Text Available Practical use of the integrated reporting framework – an analysis of the content of integrated reports of selected companies The purpose of the article is to provide a research tool for an initial assessment of whether a company’s integrated reports meet the objectives set out in the IIRC Integrated Reporting Framework and its empirical verification. In particular, the research addresses whether the reports meet the goal of improving the quality of information available and covering all factors that influence the organization’s ability to create value. The article uses the theoretical output on the principles of preparing integrated reports and analyzes the content of selected integrated reports. Based on the source analysis, a research tool has been developed for an initial assessment of whether an integrated report fulfills its objectives. It consists of 42 questions that verify the coverage of the defined elements and the implementation of the guiding principles set by the IIRC. For empirical verification of the tool, a comparative analysis was carried out for reports prepared by selected companies operating in the utilities sector. Answering questions from the research tool allows a researcher to formulate conclusions about the implementation of the guiding principles and the completeness of the presentation of the content elements. As a result of the analysis of selected integrated reports, it was stated that various elements of the report are presented with different levels of accuracy in different reports. Reports provide the most complete information on performance and strategy. The information about business model and prospective data is in some cases presented without making a link to other parts of the report – e.g. risks and opportunities, financial data or capitals. The absence of such links limits the ability to claim that an integrated report meets its objectives, since a set of individual reports, each presenting

  19. Groundwater monitoring of hydraulic fracturing in California: Recommendations for permit-required monitoring

    Science.gov (United States)

    Esser, B. K.; Beller, H. R.; Carroll, S.; Cherry, J. A.; Jackson, R. B.; Jordan, P. D.; Madrid, V.; Morris, J.; Parker, B. L.; Stringfellow, W. T.; Varadharajan, C.; Vengosh, A.

    2015-12-01

    California recently passed legislation mandating dedicated groundwater quality monitoring for new well stimulation operations. The authors provided the State with expert advice on the design of such monitoring networks. Factors that must be considered in designing a new and unique groundwater monitoring program include: Program design: The design of a monitoring program is contingent on its purpose, which can range from detection of individual well leakage to demonstration of regional impact. The regulatory goals for permit-required monitoring conducted by operators on a well-by-well basis will differ from the scientific goals of a regional monitoring program conducted by the State. Vulnerability assessment: Identifying factors that increase the probability of transport of fluids from the hydrocarbon target zone to a protected groundwater zone enables the intensity of permit-required monitoring to be tiered by risk and also enables prioritization of regional monitoring of groundwater basins based on vulnerability. Risk factors include well integrity; proximity to existing wellbores and geologic features; wastewater disposal; vertical separation between the hydrocarbon and groundwater zones; and site-specific hydrogeology. Analyte choice: The choice of chemical analytes in a regulatory monitoring program is guided by the goals of detecting impact, assuring public safety, preventing resource degradation, and minimizing cost. Balancing these goals may be best served by tiered approach in which targeted analysis of specific chemical additives is triggered by significant changes in relevant but more easily analyzed constituents. Such an approach requires characterization of baseline conditions, especially in areas with long histories of oil and gas development. Monitoring technology: Monitoring a deep subsurface process or a long wellbore is more challenging than monitoring a surface industrial source. The requirement for monitoring multiple groundwater aquifers across

  20. Analysis of Optimal Operation of an Energy Integrated Distillation Plant

    DEFF Research Database (Denmark)

    Li, Hong Wen; Hansen, C.A.; Gani, Rafiqul

    2003-01-01

    The efficiency of manufacturing systems can be significantly increased through diligent application of control based on mathematical models thereby enabling more tight integration of decision making with systems operation. In the present paper analysis of optimal operation of an energy integrated...

  1. Integrated program of using of Probabilistic Safety Analysis in Spain

    International Nuclear Information System (INIS)

    1998-01-01

    Since 25 June 1986, when the CSN (Nuclear Safety Conseil) approve the Integrated Program of Probabilistic Safety Analysis, this program has articulated the main activities of CSN. This document summarize the activities developed during these years and reviews the Integrated programme

  2. Containment integrity analysis with SAMPSON/DCRA module

    International Nuclear Information System (INIS)

    Hosoda, Seigo; Shirakawa, Noriyuki; Naitoh, Masanori

    2006-01-01

    The integrity of PWR containment under a severe accident is analyzed using the debris concrete reaction analysis code. If core fuels melt through the pressure vessel and the debris accumulates on the reactor cavity of a lower part of containment, its temperature continues to rise due to decay heat and the debris ablates the concrete floor. In case that cooling water is issued into the containment cavity and the amount of debris is limited to 30% of core fuels, our analyses showed that the debris could be cooled and frozen so that integrity of containment could hold. (author)

  3. Measure and integral an introduction to real analysis

    CERN Document Server

    Wheeden, Richard L

    2015-01-01

    Now considered a classic text on the topic, Measure and Integral: An Introduction to Real Analysis provides an introduction to real analysis by first developing the theory of measure and integration in the simple setting of Euclidean space, and then presenting a more general treatment based on abstract notions characterized by axioms and with less geometric content.Published nearly forty years after the first edition, this long-awaited Second Edition also:Studies the Fourier transform of functions in the spaces L1, L2, and Lp, 1 p Shows the Hilbert transform to be a bounded operator on L2, as an application of the L2 theory of the Fourier transform in the one-dimensional caseCovers fractional integration and some topics related to mean oscillation properties of functions, such as the classes of Hölder continuous functions and the space of functions of bounded mean oscillationDerives a subrepresentation formula, which in higher dimensions plays a role roughly similar to the one played by the fundamental theor...

  4. Semantic integration of gene expression analysis tools and data sources using software connectors

    Science.gov (United States)

    2013-01-01

    Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools

  5. Analysis of the efficiency-integration nexus of Japanese stock market

    Science.gov (United States)

    Rizvi, Syed Aun R.; Arshad, Shaista

    2017-03-01

    This paper attempts a novel approach in analysing the Japanese economy through a dual-dimension analysis of its stock market, examining the efficiency and market integration. Taking a period of 24 years, this study employs MFDFA and MGARCH to understand how the efficiency and integration of the stock market faired during different business cycle phases of the Japanese economy. The results showed improving efficiency over the time period. For the case of market integration, our findings conform to recent literature on business cycles and stock market integration that every succeeding recession creates a break into integration levels resulting in a decrease.

  6. IMG: the integrated microbial genomes database and comparative analysis system

    Science.gov (United States)

    Markowitz, Victor M.; Chen, I-Min A.; Palaniappan, Krishna; Chu, Ken; Szeto, Ernest; Grechkin, Yuri; Ratner, Anna; Jacob, Biju; Huang, Jinghua; Williams, Peter; Huntemann, Marcel; Anderson, Iain; Mavromatis, Konstantinos; Ivanova, Natalia N.; Kyrpides, Nikos C.

    2012-01-01

    The Integrated Microbial Genomes (IMG) system serves as a community resource for comparative analysis of publicly available genomes in a comprehensive integrated context. IMG integrates publicly available draft and complete genomes from all three domains of life with a large number of plasmids and viruses. IMG provides tools and viewers for analyzing and reviewing the annotations of genes and genomes in a comparative context. IMG's data content and analytical capabilities have been continuously extended through regular updates since its first release in March 2005. IMG is available at http://img.jgi.doe.gov. Companion IMG systems provide support for expert review of genome annotations (IMG/ER: http://img.jgi.doe.gov/er), teaching courses and training in microbial genome analysis (IMG/EDU: http://img.jgi.doe.gov/edu) and analysis of genomes related to the Human Microbiome Project (IMG/HMP: http://www.hmpdacc-resources.org/img_hmp). PMID:22194640

  7. TRAC-CFD code integration and its application to containment analysis

    International Nuclear Information System (INIS)

    Tahara, M.; Arai, K.; Oikawa, H.

    2004-01-01

    Several safety systems utilizing natural driving force have been recently adopted for operating reactors, or applied to next-generation reactor design. Examples of these safety systems are the Passive Containment Cooling System (PCCS) and the Drywell Cooler (DWC) for removing decay heat, and the Passive Auto-catalytic Recombiner (PAR) for removing flammable gas in reactor containment during an accident. DWC is used in almost all Boiling Water Reactors (BWR) in service. PAR has been introduced for some reactors in Europe and will be introduced for Japanese reactors. PCCS is a safety device of next-generation BWR. The functional mechanism of these safety systems is closely related to the transient of the thermal-hydraulic condition of the containment atmosphere. The performance depends on the containment atmospheric condition, which is eventually affected by the mass and energy changes caused by the safety system. Therefore, the thermal fluid dynamics in the containment vessel should be appropriately considered in detail to properly estimate the performance of these systems. A computational fluid dynamics (CFD) code is useful for evaluating detailed thermal hydraulic behavior related to this equipment. However, it also requires a considerable amount of computational resources when it is applied to whole containment system transient analysis. The paper describes the method and structure of the integrated analysis tool, and discusses the results of its application to the start-up behavior analysis of a containment cooling system, a drywell local cooler. The integrated analysis code was developed and applied to estimate the DWC performance during a severe accident. The integrated analysis tool is composed of three codes, TRAC-PCV, CFD-DW and TRAC-CC, and analyzes the interaction of the natural convection and steam condensation of the DWC as well as analyzing the thermal hydraulic transient behavior of the containment vessel during a severe accident in detail. The

  8. Integrated Reliability and Risk Analysis System (IRRAS)

    International Nuclear Information System (INIS)

    Russell, K.D.; McKay, M.K.; Sattison, M.B.; Skinner, N.L.; Wood, S.T.; Rasmuson, D.M.

    1992-01-01

    The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the user the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification. Version 1.0 of the IRRAS program was released in February of 1987. Since that time, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 4.0 and is the subject of this Reference Manual. Version 4.0 of IRRAS provides the same capabilities as Version 1.0 and adds a relational data base facility for managing the data, improved functionality, and improved algorithm performance

  9. Integrated uncertainty analysis using RELAP/SCDAPSIM/MOD4.0

    International Nuclear Information System (INIS)

    Perez, M.; Reventos, F.; Wagner, R.; Allison, C.

    2009-01-01

    The RELAP/SCDAPSIM/MOD4.0 code, designed to predict the behavior of reactor systems during normal and accident conditions, is being developed as part of an international nuclear technology Software Development and Training Program (SDTP). RELAP/SCDAPSIM/MOD4.0, which is the first version of RELAP5 completely rewritten to FORTRAN 90/95/2000 standards, uses the publicly available RELAP5 and SCDAP models in combination with (a) advanced programming and numerical techniques, (b) advanced SDTP-member-developed models for LWR, HWR, and research reactor analysis, and (c) a variety of other member-developed computational packages. One such computational package is an integrated uncertainty analysis package being developed jointly by the Technical University of Catalunya (UPC) and Innovative Systems Software (ISS). The integrated uncertainty analysis approach used in the package uses the following steps: 1. Selection of the plant; 2. Selection of the scenario; 3. Selection of the safety criteria; 4. Identification and ranking of the relevant phenomena based on the safety criteria; 5. Selection of the appropriate code parameters to represent those phenomena; 6. Association of uncertainty by means of Probability Distribution Functions (PDFs) for each selected parameter; 7. Random sampling of the selected parameters according to its PDF and performing multiple computer runs to obtain uncertainty bands with a certain percentile and confidence level; 8. Processing the results of the multiple computer runs to estimate the uncertainty bands for the computed quantities associated with the selected safety criteria. The first four steps are performed by the user prior to the RELAP/SCDAPSIM/MOD4.0 analysis. The remaining steps are included with the MOD4.0 integrated uncertainty analysis (IUA) package. This paper briefly describes the integrated uncertainty analysis package including (a) the features of the package, (b) the implementation of the package into RELAP/SCDAPSIM/MOD4.0, and

  10. Plant-wide integrated equipment monitoring and analysis system

    International Nuclear Information System (INIS)

    Morimoto, C.N.; Hunter, T.A.; Chiang, S.C.

    2004-01-01

    A nuclear power plant equipment monitoring system monitors plant equipment and reports deteriorating equipment conditions. The more advanced equipment monitoring systems can also provide information for understanding the symptoms and diagnosing the root cause of a problem. Maximizing the equipment availability and minimizing or eliminating consequential damages are the ultimate goals of equipment monitoring systems. GE Integrated Equipment Monitoring System (GEIEMS) is designed as an integrated intelligent monitoring and analysis system for plant-wide application for BWR plants. This approach reduces system maintenance efforts and equipment monitoring costs and provides information for integrated planning. This paper describes GEIEMS and how the current system is being upgraded to meet General Electric's vision for plant-wide decision support. (author)

  11. SIGMA: A System for Integrative Genomic Microarray Analysis of Cancer Genomes

    Directory of Open Access Journals (Sweden)

    Davies Jonathan J

    2006-12-01

    Full Text Available Abstract Background The prevalence of high resolution profiling of genomes has created a need for the integrative analysis of information generated from multiple methodologies and platforms. Although the majority of data in the public domain are gene expression profiles, and expression analysis software are available, the increase of array CGH studies has enabled integration of high throughput genomic and gene expression datasets. However, tools for direct mining and analysis of array CGH data are limited. Hence, there is a great need for analytical and display software tailored to cross platform integrative analysis of cancer genomes. Results We have created a user-friendly java application to facilitate sophisticated visualization and analysis such as cross-tumor and cross-platform comparisons. To demonstrate the utility of this software, we assembled array CGH data representing Affymetrix SNP chip, Stanford cDNA arrays and whole genome tiling path array platforms for cross comparison. This cancer genome database contains 267 profiles from commonly used cancer cell lines representing 14 different tissue types. Conclusion In this study we have developed an application for the visualization and analysis of data from high resolution array CGH platforms that can be adapted for analysis of multiple types of high throughput genomic datasets. Furthermore, we invite researchers using array CGH technology to deposit both their raw and processed data, as this will be a continually expanding database of cancer genomes. This publicly available resource, the System for Integrative Genomic Microarray Analysis (SIGMA of cancer genomes, can be accessed at http://sigma.bccrc.ca.

  12. Integrating health and environmental impact analysis

    DEFF Research Database (Denmark)

    Reis, S; Morris, G.; Fleming, L. E.

    2015-01-01

    which addresses human activity in all its social, economic and cultural complexity. The new approach must be integral to, and interactive, with the natural environment. We see the continuing failure to truly integrate human health and environmental impact analysis as deeply damaging, and we propose...... while equally emphasizing the health of the environment, and the growing calls for 'ecological public health' as a response to global environmental concerns now suffusing the discourse in public health. More revolution than evolution, ecological public health will demand new perspectives regarding...... the interconnections among society, the economy, the environment and our health and well-being. Success must be built on collaborations between the disparate scientific communities of the environmental sciences and public health as well as interactions with social scientists, economists and the legal profession...

  13. Integrated sequence analysis. Final report

    International Nuclear Information System (INIS)

    Andersson, K.; Pyy, P.

    1998-02-01

    The NKS/RAK subprojet 3 'integrated sequence analysis' (ISA) was formulated with the overall objective to develop and to test integrated methodologies in order to evaluate event sequences with significant human action contribution. The term 'methodology' denotes not only technical tools but also methods for integration of different scientific disciplines. In this report, we first discuss the background of ISA and the surveys made to map methods in different application fields, such as man machine system simulation software, human reliability analysis (HRA) and expert judgement. Specific event sequences were, after the surveys, selected for application and testing of a number of ISA methods. The event sequences discussed in the report were cold overpressure of BWR, shutdown LOCA of BWR, steam generator tube rupture of a PWR and BWR disturbed signal view in the control room after an external event. Different teams analysed these sequences by using different ISA and HRA methods. Two kinds of results were obtained from the ISA project: sequence specific and more general findings. The sequence specific results are discussed together with each sequence description. The general lessons are discussed under a separate chapter by using comparisons of different case studies. These lessons include areas ranging from plant safety management (design, procedures, instrumentation, operations, maintenance and safety practices) to methodological findings (ISA methodology, PSA,HRA, physical analyses, behavioural analyses and uncertainty assessment). Finally follows a discussion about the project and conclusions are presented. An interdisciplinary study of complex phenomena is a natural way to produce valuable and innovative results. This project came up with structured ways to perform ISA and managed to apply the in practice. The project also highlighted some areas where more work is needed. In the HRA work, development is required for the use of simulators and expert judgement as

  14. Integrated sequence analysis. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, K.; Pyy, P

    1998-02-01

    The NKS/RAK subprojet 3 `integrated sequence analysis` (ISA) was formulated with the overall objective to develop and to test integrated methodologies in order to evaluate event sequences with significant human action contribution. The term `methodology` denotes not only technical tools but also methods for integration of different scientific disciplines. In this report, we first discuss the background of ISA and the surveys made to map methods in different application fields, such as man machine system simulation software, human reliability analysis (HRA) and expert judgement. Specific event sequences were, after the surveys, selected for application and testing of a number of ISA methods. The event sequences discussed in the report were cold overpressure of BWR, shutdown LOCA of BWR, steam generator tube rupture of a PWR and BWR disturbed signal view in the control room after an external event. Different teams analysed these sequences by using different ISA and HRA methods. Two kinds of results were obtained from the ISA project: sequence specific and more general findings. The sequence specific results are discussed together with each sequence description. The general lessons are discussed under a separate chapter by using comparisons of different case studies. These lessons include areas ranging from plant safety management (design, procedures, instrumentation, operations, maintenance and safety practices) to methodological findings (ISA methodology, PSA,HRA, physical analyses, behavioural analyses and uncertainty assessment). Finally follows a discussion about the project and conclusions are presented. An interdisciplinary study of complex phenomena is a natural way to produce valuable and innovative results. This project came up with structured ways to perform ISA and managed to apply the in practice. The project also highlighted some areas where more work is needed. In the HRA work, development is required for the use of simulators and expert judgement as

  15. Advancing Alternative Analysis: Integration of Decision Science

    DEFF Research Database (Denmark)

    Malloy, Timothy F; Zaunbrecher, Virginia M; Batteate, Christina

    2016-01-01

    Decision analysis-a systematic approach to solving complex problems-offers tools and frameworks to support decision making that are increasingly being applied to environmental challenges. Alternatives analysis is a method used in regulation and product design to identify, compare, and evaluate......, and civil society and included experts in toxicology, decision science, alternatives assessment, engineering, and law and policy. Participants were divided into two groups and prompted with targeted questions. Throughout the workshop, the groups periodically came together in plenary sessions to reflect......) engaging the systematic development and evaluation of decision approaches and tools; (2) using case studies to advance the integration of decision analysis into alternatives analysis; (3) supporting transdisciplinary research; and (4) supporting education and outreach efforts....

  16. A Key Event Path Analysis Approach for Integrated Systems

    Directory of Open Access Journals (Sweden)

    Jingjing Liao

    2012-01-01

    Full Text Available By studying the key event paths of probabilistic event structure graphs (PESGs, a key event path analysis approach for integrated system models is proposed. According to translation rules concluded from integrated system architecture descriptions, the corresponding PESGs are constructed from the colored Petri Net (CPN models. Then the definitions of cycle event paths, sequence event paths, and key event paths are given. Whereafter based on the statistic results after the simulation of CPN models, key event paths are found out by the sensitive analysis approach. This approach focuses on the logic structures of CPN models, which is reliable and could be the basis of structured analysis for discrete event systems. An example of radar model is given to characterize the application of this approach, and the results are worthy of trust.

  17. INS integrated motion analysis for autonomous vehicle navigation

    Science.gov (United States)

    Roberts, Barry; Bazakos, Mike

    1991-01-01

    The use of inertial navigation system (INS) measurements to enhance the quality and robustness of motion analysis techniques used for obstacle detection is discussed with particular reference to autonomous vehicle navigation. The approach to obstacle detection used here employs motion analysis of imagery generated by a passive sensor. Motion analysis of imagery obtained during vehicle travel is used to generate range measurements to points within the field of view of the sensor, which can then be used to provide obstacle detection. Results obtained with an INS integrated motion analysis approach are reviewed.

  18. Sensitivity Analysis for Design Optimization Integrated Software Tools, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this proposed project is to provide a new set of sensitivity analysis theory and codes, the Sensitivity Analysis for Design Optimization Integrated...

  19. Double path-integral migration velocity analysis: a real data example

    International Nuclear Information System (INIS)

    Costa, Jessé C; Schleicher, Jörg

    2011-01-01

    Path-integral imaging forms an image with no knowledge of the velocity model by summing over the migrated images obtained for a set of migration velocity models. Double path-integral imaging migration extracts the stationary velocities, i.e. those velocities at which common-image gathers align horizontally, as a byproduct. An application of the technique to a real data set demonstrates that quantitative information about the time migration velocity model can be determined by double path-integral migration velocity analysis. Migrated images using interpolations with different regularizations of the extracted velocities prove the high quality of the resulting time-migration velocity information. The so-obtained velocity model can then be used as a starting model for subsequent velocity analysis tools like migration tomography or other tomographic methods

  20. Cross-Border Trade: An Analysis of Trade and Market Integration ...

    African Journals Online (AJOL)

    An assessment of cross-border trade and market integration reveal that inhabitants of the border areas have become economically, socially and politically integrated in spite of the conflict over the Bakassi Peninsula. Based on empirical analysis, bilateral agreements between Nigeria and Cameroon have made negligible ...

  1. An integrated 3D design, modeling and analysis resource for SSC detector systems

    International Nuclear Information System (INIS)

    DiGiacomo, N.J.; Adams, T.; Anderson, M.K.; Davis, M.; Easom, B.; Gliozzi, J.; Hale, W.M.; Hupp, J.; Killian, K.; Krohn, M.; Leitch, R.; Lajczok, M.; Mason, L.; Mitchell, J.; Pohlen, J.; Wright, T.

    1989-01-01

    Integrated computer aided engineering and design (CAE/CAD) is having a significant impact on the way design, modeling and analysis is performed, from system concept exploration and definition through final design and integration. Experience with integrated CAE/CAD in high technology projects of scale and scope similar to SSC detectors leads them to propose an integrated computer-based design, modeling and analysis resource aimed specifically at SSC detector system development. The resource architecture emphasizes value-added contact with data and efficient design, modeling and analysis of components, sub-systems or systems with fidelity appropriate to the task. They begin with a general examination of the design, modeling and analysis cycle in high technology projects, emphasizing the transition from the classical islands of automation to the integrated CAE/CAD-based approach. They follow this with a discussion of lessons learned from various attempts to design and implement integrated CAE/CAD systems in scientific and engineering organizations. They then consider the requirements for design, modeling and analysis during SSC detector development, and describe an appropriate resource architecture. They close with a report on the status of the resource and present some results that are indicative of its performance. 10 refs., 7 figs

  2. PHIDIAS: a pathogen-host interaction data integration and analysis system.

    Science.gov (United States)

    Xiang, Zuoshuang; Tian, Yuying; He, Yongqun

    2007-01-01

    The Pathogen-Host Interaction Data Integration and Analysis System (PHIDIAS) is a web-based database system that serves as a centralized source to search, compare, and analyze integrated genome sequences, conserved domains, and gene expression data related to pathogen-host interactions (PHIs) for pathogen species designated as high priority agents for public health and biological security. In addition, PHIDIAS allows submission, search and analysis of PHI genes and molecular networks curated from peer-reviewed literature. PHIDIAS is publicly available at http://www.phidias.us.

  3. An Integrated Gait and Balance Analysis System to Define Human Locomotor Control

    Science.gov (United States)

    2016-04-29

    test hypotheses they developed about how people walk. An Integrated Gait and Balance Analysis System to define Human Locomotor Control W911NF-14-R-0009...An Integrated Gait and Balance Analysis System to Define Human Locomotor Control Walking is a complicated task that requires the motor coordination...Gait and Balance Analysis System to Define Human Locomotor Control Report Title Walking is a complicated task that requires the motor coordination across

  4. Integration and segregation in auditory scene analysis

    Science.gov (United States)

    Sussman, Elyse S.

    2005-03-01

    Assessment of the neural correlates of auditory scene analysis, using an index of sound change detection that does not require the listener to attend to the sounds [a component of event-related brain potentials called the mismatch negativity (MMN)], has previously demonstrated that segregation processes can occur without attention focused on the sounds and that within-stream contextual factors influence how sound elements are integrated and represented in auditory memory. The current study investigated the relationship between the segregation and integration processes when they were called upon to function together. The pattern of MMN results showed that the integration of sound elements within a sound stream occurred after the segregation of sounds into independent streams and, further, that the individual streams were subject to contextual effects. These results are consistent with a view of auditory processing that suggests that the auditory scene is rapidly organized into distinct streams and the integration of sequential elements to perceptual units takes place on the already formed streams. This would allow for the flexibility required to identify changing within-stream sound patterns, needed to appreciate music or comprehend speech..

  5. A multilayered integrated sensor for three-dimensional, micro total analysis systems

    International Nuclear Information System (INIS)

    Xiao, Jing; Song, Fuchuan; Seo, Sang-Woo

    2013-01-01

    This paper presents a layer-by-layer integration approach of different functional devices and demonstrates a heterogeneously integrated optical sensor featuring a micro-ring resonator and a high-speed thin-film InGaAs-based photodetector co-integrated with a microfluidic droplet generation device. A thin optical device structure allows a seamless integration with other polymer-based devices on a silicon platform. The integrated sensor successfully demonstrates its transient measurement capability of two-phase liquid flow in a microfluidic droplet generation device. The proposed approach represents an important step toward fully integrated micro total analysis systems. (paper)

  6. Experimental assessment of computer codes used for safety analysis of integral reactors

    Energy Technology Data Exchange (ETDEWEB)

    Falkov, A.A.; Kuul, V.S.; Samoilov, O.B. [OKB Mechanical Engineering, Nizhny Novgorod (Russian Federation)

    1995-09-01

    Peculiarities of integral reactor thermohydraulics in accidents are associated with presence of noncondensable gas in built-in pressurizer, absence of pumped ECCS, use of guard vessel for LOCAs localisation and passive RHRS through in-reactor HX`s. These features defined the main trends in experimental investigations and verification efforts for computer codes applied. The paper reviews briefly the performed experimental investigation of thermohydraulics of AST-500, VPBER600-type integral reactors. The characteristic of UROVEN/MB-3 code for LOCAs analysis in integral reactors and results of its verification are given. The assessment of RELAP5/mod3 applicability for accident analysis in integral reactor is presented.

  7. [Integrated health care organizations: guideline for analysis].

    Science.gov (United States)

    Vázquez Navarrete, M Luisa; Vargas Lorenzo, Ingrid; Farré Calpe, Joan; Terraza Núñez, Rebeca

    2005-01-01

    There has been a tendency recently to abandon competition and to introduce policies that promote collaboration between health providers as a means of improving the efficiency of the system and the continuity of care. A number of countries, most notably the United States, have experienced the integration of health care providers to cover the continuum of care of a defined population. Catalonia has witnessed the steady emergence of increasing numbers of integrated health organisations (IHO) but, unlike the United States, studies on health providers' integration are scarce. As part of a research project currently underway, a guide was developed to study Catalan IHOs, based on a classical literature review and the development of a theoretical framework. The guide proposes analysing the IHO's performance in relation to their final objectives of improving the efficiency and continuity of health care by an analysis of the integration type (based on key characteristics); external elements (existence of other suppliers, type of services' payment mechanisms); and internal elements (model of government, organization and management) that influence integration. Evaluation of the IHO's performance focuses on global strategies and results on coordination of care and efficiency. Two types of coordination are evaluated: information coordination and coordination of care management. Evaluation of the efficiency of the IHO refers to technical and allocative efficiency. This guide may have to be modified for use in the Catalan context.

  8. Application of Stochastic Sensitivity Analysis to Integrated Force Method

    Directory of Open Access Journals (Sweden)

    X. F. Wei

    2012-01-01

    Full Text Available As a new formulation in structural analysis, Integrated Force Method has been successfully applied to many structures for civil, mechanical, and aerospace engineering due to the accurate estimate of forces in computation. Right now, it is being further extended to the probabilistic domain. For the assessment of uncertainty effect in system optimization and identification, the probabilistic sensitivity analysis of IFM was further investigated in this study. A set of stochastic sensitivity analysis formulation of Integrated Force Method was developed using the perturbation method. Numerical examples are presented to illustrate its application. Its efficiency and accuracy were also substantiated with direct Monte Carlo simulations and the reliability-based sensitivity method. The numerical algorithm was shown to be readily adaptable to the existing program since the models of stochastic finite element and stochastic design sensitivity are almost identical.

  9. Integrated Modeling for the James Webb Space Telescope (JWST) Project: Structural Analysis Activities

    Science.gov (United States)

    Johnston, John; Mosier, Mark; Howard, Joe; Hyde, Tupper; Parrish, Keith; Ha, Kong; Liu, Frank; McGinnis, Mark

    2004-01-01

    This paper presents viewgraphs about structural analysis activities and integrated modeling for the James Webb Space Telescope (JWST). The topics include: 1) JWST Overview; 2) Observatory Structural Models; 3) Integrated Performance Analysis; and 4) Future Work and Challenges.

  10. An operator expansion technique for path integral analysis

    International Nuclear Information System (INIS)

    Tsvetkov, I.V.

    1995-01-01

    A new method of path integral analysis in the framework of a power series technique is presented. The method is based on the operator expansion of an exponential. A regular procedure to calculate the correction terms is found. (orig.)

  11. Heater-Integrated Cantilevers for Nano-Samples Thermogravimetric Analysis

    OpenAIRE

    Toffoli, Valeria; Carrato, Sergio; Lee, Dongkyu; Jeon, Sangmin; Lazzarino, Marco

    2013-01-01

    The design and characteristics of a micro-system for thermogravimetric analysis (TGA) in which heater, temperature sensor and mass sensor are integrated into a single device are presented. The system consists of a suspended cantilever that incorporates a microfabricated resistor, used as both heater and thermometer. A three-dimensional finite element analysis was used to define the structure parameters. TGA sensors were fabricated by standard microlithographic techniques and tested using mill...

  12. Deterministic factor analysis: methods of integro-differentiation of non-integral order

    Directory of Open Access Journals (Sweden)

    Valentina V. Tarasova

    2016-12-01

    Full Text Available Objective to summarize the methods of deterministic factor economic analysis namely the differential calculus and the integral method. nbsp Methods mathematical methods for integrodifferentiation of nonintegral order the theory of derivatives and integrals of fractional nonintegral order. Results the basic concepts are formulated and the new methods are developed that take into account the memory and nonlocality effects in the quantitative description of the influence of individual factors on the change in the effective economic indicator. Two methods are proposed for integrodifferentiation of nonintegral order for the deterministic factor analysis of economic processes with memory and nonlocality. It is shown that the method of integrodifferentiation of nonintegral order can give more accurate results compared with standard methods method of differentiation using the first order derivatives and the integral method using the integration of the first order for a wide class of functions describing effective economic indicators. Scientific novelty the new methods of deterministic factor analysis are proposed the method of differential calculus of nonintegral order and the integral method of nonintegral order. Practical significance the basic concepts and formulas of the article can be used in scientific and analytical activity for factor analysis of economic processes. The proposed method for integrodifferentiation of nonintegral order extends the capabilities of the determined factorial economic analysis. The new quantitative method of deterministic factor analysis may become the beginning of quantitative studies of economic agents behavior with memory hereditarity and spatial nonlocality. The proposed methods of deterministic factor analysis can be used in the study of economic processes which follow the exponential law in which the indicators endogenous variables are power functions of the factors exogenous variables including the processes

  13. GO-FLOW methodology. Basic concept and integrated analysis framework for its applications

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi

    2010-01-01

    GO-FLOW methodology is a success oriented system analysis technique, and is capable of evaluating a large system with complex operational sequences. Recently an integrated analysis framework of the GO-FLOW has been developed for the safety evaluation of elevator systems by the Ministry of Land, Infrastructure, Transport and Tourism, Japanese Government. This paper describes (a) an Overview of the GO-FLOW methodology, (b) Procedure of treating a phased mission problem, (c) Common cause failure analysis, (d) Uncertainty analysis, and (e) Integrated analysis framework. The GO-FLOW methodology is a valuable and useful tool for system reliability analysis and has a wide range of applications. (author)

  14. 3-D fracture analysis using a partial-reduced integration scheme

    International Nuclear Information System (INIS)

    Leitch, B.W.

    1987-01-01

    This paper presents details of 3-D elastic-plastic analyses of axially orientated external surface flaw in an internally pressurized thin-walled cylinder and discusses the variation of the J-integral values around the crack tip. A partial-reduced-integration-penalty method is introduced to minimize this variation of the J-integral near the crack tip. Utilizing 3-D symmetry, an eighth segment of a tube containing an elliptically shaped external surface flaw is modelled using 20-noded isoparametric elements. The crack-tip elements are collapsed to form a 1/r stress singularity about the curved crack front. The finite element model is subjected to internal pressure and axial pressure-generated loads. The virtual crack extension method is used to determine linear elastic stress intensity factors from the J-integral results at various points around the crack front. Despite the different material constants and the thinner wall thickness in this analysis, the elastic results compare favourably with those obtained by other researchers. The nonlinear stress-strain behaviour of the tube material is modelled using an incremental theory of plasticity. Variations of the J-integral values around the curved crack front of the 3-D flaw were seen. These variations could not be resolved by neglecting the immediate crack-tip elements J-integral results in favour of the more remote contour paths or else smoothed out when all the path results are averaged. Numerical incompatabilities in the 20-noded 3-D finite elements used to model the surface flaw were found. A partial-reduced integration scheme, using a combination of full and reduced integration elements, is proposed to determine J-integral results for 3-D fracture analyses. This procedure is applied to the analysis of an external semicircular surface flaw projecting halfway into the tube wall thickness. Examples of the J-integral values, before and after the partial-reduced integration method is employed, are given around the

  15. A taxonomy of integral reaction path analysis

    Energy Technology Data Exchange (ETDEWEB)

    Grcar, Joseph F.; Day, Marcus S.; Bell, John B.

    2004-12-23

    W. C. Gardiner observed that achieving understanding through combustion modeling is limited by the ability to recognize the implications of what has been computed and to draw conclusions about the elementary steps underlying the reaction mechanism. This difficulty can be overcome in part by making better use of reaction path analysis in the context of multidimensional flame simulations. Following a survey of current practice, an integral reaction flux is formulated in terms of conserved scalars that can be calculated in a fully automated way. Conditional analyses are then introduced, and a taxonomy for bidirectional path analysis is explored. Many examples illustrate the resulting path analysis and uncover some new results about nonpremixed methane-air laminar jets.

  16. IPAD: the Integrated Pathway Analysis Database for Systematic Enrichment Analysis.

    Science.gov (United States)

    Zhang, Fan; Drabier, Renee

    2012-01-01

    Next-Generation Sequencing (NGS) technologies and Genome-Wide Association Studies (GWAS) generate millions of reads and hundreds of datasets, and there is an urgent need for a better way to accurately interpret and distill such large amounts of data. Extensive pathway and network analysis allow for the discovery of highly significant pathways from a set of disease vs. healthy samples in the NGS and GWAS. Knowledge of activation of these processes will lead to elucidation of the complex biological pathways affected by drug treatment, to patient stratification studies of new and existing drug treatments, and to understanding the underlying anti-cancer drug effects. There are approximately 141 biological human pathway resources as of Jan 2012 according to the Pathguide database. However, most currently available resources do not contain disease, drug or organ specificity information such as disease-pathway, drug-pathway, and organ-pathway associations. Systematically integrating pathway, disease, drug and organ specificity together becomes increasingly crucial for understanding the interrelationships between signaling, metabolic and regulatory pathway, drug action, disease susceptibility, and organ specificity from high-throughput omics data (genomics, transcriptomics, proteomics and metabolomics). We designed the Integrated Pathway Analysis Database for Systematic Enrichment Analysis (IPAD, http://bioinfo.hsc.unt.edu/ipad), defining inter-association between pathway, disease, drug and organ specificity, based on six criteria: 1) comprehensive pathway coverage; 2) gene/protein to pathway/disease/drug/organ association; 3) inter-association between pathway, disease, drug, and organ; 4) multiple and quantitative measurement of enrichment and inter-association; 5) assessment of enrichment and inter-association analysis with the context of the existing biological knowledge and a "gold standard" constructed from reputable and reliable sources; and 6) cross-linking of

  17. A Comprehensive Database and Analysis Framework To Incorporate Multiscale Data Types and Enable Integrated Analysis of Bioactive Polyphenols.

    Science.gov (United States)

    Ho, Lap; Cheng, Haoxiang; Wang, Jun; Simon, James E; Wu, Qingli; Zhao, Danyue; Carry, Eileen; Ferruzzi, Mario G; Faith, Jeremiah; Valcarcel, Breanna; Hao, Ke; Pasinetti, Giulio M

    2018-03-05

    The development of a given botanical preparation for eventual clinical application requires extensive, detailed characterizations of the chemical composition, as well as the biological availability, biological activity, and safety profiles of the botanical. These issues are typically addressed using diverse experimental protocols and model systems. Based on this consideration, in this study we established a comprehensive database and analysis framework for the collection, collation, and integrative analysis of diverse, multiscale data sets. Using this framework, we conducted an integrative analysis of heterogeneous data from in vivo and in vitro investigation of a complex bioactive dietary polyphenol-rich preparation (BDPP) and built an integrated network linking data sets generated from this multitude of diverse experimental paradigms. We established a comprehensive database and analysis framework as well as a systematic and logical means to catalogue and collate the diverse array of information gathered, which is securely stored and added to in a standardized manner to enable fast query. We demonstrated the utility of the database in (1) a statistical ranking scheme to prioritize response to treatments and (2) in depth reconstruction of functionality studies. By examination of these data sets, the system allows analytical querying of heterogeneous data and the access of information related to interactions, mechanism of actions, functions, etc., which ultimately provide a global overview of complex biological responses. Collectively, we present an integrative analysis framework that leads to novel insights on the biological activities of a complex botanical such as BDPP that is based on data-driven characterizations of interactions between BDPP-derived phenolic metabolites and their mechanisms of action, as well as synergism and/or potential cancellation of biological functions. Out integrative analytical approach provides novel means for a systematic integrative

  18. Momentum integral network method for thermal-hydraulic transient analysis

    International Nuclear Information System (INIS)

    Van Tuyle, G.J.

    1983-01-01

    A new momentum integral network method has been developed, and tested in the MINET computer code. The method was developed in order to facilitate the transient analysis of complex fluid flow and heat transfer networks, such as those found in the balance of plant of power generating facilities. The method employed in the MINET code is a major extension of a momentum integral method reported by Meyer. Meyer integrated the momentum equation over several linked nodes, called a segment, and used a segment average pressure, evaluated from the pressures at both ends. Nodal mass and energy conservation determined nodal flows and enthalpies, accounting for fluid compression and thermal expansion

  19. Integration, warehousing, and analysis strategies of Omics data.

    Science.gov (United States)

    Gedela, Srinubabu

    2011-01-01

    "-Omics" is a current suffix for numerous types of large-scale biological data generation procedures, which naturally demand the development of novel algorithms for data storage and analysis. With next generation genome sequencing burgeoning, it is pivotal to decipher a coding site on the genome, a gene's function, and information on transcripts next to the pure availability of sequence information. To explore a genome and downstream molecular processes, we need umpteen results at the various levels of cellular organization by utilizing different experimental designs, data analysis strategies and methodologies. Here comes the need for controlled vocabularies and data integration to annotate, store, and update the flow of experimental data. This chapter explores key methodologies to merge Omics data by semantic data carriers, discusses controlled vocabularies as eXtensible Markup Languages (XML), and provides practical guidance, databases, and software links supporting the integration of Omics data.

  20. Integrative Analysis of Cancer Diagnosis Studies with Composite Penalization

    Science.gov (United States)

    Liu, Jin; Huang, Jian; Ma, Shuangge

    2013-01-01

    Summary In cancer diagnosis studies, high-throughput gene profiling has been extensively conducted, searching for genes whose expressions may serve as markers. Data generated from such studies have the “large d, small n” feature, with the number of genes profiled much larger than the sample size. Penalization has been extensively adopted for simultaneous estimation and marker selection. Because of small sample sizes, markers identified from the analysis of single datasets can be unsatisfactory. A cost-effective remedy is to conduct integrative analysis of multiple heterogeneous datasets. In this article, we investigate composite penalization methods for estimation and marker selection in integrative analysis. The proposed methods use the minimax concave penalty (MCP) as the outer penalty. Under the homogeneity model, the ridge penalty is adopted as the inner penalty. Under the heterogeneity model, the Lasso penalty and MCP are adopted as the inner penalty. Effective computational algorithms based on coordinate descent are developed. Numerical studies, including simulation and analysis of practical cancer datasets, show satisfactory performance of the proposed methods. PMID:24578589

  1. Analysis of Waste Isolation Pilot Plant Samples: Integrated Summary Report

    Energy Technology Data Exchange (ETDEWEB)

    Britt, Phillip F [ORNL

    2015-03-01

    Analysis of Waste Isolation Pilot Plant Samples: Integrated Summary Report. Summaries of conclusions, analytical processes, and analytical results. Analysis of samples taken from the Waste Isolation Pilot Plant (WIPP) near Carlsbad, New Mexico in support of the WIPP Technical Assessment Team (TAT) activities to determine to the extent feasible the mechanisms and chemical reactions that may have resulted in the breach of at least one waste drum and release of waste material in WIPP Panel 7 Room 7 on February 14, 2014. This report integrates and summarizes the results contained in three separate reports, described below, and draws conclusions based on those results. Chemical and Radiochemical Analyses of WIPP Samples R-15 C5 SWB and R16 C-4 Lip; PNNL-24003, Pacific Northwest National Laboratory, December 2014 Analysis of Waste Isolation Pilot Plant (WIPP) Underground and MgO Samples by the Savannah River National Laboratory (SRNL); SRNL-STI-2014-00617; Savannah River National Laboratory, December 2014 Report for WIPP UG Sample #3, R15C5 (9/3/14); LLNL-TR-667015; Lawrence Livermore National Laboratory, January 2015 This report is also contained in the Waste Isolation Pilot Plant Technical Assessment Team Report; SRNL-RP-2015-01198; Savannah River National Laboratory, March 17, 2015, as Appendix C: Analysis Integrated Summary Report.

  2. Integrated Risk-Capability Analysis under Deep Uncertainty : An ESDMA Approach

    NARCIS (Netherlands)

    Pruyt, E.; Kwakkel, J.H.

    2012-01-01

    Integrated risk-capability analysis methodologies for dealing with increasing degrees of complexity and deep uncertainty are urgently needed in an ever more complex and uncertain world. Although scenario approaches, risk assessment methods, and capability analysis methods are used, few organizations

  3. Integrated dynamic modeling and management system mission analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, A.K.

    1994-12-28

    This document summarizes the mission analysis performed on the Integrated Dynamic Modeling and Management System (IDMMS). The IDMMS will be developed to provide the modeling and analysis capability required to understand the TWRS system behavior in terms of the identified TWRS performance measures. The IDMMS will be used to demonstrate in a verified and validated manner the satisfactory performance of the TWRS system configuration and assurance that the requirements have been satisfied.

  4. Integrated dynamic modeling and management system mission analysis

    International Nuclear Information System (INIS)

    Lee, A.K.

    1994-01-01

    This document summarizes the mission analysis performed on the Integrated Dynamic Modeling and Management System (IDMMS). The IDMMS will be developed to provide the modeling and analysis capability required to understand the TWRS system behavior in terms of the identified TWRS performance measures. The IDMMS will be used to demonstrate in a verified and validated manner the satisfactory performance of the TWRS system configuration and assurance that the requirements have been satisfied

  5. J-integral evaluation and stability analysis in the unstable ductile fracture

    International Nuclear Information System (INIS)

    Miyoshi, Toshiro; Yoshida, Yuichiro; Shiratori, Masaki.

    1984-01-01

    Concerning unstable ductile fracture, which is an important problem on the structural stability of line pipes, nuclear reactor piping and so on, the research on fracture mechanics parameters which control the beginning of the stable growth and unstable growth of cracks attracts interest. At present, as the parameters, the T-modulus based on J-integral crack tip opening angle, crack opening angle averaged over crack developing part, plastic work coefficient and so on have been proposed. The research on the effectiveness and inter-relation of these parameters is divided into generation phase and application phase, and by these researches, it was reported that all T-modulus, CTOA and COA took almost constant values in relation to crack development, except initial transition period. In order to decide which parameter is most appropriate, the detailed analysis is required. In this study, the analysis of unstable ductile fracture of a central crack test piece and a small tensile test piece was carried out by finite element method, and the evaluation of J-integral in relation to crack development, J-integral resistance value when COA is assumed to be a constant, the form of an unstable fracture occurring point and the compliance dependence were examined. The method of analysis, the evaluation of J-integral, J-integral resistance value, unstable fracture occurring point and stability diagram are described. (Kako, I.)

  6. Simulation analysis of globally integrated logistics and recycling strategies

    Energy Technology Data Exchange (ETDEWEB)

    Song, S.J.; Hiroshi, K. [Hiroshima Inst. of Tech., Graduate School of Mechanical Systems Engineering, Dept. of In formation and Intelligent Systems Engineering, Hiroshima (Japan)

    2004-07-01

    This paper focuses on the optimal analysis of world-wide recycling activities associated with managing the logistics and production activities in global manufacturing whose activities stretch across national boundaries. Globally integrated logistics and recycling strategies consist of the home country and two free trading economic blocs, NAFTA and ASEAN, where significant differences are found in production and disassembly cost, tax rates, local content rules and regulations. Moreover an optimal analysis of globally integrated value-chain was developed by applying simulation optimization technique as a decision-making tool. The simulation model was developed and analyzed by using ProModel packages, and the results help to identify some of the appropriate conditions required to make well-performed logistics and recycling plans in world-wide collaborated manufacturing environment. (orig.)

  7. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE), Version 5.0: Integrated Reliability and Risk Analysis System (IRRAS) reference manual. Volume 2

    International Nuclear Information System (INIS)

    Russell, K.D.; Kvarfordt, K.J.; Skinner, N.L.; Wood, S.T.; Rasmuson, D.M.

    1994-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of several microcomputer programs that were developed to create and analyze probabilistic risk assessments (PRAs), primarily for nuclear power plants. The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the use the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification to report generation. Version 1.0 of the IRRAS program was released in February of 1987. Since then, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 5.0 and is the subject of this Reference Manual. Version 5.0 of IRRAS provides the same capabilities as earlier versions and ads the ability to perform location transformations, seismic analysis, and provides enhancements to the user interface as well as improved algorithm performance. Additionally, version 5.0 contains new alphanumeric fault tree and event used for event tree rules, recovery rules, and end state partitioning

  8. Vehicle Integrated Performance Analysis, the VIPA Experience: Reconnecting with Technical Integration

    Science.gov (United States)

    McGhee, David S.

    2005-01-01

    Today's NASA is facing significant challenges and changes. The Exploration initiative indicates a large increase in projects with limited increase in budget. The Columbia report has criticized NASA for its lack of insight and technical integration impacting its ability to provide safety. The Aldridge report is advocating NASA find new ways of doing business. Very early in the Space Launch Initiative (SLI) program a small team of engineers at MSFC were asked to propose a process for performing a system level assessment of a launch vehicle. The request was aimed primarily at providing insight and making NASA a "smart buyer." Out of this effort the VIPA team was created. The difference between the VIPA effort and many integration attempts is that VIPA focuses on using experienced people from various disciplines and a process which focuses them on a technically integrated assessment. Most previous attempts have focused on developing an all encompassing software tool. In addition, VIPA anchored its process formulation in the experience of its members and in early developmental Space Shuttle experience. The primary reference for this is NASA-TP-2001-210092, "Launch Vehicle Design Process: Characterization, Technical Integration, and Lessons Learned," and discussions with its authors. The foundations of VIPA's process are described. The VIPA team also recognized the need to drive detailed analysis earlier in the design process. Analyses and techniques typically done in later design phases, are brought forward using improved computing technology. The intent is to allow the identification of significant sensitivities, trades, and design issues much earlier in the program. This process is driven by the T-model for Technical Integration described in the aforementioned reference. VIPA's approach to performing system level technical integration is discussed in detail. Proposed definitions are offered to clarify this discussion and the general systems integration dialog. VIPA

  9. Study on integrated design and analysis platform of NPP

    International Nuclear Information System (INIS)

    Lu Dongsen; Gao Zuying; Zhou Zhiwei

    2001-01-01

    Many calculation software have been developed to nuclear system's design and safety analysis, such as structure design software, fuel design and manage software, thermal hydraulic analysis software, severe accident simulation software, etc. This study integrates those software to a platform, develops visual modeling tool for Retran, NGFM90. And in this platform, a distribution calculation method is also provided for couple calculation between different software. The study will improve the design and analysis of NPP

  10. The integrated microbial genome resource of analysis.

    Science.gov (United States)

    Checcucci, Alice; Mengoni, Alessio

    2015-01-01

    Integrated Microbial Genomes and Metagenomes (IMG) is a biocomputational system that allows to provide information and support for annotation and comparative analysis of microbial genomes and metagenomes. IMG has been developed by the US Department of Energy (DOE)-Joint Genome Institute (JGI). IMG platform contains both draft and complete genomes, sequenced by Joint Genome Institute and other public and available genomes. Genomes of strains belonging to Archaea, Bacteria, and Eukarya domains are present as well as those of viruses and plasmids. Here, we provide some essential features of IMG system and case study for pangenome analysis.

  11. Brain network analysis: separating cost from topology using cost-integration.

    Directory of Open Access Journals (Sweden)

    Cedric E Ginestet

    Full Text Available A statistically principled way of conducting brain network analysis is still lacking. Comparison of different populations of brain networks is hard because topology is inherently dependent on wiring cost, where cost is defined as the number of edges in an unweighted graph. In this paper, we evaluate the benefits and limitations associated with using cost-integrated topological metrics. Our focus is on comparing populations of weighted undirected graphs that differ in mean association weight, using global efficiency. Our key result shows that integrating over cost is equivalent to controlling for any monotonic transformation of the weight set of a weighted graph. That is, when integrating over cost, we eliminate the differences in topology that may be due to a monotonic transformation of the weight set. Our result holds for any unweighted topological measure, and for any choice of distribution over cost levels. Cost-integration is therefore helpful in disentangling differences in cost from differences in topology. By contrast, we show that the use of the weighted version of a topological metric is generally not a valid approach to this problem. Indeed, we prove that, under weak conditions, the use of the weighted version of global efficiency is equivalent to simply comparing weighted costs. Thus, we recommend the reporting of (i differences in weighted costs and (ii differences in cost-integrated topological measures with respect to different distributions over the cost domain. We demonstrate the application of these techniques in a re-analysis of an fMRI working memory task. We also provide a Monte Carlo method for approximating cost-integrated topological measures. Finally, we discuss the limitations of integrating topology over cost, which may pose problems when some weights are zero, when multiplicities exist in the ranks of the weights, and when one expects subtle cost-dependent topological differences, which could be masked by cost-integration.

  12. Brain Network Analysis: Separating Cost from Topology Using Cost-Integration

    Science.gov (United States)

    Ginestet, Cedric E.; Nichols, Thomas E.; Bullmore, Ed T.; Simmons, Andrew

    2011-01-01

    A statistically principled way of conducting brain network analysis is still lacking. Comparison of different populations of brain networks is hard because topology is inherently dependent on wiring cost, where cost is defined as the number of edges in an unweighted graph. In this paper, we evaluate the benefits and limitations associated with using cost-integrated topological metrics. Our focus is on comparing populations of weighted undirected graphs that differ in mean association weight, using global efficiency. Our key result shows that integrating over cost is equivalent to controlling for any monotonic transformation of the weight set of a weighted graph. That is, when integrating over cost, we eliminate the differences in topology that may be due to a monotonic transformation of the weight set. Our result holds for any unweighted topological measure, and for any choice of distribution over cost levels. Cost-integration is therefore helpful in disentangling differences in cost from differences in topology. By contrast, we show that the use of the weighted version of a topological metric is generally not a valid approach to this problem. Indeed, we prove that, under weak conditions, the use of the weighted version of global efficiency is equivalent to simply comparing weighted costs. Thus, we recommend the reporting of (i) differences in weighted costs and (ii) differences in cost-integrated topological measures with respect to different distributions over the cost domain. We demonstrate the application of these techniques in a re-analysis of an fMRI working memory task. We also provide a Monte Carlo method for approximating cost-integrated topological measures. Finally, we discuss the limitations of integrating topology over cost, which may pose problems when some weights are zero, when multiplicities exist in the ranks of the weights, and when one expects subtle cost-dependent topological differences, which could be masked by cost-integration. PMID:21829437

  13. The ASDEX integrated data analysis system AIDA

    International Nuclear Information System (INIS)

    Grassie, K.; Gruber, O.; Kardaun, O.; Kaufmann, M.; Lackner, K.; Martin, P.; Mast, K.F.; McCarthy, P.J.; Mertens, V.; Pohl, D.; Rang, U.; Wunderlich, R.

    1989-11-01

    Since about two years, the ASDEX integrated data analysis system (AIDA), which combines the database (DABA) and the statistical analysis system (SAS), is successfully in operation. Besides a considerable, but meaningful, reduction of the 'raw' shot data, it offers the advantage of carefully selected and precisely defined datasets, which are easily accessible for informative tabular data overviews (DABA), and multi-shot analysis (SAS). Even rather complicated, statistical analyses can be performed efficiently within this system. In this report, we want to summarise AIDA's main features, give some details on its set-up and on the physical models which have been used for the derivation of the processed data. We also give short introduction how to use DABA and SAS. (orig.)

  14. INTEGRATION OF FACILITY MODELING CAPABILITIES FOR NUCLEAR NONPROLIFERATION ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Gorensek, M.; Hamm, L.; Garcia, H.; Burr, T.; Coles, G.; Edmunds, T.; Garrett, A.; Krebs, J.; Kress, R.; Lamberti, V.; Schoenwald, D.; Tzanos, C.; Ward, R.

    2011-07-18

    Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come from many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.

  15. Integration of facility modeling capabilities for nuclear nonproliferation analysis

    International Nuclear Information System (INIS)

    Garcia, Humberto; Burr, Tom; Coles, Garill A.; Edmunds, Thomas A.; Garrett, Alfred; Gorensek, Maximilian; Hamm, Luther; Krebs, John; Kress, Reid L.; Lamberti, Vincent; Schoenwald, David; Tzanos, Constantine P.; Ward, Richard C.

    2012-01-01

    Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come from many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.

  16. Integration Of Facility Modeling Capabilities For Nuclear Nonproliferation Analysis

    International Nuclear Information System (INIS)

    Gorensek, M.; Hamm, L.; Garcia, H.; Burr, T.; Coles, G.; Edmunds, T.; Garrett, A.; Krebs, J.; Kress, R.; Lamberti, V.; Schoenwald, D.; Tzanos, C.; Ward, R.

    2011-01-01

    Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come from many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.

  17. Marketing Mix Formulation for Higher Education: An Integrated Analysis Employing Analytic Hierarchy Process, Cluster Analysis and Correspondence Analysis

    Science.gov (United States)

    Ho, Hsuan-Fu; Hung, Chia-Chi

    2008-01-01

    Purpose: The purpose of this paper is to examine how a graduate institute at National Chiayi University (NCYU), by using a model that integrates analytic hierarchy process, cluster analysis and correspondence analysis, can develop effective marketing strategies. Design/methodology/approach: This is primarily a quantitative study aimed at…

  18. Integration of electrochemistry in micro-total analysis systems for biochemical assays: recent developments.

    Science.gov (United States)

    Xu, Xiaoli; Zhang, Song; Chen, Hui; Kong, Jilie

    2009-11-15

    Micro-total analysis systems (microTAS) integrate different analytical operations like sample preparation, separation and detection into a single microfabricated device. With the outstanding advantages of low cost, satisfactory analytical efficiency and flexibility in design, highly integrated and miniaturized devices from the concept of microTAS have gained widespread applications, especially in biochemical assays. Electrochemistry is shown to be quite compatible with microanalytical systems for biochemical assays, because of its attractive merits such as simplicity, rapidity, high sensitivity, reduced power consumption, and sample/reagent economy. This review presents recent developments in the integration of electrochemistry in microdevices for biochemical assays. Ingenious microelectrode design and fabrication methods, and versatility of electrochemical techniques are involved. Practical applications of such integrated microsystem in biochemical assays are focused on in situ analysis, point-of-care testing and portable devices. Electrochemical techniques are apparently suited to microsystems, since easy microfabrication of electrochemical elements and a high degree of integration with multi-analytical functions can be achieved at low cost. Such integrated microsystems will play an increasingly important role for analysis of small volume biochemical samples. Work is in progress toward new microdevice design and applications.

  19. Integration and global analysis of isothermal titration calorimetry data for studying macromolecular interactions.

    Science.gov (United States)

    Brautigam, Chad A; Zhao, Huaying; Vargas, Carolyn; Keller, Sandro; Schuck, Peter

    2016-05-01

    Isothermal titration calorimetry (ITC) is a powerful and widely used method to measure the energetics of macromolecular interactions by recording a thermogram of differential heating power during a titration. However, traditional ITC analysis is limited by stochastic thermogram noise and by the limited information content of a single titration experiment. Here we present a protocol for bias-free thermogram integration based on automated shape analysis of the injection peaks, followed by combination of isotherms from different calorimetric titration experiments into a global analysis, statistical analysis of binding parameters and graphical presentation of the results. This is performed using the integrated public-domain software packages NITPIC, SEDPHAT and GUSSI. The recently developed low-noise thermogram integration approach and global analysis allow for more precise parameter estimates and more reliable quantification of multisite and multicomponent cooperative and competitive interactions. Titration experiments typically take 1-2.5 h each, and global analysis usually takes 10-20 min.

  20. STINGRAY: system for integrated genomic resources and analysis.

    Science.gov (United States)

    Wagner, Glauber; Jardim, Rodrigo; Tschoeke, Diogo A; Loureiro, Daniel R; Ocaña, Kary A C S; Ribeiro, Antonio C B; Emmel, Vanessa E; Probst, Christian M; Pitaluga, André N; Grisard, Edmundo C; Cavalcanti, Maria C; Campos, Maria L M; Mattoso, Marta; Dávila, Alberto M R

    2014-03-07

    The STINGRAY system has been conceived to ease the tasks of integrating, analyzing, annotating and presenting genomic and expression data from Sanger and Next Generation Sequencing (NGS) platforms. STINGRAY includes: (a) a complete and integrated workflow (more than 20 bioinformatics tools) ranging from functional annotation to phylogeny; (b) a MySQL database schema, suitable for data integration and user access control; and (c) a user-friendly graphical web-based interface that makes the system intuitive, facilitating the tasks of data analysis and annotation. STINGRAY showed to be an easy to use and complete system for analyzing sequencing data. While both Sanger and NGS platforms are supported, the system could be faster using Sanger data, since the large NGS datasets could potentially slow down the MySQL database usage. STINGRAY is available at http://stingray.biowebdb.org and the open source code at http://sourceforge.net/projects/stingray-biowebdb/.

  1. Comparative Studies of Traditional (Non-Energy Integration and Energy Integration of Catalytic Reforming Unit using Pinch Analysis

    Directory of Open Access Journals (Sweden)

    M. Alta

    2012-12-01

    Full Text Available Energy Integration of Catalytic Reforming Unit (CRU of Kaduna Refinery and petrochemicals Company Kaduna Nigeria was carried out using Pinch Technology. The pinch analysis was carried out using Maple. Optimum minimum approach temperature of 20 °C was used to determine the energy target. The pinch point temperature was found to be 278 °C. The utilities targets for the minimum approach temperature were found to be 72711839.47 kJ/hr and 87105834.43 kJ/hr for hot and cold utilities respectively. Pinch analysis as an energy integration technique was found to save more energy and utilities cost than the traditional energy technique. Key words: Pinch point, CRU, Energy Target, Maple

  2. Integrated Network Analysis and Effective Tools in Plant Systems Biology

    Directory of Open Access Journals (Sweden)

    Atsushi eFukushima

    2014-11-01

    Full Text Available One of the ultimate goals in plant systems biology is to elucidate the genotype-phenotype relationship in plant cellular systems. Integrated network analysis that combines omics data with mathematical models has received particular attention. Here we focus on the latest cutting-edge computational advances that facilitate their combination. We highlight (1 network visualization tools, (2 pathway analyses, (3 genome-scale metabolic reconstruction, and (4 the integration of high-throughput experimental data and mathematical models. Multi-omics data that contain the genome, transcriptome, proteome, and metabolome and mathematical models are expected to integrate and expand our knowledge of complex plant metabolisms.

  3. Numerical modelling of cuttings transport in horizontal wells using conventional drilling fluids

    Energy Technology Data Exchange (ETDEWEB)

    Li, Y.; Bjorndalen, E.; Kuru, E. [Alberta Univ., Edmonton, AB (Canada)

    2004-07-01

    Some of the problems associated with poor wellbore cleaning include high drag or torque, slower rate of penetration, formation fractures and difficulty in wellbore steering. Some of the factors that affect cuttings transport include drilling fluid velocity, inclination angle, drilling fluid viscosity and drilling rate. The general practice is to stop drilling when necessary to clean boreholes with viscous pills, pipe rotation or drilling fluid circulation. It is important to predict when drilling should be stopped for remedial wellbore cleaning. This can be accomplished with a transient cuttings transport model which can improve drilling hydraulics, particularly in long horizontal well sections and extended reach (ERD) wells. This paper presents a newly developed 1-dimensional transient mechanistic model of cuttings transport with conventional (incompressible) drilling fluids in horizontal wells. The numerically solved model predicts the height of cutting beds as a function of different drilling operational parameters such as fluid flow rate and rheological characteristics, drilling rates, wellbore geometry and drillpipe eccentricity. Sensitivity analysis has demonstrated the effects of these parameters on the efficiency of solids transport. The proposed model can be used in the creation of computer programs designed to optimize drilling fluid rheology and flow rates for horizontal well drilling. 29 refs., 3 tabs., 12 figs.

  4. Polymer-Cement Composites with Self-Healing Ability for Geothermal and Fossil Energy Applications

    Energy Technology Data Exchange (ETDEWEB)

    Childers, M. Ian; Nguyen, Manh-Thuong; Rod, Kenton A.; Koech, Phillip K.; Um, Wooyong; Chun, Jaehun; Glezakou, Vassiliki-Alexandra; Linn, Diana; Roosendaal, Timothy J.; Wietsma, Thomas W.; Huerta, Nicolas John; Kutchko, Barbara G.; Fernandez, Carlos A.

    2017-05-18

    Sealing of wellbores in geothermal and tight oil/gas reservoirs by filling the annulus with cement is a well-established practice. Failure of the cement as a result of physical and/or chemical stress is a common problem with serious environmental and financial consequences. Numerous alternative cement blends have been proposed for the oil and gas industry. Most of these possess poor mechanical properties, or are not designed to work in high temperature environments. This work reports on a novel polymer-cement composite with remarkable self-healing ability that maintains the required properties of typical wellbore cements and may be stable at most geothermal temperatures. We combine for the first time experimental analysis of physical and chemical properties with density functional theory simulations to evaluate cement performance. The thermal stability and mechanical strength are attributed to the formation of a number of chemical interactions between the polymer and cement matrix including covalent bonds, hydrogen bonding, and van der Waals interactions. Self-healing was demonstrated by sealing fractures with 0.3–0.5 mm apertures, 2 orders of magnitude larger than typical wellbore fractures. This polymer-cement composite represents a major advance in wellbore cementing that could improve the environmental safety and economics of enhanced geothermal energy and tight oil/gas production.

  5. Multi-criteria decision analysis integrated with GIS for radio ...

    African Journals Online (AJOL)

    Multi-criteria decision analysis integrated with GIS for radio astronomical observatory site selection in peninsular of Malaysia. R Umar, Z.Z. Abidin, Z.A. Ibrahim, M.K.A. Kamarudin, S.N. Hazmin, A Endut, H Juahir ...

  6. K West integrated water treatment system subproject safety analysis document

    International Nuclear Information System (INIS)

    SEMMENS, L.S.

    1999-01-01

    This Accident Analysis evaluates unmitigated accident scenarios, and identifies Safety Significant and Safety Class structures, systems, and components for the K West Integrated Water Treatment System

  7. K West integrated water treatment system subproject safety analysis document

    Energy Technology Data Exchange (ETDEWEB)

    SEMMENS, L.S.

    1999-02-24

    This Accident Analysis evaluates unmitigated accident scenarios, and identifies Safety Significant and Safety Class structures, systems, and components for the K West Integrated Water Treatment System.

  8. Intentional salt clogging: A novel concept for long-term CO2 sealing

    NARCIS (Netherlands)

    Wasch, L.J.; Wollenweber, J.; Tambach, T.J.

    2013-01-01

    Well abandonment in the context of CO2 storage operations demands a mitigation strategy for CO2 leakage along the wellbore. To prevent possible CO2 transport toward the surface and to protect the wellbore material from contact with acid brine, we propose forming a salt seal around the wellbore at

  9. The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology: Capabilities and Applications

    Science.gov (United States)

    Evers, Ken H.; Bachert, Robert F.

    1987-01-01

    The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.

  10. Vertically integrated analysis of human DNA. Final technical report

    Energy Technology Data Exchange (ETDEWEB)

    Olson, M.

    1997-10-01

    This project has been oriented toward improving the vertical integration of the sequential steps associated with the large-scale analysis of human DNA. The central focus has been on an approach to the preparation of {open_quotes}sequence-ready{close_quotes} maps, which is referred to as multiple-complete-digest (MCD) mapping, primarily directed at cosmid clones. MCD mapping relies on simple experimental steps, supported by advanced image-analysis and map-assembly software, to produce extremely accurate restriction-site and clone-overlap maps. We believe that MCD mapping is one of the few high-resolution mapping systems that has the potential for high-level automation. Successful automation of this process would be a landmark event in genome analysis. Once other higher organisms, paving the way for cost-effective sequencing of these genomes. Critically, MCD mapping has the potential to provide built-in quality control for sequencing accuracy and to make possible a highly integrated end product even if there are large numbers of discontinuities in the actual sequence.

  11. DESIGN ANALYSIS OF ELECTRICAL MACHINES THROUGH INTEGRATED NUMERICAL APPROACH

    Directory of Open Access Journals (Sweden)

    ARAVIND C.V.

    2016-02-01

    Full Text Available An integrated design platform for the newer type of machines is presented in this work. The machine parameters are evaluated out using developed modelling tool. With the machine parameters, the machine is modelled using computer aided tool. The designed machine is brought to simulation tool to perform electromagnetic and electromechanical analysis. In the simulation, conditions setting are performed to setup the materials, meshes, rotational speed and the excitation circuit. Electromagnetic analysis is carried out to predict the behavior of the machine based on the movement of flux in the machines. Besides, electromechanical analysis is carried out to analyse the speed-torque characteristic, the current-torque characteristic and the phase angle-torque characteristic. After all the results are analysed, the designed machine is used to generate S block function that is compatible with MATLAB/SIMULINK tool for the dynamic operational characteristics. This allows the integration of existing drive system into the new machines designed in the modelling tool. An example of the machine design is presented to validate the usage of such a tool.

  12. Integration of risk analysis, land use planning, and cost analysis

    International Nuclear Information System (INIS)

    Rajen, G.; Sanchez, G.

    1994-01-01

    The Department of Energy (DOE) and the Pueblo of San Ildefonso (Pueblo), which is a sovereign Indian tribe, have often been involved in adversarial situations regarding the Los Alamos National Laboratory (LANL). The Pueblo shares a common boundary with the LANL. This paper describes an on-going project that could alter the DOE and the Pueblo's relationship to one of cooperation; and unite the DOE and the Pueblo in a Pollution Prevention/Waste Minimization, and Integrated Risk Analysis and Land Use Planning effort

  13. Analysis of Price Variation and Market Integration of Prosopis ...

    African Journals Online (AJOL)

    Analysis of Price Variation and Market Integration of Prosopis Africana (guill. ... select five markets based on the presence of traders selling the commodity in the markets ... T- test result showed that Prosopis africana seed trade is profitable and ...

  14. Requirement analysis and architecture of data communication system for integral reactor

    International Nuclear Information System (INIS)

    Jeong, K. I.; Kwon, H. J.; Park, J. H.; Park, H. Y.; Koo, I. S.

    2005-05-01

    When digitalizing the Instrumentation and Control(I and C) systems in Nuclear Power Plants(NPP), a communication network is required for exchanging the digitalized data between I and C equipments in a NPP. A requirements analysis and an analysis of design elements and techniques are required for the design of a communication network. Through the requirements analysis of the code and regulation documents such as NUREG/CR-6082, section 7.9 of NUREG 0800 , IEEE Standard 7-4.3.2 and IEEE Standard 603, the extracted requirements can be used as a design basis and design concept for a detailed design of a communication network in the I and C system of an integral reactor. Design elements and techniques such as a physical topology, protocol transmission media and interconnection device should be considered for designing a communication network. Each design element and technique should be analyzed and evaluated as a portion of the integrated communication network design. In this report, the basic design requirements related to the design of communication network are investigated by using the code and regulation documents and an analysis of the design elements and techniques is performed. Based on these investigation and analysis, the overall architecture including the safety communication network and the non-safety communication network is proposed for an integral reactor

  15. Enhancing yeast transcription analysis through integration of heterogeneous data

    DEFF Research Database (Denmark)

    Grotkjær, Thomas; Nielsen, Jens

    2004-01-01

    of Saccharomyces cerevisiae whole genome transcription data. A special focus is on the quantitative aspects of normalisation and mathematical modelling approaches, since they are expected to play an increasing role in future DNA microarray analysis studies. Data analysis is exemplified with cluster analysis......DNA microarray technology enables the simultaneous measurement of the transcript level of thousands of genes. Primary analysis can be done with basic statistical tools and cluster analysis, but effective and in depth analysis of the vast amount of transcription data requires integration with data...... from several heterogeneous data Sources, such as upstream promoter sequences, genome-scale metabolic models, annotation databases and other experimental data. In this review, we discuss how experimental design, normalisation, heterogeneous data and mathematical modelling can enhance analysis...

  16. Delight2 Daylighting Analysis in Energy Plus: Integration and Preliminary User Results

    Energy Technology Data Exchange (ETDEWEB)

    Carroll, William L.; Hitchcock, Robert J.

    2005-04-26

    DElight is a simulation engine for daylight and electric lighting system analysis in buildings. DElight calculates interior illuminance levels from daylight, and the subsequent contribution required from electric lighting to meet a desired interior illuminance. DElight has been specifically designed to integrate with building thermal simulation tools. This paper updates the DElight capability set, the status of integration into the simulation tool EnergyPlus, and describes a sample analysis of a simple model from the user perspective.

  17. Heater-Integrated Cantilevers for Nano-Samples Thermogravimetric Analysis

    Science.gov (United States)

    Toffoli, Valeria; Carrato, Sergio; Lee, Dongkyu; Jeon, Sangmin; Lazzarino, Marco

    2013-01-01

    The design and characteristics of a micro-system for thermogravimetric analysis (TGA) in which heater, temperature sensor and mass sensor are integrated into a single device are presented. The system consists of a suspended cantilever that incorporates a microfabricated resistor, used as both heater and thermometer. A three-dimensional finite element analysis was used to define the structure parameters. TGA sensors were fabricated by standard microlithographic techniques and tested using milli-Q water and polyurethane microcapsule. The results demonstrated that our approach provides a faster and more sensitive TGA with respect to commercial systems.

  18. Analyzing Developing Country Market Integration using Incomplete Price Data and Cluster Analysis

    NARCIS (Netherlands)

    Ansah, I.G.; Gardebroek, Koos; Ihle, R.; Jaletac, M.

    2015-01-01

    Recent global food price developments have spurred renewed interest in analyzing integration of local markets to global markets. A popular approach to quantify market integration is cointegration analysis. However, local market price data often has missing values, outliers, or short and incomplete

  19. A network analysis of leadership theory : the infancy of integration.

    OpenAIRE

    Meuser, J. D.; Gardner, W. L.; Dinh, J. E.; Hu, J.; Liden, R. C.; Lord, R. G.

    2016-01-01

    We investigated the status of leadership theory integration by reviewing 14 years of published research (2000 through 2013) in 10 top journals (864 articles). The authors of these articles examined 49 leadership approaches/theories, and in 293 articles, 3 or more of these leadership approaches were included in their investigations. Focusing on these articles that reflected relatively extensive integration, we applied an inductive approach and used graphic network analysis as a guide for drawi...

  20. Integrated design and performance analysis of the KO HCCR TBM for ITER

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dong Won, E-mail: dwlee@kaeri.re.kr [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Jin, Hyung Gon; Lee, Eo Hwak; Yoon, Jae Sung; Kim, Suk Kwon; Lee, Cheol Woo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Ahn, Mu-Young; Cho, Seungyon [National Fusion Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    Highlights: • Integrated analysis is performed with the conventional CFD code (ANSYS-CFX). • Overall pressure drop and coolant flow scheme are investigated. • Manifold design is being performed considering flow distribution. - Abstract: To develop tritium breeding technology for a Fusion Reactor, Korea has participated in the Test Blanket Module (TBM) program in ITER. The He Cooled Ceramic Reflector (HCCR) TBM consists of functional components such as First Wall (FW), Breeding Zone (BZ), Side Wall (SW), and Back Manifold (BM) and it was designed based on the separate analyses for each component in 2012. Based on the each component analysis model, the integrated model is prepared and thermal-hydraulic analysis for the HCCR TBM is performed in the present study. The coolant flow distribution from BM and SW to FW and BZ, and resulted structure temperatures are obtained with the integrated model. It is found that the non-uniform flow rate occurs at FW and BZ and it causes excess of the design limit (550 °C) at some region. Based on this integrated model, we will perform the design optimization for obtaining uniform flow distribution for satisfying the design requirements.

  1. Stability Analysis and Variational Integrator for Real-Time Formation Based on Potential Field

    Directory of Open Access Journals (Sweden)

    Shengqing Yang

    2014-01-01

    Full Text Available This paper investigates a framework of real-time formation of autonomous vehicles by using potential field and variational integrator. Real-time formation requires vehicles to have coordinated motion and efficient computation. Interactions described by potential field can meet the former requirement which results in a nonlinear system. Stability analysis of such nonlinear system is difficult. Our methodology of stability analysis is discussed in error dynamic system. Transformation of coordinates from inertial frame to body frame can help the stability analysis focus on the structure instead of particular coordinates. Then, the Jacobian of reduced system can be calculated. It can be proved that the formation is stable at the equilibrium point of error dynamic system with the effect of damping force. For consideration of calculation, variational integrator is introduced. It is equivalent to solving algebraic equations. Forced Euler-Lagrange equation in discrete expression is used to construct a forced variational integrator for vehicles in potential field and obstacle environment. By applying forced variational integrator on computation of vehicles' motion, real-time formation of vehicles in obstacle environment can be implemented. Algorithm based on forced variational integrator is designed for a leader-follower formation.

  2. Technology integrated teaching in Malaysian schools: GIS, a SWOT analysis

    Directory of Open Access Journals (Sweden)

    Habibah Lateh, vasugiammai muniandy

    2011-08-01

    , articles and proceeding papers. Researches had been continuously done in integrating GIS into Geography syllabus. Thus, this article describes and discusses the barriers and opportunities of implementing GIS in schools with a deep focus of how GIS could enhance the process of teaching and learning geography. The purpose of the study is to determine the effectiveness of GIS in enhancing students’ interest towards the subject. Barriers that might limit the implementation of GIS in schools also briefly discussedin this article. The capabilities of GIS in schools and teaching with GIS is also a part of this article. SWOT analysis is used to find the strength, threaten, opportunities and weakness of GIS to be integrated in Malaysian schools. A content analysis was performed using articles from local and abroad publications regarding technology integration and GIS. Conference proceedings were also analyzed. This content analysis included 35 articles selected from ICT and GIS publication in Malaysia and abroad. The content analysis was done in order to identify the barriers of trying GIS in schools in Malaysia. Thus, this article discusses strengths, weaknesses, opportunities and threatens. The future of GIS in Malaysian Schools has been added into the conclusion.

  3. Improvements to Integrated Tradespace Analysis of Communications Architectures (ITACA) Network Loading Analysis Tool

    Science.gov (United States)

    Lee, Nathaniel; Welch, Bryan W.

    2018-01-01

    NASA's SCENIC project aims to simplify and reduce the cost of space mission planning by replicating the analysis capabilities of commercially licensed software which are integrated with relevant analysis parameters specific to SCaN assets and SCaN supported user missions. SCENIC differs from current tools that perform similar analyses in that it 1) does not require any licensing fees, 2) will provide an all-in-one package for various analysis capabilities that normally requires add-ons or multiple tools to complete. As part of SCENIC's capabilities, the ITACA network loading analysis tool will be responsible for assessing the loading on a given network architecture and generating a network service schedule. ITACA will allow users to evaluate the quality of service of a given network architecture and determine whether or not the architecture will satisfy the mission's requirements. ITACA is currently under development, and the following improvements were made during the fall of 2017: optimization of runtime, augmentation of network asset pre-service configuration time, augmentation of Brent's method of root finding, augmentation of network asset FOV restrictions, augmentation of mission lifetimes, and the integration of a SCaN link budget calculation tool. The improvements resulted in (a) 25% reduction in runtime, (b) more accurate contact window predictions when compared to STK(Registered Trademark) contact window predictions, and (c) increased fidelity through the use of specific SCaN asset parameters.

  4. The ICVSIE: A General Purpose Integral Equation Method for Bio-Electromagnetic Analysis.

    Science.gov (United States)

    Gomez, Luis J; Yucel, Abdulkadir C; Michielssen, Eric

    2018-03-01

    An internally combined volume surface integral equation (ICVSIE) for analyzing electromagnetic (EM) interactions with biological tissue and wide ranging diagnostic, therapeutic, and research applications, is proposed. The ICVSIE is a system of integral equations in terms of volume and surface equivalent currents in biological tissue subject to fields produced by externally or internally positioned devices. The system is created by using equivalence principles and solved numerically; the resulting current values are used to evaluate scattered and total electric fields, specific absorption rates, and related quantities. The validity, applicability, and efficiency of the ICVSIE are demonstrated by EM analysis of transcranial magnetic stimulation, magnetic resonance imaging, and neuromuscular electrical stimulation. Unlike previous integral equations, the ICVSIE is stable regardless of the electric permittivities of the tissue or frequency of operation, providing an application-agnostic computational framework for EM-biomedical analysis. Use of the general purpose and robust ICVSIE permits streamlining the development, deployment, and safety analysis of EM-biomedical technologies.

  5. Real analysis an introduction to the theory of real functions and integration

    CERN Document Server

    Dshalalow, Jewgeni H

    2000-01-01

    Designed for use in a two-semester course on abstract analysis, REAL ANALYSIS: An Introduction to the Theory of Real Functions and Integration illuminates the principle topics that constitute real analysis. Self-contained, with coverage of topology, measure theory, and integration, it offers a thorough elaboration of major theorems, notions, and constructions needed not only by mathematics students but also by students of statistics and probability, operations research, physics, and engineering.Structured logically and flexibly through the author''s many years of teaching experience, the material is presented in three main sections:Part 1, chapters 1through 3, covers the preliminaries of set theory and the fundamentals of metric spaces and topology. This section can also serves as a text for first courses in topology.Part II, chapter 4 through 7, details the basics of measure and integration and stands independently for use in a separate measure theory course.Part III addresses more advanced topics, includin...

  6. Gas deliverability forecasting - why bother?

    International Nuclear Information System (INIS)

    Trick, M.

    1996-01-01

    According to the author the answer to the question is an unequivocal 'yes' because gas production forecasting is extremely useful for the management and development of a gas field. To model a gas field, one must take into account reservoir performance, sandface inflow performance, wellbore pressure losses, gathering system pressure losses, and field facility performance. The integration of all these factors in a single computer-based model that incorporates proven technology will facilitate the evaluation of various development strategies. A good computer model can help to predict the most cost effective improvement methods, determine economic viability, estimate how much gas is available, evaluate whether drilling wells or adding compression will produce the most reserves, determine optimum placement of compression, evaluate changes to the gathering system, and determine if production from existing wells can be increased by wellbore modifications

  7. Integrating Expert Knowledge with Statistical Analysis for Landslide Susceptibility Assessment at Regional Scale

    Directory of Open Access Journals (Sweden)

    Christos Chalkias

    2016-03-01

    Full Text Available In this paper, an integration landslide susceptibility model by combining expert-based and bivariate statistical analysis (Landslide Susceptibility Index—LSI approaches is presented. Factors related with the occurrence of landslides—such as elevation, slope angle, slope aspect, lithology, land cover, Mean Annual Precipitation (MAP and Peak Ground Acceleration (PGA—were analyzed within a GIS environment. This integrated model produced a landslide susceptibility map which categorized the study area according to the probability level of landslide occurrence. The accuracy of the final map was evaluated by Receiver Operating Characteristics (ROC analysis depending on an independent (validation dataset of landslide events. The prediction ability was found to be 76% revealing that the integration of statistical analysis with human expertise can provide an acceptable landslide susceptibility assessment at regional scale.

  8. Analysis of metabolomic data: tools, current strategies and future challenges for omics data integration.

    Science.gov (United States)

    Cambiaghi, Alice; Ferrario, Manuela; Masseroli, Marco

    2017-05-01

    Metabolomics is a rapidly growing field consisting of the analysis of a large number of metabolites at a system scale. The two major goals of metabolomics are the identification of the metabolites characterizing each organism state and the measurement of their dynamics under different situations (e.g. pathological conditions, environmental factors). Knowledge about metabolites is crucial for the understanding of most cellular phenomena, but this information alone is not sufficient to gain a comprehensive view of all the biological processes involved. Integrated approaches combining metabolomics with transcriptomics and proteomics are thus required to obtain much deeper insights than any of these techniques alone. Although this information is available, multilevel integration of different 'omics' data is still a challenge. The handling, processing, analysis and integration of these data require specialized mathematical, statistical and bioinformatics tools, and several technical problems hampering a rapid progress in the field exist. Here, we review four main tools for number of users or provided features (MetaCoreTM, MetaboAnalyst, InCroMAP and 3Omics) out of the several available for metabolomic data analysis and integration with other 'omics' data, highlighting their strong and weak aspects; a number of related issues affecting data analysis and integration are also identified and discussed. Overall, we provide an objective description of how some of the main currently available software packages work, which may help the experimental practitioner in the choice of a robust pipeline for metabolomic data analysis and integration. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  9. IMG 4 version of the integrated microbial genomes comparative analysis system

    Science.gov (United States)

    Markowitz, Victor M.; Chen, I-Min A.; Palaniappan, Krishna; Chu, Ken; Szeto, Ernest; Pillay, Manoj; Ratner, Anna; Huang, Jinghua; Woyke, Tanja; Huntemann, Marcel; Anderson, Iain; Billis, Konstantinos; Varghese, Neha; Mavromatis, Konstantinos; Pati, Amrita; Ivanova, Natalia N.; Kyrpides, Nikos C.

    2014-01-01

    The Integrated Microbial Genomes (IMG) data warehouse integrates genomes from all three domains of life, as well as plasmids, viruses and genome fragments. IMG provides tools for analyzing and reviewing the structural and functional annotations of genomes in a comparative context. IMG’s data content and analytical capabilities have increased continuously since its first version released in 2005. Since the last report published in the 2012 NAR Database Issue, IMG’s annotation and data integration pipelines have evolved while new tools have been added for recording and analyzing single cell genomes, RNA Seq and biosynthetic cluster data. Different IMG datamarts provide support for the analysis of publicly available genomes (IMG/W: http://img.jgi.doe.gov/w), expert review of genome annotations (IMG/ER: http://img.jgi.doe.gov/er) and teaching and training in the area of microbial genome analysis (IMG/EDU: http://img.jgi.doe.gov/edu). PMID:24165883

  10. IMG 4 version of the integrated microbial genomes comparative analysis system

    Energy Technology Data Exchange (ETDEWEB)

    Markowitz, Victor M. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Biological Data Management and Technology Center. Computational Research Division; Chen, I-Min A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Biological Data Management and Technology Center. Computational Research Division; Palaniappan, Krishna [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Biological Data Management and Technology Center. Computational Research Division; Chu, Ken [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Biological Data Management and Technology Center. Computational Research Division; Szeto, Ernest [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Biological Data Management and Technology Center. Computational Research Division; Pillay, Manoj [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Biological Data Management and Technology Center. Computational Research Division; Ratner, Anna [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Biological Data Management and Technology Center. Computational Research Division; Huang, Jinghua [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Biological Data Management and Technology Center. Computational Research Division; Woyke, Tanja [USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States). Microbial Genome and Metagenome Program; Huntemann, Marcel [USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States). Microbial Genome and Metagenome Program; Anderson, Iain [USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States). Microbial Genome and Metagenome Program; Billis, Konstantinos [USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States). Microbial Genome and Metagenome Program; Varghese, Neha [USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States). Microbial Genome and Metagenome Program; Mavromatis, Konstantinos [USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States). Microbial Genome and Metagenome Program; Pati, Amrita [USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States). Microbial Genome and Metagenome Program; Ivanova, Natalia N. [USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States). Microbial Genome and Metagenome Program; Kyrpides, Nikos C. [USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States). Microbial Genome and Metagenome Program

    2013-10-27

    The Integrated Microbial Genomes (IMG) data warehouse integrates genomes from all three domains of life, as well as plasmids, viruses and genome fragments. IMG provides tools for analyzing and reviewing the structural and functional annotations of genomes in a comparative context. IMG’s data content and analytical capabilities have increased continuously since its first version released in 2005. Since the last report published in the 2012 NAR Database Issue, IMG’s annotation and data integration pipelines have evolved while new tools have been added for recording and analyzing single cell genomes, RNA Seq and biosynthetic cluster data. Finally, different IMG datamarts provide support for the analysis of publicly available genomes (IMG/W: http://img.jgi.doe.gov/w), expert review of genome annotations (IMG/ER: http://img.jgi.doe.gov/er) and teaching and training in the area of microbial genome analysis (IMG/EDU: http://img.jgi.doe.gov/edu).

  11. Integrative Analysis of Metabolic Models – from Structure to Dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Hartmann, Anja, E-mail: hartmann@ipk-gatersleben.de [Leibniz Institute of Plant Genetics and Crop Plant Research (IPK), Gatersleben (Germany); Schreiber, Falk [Monash University, Melbourne, VIC (Australia); Martin-Luther-University Halle-Wittenberg, Halle (Germany)

    2015-01-26

    The characterization of biological systems with respect to their behavior and functionality based on versatile biochemical interactions is a major challenge. To understand these complex mechanisms at systems level modeling approaches are investigated. Different modeling formalisms allow metabolic models to be analyzed depending on the question to be solved, the biochemical knowledge and the availability of experimental data. Here, we describe a method for an integrative analysis of the structure and dynamics represented by qualitative and quantitative metabolic models. Using various formalisms, the metabolic model is analyzed from different perspectives. Determined structural and dynamic properties are visualized in the context of the metabolic model. Interaction techniques allow the exploration and visual analysis thereby leading to a broader understanding of the behavior and functionality of the underlying biological system. The System Biology Metabolic Model Framework (SBM{sup 2} – Framework) implements the developed method and, as an example, is applied for the integrative analysis of the crop plant potato.

  12. MetaMeta: integrating metagenome analysis tools to improve taxonomic profiling.

    Science.gov (United States)

    Piro, Vitor C; Matschkowski, Marcel; Renard, Bernhard Y

    2017-08-14

    Many metagenome analysis tools are presently available to classify sequences and profile environmental samples. In particular, taxonomic profiling and binning methods are commonly used for such tasks. Tools available among these two categories make use of several techniques, e.g., read mapping, k-mer alignment, and composition analysis. Variations on the construction of the corresponding reference sequence databases are also common. In addition, different tools provide good results in different datasets and configurations. All this variation creates a complicated scenario to researchers to decide which methods to use. Installation, configuration and execution can also be difficult especially when dealing with multiple datasets and tools. We propose MetaMeta: a pipeline to execute and integrate results from metagenome analysis tools. MetaMeta provides an easy workflow to run multiple tools with multiple samples, producing a single enhanced output profile for each sample. MetaMeta includes a database generation, pre-processing, execution, and integration steps, allowing easy execution and parallelization. The integration relies on the co-occurrence of organisms from different methods as the main feature to improve community profiling while accounting for differences in their databases. In a controlled case with simulated and real data, we show that the integrated profiles of MetaMeta overcome the best single profile. Using the same input data, it provides more sensitive and reliable results with the presence of each organism being supported by several methods. MetaMeta uses Snakemake and has six pre-configured tools, all available at BioConda channel for easy installation (conda install -c bioconda metameta). The MetaMeta pipeline is open-source and can be downloaded at: https://gitlab.com/rki_bioinformatics .

  13. The Integral A Crux for Analysis

    CERN Document Server

    Krantz, Steven G

    2011-01-01

    This book treats all of the most commonly used theories of the integral. After motivating the idea of integral, we devote a full chapter to the Riemann integral and the next to the Lebesgue integral. Another chapter compares and contrasts the two theories. The concluding chapter offers brief introductions to the Henstock integral, the Daniell integral, the Stieltjes integral, and other commonly used integrals. The purpose of this book is to provide a quick but accurate (and detailed) introduction to all aspects of modern integration theory. It should be accessible to any student who has had ca

  14. Penalized differential pathway analysis of integrative oncogenomics studies.

    Science.gov (United States)

    van Wieringen, Wessel N; van de Wiel, Mark A

    2014-04-01

    Through integration of genomic data from multiple sources, we may obtain a more accurate and complete picture of the molecular mechanisms underlying tumorigenesis. We discuss the integration of DNA copy number and mRNA gene expression data from an observational integrative genomics study involving cancer patients. The two molecular levels involved are linked through the central dogma of molecular biology. DNA copy number aberrations abound in the cancer cell. Here we investigate how these aberrations affect gene expression levels within a pathway using observational integrative genomics data of cancer patients. In particular, we aim to identify differential edges between regulatory networks of two groups involving these molecular levels. Motivated by the rate equations, the regulatory mechanism between DNA copy number aberrations and gene expression levels within a pathway is modeled by a simultaneous-equations model, for the one- and two-group case. The latter facilitates the identification of differential interactions between the two groups. Model parameters are estimated by penalized least squares using the lasso (L1) penalty to obtain a sparse pathway topology. Simulations show that the inclusion of DNA copy number data benefits the discovery of gene-gene interactions. In addition, the simulations reveal that cis-effects tend to be over-estimated in a univariate (single gene) analysis. In the application to real data from integrative oncogenomic studies we show that inclusion of prior information on the regulatory network architecture benefits the reproducibility of all edges. Furthermore, analyses of the TP53 and TGFb signaling pathways between ER+ and ER- samples from an integrative genomics breast cancer study identify reproducible differential regulatory patterns that corroborate with existing literature.

  15. Advancing Alternative Analysis: Integration of Decision Science.

    Science.gov (United States)

    Malloy, Timothy F; Zaunbrecher, Virginia M; Batteate, Christina M; Blake, Ann; Carroll, William F; Corbett, Charles J; Hansen, Steffen Foss; Lempert, Robert J; Linkov, Igor; McFadden, Roger; Moran, Kelly D; Olivetti, Elsa; Ostrom, Nancy K; Romero, Michelle; Schoenung, Julie M; Seager, Thomas P; Sinsheimer, Peter; Thayer, Kristina A

    2017-06-13

    Decision analysis-a systematic approach to solving complex problems-offers tools and frameworks to support decision making that are increasingly being applied to environmental challenges. Alternatives analysis is a method used in regulation and product design to identify, compare, and evaluate the safety and viability of potential substitutes for hazardous chemicals. We assessed whether decision science may assist the alternatives analysis decision maker in comparing alternatives across a range of metrics. A workshop was convened that included representatives from government, academia, business, and civil society and included experts in toxicology, decision science, alternatives assessment, engineering, and law and policy. Participants were divided into two groups and were prompted with targeted questions. Throughout the workshop, the groups periodically came together in plenary sessions to reflect on other groups' findings. We concluded that the further incorporation of decision science into alternatives analysis would advance the ability of companies and regulators to select alternatives to harmful ingredients and would also advance the science of decision analysis. We advance four recommendations: a ) engaging the systematic development and evaluation of decision approaches and tools; b ) using case studies to advance the integration of decision analysis into alternatives analysis; c ) supporting transdisciplinary research; and d ) supporting education and outreach efforts. https://doi.org/10.1289/EHP483.

  16. Integration of Design and Control Through Model Analysis

    DEFF Research Database (Denmark)

    Russel, Boris Mariboe; Henriksen, Jens Peter; Jørgensen, Sten Bay

    2000-01-01

    of the phenomena models representing the process model identify the relationships between the important process and design variables, which help to understand, define and address some of the issues related to integration of design and control issues. The model analysis is highlighted through examples involving...... processes with mass and/or energy recycle. (C) 2000 Elsevier Science Ltd. All rights reserved....

  17. Performance analysis of solar energy integrated with natural-gas-to-methanol process

    International Nuclear Information System (INIS)

    Yang, Sheng; Liu, Zhiqiang; Tang, Zhiyong; Wang, Yifan; Chen, Qianqian; Sun, Yuhan

    2017-01-01

    Highlights: • Solar energy integrated with natural-gas-to-methanol process is proposed. • The two processes are modeled and simulated. • Performance analysis of the two processes are conducted. • The proposed process can cut down the greenhouse gas emission. • The proposed process can save natural gas consumption. - Abstract: Methanol is an important platform chemical. Methanol production using natural gas as raw material has short processing route and well developed equipment and technology. However, natural gas reserves are not large in China. Solar energy power generation system integrated with natural-gas-to-methanol (NGTM) process is developed, which may provide a technical routine for methanol production in the future. The solar energy power generation produces electricity for reforming unit and system consumption in solar energy integrated natural-gas-to-methanol system (SGTM). Performance analysis of conventional natural-gas-to-methanol process and solar energy integrated with natural-gas-to-methanol process are presented based on simulation results. Performance analysis was conducted considering carbon efficiency, production cost, solar energy price, natural gas price, and carbon tax. Results indicate that solar energy integrated with natural-gas-to-methanol process is able to cut down the greenhouse gas (GHG) emission. In addition, solar energy can replace natural gas as fuel. This can reduce the consumption of natural gas, which equals to 9.2% of the total consumed natural gas. However, it is not economical considering the current technology readiness level, compared with conventional natural-gas-to-methanol process.

  18. Heater-Integrated Cantilevers for Nano-Samples Thermogravimetric Analysis

    Directory of Open Access Journals (Sweden)

    Valeria Toffoli

    2013-12-01

    Full Text Available The design and characteristics of a micro-system for thermogravimetric analysis (TGA in which heater, temperature sensor and mass sensor are integrated into a single device are presented. The system consists of a suspended cantilever that incorporates a microfabricated resistor, used as both heater and thermometer. A three-dimensional finite element analysis was used to define the structure parameters. TGA sensors were fabricated by standard microlithographic techniques and tested using milli-Q water and polyurethane microcapsule. The results demonstrated that our approach provides a faster and more sensitive TGA with respect to commercial systems.

  19. Developing a comprehensive framework of community integration for people with acquired brain injury: a conceptual analysis.

    Science.gov (United States)

    Shaikh, Nusratnaaz M; Kersten, Paula; Siegert, Richard J; Theadom, Alice

    2018-03-06

    Despite increasing emphasis on the importance of community integration as an outcome for acquired brain injury (ABI), there is still no consensus on the definition of community integration. The aim of this study was to complete a concept analysis of community integration in people with ABI. The method of concept clarification was used to guide concept analysis of community integration based on a literature review. Articles were included if they explored community integration in people with ABI. Data extraction was performed by the initial coding of (1) the definition of community integration used in the articles, (2) attributes of community integration recognized in the articles' findings, and (3) the process of community integration. This information was synthesized to develop a model of community integration. Thirty-three articles were identified that met the inclusion criteria. The construct of community integration was found to be a non-linear process reflecting recovery over time, sequential goals, and transitions. Community integration was found to encompass six components including: independence, sense of belonging, adjustment, having a place to live, involved in a meaningful occupational activity, and being socially connected into the community. Antecedents to community integration included individual, injury-related, environmental, and societal factors. The findings of this concept analysis suggest that the concept of community integration is more diverse than previously recognized. New measures and rehabilitation plans capturing all attributes of community integration are needed in clinical practice. Implications for rehabilitation Understanding of perceptions and lived experiences of people with acquired brain injury through this analysis provides basis to ensure rehabilitation meets patients' needs. This model highlights the need for clinicians to be aware and assess the role of antecedents as well as the attributes of community integration itself to

  20. Integrated modeling and analysis methodology for precision pointing applications

    Science.gov (United States)

    Gutierrez, Homero L.

    2002-07-01

    Space-based optical systems that perform tasks such as laser communications, Earth imaging, and astronomical observations require precise line-of-sight (LOS) pointing. A general approach is described for integrated modeling and analysis of these types of systems within the MATLAB/Simulink environment. The approach can be applied during all stages of program development, from early conceptual design studies to hardware implementation phases. The main objective is to predict the dynamic pointing performance subject to anticipated disturbances and noise sources. Secondary objectives include assessing the control stability, levying subsystem requirements, supporting pointing error budgets, and performing trade studies. The integrated model resides in Simulink, and several MATLAB graphical user interfaces (GUI"s) allow the user to configure the model, select analysis options, run analyses, and process the results. A convenient parameter naming and storage scheme, as well as model conditioning and reduction tools and run-time enhancements, are incorporated into the framework. This enables the proposed architecture to accommodate models of realistic complexity.

  1. Timing of Formal Phase Safety Reviews for Large-Scale Integrated Hazard Analysis

    Science.gov (United States)

    Massie, Michael J.; Morris, A. Terry

    2010-01-01

    Integrated hazard analysis (IHA) is a process used to identify and control unacceptable risk. As such, it does not occur in a vacuum. IHA approaches must be tailored to fit the system being analyzed. Physical, resource, organizational and temporal constraints on large-scale integrated systems impose additional direct or derived requirements on the IHA. The timing and interaction between engineering and safety organizations can provide either benefits or hindrances to the overall end product. The traditional approach for formal phase safety review timing and content, which generally works well for small- to moderate-scale systems, does not work well for very large-scale integrated systems. This paper proposes a modified approach to timing and content of formal phase safety reviews for IHA. Details of the tailoring process for IHA will describe how to avoid temporary disconnects in major milestone reviews and how to maintain a cohesive end-to-end integration story particularly for systems where the integrator inherently has little to no insight into lower level systems. The proposal has the advantage of allowing the hazard analysis development process to occur as technical data normally matures.

  2. WebGimm: An integrated web-based platform for cluster analysis, functional analysis, and interactive visualization of results.

    Science.gov (United States)

    Joshi, Vineet K; Freudenberg, Johannes M; Hu, Zhen; Medvedovic, Mario

    2011-01-17

    Cluster analysis methods have been extensively researched, but the adoption of new methods is often hindered by technical barriers in their implementation and use. WebGimm is a free cluster analysis web-service, and an open source general purpose clustering web-server infrastructure designed to facilitate easy deployment of integrated cluster analysis servers based on clustering and functional annotation algorithms implemented in R. Integrated functional analyses and interactive browsing of both, clustering structure and functional annotations provides a complete analytical environment for cluster analysis and interpretation of results. The Java Web Start client-based interface is modeled after the familiar cluster/treeview packages making its use intuitive to a wide array of biomedical researchers. For biomedical researchers, WebGimm provides an avenue to access state of the art clustering procedures. For Bioinformatics methods developers, WebGimm offers a convenient avenue to deploy their newly developed clustering methods. WebGimm server, software and manuals can be freely accessed at http://ClusterAnalysis.org/.

  3. Probabilistic Steady-State Operation and Interaction Analysis of Integrated Electricity, Gas and Heating Systems

    Directory of Open Access Journals (Sweden)

    Lun Yang

    2018-04-01

    Full Text Available The existing studies on probabilistic steady-state analysis of integrated energy systems (IES are limited to integrated electricity and gas networks or integrated electricity and heating networks. This paper proposes a probabilistic steady-state analysis of integrated electricity, gas and heating networks (EGH-IES. Four typical operation modes of an EGH-IES are presented at first. The probabilistic energy flow problem of the EGS-IES considering its operation modes and correlated uncertainties in wind/solar power and electricity/gas/heat loads is then formulated and solved by the Monte Carlo method based on Latin hypercube sampling and Nataf transformation. Numerical simulations are conducted on a sample EGH-IES working in the “electricity/gas following heat” mode to verify the probabilistic analysis proposed in this paper and to study the effects of uncertainties and correlations on the operation of the EGH-IES, especially uncertainty transmissions among the subnetworks.

  4. Experimental Investigations into CO2 Interactions with Injection Well Infrastructure for CO2 Storage

    Science.gov (United States)

    Syed, Amer; Shi, Ji-Quan; Durucan, Sevket; Nash, Graham; Korre, Anna

    2013-04-01

    Wellbore integrity is an essential requirement to ensure the success of a CO2 Storage project as leakage of CO2 from the injection or any other abandoned well in the storage complex, could not only severely impede the efficiency of CO2 injection and storage but also may result in potential adverse impact on the surrounding environment. Early research has revealed that in case of improper well completions and/or significant changes in operating bottomhole pressure and temperature could lead to the creation of microannulus at cement-casing interface which may constitute a preferential pathway for potential CO2 leakage during and post injection period. As a part of a European Commission funded CO2CARE project, the current research investigates the sealing behaviour of such microannulus at the cement-casing interface under simulated subsurface reservoir pressure and temperature conditions and uses the findings to develop a methodology to assess the overall integrity of CO2 storage. A full scale wellbore experimental test set up was constructed for use under elevated pressure and temperature conditions as encountered in typical CO2 storage sites. The wellbore cell consists of an assembly of concentric elements of full scale casing (Diameter= 0.1524m), cement sheath and an outer casing. The stainless steel outer ring is intended to simulate the stiffness offered by the reservoir rock to the displacement applied at the wellbore. The Central Loading Mechanism (CLM) consists of four case hardened shoes that can impart radial load onto the well casing. The radial movement of the shoes is powered through the synchronised movement of four precision jacks controlled hydraulically which could impart radial pressures up to 15 MPa. The cell body is a gas tight enclosure that houses the wellbore and the central loading mechanism. The setup is enclosed in a laboratory oven which acts both as temperature and safety enclosure. Prior to a test, cement mix is set between the casing and

  5. Sparse Group Penalized Integrative Analysis of Multiple Cancer Prognosis Datasets

    Science.gov (United States)

    Liu, Jin; Huang, Jian; Xie, Yang; Ma, Shuangge

    2014-01-01

    SUMMARY In cancer research, high-throughput profiling studies have been extensively conducted, searching for markers associated with prognosis. Because of the “large d, small n” characteristic, results generated from the analysis of a single dataset can be unsatisfactory. Recent studies have shown that integrative analysis, which simultaneously analyzes multiple datasets, can be more effective than single-dataset analysis and classic meta-analysis. In most of existing integrative analysis, the homogeneity model has been assumed, which postulates that different datasets share the same set of markers. Several approaches have been designed to reinforce this assumption. In practice, different datasets may differ in terms of patient selection criteria, profiling techniques, and many other aspects. Such differences may make the homogeneity model too restricted. In this study, we assume the heterogeneity model, under which different datasets are allowed to have different sets of markers. With multiple cancer prognosis datasets, we adopt the AFT (accelerated failure time) model to describe survival. This model may have the lowest computational cost among popular semiparametric survival models. For marker selection, we adopt a sparse group MCP (minimax concave penalty) approach. This approach has an intuitive formulation and can be computed using an effective group coordinate descent algorithm. Simulation study shows that it outperforms the existing approaches under both the homogeneity and heterogeneity models. Data analysis further demonstrates the merit of heterogeneity model and proposed approach. PMID:23938111

  6. Environmental science applications with Rapid Integrated Mapping and analysis System (RIMS)

    Science.gov (United States)

    Shiklomanov, A.; Prusevich, A.; Gordov, E.; Okladnikov, I.; Titov, A.

    2016-11-01

    The Rapid Integrated Mapping and analysis System (RIMS) has been developed at the University of New Hampshire as an online instrument for multidisciplinary data visualization, analysis and manipulation with a focus on hydrological applications. Recently it was enriched with data and tools to allow more sophisticated analysis of interdisciplinary data. Three different examples of specific scientific applications with RIMS are demonstrated and discussed. Analysis of historical changes in major components of the Eurasian pan-Arctic water budget is based on historical discharge data, gridded observational meteorological fields, and remote sensing data for sea ice area. Express analysis of the extremely hot and dry summer of 2010 across European Russia is performed using a combination of near-real time and historical data to evaluate the intensity and spatial distribution of this event and its socioeconomic impacts. Integrative analysis of hydrological, water management, and population data for Central Asia over the last 30 years provides an assessment of regional water security due to changes in climate, water use and demography. The presented case studies demonstrate the capabilities of RIMS as a powerful instrument for hydrological and coupled human-natural systems research.

  7. Combination and Integration of Qualitative and Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Philipp Mayring

    2001-02-01

    Full Text Available In this paper, I am going to outline ways of combining qualitative and quantitative steps of analysis on five levels. On the technical level, programs for the computer-aided analysis of qualitative data offer various combinations. Where the data are concerned, the employment of categories (for instance by using qualitative content analysis allows for combining qualitative and quantitative forms of data analysis. On the individual level, the creation of types and the inductive generalisation of cases allow for proceeding from individual case material to quantitative generalisations. As for research design, different models can be distinguished (preliminary study, generalisation, elaboration, triangulation which combine qualitative and quantitative steps of analysis. Where the logic of research is concerned, it can be shown that an extended process model which combined qualitative and quantitative research can be appropriate and thus lead to an integration of the two approaches. URN: urn:nbn:de:0114-fqs010162

  8. Structural integrity analysis of a steam turbine

    International Nuclear Information System (INIS)

    Villagarcia, Maria P.

    1997-01-01

    One of the most critical components of a power utility is the rotor of the steam turbine. Catastrophic failures of the last decades have promoted the development of life assessment procedures for rotors. The present study requires the knowledge of operating conditions, component geometry, the properties of materials, history of the component, size, location and nature of the existing flaws. The aim of the present work is the obtention of a structural integrity analysis procedure for a steam turbine rotor, taking into account the above-mentioned parameters. In this procedure, a stress thermal analysis by finite elements is performed initially, in order to obtain the temperature and stress distribution for a subsequent analysis by fracture mechanics. The risk of a fast fracture due to flaws in the central zone of the rotor is analyzed. The procedure is applied to an operating turbine: the main steam turbine of the Atucha I nuclear power utility. (author)

  9. Integration of targeted health interventions into health systems: a conceptual framework for analysis.

    Science.gov (United States)

    Atun, Rifat; de Jongh, Thyra; Secci, Federica; Ohiri, Kelechi; Adeyi, Olusoji

    2010-03-01

    The benefits of integrating programmes that emphasize specific interventions into health systems to improve health outcomes have been widely debated. This debate has been driven by narrow binary considerations of integrated (horizontal) versus non-integrated (vertical) programmes, and characterized by polarization of views with protagonists for and against integration arguing the relative merits of each approach. The presence of both integrated and non-integrated programmes in many countries suggests benefits to each approach. While the terms 'vertical' and 'integrated' are widely used, they each describe a range of phenomena. In practice the dichotomy between vertical and horizontal is not rigid and the extent of verticality or integration varies between programmes. However, systematic analysis of the relative merits of integration in various contexts and for different interventions is complicated as there is no commonly accepted definition of 'integration'-a term loosely used to describe a variety of organizational arrangements for a range of programmes in different settings. We present an analytical framework which enables deconstruction of the term integration into multiple facets, each corresponding to a critical health system function. Our conceptual framework builds on theoretical propositions and empirical research in innovation studies, and in particular adoption and diffusion of innovations within health systems, and builds on our own earlier empirical research. It brings together the critical elements that affect adoption, diffusion and assimilation of a health intervention, and in doing so enables systematic and holistic exploration of the extent to which different interventions are integrated in varied settings and the reasons for the variation. The conceptual framework and the analytical approach we propose are intended to facilitate analysis in evaluative and formative studies of-and policies on-integration, for use in systematically comparing and

  10. Development of essential system technologies for advanced reactor - Development of natural circulation analysis code for integral reactor

    Energy Technology Data Exchange (ETDEWEB)

    Park, Goon Cherl; Park, Ik Gyu; Kim, Jae Hak; Lee, Sang Min; Kim, Tae Wan [Seoul National University, Seoul (Korea)

    1999-04-01

    The objective of this study is to understand the natural circulation characteristics of integral type reactors and to develope the natural circulation analysis code for integral type reactors. This study is focused on the asymmetric 3-dimensional flow during natural circulation such as 1/4 steam generator section isolation and the inclination of the reactor systems. Natural circulation experiments were done using small-scale facilities of integral reactor SMART (System-Integrated Modular Advanced ReacTor). CFX4 code was used to investigate the flow patterns and thermal mixing phenomena in upper pressure header and downcomer. Differences between normal operation of all steam generators and the 1/4 section isolation conditions were observed and the results were used as the data 1/4 section isolation conditions were observed and the results were used as the data for RETRAN-03/INT code validation. RETRAN-03 code was modified for the development of natural circulation analysis code for integral type reactors, which was development of natural circulation analysis code for integral type reactors, which was named as RETRAN-03/INT. 3-dimensional analysis models for asymmetric flow in integral type reactors were developed using vector momentum equations in RETRAN-03. Analysis results using RETRAN-03/INT were compared with experimental and CFX4 analysis results and showed good agreements. The natural circulation characteristics obtained in this study will provide the important and fundamental design features for the future small and medium integral reactors. (author). 29 refs., 75 figs., 18 tabs.

  11. Application of symplectic integrator to numerical fluid analysis

    International Nuclear Information System (INIS)

    Tanaka, Nobuatsu

    2000-01-01

    This paper focuses on application of the symplectic integrator to numerical fluid analysis. For the purpose, we introduce Hamiltonian particle dynamics to simulate fluid behavior. The method is based on both the Hamiltonian formulation of a system and the particle methods, and is therefore called Hamiltonian Particle Dynamics (HPD). In this paper, an example of HPD applications, namely the behavior of incompressible inviscid fluid, is solved. In order to improve accuracy of HPD with respect to space, CIVA, which is a highly accurate interpolation method, is combined, but the combined method is subject to problems in that the invariants of the system are not conserved in a long-time computation. For solving the problems, symplectic time integrators are introduced and the effectiveness is confirmed by numerical analyses. (author)

  12. Integrated analysis for genotypic adaptation in rice | Das | African ...

    African Journals Online (AJOL)

    Integrated analysis for genotypic adaptation in rice. S Das, RC Misra, MC Pattnaik, SK Sinha. Abstract. Development of varieties with high yield potential coupled with wide adaptability is an important plant breeding objective. The presence of genotype by environment (GxE) interaction plays a crucial role in determining the ...

  13. SVIP-N 1.0: An integrated visualization platform for neutronics analysis

    International Nuclear Information System (INIS)

    Luo Yuetong; Long Pengcheng; Wu Guoyong; Zeng Qin; Hu Liqin; Zou Jun

    2010-01-01

    Post-processing is an important part of neutronics analysis, and SVIP-N 1.0 (scientific visualization integrated platform for neutronics analysis) is designed to ease post-processing of neutronics analysis through visualization technologies. Main capabilities of SVIP-N 1.0 include: (1) ability of manage neutronics analysis result; (2) ability to preprocess neutronics analysis result; (3) ability to visualization neutronics analysis result data in different way. The paper describes the system architecture and main features of SVIP-N, some advanced visualization used in SVIP-N 1.0 and some preliminary applications, such as ITER.

  14. Accelerator physics analysis with an integrated toolkit

    International Nuclear Information System (INIS)

    Holt, J.A.; Michelotti, L.; Satogata, T.

    1992-08-01

    Work is in progress on an integrated software toolkit for linear and nonlinear accelerator design, analysis, and simulation. As a first application, ''beamline'' and ''MXYZPTLK'' (differential algebra) class libraries, were used with an X Windows graphics library to build an user-friendly, interactive phase space tracker which, additionally, finds periodic orbits. This program was used to analyse a theoretical lattice which contains octupoles and decapoles to find the 20th order, stable and unstable periodic orbits and to explore the local phase space structure

  15. Integrative omics analysis. A study based on Plasmodium falciparum mRNA and protein data.

    Science.gov (United States)

    Tomescu, Oana A; Mattanovich, Diethard; Thallinger, Gerhard G

    2014-01-01

    Technological improvements have shifted the focus from data generation to data analysis. The availability of large amounts of data from transcriptomics, protemics and metabolomics experiments raise new questions concerning suitable integrative analysis methods. We compare three integrative analysis techniques (co-inertia analysis, generalized singular value decomposition and integrative biclustering) by applying them to gene and protein abundance data from the six life cycle stages of Plasmodium falciparum. Co-inertia analysis is an analysis method used to visualize and explore gene and protein data. The generalized singular value decomposition has shown its potential in the analysis of two transcriptome data sets. Integrative Biclustering applies biclustering to gene and protein data. Using CIA, we visualize the six life cycle stages of Plasmodium falciparum, as well as GO terms in a 2D plane and interpret the spatial configuration. With GSVD, we decompose the transcriptomic and proteomic data sets into matrices with biologically meaningful interpretations and explore the processes captured by the data sets. IBC identifies groups of genes, proteins, GO Terms and life cycle stages of Plasmodium falciparum. We show method-specific results as well as a network view of the life cycle stages based on the results common to all three methods. Additionally, by combining the results of the three methods, we create a three-fold validated network of life cycle stage specific GO terms: Sporozoites are associated with transcription and transport; merozoites with entry into host cell as well as biosynthetic and metabolic processes; rings with oxidation-reduction processes; trophozoites with glycolysis and energy production; schizonts with antigenic variation and immune response; gametocyctes with DNA packaging and mitochondrial transport. Furthermore, the network connectivity underlines the separation of the intraerythrocytic cycle from the gametocyte and sporozoite stages

  16. LLIMAS: Revolutionizing integrating modeling and analysis at MIT Lincoln Laboratory

    Science.gov (United States)

    Doyle, Keith B.; Stoeckel, Gerhard P.; Rey, Justin J.; Bury, Mark E.

    2017-08-01

    MIT Lincoln Laboratory's Integrated Modeling and Analysis Software (LLIMAS) enables the development of novel engineering solutions for advanced prototype systems through unique insights into engineering performance and interdisciplinary behavior to meet challenging size, weight, power, environmental, and performance requirements. LLIMAS is a multidisciplinary design optimization tool that wraps numerical optimization algorithms around an integrated framework of structural, thermal, optical, stray light, and computational fluid dynamics analysis capabilities. LLIMAS software is highly extensible and has developed organically across a variety of technologies including laser communications, directed energy, photometric detectors, chemical sensing, laser radar, and imaging systems. The custom software architecture leverages the capabilities of existing industry standard commercial software and supports the incorporation of internally developed tools. Recent advances in LLIMAS's Structural-Thermal-Optical Performance (STOP), aeromechanical, and aero-optical capabilities as applied to Lincoln prototypes are presented.

  17. Methodology for dimensional variation analysis of ITER integrated systems

    International Nuclear Information System (INIS)

    Fuentes, F. Javier; Trouvé, Vincent; Cordier, Jean-Jacques; Reich, Jens

    2016-01-01

    Highlights: • Tokamak dimensional management methodology, based on 3D variation analysis, is presented. • Dimensional Variation Model implementation workflow is described. • Methodology phases are described in detail. The application of this methodology to the tolerance analysis of ITER Vacuum Vessel is presented. • Dimensional studies are a valuable tool for the assessment of Tokamak PCR (Project Change Requests), DR (Deviation Requests) and NCR (Non-Conformance Reports). - Abstract: The ITER machine consists of a large number of complex systems highly integrated, with critical functional requirements and reduced design clearances to minimize the impact in cost and performances. Tolerances and assembly accuracies in critical areas could have a serious impact in the final performances, compromising the machine assembly and plasma operation. The management of tolerances allocated to part manufacture and assembly processes, as well as the control of potential deviations and early mitigation of non-compliances with the technical requirements, is a critical activity on the project life cycle. A 3D tolerance simulation analysis of ITER Tokamak machine has been developed based on 3DCS dedicated software. This integrated dimensional variation model is representative of Tokamak manufacturing functional tolerances and assembly processes, predicting accurate values for the amount of variation on critical areas. This paper describes the detailed methodology to implement and update the Tokamak Dimensional Variation Model. The model is managed at system level. The methodology phases are illustrated by its application to the Vacuum Vessel (VV), considering the status of maturity of VV dimensional variation model. The following topics are described in this paper: • Model description and constraints. • Model implementation workflow. • Management of input and output data. • Statistical analysis and risk assessment. The management of the integration studies based on

  18. Methodology for dimensional variation analysis of ITER integrated systems

    Energy Technology Data Exchange (ETDEWEB)

    Fuentes, F. Javier, E-mail: FranciscoJavier.Fuentes@iter.org [ITER Organization, Route de Vinon-sur-Verdon—CS 90046, 13067 St Paul-lez-Durance (France); Trouvé, Vincent [Assystem Engineering & Operation Services, rue J-M Jacquard CS 60117, 84120 Pertuis (France); Cordier, Jean-Jacques; Reich, Jens [ITER Organization, Route de Vinon-sur-Verdon—CS 90046, 13067 St Paul-lez-Durance (France)

    2016-11-01

    Highlights: • Tokamak dimensional management methodology, based on 3D variation analysis, is presented. • Dimensional Variation Model implementation workflow is described. • Methodology phases are described in detail. The application of this methodology to the tolerance analysis of ITER Vacuum Vessel is presented. • Dimensional studies are a valuable tool for the assessment of Tokamak PCR (Project Change Requests), DR (Deviation Requests) and NCR (Non-Conformance Reports). - Abstract: The ITER machine consists of a large number of complex systems highly integrated, with critical functional requirements and reduced design clearances to minimize the impact in cost and performances. Tolerances and assembly accuracies in critical areas could have a serious impact in the final performances, compromising the machine assembly and plasma operation. The management of tolerances allocated to part manufacture and assembly processes, as well as the control of potential deviations and early mitigation of non-compliances with the technical requirements, is a critical activity on the project life cycle. A 3D tolerance simulation analysis of ITER Tokamak machine has been developed based on 3DCS dedicated software. This integrated dimensional variation model is representative of Tokamak manufacturing functional tolerances and assembly processes, predicting accurate values for the amount of variation on critical areas. This paper describes the detailed methodology to implement and update the Tokamak Dimensional Variation Model. The model is managed at system level. The methodology phases are illustrated by its application to the Vacuum Vessel (VV), considering the status of maturity of VV dimensional variation model. The following topics are described in this paper: • Model description and constraints. • Model implementation workflow. • Management of input and output data. • Statistical analysis and risk assessment. The management of the integration studies based on

  19. Global sensitivity analysis of DRAINMOD-FOREST, an integrated forest ecosystem model

    Science.gov (United States)

    Shiying Tian; Mohamed A. Youssef; Devendra M. Amatya; Eric D. Vance

    2014-01-01

    Global sensitivity analysis is a useful tool to understand process-based ecosystem models by identifying key parameters and processes controlling model predictions. This study reported a comprehensive global sensitivity analysis for DRAINMOD-FOREST, an integrated model for simulating water, carbon (C), and nitrogen (N) cycles and plant growth in lowland forests. The...

  20. Integration of rocket turbine design and analysis through computer graphics

    Science.gov (United States)

    Hsu, Wayne; Boynton, Jim

    1988-01-01

    An interactive approach with engineering computer graphics is used to integrate the design and analysis processes of a rocket engine turbine into a progressive and iterative design procedure. The processes are interconnected through pre- and postprocessors. The graphics are used to generate the blade profiles, their stacking, finite element generation, and analysis presentation through color graphics. Steps of the design process discussed include pitch-line design, axisymmetric hub-to-tip meridional design, and quasi-three-dimensional analysis. The viscous two- and three-dimensional analysis codes are executed after acceptable designs are achieved and estimates of initial losses are confirmed.

  1. Integrated information system for analysis of nuclear power plants

    International Nuclear Information System (INIS)

    Galperin, A.

    1994-01-01

    Performing complicated engineering analyses of a nuclear power plant requires storage and manipulation of a large amount of information, both data and knowledge. This information is characterized by its multidisciplinary nature, complexity, and diversity. The problems caused by inefficient and lengthy manual operations involving the data flow management within the frame-work of the safety-related analysis of a power plant can be solved by applying the computer aided engineering principles. These principles are the basis of the design of an integrated information storage system (IRIS). The basic idea is to create a computerized environment, which includes both database and functional capabilities. Consideration and analysis of the data types and required data manipulation capabilities as well as operational requirements, resulted in the choice of an object-oriented data-base management system (OODBMS) as a development platform for solving the software engineering problems. Several advantages of OODBMSs over conventional relations database systems were found of crucial importance, especially providing the necessary flexibility for different data types and extensibility potential. A detailed design of a data model is produced for the plant technical data and for the storage of analysis results. The overall system architecture was designed to assure the feasibility of integrating database capabilities with procedures and functions written in conventional algorithmic programming languages

  2. 10 CFR 70.62 - Safety program and integrated safety analysis.

    Science.gov (United States)

    2010-01-01

    ...; (iv) Potential accident sequences caused by process deviations or other events internal to the... have experience in nuclear criticality safety, radiation safety, fire safety, and chemical process... this safety program; namely, process safety information, integrated safety analysis, and management...

  3. Field Test and Evaluation of Engineered Biomineralization Technology for Sealing Existing Wells

    Energy Technology Data Exchange (ETDEWEB)

    Cunningham, Alfred [Montana State Univ., Bozeman, MT (United States)

    2015-12-21

    This research project addresses one of the goals of the U.S. Department of Energy (DOE) Carbon Storage Program (CSP) aimed at developing Advanced Wellbore Integrity Technologies to Ensure Permanent Geologic Carbon Storage. The technology field-tested in this research project is referred to as microbially induced calcite precipitation (MICP), which utilizes a biologically-based process to precipitate calcium carbonate. If properly controlled MICP can successfully seal fractures, high permeability zones, and compromised wellbore cement in the vicinity of wellbores and in nearby caprock, thereby improving the storage security of geologically-stored carbon dioxide. This report describes an MICP sealing field test performed on a 24.4 cm (9.625 inch) diameter well located on the Gorgas Steam Generation facility near Jasper, Alabama. The research was aimed at (1) developing methods for delivering MICP promoting fluids downhole using conventional oil field technologies and (2) assessing the ability of MICP to seal cement and formation fractures in the near wellbore region in a sandstone formation. Both objectives were accomplished successfully during a field test performed during the period April 1-11, 2014. The test resulted in complete biomineralization sealing of a horizontal fracture located 340.7 m (1118 feet) below ground surface. A total of 24 calcium injections and six microbial inoculation injections were required over a three day period in order to achieve complete sealing. The fractured region was considered completely sealed when it was no longer possible to inject fluids into the formation without exceeding the initial formation fracture pressure. The test was accomplished using conventional oil field technology including an 11.4 L (3.0 gallon) wireline dump bailer for injecting the biomineralization materials downhole. Metrics indicating successful MICP sealing included reduced injectivity during seal formation, reduction in pressure falloff, and

  4. Bidirectional Retroviral Integration Site PCR Methodology and Quantitative Data Analysis Workflow.

    Science.gov (United States)

    Suryawanshi, Gajendra W; Xu, Song; Xie, Yiming; Chou, Tom; Kim, Namshin; Chen, Irvin S Y; Kim, Sanggu

    2017-06-14

    Integration Site (IS) assays are a critical component of the study of retroviral integration sites and their biological significance. In recent retroviral gene therapy studies, IS assays, in combination with next-generation sequencing, have been used as a cell-tracking tool to characterize clonal stem cell populations sharing the same IS. For the accurate comparison of repopulating stem cell clones within and across different samples, the detection sensitivity, data reproducibility, and high-throughput capacity of the assay are among the most important assay qualities. This work provides a detailed protocol and data analysis workflow for bidirectional IS analysis. The bidirectional assay can simultaneously sequence both upstream and downstream vector-host junctions. Compared to conventional unidirectional IS sequencing approaches, the bidirectional approach significantly improves IS detection rates and the characterization of integration events at both ends of the target DNA. The data analysis pipeline described here accurately identifies and enumerates identical IS sequences through multiple steps of comparison that map IS sequences onto the reference genome and determine sequencing errors. Using an optimized assay procedure, we have recently published the detailed repopulation patterns of thousands of Hematopoietic Stem Cell (HSC) clones following transplant in rhesus macaques, demonstrating for the first time the precise time point of HSC repopulation and the functional heterogeneity of HSCs in the primate system. The following protocol describes the step-by-step experimental procedure and data analysis workflow that accurately identifies and quantifies identical IS sequences.

  5. Construction of an integrated database to support genomic sequence analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gilbert, W.; Overbeek, R.

    1994-11-01

    The central goal of this project is to develop an integrated database to support comparative analysis of genomes including DNA sequence data, protein sequence data, gene expression data and metabolism data. In developing the logic-based system GenoBase, a broader integration of available data was achieved due to assistance from collaborators. Current goals are to easily include new forms of data as they become available and to easily navigate through the ensemble of objects described within the database. This report comments on progress made in these areas.

  6. Integrating computer aided radiography and plantar pressure measurements for complex gait analysis

    International Nuclear Information System (INIS)

    Gefen, A.; Megido-Ravid, M.; Itzchak, Y.; Arcan, M.

    1998-01-01

    Radiographic Fluoroscopy (DRF) and Contact Pressure Display (CPD). The CPD method uses a birefiingent integrated optical sandwich for contact stress analysis, e.g. plantar pressure distribution. The DRF method displays and electronically records skeletal motion using X-ray radiation, providing the exact bone and joint positions during gait. Integrating the two techniques, contribution of each segment to the HFS behavior may be studied by applying image processing and analysis techniques. The combined resulted data may be used not only to detect and diagnose gait pathologies but also as a base for development of advanced numerical models of the foot structure

  7. Integrated oncogeriatric approach: a systematic review of the literature using concept analysis.

    Science.gov (United States)

    Tremblay, Dominique; Charlebois, Kathleen; Terret, Catherine; Joannette, Sonia; Latreille, Jean

    2012-01-01

    The purpose of this study was to provide a more precise definition of an integrated oncogeriatric approach (IOGA) through concept analysis. The literature was reviewed from January 2005 to April 2011 integrating three broad terms: geriatric oncology, multidisciplinarity and integrated care delivery models. Citation selection was based on: (1) elderly cancer patients as the study population; (2) disease management and (3) case studies, intervention studies, assessments, evaluations and studies. Inclusion and exclusion criteria were refined in the course of the literature search. Initiatives in geriatric oncology that relate to oncology services, social support services and primary care services for elderly cancer patients. Elderly cancer patients aged 70 years old or more. Rodgers' concept analysis method was used for this study. The analysis was carried out according to thematic analysis based on the elements of the Chronic Care Model. The search identified 618 citations. After in-depth appraisal of 327 potential citations, 62 articles that met our inclusion criteria were included in the analysis. Three IOGA main attributes were identified, which constitute IOGA's core aspects: geriatric assessment (GA), comorbidity burden and treatment outcomes. The IOGA concept comprises two broad antecedents: coordinated healthcare delivery and primary supportive care services. Regarding the consequents of an integrated approach in geriatric oncology, the studies reviewed remain inconclusive. Our study highlights the pioneering character of the multidimensional IOGA concept, for which the relationship between clinical and organisational attributes, on the one hand, and contextual antecedents, on the other, is not well understood. We have yet to ascertain IOGA's consequents. IMPLICATIONS OF KEY FINDINGS: There is clearly a need for a whole-system approach to change that will provide direction for multilevel (clinical, organisational, strategic) interventions to support

  8. Structural integrity analysis of an INPP building under external loading

    International Nuclear Information System (INIS)

    Dundulis, G.; Karalevicius, R.; Uspuras, E.; Kulak, R.F.; Marchertas, A.

    2005-01-01

    After the terrorist attacks in New York and Washington D. C. using civil airplanes, the evaluation of civil airplane crashes into civil and NPP structures has become very important. The interceptions of many terrorists' communications reveal that the use of commandeered commercial aircraft is still a major part of their plans for destruction. Aircraft crash or other flying objects in the territory of the Ignalina Nuclear Power Plant (INPP) represents a concern to the plant. Aircraft traveling at high velocity have a destructive potential. The aircraft crash may damage the roof and walls of buildings, pipelines, electric motors, cases of power supplies, power cables of electricity transmission and other elements and systems, which are important for safety. Therefore, the evaluation of the structural response to an of aircraft crash is important and was selected for analysis. The structural integrity analysis due to the effects of an aircraft crash on an NPP building structure is the subject of this paper. The finite element method was used for the structural analysis of a typical Ignalina NPP building. The structural integrity analysis was performed for a portion of the ALS using the dynamic loading of an aircraft crash impact model. The computer code NEPTUNE was used for this analysis. The local effects caused by impact of the aircraft's engine on the building wall were evaluated independently by using an empirical formula. (authors)

  9. Multi-color fluorescent DNA analysis in an integrated optofluidic lab-on-a-chip

    OpenAIRE

    Dongre, C.; van Weerd, J.; van Weeghel, R.; Martinez-Vazquez, R.; Osellame, R.; Cerullo, G.; Besselink, G.A.J.; van den Vlekkert, H.H.; Hoekstra, Hugo; Pollnau, Markus

    2010-01-01

    Sorting and sizing of DNA molecules within the human genome project has enabled the genetic mapping of various illnesses. By employing tiny lab-on-a-chip devices for such DNA analysis, integrated DNA sequencing and genetic diagnostics have become feasible. However, such diagnostic chips typically lack integrated sensing capability. We address this issue by combining microfluidic capillary electrophoresis with laser-induced fluorescence detection resulting in optofluidic integration towards an...

  10. MONGKIE: an integrated tool for network analysis and visualization for multi-omics data.

    Science.gov (United States)

    Jang, Yeongjun; Yu, Namhee; Seo, Jihae; Kim, Sun; Lee, Sanghyuk

    2016-03-18

    Network-based integrative analysis is a powerful technique for extracting biological insights from multilayered omics data such as somatic mutations, copy number variations, and gene expression data. However, integrated analysis of multi-omics data is quite complicated and can hardly be done in an automated way. Thus, a powerful interactive visual mining tool supporting diverse analysis algorithms for identification of driver genes and regulatory modules is much needed. Here, we present a software platform that integrates network visualization with omics data analysis tools seamlessly. The visualization unit supports various options for displaying multi-omics data as well as unique network models for describing sophisticated biological networks such as complex biomolecular reactions. In addition, we implemented diverse in-house algorithms for network analysis including network clustering and over-representation analysis. Novel functions include facile definition and optimized visualization of subgroups, comparison of a series of data sets in an identical network by data-to-visual mapping and subsequent overlaying function, and management of custom interaction networks. Utility of MONGKIE for network-based visual data mining of multi-omics data was demonstrated by analysis of the TCGA glioblastoma data. MONGKIE was developed in Java based on the NetBeans plugin architecture, thus being OS-independent with intrinsic support of module extension by third-party developers. We believe that MONGKIE would be a valuable addition to network analysis software by supporting many unique features and visualization options, especially for analysing multi-omics data sets in cancer and other diseases. .

  11. Canonical integration and analysis of periodic maps using non-standard analysis and life methods

    Energy Technology Data Exchange (ETDEWEB)

    Forest, E.; Berz, M.

    1988-06-01

    We describe a method and a way of thinking which is ideally suited for the study of systems represented by canonical integrators. Starting with the continuous description provided by the Hamiltonians, we replace it by a succession of preferably canonical maps. The power series representation of these maps can be extracted with a computer implementation of the tools of Non-Standard Analysis and analyzed by the same tools. For a nearly integrable system, we can define a Floquet ring in a way consistent with our needs. Using the finite time maps, the Floquet ring is defined only at the locations s/sub i/ where one perturbs or observes the phase space. At most the total number of locations is equal to the total number of steps of our integrator. We can also produce pseudo-Hamiltonians which describe the motion induced by these maps. 15 refs., 1 fig.

  12. EUROPEAN INTEGRATION: A MULTILEVEL PROCESS THAT REQUIRES A MULTILEVEL STATISTICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Roxana-Otilia-Sonia HRITCU

    2015-11-01

    Full Text Available A process of market regulation and a system of multi-level governance and several supranational, national and subnational levels of decision making, European integration subscribes to being a multilevel phenomenon. The individual characteristics of citizens, as well as the environment where the integration process takes place, are important. To understand the European integration and its consequences it is important to develop and test multi-level theories that consider individual-level characteristics, as well as the overall context where individuals act and express their characteristics. A central argument of this paper is that support for European integration is influenced by factors operating at different levels. We review and present theories and related research on the use of multilevel analysis in the European area. This paper draws insights on various aspects and consequences of the European integration to take stock of what we know about how and why to use multilevel modeling.

  13. Integrated data analysis of fusion diagnostics by means of the Bayesian probability theory

    International Nuclear Information System (INIS)

    Fischer, R.; Dinklage, A.

    2004-01-01

    Integrated data analysis (IDA) of fusion diagnostics is the combination of heterogeneous diagnostics to obtain validated physical results. Benefits from the integrated approach result from a systematic use of interdependencies; in that sense IDA optimizes the extraction of information from sets of different data. For that purpose IDA requires a systematic and formalized error analysis of all (statistical and systematic) uncertainties involved in each diagnostic. Bayesian probability theory allows for a systematic combination of all information entering the diagnostic model by considering all uncertainties of the measured data, the calibration measurements, and the physical model. Prior physics knowledge on model parameters can be included. Handling of systematic errors is provided. A central goal of the integration of redundant or complementary diagnostics is to provide information to resolve inconsistencies by exploiting interdependencies. A comparable analysis of sets of diagnostics (meta-diagnostics) is performed by combining statistical and systematical uncertainties with model parameters and model uncertainties. Diagnostics improvement and experimental optimization and design of meta-diagnostics will be discussed

  14. Exergy analysis of components of integrated wind energy / hydrogen / fuel cell

    International Nuclear Information System (INIS)

    Hernandez Galvez, G.; Pathiyamattom, J.S.; Sanchez Gamboa, S.

    2009-01-01

    Exergy analysis is made of three components of an integrated wind energy to hydrogen fuel cell: wind turbine, fuel cell (PEMFC) and electrolyzer (PEM). The methodology used to assess how affect the second law efficiency of the electrolyzer and the FC parameters as temperature and operating pressure and membrane thickness. It develop methods to evaluate the influence of changes in the air density and height of the tower on the second law efficiency of the turbine. This work represents a starting point for developing the global availability analysis of an integrated wind / hydrogen / fuel cells, which can be used as a tool to achieve the optimum design of the same. The use of this system contribute to protect the environment

  15. Integrated In Silico Analysis of Pathway Designs for Synthetic Photo-Electro-Autotrophy.

    Directory of Open Access Journals (Sweden)

    Michael Volpers

    Full Text Available The strong advances in synthetic biology enable the engineering of novel functions and complex biological features in unprecedented ways, such as implementing synthetic autotrophic metabolism into heterotrophic hosts. A key challenge for the sustainable production of fuels and chemicals entails the engineering of synthetic autotrophic organisms that can effectively and efficiently fix carbon dioxide by using sustainable energy sources. This challenge involves the integration of carbon fixation and energy uptake systems. A variety of carbon fixation pathways and several types of photosystems and other energy uptake systems can be chosen and, potentially, modularly combined to design synthetic autotrophic metabolism. Prior to implementation, these designs can be evaluated by the combination of several computational pathway analysis techniques. Here we present a systematic, integrated in silico analysis of photo-electro-autotrophic pathway designs, consisting of natural and synthetic carbon fixation pathways, a proton-pumping rhodopsin photosystem for ATP regeneration and an electron uptake pathway. We integrated Flux Balance Analysis of the heterotrophic chassis Escherichia coli with kinetic pathway analysis and thermodynamic pathway analysis (Max-min Driving Force. The photo-electro-autotrophic designs are predicted to have a limited potential for anaerobic, autotrophic growth of E. coli, given the relatively low ATP regenerating capacity of the proton pumping rhodopsin photosystems and the high ATP maintenance of E. coli. If these factors can be tackled, our analysis indicates the highest growth potential for the natural reductive tricarboxylic acid cycle and the synthetic pyruvate synthase-pyruvate carboxylate -glyoxylate bicycle. Both carbon fixation cycles are very ATP efficient, while maintaining fast kinetics, which also results in relatively low estimated protein costs for these pathways. Furthermore, the synthetic bicycles are highly

  16. Computational Approaches for Integrative Analysis of the Metabolome and Microbiome

    Directory of Open Access Journals (Sweden)

    Jasmine Chong

    2017-11-01

    Full Text Available The study of the microbiome, the totality of all microbes inhabiting the host or an environmental niche, has experienced exponential growth over the past few years. The microbiome contributes functional genes and metabolites, and is an important factor for maintaining health. In this context, metabolomics is increasingly applied to complement sequencing-based approaches (marker genes or shotgun metagenomics to enable resolution of microbiome-conferred functionalities associated with health. However, analyzing the resulting multi-omics data remains a significant challenge in current microbiome studies. In this review, we provide an overview of different computational approaches that have been used in recent years for integrative analysis of metabolome and microbiome data, ranging from statistical correlation analysis to metabolic network-based modeling approaches. Throughout the process, we strive to present a unified conceptual framework for multi-omics integration and interpretation, as well as point out potential future directions.

  17. On the Integration of Digital Design and Analysis Tools

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    2006-01-01

    The aim of this research is to look into integrated digital design and analysis tools in order to find out if it is suited for use by architects and designers or only by specialists and technicians - and if not, then to look at what can be done to make them more available to architects and design...

  18. Inertial navigation sensor integrated motion analysis for autonomous vehicle navigation

    Science.gov (United States)

    Roberts, Barry; Bhanu, Bir

    1992-01-01

    Recent work on INS integrated motion analysis is described. Results were obtained with a maximally passive system of obstacle detection (OD) for ground-based vehicles and rotorcraft. The OD approach involves motion analysis of imagery acquired by a passive sensor in the course of vehicle travel to generate range measurements to world points within the sensor FOV. INS data and scene analysis results are used to enhance interest point selection, the matching of the interest points, and the subsequent motion-based computations, tracking, and OD. The most important lesson learned from the research described here is that the incorporation of inertial data into the motion analysis program greatly improves the analysis and makes the process more robust.

  19. Integrated situational awareness for cyber attack detection, analysis, and mitigation

    Science.gov (United States)

    Cheng, Yi; Sagduyu, Yalin; Deng, Julia; Li, Jason; Liu, Peng

    2012-06-01

    Real-time cyberspace situational awareness is critical for securing and protecting today's enterprise networks from various cyber threats. When a security incident occurs, network administrators and security analysts need to know what exactly has happened in the network, why it happened, and what actions or countermeasures should be taken to quickly mitigate the potential impacts. In this paper, we propose an integrated cyberspace situational awareness system for efficient cyber attack detection, analysis and mitigation in large-scale enterprise networks. Essentially, a cyberspace common operational picture will be developed, which is a multi-layer graphical model and can efficiently capture and represent the statuses, relationships, and interdependencies of various entities and elements within and among different levels of a network. Once shared among authorized users, this cyberspace common operational picture can provide an integrated view of the logical, physical, and cyber domains, and a unique visualization of disparate data sets to support decision makers. In addition, advanced analyses, such as Bayesian Network analysis, will be explored to address the information uncertainty, dynamic and complex cyber attack detection, and optimal impact mitigation issues. All the developed technologies will be further integrated into an automatic software toolkit to achieve near real-time cyberspace situational awareness and impact mitigation in large-scale computer networks.

  20. Multi-color fluorescent DNA analysis in an integrated optofluidic lab on a chip

    OpenAIRE

    Dongre, C.

    2010-01-01

    Abstract: Sorting and sizing of DNA molecules within the human genome project has enabled the genetic mapping of various illnesses. Furthermore by employing tiny lab-on-a-chip device, integrated DNA sequencing and genetic diagnostics have become feasible. We present the combination of capillary electrophoresis with laser-induced fluorescence for optofluidic integration toward an on-chip bio-analysis tool. Integrated optical fluorescence excitation allows for a high spatial resolution (12 μm) ...

  1. Experimental Study of Cement - Sandstone/Shale - Brine - CO2 Interactions.

    Science.gov (United States)

    Carroll, Susan A; McNab, Walt W; Torres, Sharon C

    2011-11-11

    Reactive-transport simulation is a tool that is being used to estimate long-term trapping of CO2, and wellbore and cap rock integrity for geologic CO2 storage. We reacted end member components of a heterolithic sandstone and shale unit that forms the upper section of the In Salah Gas Project carbon storage reservoir in Krechba, Algeria with supercritical CO2, brine, and with/without cement at reservoir conditions to develop experimentally constrained geochemical models for use in reactive transport simulations. We observe marked changes in solution composition when CO2 reacted with cement, sandstone, and shale components at reservoir conditions. The geochemical model for the reaction of sandstone and shale with CO2 and brine is a simple one in which albite, chlorite, illite and carbonate minerals partially dissolve and boehmite, smectite, and amorphous silica precipitate. The geochemical model for the wellbore environment is also fairly simple, in which alkaline cements and rock react with CO2-rich brines to form an Fe containing calcite, amorphous silica, smectite and boehmite or amorphous Al(OH)3. Our research shows that relatively simple geochemical models can describe the dominant reactions that are likely to occur when CO2 is stored in deep saline aquifers sealed with overlying shale cap rocks, as well as the dominant reactions for cement carbonation at the wellbore interface.

  2. Interpretation of horizontal well performance in complicated systems by the boundary element method

    Energy Technology Data Exchange (ETDEWEB)

    Jongkittinarukorn, K.; Tiab, D. [Oklahoma Univ., School of Petroleum and Geological Engineering (United States); Escobar, F. H. [Surcolombiana Univ., Dept. of Petroleum Engineering (Colombia)

    1998-12-31

    A solution obtained by using the boundary element method to simulate pressure behaviour of horizontal wells in complicated reservoir-wellbore configurations is presented. Three different types of well bore and reservoir models were studied, i.e. a snake-shaped horizontal wellbore intersecting a two-layer reservoir with cross flow, a horizontal well in a three-layer reservoir with cross flow, and a vertical well intersecting a two-layer reservoir without cross flow. In each case, special attention was paid to the influence of wellbore inclination angle, the distance from the wellbore to the different boundaries and the permeability ratio. Performance of each of these types of wells are discussed. 9 refs., 18 figs.

  3. Gene ARMADA: an integrated multi-analysis platform for microarray data implemented in MATLAB.

    Science.gov (United States)

    Chatziioannou, Aristotelis; Moulos, Panagiotis; Kolisis, Fragiskos N

    2009-10-27

    The microarray data analysis realm is ever growing through the development of various tools, open source and commercial. However there is absence of predefined rational algorithmic analysis workflows or batch standardized processing to incorporate all steps, from raw data import up to the derivation of significantly differentially expressed gene lists. This absence obfuscates the analytical procedure and obstructs the massive comparative processing of genomic microarray datasets. Moreover, the solutions provided, heavily depend on the programming skills of the user, whereas in the case of GUI embedded solutions, they do not provide direct support of various raw image analysis formats or a versatile and simultaneously flexible combination of signal processing methods. We describe here Gene ARMADA (Automated Robust MicroArray Data Analysis), a MATLAB implemented platform with a Graphical User Interface. This suite integrates all steps of microarray data analysis including automated data import, noise correction and filtering, normalization, statistical selection of differentially expressed genes, clustering, classification and annotation. In its current version, Gene ARMADA fully supports 2 coloured cDNA and Affymetrix oligonucleotide arrays, plus custom arrays for which experimental details are given in tabular form (Excel spreadsheet, comma separated values, tab-delimited text formats). It also supports the analysis of already processed results through its versatile import editor. Besides being fully automated, Gene ARMADA incorporates numerous functionalities of the Statistics and Bioinformatics Toolboxes of MATLAB. In addition, it provides numerous visualization and exploration tools plus customizable export data formats for seamless integration by other analysis tools or MATLAB, for further processing. Gene ARMADA requires MATLAB 7.4 (R2007a) or higher and is also distributed as a stand-alone application with MATLAB Component Runtime. Gene ARMADA provides a

  4. Transient pressure and productivity analysis in carbonate geothermal reservoirs with changing external boundary flux

    Directory of Open Access Journals (Sweden)

    Wang Dongying

    2017-01-01

    Full Text Available In this paper, a triple-medium flow model for carbonate geothermal reservoirs with an exponential external boundary flux is established. The pressure solution under constant production conditions in Laplace space is solved. The geothermal wellbore pressure change considering wellbore storage and skin factor is obtained by Stehfest numerical inversion. The well test interpretation charts and Fetkovich production decline chart for carbonate geothermal reservoirs are proposed for the first time. The proposed Fetkovich production decline curves are applied to analyze the production decline behavior. The results indicate that in carbonate geothermal reservoirs with exponential external boundary flux, the pressure derivative curve contains a triple dip, which represents the interporosity flow between the vugs or matrix and fracture system and the invading flow of the external boundary flux. The interporosity flow of carbonate geothermal reservoirs and changing external boundary flux can both slow down the extent of production decline and the same variation tendency is observed in the Fetkovich production decline curve.

  5. Integration of facility modeling capabilities for nuclear nonproliferation analysis

    International Nuclear Information System (INIS)

    Burr, Tom; Gorensek, M.B.; Krebs, John; Kress, Reid L.; Lamberti, Vincent; Schoenwald, David; Ward, Richard C.

    2012-01-01

    Developing automated methods for data collection and analysis that can facilitate nuclearnonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facilitymodeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facilitymodeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come from many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facilitymodelingcapabilities and illustrates how they could be integrated and utilized for nonproliferationanalysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facilitymodeling tools. After considering a representative sampling of key facilitymodelingcapabilities, the proposed integration framework is illustrated with several examples.

  6. INSIGHT: an integrated scoping analysis tool for in-core fuel management of PWR

    International Nuclear Information System (INIS)

    Yamamoto, Akio; Noda, Hidefumi; Ito, Nobuaki; Maruyama, Taiji.

    1997-01-01

    An integrated software tool for scoping analysis of in-core fuel management, INSIGHT, has been developed to automate the scoping analysis and to improve the fuel cycle cost using advanced optimization techniques. INSIGHT is an interactive software tool executed on UNIX based workstations that is equipped with an X-window system. INSIGHT incorporates the GALLOP loading pattern (LP) optimization module that utilizes hybrid genetic algorithms, the PATMAKER interactive LP design module, the MCA multicycle analysis module, an integrated database, and other utilities. Two benchmark problems were analyzed to confirm the key capabilities of INSIGHT: LP optimization and multicycle analysis. The first was the single cycle LP optimization problem that included various constraints. The second one was the multicycle LP optimization problem that includes the assembly burnup limitation at rod cluster control (RCC) positions. The results for these problems showed the feasibility of INSIGHT for the practical scoping analysis, whose work almost consists of LP generation and multicycle analysis. (author)

  7. Evaluation of time integration methods for transient response analysis of nonlinear structures

    International Nuclear Information System (INIS)

    Park, K.C.

    1975-01-01

    Recent developments in the evaluation of direct time integration methods for the transient response analysis of nonlinear structures are presented. These developments, which are based on local stability considerations of an integrator, show that the interaction between temporal step size and nonlinearities of structural systems has a pronounced effect on both accuracy and stability of a given time integration method. The resulting evaluation technique is applied to a model nonlinear problem, in order to: 1) demonstrate that it eliminates the present costly process of evaluating time integrator for nonlinear structural systems via extensive numerical experiments; 2) identify the desirable characteristics of time integration methods for nonlinear structural problems; 3) develop improved stiffly-stable methods for application to nonlinear structures. Extension of the methodology for examination of the interaction between a time integrator and the approximate treatment of nonlinearities (such as due to pseudo-force or incremental solution procedures) is also discussed. (Auth.)

  8. Aerodynamic multi-objective integrated optimization based on principal component analysis

    Directory of Open Access Journals (Sweden)

    Jiangtao HUANG

    2017-08-01

    Full Text Available Based on improved multi-objective particle swarm optimization (MOPSO algorithm with principal component analysis (PCA methodology, an efficient high-dimension multi-objective optimization method is proposed, which, as the purpose of this paper, aims to improve the convergence of Pareto front in multi-objective optimization design. The mathematical efficiency, the physical reasonableness and the reliability in dealing with redundant objectives of PCA are verified by typical DTLZ5 test function and multi-objective correlation analysis of supercritical airfoil, and the proposed method is integrated into aircraft multi-disciplinary design (AMDEsign platform, which contains aerodynamics, stealth and structure weight analysis and optimization module. Then the proposed method is used for the multi-point integrated aerodynamic optimization of a wide-body passenger aircraft, in which the redundant objectives identified by PCA are transformed to optimization constraints, and several design methods are compared. The design results illustrate that the strategy used in this paper is sufficient and multi-point design requirements of the passenger aircraft are reached. The visualization level of non-dominant Pareto set is improved by effectively reducing the dimension without losing the primary feature of the problem.

  9. Integration of Multifidelity Multidisciplinary Computer Codes for Design and Analysis of Supersonic Aircraft

    Science.gov (United States)

    Geiselhart, Karl A.; Ozoroski, Lori P.; Fenbert, James W.; Shields, Elwood W.; Li, Wu

    2011-01-01

    This paper documents the development of a conceptual level integrated process for design and analysis of efficient and environmentally acceptable supersonic aircraft. To overcome the technical challenges to achieve this goal, a conceptual design capability which provides users with the ability to examine the integrated solution between all disciplines and facilitates the application of multidiscipline design, analysis, and optimization on a scale greater than previously achieved, is needed. The described capability is both an interactive design environment as well as a high powered optimization system with a unique blend of low, mixed and high-fidelity engineering tools combined together in the software integration framework, ModelCenter. The various modules are described and capabilities of the system are demonstrated. The current limitations and proposed future enhancements are also discussed.

  10. Analysis and Modeling of Integrated Magnetics for LLC resonant Converters

    DEFF Research Database (Denmark)

    Li, Mingxiao; Ouyang, Ziwei; Zhao, Bin

    2017-01-01

    Shunt-inserted transformers are widely used toobtain high leakage inductance. This paper investigates thismethod in depth to make it applicable to integrate resonantinductor for the LLC resonant converters. The analysis andmodel of magnetizing inductance and leakage inductance forshunt...... transformers can provide a significantdifference. The way to obtain the desirable magnetizing andleakage inductance value for LLC resonant converters issimplified by the creation of air gaps together with a magneticshunt. The calculation and relation are validated by finiteelement analysis (FEA) simulations...

  11. HTGR-Integrated Coal To Liquids Production Analysis

    International Nuclear Information System (INIS)

    Gandrik, Anastasia M.; Wood, Rick A.

    2010-01-01

    As part of the DOE's Idaho National Laboratory (INL) nuclear energy development mission, the INL is leading a program to develop and design a high temperature gas-cooled reactor (HTGR), which has been selected as the base design for the Next Generation Nuclear Plant. Because an HTGR operates at a higher temperature, it can provide higher temperature process heat, more closely matched to chemical process temperatures, than a conventional light water reactor. Integrating HTGRs into conventional industrial processes would increase U.S. energy security and potentially reduce greenhouse gas emissions (GHG), particularly CO2. This paper focuses on the integration of HTGRs into a coal to liquids (CTL) process, for the production of synthetic diesel fuel, naphtha, and liquefied petroleum gas (LPG). The plant models for the CTL processes were developed using Aspen Plus. The models were constructed with plant production capacity set at 50,000 barrels per day of liquid products. Analysis of the conventional CTL case indicated a potential need for hydrogen supplementation from high temperature steam electrolysis (HTSE), with heat and power supplied by the HTGR. By supplementing the process with an external hydrogen source, the need to 'shift' the syngas using conventional water-gas shift reactors was eliminated. HTGR electrical power generation efficiency was set at 40%, a reactor size of 600 MWth was specified, and it was assumed that heat in the form of hot helium could be delivered at a maximum temperature of 700 C to the processes. Results from the Aspen Plus model were used to perform a preliminary economic analysis and a life cycle emissions assessment. The following conclusions were drawn when evaluating the nuclear assisted CTL process against the conventional process: (1) 11 HTGRs (600 MWth each) are required to support production of a 50,000 barrel per day CTL facility. When compared to conventional CTL production, nuclear integration decreases coal consumption by 66

  12. HTGR-INTEGRATED COAL TO LIQUIDS PRODUCTION ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Anastasia M Gandrik; Rick A Wood

    2010-10-01

    As part of the DOE’s Idaho National Laboratory (INL) nuclear energy development mission, the INL is leading a program to develop and design a high temperature gas-cooled reactor (HTGR), which has been selected as the base design for the Next Generation Nuclear Plant. Because an HTGR operates at a higher temperature, it can provide higher temperature process heat, more closely matched to chemical process temperatures, than a conventional light water reactor. Integrating HTGRs into conventional industrial processes would increase U.S. energy security and potentially reduce greenhouse gas emissions (GHG), particularly CO2. This paper focuses on the integration of HTGRs into a coal to liquids (CTL) process, for the production of synthetic diesel fuel, naphtha, and liquefied petroleum gas (LPG). The plant models for the CTL processes were developed using Aspen Plus. The models were constructed with plant production capacity set at 50,000 barrels per day of liquid products. Analysis of the conventional CTL case indicated a potential need for hydrogen supplementation from high temperature steam electrolysis (HTSE), with heat and power supplied by the HTGR. By supplementing the process with an external hydrogen source, the need to “shift” the syngas using conventional water-gas shift reactors was eliminated. HTGR electrical power generation efficiency was set at 40%, a reactor size of 600 MWth was specified, and it was assumed that heat in the form of hot helium could be delivered at a maximum temperature of 700°C to the processes. Results from the Aspen Plus model were used to perform a preliminary economic analysis and a life cycle emissions assessment. The following conclusions were drawn when evaluating the nuclear assisted CTL process against the conventional process: • 11 HTGRs (600 MWth each) are required to support production of a 50,000 barrel per day CTL facility. When compared to conventional CTL production, nuclear integration decreases coal

  13. Non-ferromagnetic overburden casing

    Science.gov (United States)

    Vinegar, Harold J.; Harris, Christopher Kelvin; Mason, Stanley Leroy

    2010-09-14

    Systems, methods, and heaters for treating a subsurface formation are described herein. At least one system for electrically insulating an overburden portion of a heater wellbore is described. The system may include a heater wellbore located in a subsurface formation and an electrically insulating casing located in the overburden portion of the heater wellbore. The casing may include at least one non-ferromagnetic material such that ferromagnetic effects are inhibited in the casing.

  14. Integrative analysis of the mitochondrial proteome in yeast.

    Directory of Open Access Journals (Sweden)

    Holger Prokisch

    2004-06-01

    Full Text Available In this study yeast mitochondria were used as a model system to apply, evaluate, and integrate different genomic approaches to define the proteins of an organelle. Liquid chromatography mass spectrometry applied to purified mitochondria identified 546 proteins. By expression analysis and comparison to other proteome studies, we demonstrate that the proteomic approach identifies primarily highly abundant proteins. By expanding our evaluation to other types of genomic approaches, including systematic deletion phenotype screening, expression profiling, subcellular localization studies, protein interaction analyses, and computational predictions, we show that an integration of approaches moves beyond the limitations of any single approach. We report the success of each approach by benchmarking it against a reference set of known mitochondrial proteins, and predict approximately 700 proteins associated with the mitochondrial organelle from the integration of 22 datasets. We show that a combination of complementary approaches like deletion phenotype screening and mass spectrometry can identify over 75% of the known mitochondrial proteome. These findings have implications for choosing optimal genome-wide approaches for the study of other cellular systems, including organelles and pathways in various species. Furthermore, our systematic identification of genes involved in mitochondrial function and biogenesis in yeast expands the candidate genes available for mapping Mendelian and complex mitochondrial disorders in humans.

  15. NEW CORPORATE REPORTING TRENDS. ANALYSIS ON THE EVOLUTION OF INTEGRATED REPORTING

    Directory of Open Access Journals (Sweden)

    Dragu Ioana

    2013-07-01

    Full Text Available The objective of this paper is to present the new corporate reporting trends of the 21st century. Integrated reporting has been launched through a common initiative of the International Integrated Reporting Committee and global accounting organizations. However, the history of integrated reports starts before the initiative of the IIRC, and goes back in time when large corporations begun to disclose sustainability and corporate social responsibility information. Further on, we claim that the initial sustainability and CSR reports that were issued separate along with the financial annual report represent the predecessors of the current integrated reports. The paper consists of a literature review analysis on the evolution of integrated reporting, from the first stage of international non-financial initiatives, up to the current state of a single integrated annual report. In order to understand the background of integrated reporting we analyze the most relevant research papers on corporate reporting, focusing on the international organizations’ perspective on non-financial reporting, in general, and integrated reporting, in particular. Based on the literature overview, we subtracted the essential information for setting the framework of the integrated reporting evolution. The findings suggest that we can delimitate three main stages in the evolution of integrated reports, namely: the non-financial reporting initiatives, the sustainability era, and the revolution of integrated reporting. We illustrate these results by presenting each relevant point in the history of integrated reporting on a time scale axis, developed with the purpose of defining the road to integrated reporting at theoretical, empirical, and practical levels. We consider the current investigation as relevant for future studies concerning integrated reports, as this is a new area of research still in its infancy. The originality of the research derives from the novelty of

  16. Monitoring and/or Detection of Wellbore Leakage In Energy Storage Wells

    Science.gov (United States)

    Ratigan, J.

    2017-12-01

    Energy (compressed natural gas, crude oil, NGL, and LPG) storage wells in solution-mined caverns in salt formations are required to be tested for integrity every five years. Rules promulgated for such testing typically assume the cavern interval in the salt formation is inherently impermeable, even though some experience demonstrates that this is not always the case. A protocol for testing the cavern impermeable hypothesis should be developed. The description for the integrity test of the "well" component of the well and cavern storage system was developed more than 30 years ago. However, some of the implicit assumptions inherent to the decades-old well test protocol are no longer applicable to the large diameter, high flow rate wells commonly constructed today. More detailed test protocols are necessary for the more contemporary energy storage wells.

  17. CQUESTRA, a risk and performance assessment code for geological sequestration of carbon dioxide

    International Nuclear Information System (INIS)

    LeNeveu, D.M.

    2008-01-01

    A computationally efficient semi-analytical code, CQUESTRA, has been developed for probabilistic risk assessment and rapid screening of potential sites for geological sequestration of carbon dioxide. The rate of dissolution and leakage from a trapped underground pool of carbon dioxide is determined. The trapped carbon dioxide could be mixed with hydrocarbons and other components to form a buoyant phase. The program considers potential mechanisms for escape from the geological formations such as the movement of the buoyant phase through failed seals in wellbores, the annulus around wellbores and through open fractures in the caprock. Plume animations of dissolved carbon dioxide in formation water around the wellbores are provided. Solubility, density and viscosity of the buoyant phase are determined by equations of state. Advection, dispersion, diffusion, buoyancy, aquifer flow rates and local formation fluid pressure are taken into account in the modeling of the carbon dioxide movement. Results from a hypothetical example simulation based on data from the Williston basin near Weyburn, Saskatchewan, indicate that this site is potentially a viable candidate for carbon dioxide sequestration. Sensitivity analysis of CQUESTRA indicates that criteria such as siting below aquifers with large flow rates and siting in reservoirs having fluid pressure below the pressure of the formations above can promote complete dissolution of the carbon dioxide during movement toward the surface, thereby preventing release to the biosphere. Formation of very small carbon dioxide bubbles within the fluid in the wellbores can also lead to complete dissolution

  18. Man-Machine Integrated Design and Analysis System (MIDAS): Functional Overview

    Science.gov (United States)

    Corker, Kevin; Neukom, Christian

    1998-01-01

    Included in the series of screen print-outs illustrates the structure and function of the Man-Machine Integrated Design and Analysis System (MIDAS). Views into the use of the system and editors are featured. The use-case in this set of graphs includes the development of a simulation scenario.

  19. Statistical Analysis of CO2 Exposed Wells to Predict Long Term Leakage through the Development of an Integrated Neural-Genetic Algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Boyun [Univ. of Louisiana, Lafayette, LA (United States); Duguid, Andrew [Battelle, Columbus, OH (United States); Nygaard, Ronar [Missouri Univ. of Science and Technology, Rolla, MO (United States)

    2017-08-05

    The objective of this project is to develop a computerized statistical model with the Integrated Neural-Genetic Algorithm (INGA) for predicting the probability of long-term leak of wells in CO2 sequestration operations. This object has been accomplished by conducting research in three phases: 1) data mining of CO2-explosed wells, 2) INGA computer model development, and 3) evaluation of the predictive performance of the computer model with data from field tests. Data mining was conducted for 510 wells in two CO2 sequestration projects in the Texas Gulf Coast region. They are the Hasting West field and Oyster Bayou field in the Southern Texas. Missing wellbore integrity data were estimated using an analytical and Finite Element Method (FEM) model. The INGA was first tested for performances of convergence and computing efficiency with the obtained data set of high dimension. It was concluded that the INGA can handle the gathered data set with good accuracy and reasonable computing time after a reduction of dimension with a grouping mechanism. A computerized statistical model with the INGA was then developed based on data pre-processing and grouping. Comprehensive training and testing of the model were carried out to ensure that the model is accurate and efficient enough for predicting the probability of long-term leak of wells in CO2 sequestration operations. The Cranfield in the southern Mississippi was select as the test site. Observation wells CFU31F2 and CFU31F3 were used for pressure-testing, formation-logging, and cement-sampling. Tools run in the wells include Isolation Scanner, Slim Cement Mapping Tool (SCMT), Cased Hole Formation Dynamics Tester (CHDT), and Mechanical Sidewall Coring Tool (MSCT). Analyses of the obtained data indicate no leak of CO2 cross the cap zone while it is evident that the well cement sheath was invaded by the CO2 from the storage zone. This observation is consistent

  20. Qualitative Analysis of Integration Adapter Modeling

    OpenAIRE

    Ritter, Daniel; Holzleitner, Manuel

    2015-01-01

    Integration Adapters are a fundamental part of an integration system, since they provide (business) applications access to its messaging channel. However, their modeling and configuration remain under-represented. In previous work, the integration control and data flow syntax and semantics have been expressed in the Business Process Model and Notation (BPMN) as a semantic model for message-based integration, while adapter and the related quality of service modeling were left for further studi...

  1. AMIC: an expandable integrated analog front-end for light distribution moments analysis

    OpenAIRE

    SPAGGIARI, MICHELE; Herrero Bosch, Vicente; Lerche, Christoph Werner; Aliaga Varea, Ramón José; Monzó Ferrer, José María; Gadea Gironés, Rafael

    2011-01-01

    In this article we introduce AMIC (Analog Moments Integrated Circuit), a novel analog Application Specific Integrated Circuit (ASIC) front-end for Positron Emission Tomography (PET) applications. Its working principle is based on mathematical analysis of light distribution through moments calculation. Each moment provides useful information about light distribution, such as energy, position, depth of interaction, skewness (deformation due to border effect) etc. A current buffer delivers a cop...

  2. Application of Sensitivity Analysis in Design of Integrated Building Concepts

    DEFF Research Database (Denmark)

    Heiselberg, Per; Brohus, Henrik; Hesselholt, Allan Tind

    2007-01-01

    analysis makes it possible to identify the most important parameters in relation to building performance and to focus design and optimization of integrated building concepts on these fewer, but most important parameters. The sensitivity analyses will typically be performed at a reasonably early stage...... the design requirements and objectives. In the design of integrated building concepts it is beneficial to identify the most important design parameters in order to more efficiently develop alternative design solutions or more efficiently perform an optimization of the building performance. The sensitivity...

  3. Metabolome Integrated Analysis of High-Temperature Response in Pinus radiata

    Directory of Open Access Journals (Sweden)

    Mónica Escandón

    2018-04-01

    Full Text Available The integrative omics approach is crucial to identify the molecular mechanisms underlying high-temperature response in non-model species. Based on future scenarios of heat increase, Pinus radiata plants were exposed to a temperature of 40°C for a period of 5 days, including recovered plants (30 days after last exposure to 40°C in the analysis. The analysis of the metabolome using complementary mass spectrometry techniques (GC-MS and LC-Orbitrap-MS allowed the reliable quantification of 2,287 metabolites. The analysis of identified metabolites and highlighter metabolic pathways across heat time exposure reveal the dynamism of the metabolome in relation to high-temperature response in P. radiata, identifying the existence of a turning point (on day 3 at which P. radiata plants changed from an initial stress response program (shorter-term response to an acclimation one (longer-term response. Furthermore, the integration of metabolome and physiological measurements, which cover from the photosynthetic state to hormonal profile, suggests a complex metabolic pathway interaction network related to heat-stress response. Cytokinins (CKs, fatty acid metabolism and flavonoid and terpenoid biosynthesis were revealed as the most important pathways involved in heat-stress response in P. radiata, with zeatin riboside (ZR and isopentenyl adenosine (iPA as the key hormones coordinating these multiple and complex interactions. On the other hand, the integrative approach allowed elucidation of crucial metabolic mechanisms involved in heat response in P. radiata, as well as the identification of thermotolerance metabolic biomarkers (L-phenylalanine, hexadecanoic acid, and dihydromyricetin, crucial metabolites which can reschedule the metabolic strategy to adapt to high temperature.

  4. The Integrated Microbial Genomes (IMG) System: An Expanding Comparative Analysis Resource

    Energy Technology Data Exchange (ETDEWEB)

    Markowitz, Victor M.; Chen, I-Min A.; Palaniappan, Krishna; Chu, Ken; Szeto, Ernest; Grechkin, Yuri; Ratner, Anna; Anderson, Iain; Lykidis, Athanasios; Mavromatis, Konstantinos; Ivanova, Natalia N.; Kyrpides, Nikos C.

    2009-09-13

    The integrated microbial genomes (IMG) system serves as a community resource for comparative analysis of publicly available genomes in a comprehensive integrated context. IMG contains both draft and complete microbial genomes integrated with other publicly available genomes from all three domains of life, together with a large number of plasmids and viruses. IMG provides tools and viewers for analyzing and reviewing the annotations of genes and genomes in a comparative context. Since its first release in 2005, IMG's data content and analytical capabilities have been constantly expanded through regular releases. Several companion IMG systems have been set up in order to serve domain specific needs, such as expert review of genome annotations. IMG is available at .

  5. Nanoscale Chemical Processes Affecting Storage Capacities and Seals during Geologic CO2 Sequestration.

    Science.gov (United States)

    Jun, Young-Shin; Zhang, Lijie; Min, Yujia; Li, Qingyun

    2017-07-18

    Geologic CO 2 sequestration (GCS) is a promising strategy to mitigate anthropogenic CO 2 emission to the atmosphere. Suitable geologic storage sites should have a porous reservoir rock zone where injected CO 2 can displace brine and be stored in pores, and an impermeable zone on top of reservoir rocks to hinder upward movement of buoyant CO 2 . The injection wells (steel casings encased in concrete) pass through these geologic zones and lead CO 2 to the desired zones. In subsurface environments, CO 2 is reactive as both a supercritical (sc) phase and aqueous (aq) species. Its nanoscale chemical reactions with geomedia and wellbores are closely related to the safety and efficiency of CO 2 storage. For example, the injection pressure is determined by the wettability and permeability of geomedia, which can be sensitive to nanoscale mineral-fluid interactions; the sealing safety of the injection sites is affected by the opening and closing of fractures in caprocks and the alteration of wellbore integrity caused by nanoscale chemical reactions; and the time scale for CO 2 mineralization is also largely dependent on the chemical reactivities of the reservoir rocks. Therefore, nanoscale chemical processes can influence the hydrogeological and mechanical properties of geomedia, such as their wettability, permeability, mechanical strength, and fracturing. This Account reviews our group's work on nanoscale chemical reactions and their qualitative impacts on seal integrity and storage capacity at GCS sites from four points of view. First, studies on dissolution of feldspar, an important reservoir rock constituent, and subsequent secondary mineral precipitation are discussed, focusing on the effects of feldspar crystallography, cations, and sulfate anions. Second, interfacial reactions between caprock and brine are introduced using model clay minerals, with focuses on the effects of water chemistries (salinity and organic ligands) and water content on mineral dissolution and

  6. Trends and applications of integrated automated ultra-trace sample handling and analysis (T9)

    International Nuclear Information System (INIS)

    Kingston, H.M.S.; Ye Han; Stewart, L.; Link, D.

    2002-01-01

    Full text: Automated analysis, sub-ppt detection limits, and the trend toward speciated analysis (rather than just elemental analysis) force the innovation of sophisticated and integrated sample preparation and analysis techniques. Traditionally, the ability to handle samples at ppt and sub-ppt levels has been limited to clean laboratories and special sample handling techniques and equipment. The world of sample handling has passed a threshold where older or 'old fashioned' traditional techniques no longer provide the ability to see the sample due to the influence of the analytical blank and the fragile nature of the analyte. When samples require decomposition, extraction, separation and manipulation, application of newer more sophisticated sample handling systems are emerging that enable ultra-trace analysis and species manipulation. In addition, new instrumentation has emerged which integrate sample preparation and analysis to enable on-line near real-time analysis. Examples of those newer sample-handling methods will be discussed and current examples provided as alternatives to traditional sample handling. Two new techniques applying ultra-trace microwave energy enhanced sample handling have been developed that permit sample separation and refinement while performing species manipulation during decomposition. A demonstration, that applies to semiconductor materials, will be presented. Next, a new approach to the old problem of sample evaporation without losses will be demonstrated that is capable of retaining all elements and species tested. Both of those methods require microwave energy manipulation in specialized systems and are not accessible through convection, conduction, or other traditional energy applications. A new automated integrated method for handling samples for ultra-trace analysis has been developed. An on-line near real-time measurement system will be described that enables many new automated sample handling and measurement capabilities. This

  7. Strategic Integration of Multiple Bioinformatics Resources for System Level Analysis of Biological Networks.

    Science.gov (United States)

    D'Souza, Mark; Sulakhe, Dinanath; Wang, Sheng; Xie, Bing; Hashemifar, Somaye; Taylor, Andrew; Dubchak, Inna; Conrad Gilliam, T; Maltsev, Natalia

    2017-01-01

    Recent technological advances in genomics allow the production of biological data at unprecedented tera- and petabyte scales. Efficient mining of these vast and complex datasets for the needs of biomedical research critically depends on a seamless integration of the clinical, genomic, and experimental information with prior knowledge about genotype-phenotype relationships. Such experimental data accumulated in publicly available databases should be accessible to a variety of algorithms and analytical pipelines that drive computational analysis and data mining.We present an integrated computational platform Lynx (Sulakhe et al., Nucleic Acids Res 44:D882-D887, 2016) ( http://lynx.cri.uchicago.edu ), a web-based database and knowledge extraction engine. It provides advanced search capabilities and a variety of algorithms for enrichment analysis and network-based gene prioritization. It gives public access to the Lynx integrated knowledge base (LynxKB) and its analytical tools via user-friendly web services and interfaces. The Lynx service-oriented architecture supports annotation and analysis of high-throughput experimental data. Lynx tools assist the user in extracting meaningful knowledge from LynxKB and experimental data, and in the generation of weighted hypotheses regarding the genes and molecular mechanisms contributing to human phenotypes or conditions of interest. The goal of this integrated platform is to support the end-to-end analytical needs of various translational projects.

  8. Integrated cost-effectiveness analysis of agri-environmental measures for water quality.

    Science.gov (United States)

    Balana, Bedru B; Jackson-Blake, Leah; Martin-Ortega, Julia; Dunn, Sarah

    2015-09-15

    This paper presents an application of integrated methodological approach for identifying cost-effective combinations of agri-environmental measures to achieve water quality targets. The methodological approach involves linking hydro-chemical modelling with economic costs of mitigation measures. The utility of the approach was explored for the River Dee catchment in North East Scotland, examining the cost-effectiveness of mitigation measures for nitrogen (N) and phosphorus (P) pollutants. In-stream nitrate concentration was modelled using the STREAM-N and phosphorus using INCA-P model. Both models were first run for baseline conditions and then their effectiveness for changes in land management was simulated. Costs were based on farm income foregone, capital and operational expenditures. The costs and effects data were integrated using 'Risk Solver Platform' optimization in excel to produce the most cost-effective combination of measures by which target nutrient reductions could be attained at a minimum economic cost. The analysis identified different combination of measures as most cost-effective for the two pollutants. An important aspect of this paper is integration of model-based effectiveness estimates with economic cost of measures for cost-effectiveness analysis of land and water management options. The methodological approach developed is not limited to the two pollutants and the selected agri-environmental measures considered in the paper; the approach can be adapted to the cost-effectiveness analysis of any catchment-scale environmental management options. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Integral cost-benefit analysis of Maglev technology under market imperfections

    NARCIS (Netherlands)

    Elhorst, J. Paul; Oosterhaven, Jan; Romp, Ward E.

    2001-01-01

    The aim of this article is to assess a proposed new mode of guided high speed ground transportation, the magnetic levitation rail system (Maglev), and to compare the results of a partial cost-benefit analysis with those of an integral CBA. We deal with an urbanconglomeration as well as a

  10. atBioNet– an integrated network analysis tool for genomics and biomarker discovery

    Directory of Open Access Journals (Sweden)

    Ding Yijun

    2012-07-01

    Full Text Available Abstract Background Large amounts of mammalian protein-protein interaction (PPI data have been generated and are available for public use. From a systems biology perspective, Proteins/genes interactions encode the key mechanisms distinguishing disease and health, and such mechanisms can be uncovered through network analysis. An effective network analysis tool should integrate different content-specific PPI databases into a comprehensive network format with a user-friendly platform to identify key functional modules/pathways and the underlying mechanisms of disease and toxicity. Results atBioNet integrates seven publicly available PPI databases into a network-specific knowledge base. Knowledge expansion is achieved by expanding a user supplied proteins/genes list with interactions from its integrated PPI network. The statistically significant functional modules are determined by applying a fast network-clustering algorithm (SCAN: a Structural Clustering Algorithm for Networks. The functional modules can be visualized either separately or together in the context of the whole network. Integration of pathway information enables enrichment analysis and assessment of the biological function of modules. Three case studies are presented using publicly available disease gene signatures as a basis to discover new biomarkers for acute leukemia, systemic lupus erythematosus, and breast cancer. The results demonstrated that atBioNet can not only identify functional modules and pathways related to the studied diseases, but this information can also be used to hypothesize novel biomarkers for future analysis. Conclusion atBioNet is a free web-based network analysis tool that provides a systematic insight into proteins/genes interactions through examining significant functional modules. The identified functional modules are useful for determining underlying mechanisms of disease and biomarker discovery. It can be accessed at: http

  11. atBioNet--an integrated network analysis tool for genomics and biomarker discovery.

    Science.gov (United States)

    Ding, Yijun; Chen, Minjun; Liu, Zhichao; Ding, Don; Ye, Yanbin; Zhang, Min; Kelly, Reagan; Guo, Li; Su, Zhenqiang; Harris, Stephen C; Qian, Feng; Ge, Weigong; Fang, Hong; Xu, Xiaowei; Tong, Weida

    2012-07-20

    Large amounts of mammalian protein-protein interaction (PPI) data have been generated and are available for public use. From a systems biology perspective, Proteins/genes interactions encode the key mechanisms distinguishing disease and health, and such mechanisms can be uncovered through network analysis. An effective network analysis tool should integrate different content-specific PPI databases into a comprehensive network format with a user-friendly platform to identify key functional modules/pathways and the underlying mechanisms of disease and toxicity. atBioNet integrates seven publicly available PPI databases into a network-specific knowledge base. Knowledge expansion is achieved by expanding a user supplied proteins/genes list with interactions from its integrated PPI network. The statistically significant functional modules are determined by applying a fast network-clustering algorithm (SCAN: a Structural Clustering Algorithm for Networks). The functional modules can be visualized either separately or together in the context of the whole network. Integration of pathway information enables enrichment analysis and assessment of the biological function of modules. Three case studies are presented using publicly available disease gene signatures as a basis to discover new biomarkers for acute leukemia, systemic lupus erythematosus, and breast cancer. The results demonstrated that atBioNet can not only identify functional modules and pathways related to the studied diseases, but this information can also be used to hypothesize novel biomarkers for future analysis. atBioNet is a free web-based network analysis tool that provides a systematic insight into proteins/genes interactions through examining significant functional modules. The identified functional modules are useful for determining underlying mechanisms of disease and biomarker discovery. It can be accessed at: http://www.fda.gov/ScienceResearch/BioinformaticsTools/ucm285284.htm.

  12. Improving horizontal completions on heterogeneous tight shales

    Energy Technology Data Exchange (ETDEWEB)

    Suarez-Rivera, Roberto; Deenadayalu, Chaitanya; Chertov, Maxim; Novalo Hartanto, Ricardo; Gathogo, Patrick [Schlumberger (United States); Kunjir, Rahul [University of Utah (United States)

    2011-07-01

    Evaluation of the two formation characteristics conducive to economic well production is important when tight shale formation characterization and completion design are being considered. This paper presents the basic understanding required to improve the efficiency of horizontal completions in oil and gas producing shales. Guidelines are defined for effective perforation and fracturing to improve the efficiency and sustainability of horizontal completions using extensive laboratory characterization of mechanical properties on core, core/log integration and continuous mapping of these properties by logging-while-drilling (LWD) methods. The objective is to improve completion design efficiency. This is accomplished by suitable selection of perforation intervals based on an understanding of the relevant physical processes and rock characterization. Conditions at two reservoir regions, the near-wellbore and the far-wellbore, are outlined and are essential to completion design. From the study, it can be concluded that tight shales are strongly anisotropic and cannot be approximated using isotropic models.

  13. A code to compute borehole fluid conductivity profiles with multiple feed points

    International Nuclear Information System (INIS)

    Hale, F.V.; Tsang, C.F.

    1988-03-01

    It is of much current interest to determine the flow characteristics of fractures intersecting a wellbore in order to understand the hydrologic behavior of fractured rocks. Often inflow from these fractures into the wellbore is at very low rates. A new procedure has been proposed and a corresponding method of analysis developed to obtain fracture inflow parameters from a time sequence of electric conductivity logs of the borehole fluid. The present report is a companion document to NTB--88-13 giving the details of equations and computer code used to compute borehole fluid conductivity distributions. Verification of the code used and a listing of the code are also given. (author) 9 refs., 5 figs., 7 tabs

  14. Integration of the ATLAS tag database with data management and analysis components

    Energy Technology Data Exchange (ETDEWEB)

    Cranshaw, J; Malon, D [Argonne National Laboratory, Argonne, IL 60439 (United States); Doyle, A T; Kenyon, M J; McGlone, H; Nicholson, C [Department of Physics and Astronomy, University of Glasgow, Glasgow, G12 8QQ, Scotland (United Kingdom)], E-mail: c.nicholson@physics.gla.ac.uk

    2008-07-15

    The ATLAS Tag Database is an event-level metadata system, designed to allow efficient identification and selection of interesting events for user analysis. By making first-level cuts using queries on a relational database, the size of an analysis input sample could be greatly reduced and thus the time taken for the analysis reduced. Deployment of such a Tag database is underway, but to be most useful it needs to be integrated with the distributed data management (DDM) and distributed analysis (DA) components. This means addressing the issue that the DDM system at ATLAS groups files into datasets for scalability and usability, whereas the Tag Database points to events in files. It also means setting up a system which could prepare a list of input events and use both the DDM and DA systems to run a set of jobs. The ATLAS Tag Navigator Tool (TNT) has been developed to address these issues in an integrated way and provide a tool that the average physicist can use. Here, the current status of this work is presented and areas of future work are highlighted.

  15. Integration of the ATLAS tag database with data management and analysis components

    International Nuclear Information System (INIS)

    Cranshaw, J; Malon, D; Doyle, A T; Kenyon, M J; McGlone, H; Nicholson, C

    2008-01-01

    The ATLAS Tag Database is an event-level metadata system, designed to allow efficient identification and selection of interesting events for user analysis. By making first-level cuts using queries on a relational database, the size of an analysis input sample could be greatly reduced and thus the time taken for the analysis reduced. Deployment of such a Tag database is underway, but to be most useful it needs to be integrated with the distributed data management (DDM) and distributed analysis (DA) components. This means addressing the issue that the DDM system at ATLAS groups files into datasets for scalability and usability, whereas the Tag Database points to events in files. It also means setting up a system which could prepare a list of input events and use both the DDM and DA systems to run a set of jobs. The ATLAS Tag Navigator Tool (TNT) has been developed to address these issues in an integrated way and provide a tool that the average physicist can use. Here, the current status of this work is presented and areas of future work are highlighted

  16. Living PRAs [probabilistic risk analysis] made easier with IRRAS [Integrated Reliability and Risk Analysis System

    International Nuclear Information System (INIS)

    Russell, K.D.; Sattison, M.B.; Rasmuson, D.M.

    1989-01-01

    The Integrated Reliability and Risk Analysis System (IRRAS) is an integrated PRA software tool that gives the user the ability to create and analyze fault trees and accident sequences using an IBM-compatible microcomputer. This program provides functions that range from graphical fault tree and event tree construction to cut set generation and quantification. IRRAS contains all the capabilities and functions required to create, modify, reduce, and analyze event tree and fault tree models used in the analysis of complex systems and processes. IRRAS uses advanced graphic and analytical techniques to achieve the greatest possible realization of the potential of the microcomputer. When the needs of the user exceed this potential, IRRAS can call upon the power of the mainframe computer. The role of the Idaho National Engineering Laboratory if the IRRAS program is that of software developer and interface to the user community. Version 1.0 of the IRRAS program was released in February 1987 to prove the concept of performing this kind of analysis on microcomputers. This version contained many of the basic features needed for fault tree analysis and was received very well by the PRA community. Since the release of Version 1.0, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version is designated ''IRRAS 2.0''. Version 3.0 will contain all of the features required for efficient event tree and fault tree construction and analysis. 5 refs., 26 figs

  17. An expert system for integrated structural analysis and design optimization for aerospace structures

    Science.gov (United States)

    1992-04-01

    The results of a research study on the development of an expert system for integrated structural analysis and design optimization is presented. An Object Representation Language (ORL) was developed first in conjunction with a rule-based system. This ORL/AI shell was then used to develop expert systems to provide assistance with a variety of structural analysis and design optimization tasks, in conjunction with procedural modules for finite element structural analysis and design optimization. The main goal of the research study was to provide expertise, judgment, and reasoning capabilities in the aerospace structural design process. This will allow engineers performing structural analysis and design, even without extensive experience in the field, to develop error-free, efficient and reliable structural designs very rapidly and cost-effectively. This would not only improve the productivity of design engineers and analysts, but also significantly reduce time to completion of structural design. An extensive literature survey in the field of structural analysis, design optimization, artificial intelligence, and database management systems and their application to the structural design process was first performed. A feasibility study was then performed, and the architecture and the conceptual design for the integrated 'intelligent' structural analysis and design optimization software was then developed. An Object Representation Language (ORL), in conjunction with a rule-based system, was then developed using C++. Such an approach would improve the expressiveness for knowledge representation (especially for structural analysis and design applications), provide ability to build very large and practical expert systems, and provide an efficient way for storing knowledge. Functional specifications for the expert systems were then developed. The ORL/AI shell was then used to develop a variety of modules of expert systems for a variety of modeling, finite element analysis, and

  18. Game analysis of product-service integration

    Directory of Open Access Journals (Sweden)

    Heping Zhong

    2014-10-01

    Full Text Available Purpose: This paper aims at defining the value creation mechanism and income distribution strategies of product-service integration in order to promote product-service integration of a firm.Design/methodology/approach: This paper conducts researches quantitatively on the coordination mechanism of product-service integration by using game theory, and uses the methods of Shapley value and Equal growth rate to further discuss income distribution strategies of product-service integration.Findings: Product-service integration increases the total income of a firm and the added value of the income decreases as the unit price demand variation coefficient of products and services increases, while decreases as the marginal cost of products increases, decreases as the marginal cost of services increases. Moreover, the findings suggest that both income distribution strategies of product-service integration based on Shapley value method and Equal growth rate method can make the product department and service department of a firm win-win and realize the pareto improvement. The choice of what kind of distribution strategy to coordinate the actions between departments depends on the department playing dominant role in the firm. Generally speaking, for a firm at the center of market, when the product department is the main contributor to firm income, the service department will choose the income distribution strategy of product-service integration based on Shapley value method; when the service department is the main contributor to firm income, the service department will choose the income distribution strategy of product-service integration based on Equal growth rate method.Research limitations/implications: This paper makes some strict assumptions such as complete information, risk neutral, linear cost function and so on and the discussion is limited to the simple relationship between product department and service department.Practical implications: Product

  19. Case for integral core-disruptive accident analysis

    International Nuclear Information System (INIS)

    Luck, L.B.; Bell, C.R.

    1985-01-01

    Integral analysis is an approach used at the Los Alamos National Laboratory to cope with the broad multiplicity of accident paths and complex phenomena that characterize the transition phase of core-disruptive accident progression in a liquid-metal-cooled fast breeder reactor. The approach is based on the combination of a reference calculation, which is intended to represent a band of similar accident paths, and associated system- and separate-effect studies, which are designed to determine the effect of uncertainties. Results are interpreted in the context of a probabilistic framework. The approach was applied successfully in two studies; illustrations from the Clinch River Breeder Reactor licensing assessment are included

  20. Integrating model checking with HiP-HOPS in model-based safety analysis

    International Nuclear Information System (INIS)

    Sharvia, Septavera; Papadopoulos, Yiannis

    2015-01-01

    The ability to perform an effective and robust safety analysis on the design of modern safety–critical systems is crucial. Model-based safety analysis (MBSA) has been introduced in recent years to support the assessment of complex system design by focusing on the system model as the central artefact, and by automating the synthesis and analysis of failure-extended models. Model checking and failure logic synthesis and analysis (FLSA) are two prominent MBSA paradigms. Extensive research has placed emphasis on the development of these techniques, but discussion on their integration remains limited. In this paper, we propose a technique in which model checking and Hierarchically Performed Hazard Origin and Propagation Studies (HiP-HOPS) – an advanced FLSA technique – can be applied synergistically with benefit for the MBSA process. The application of the technique is illustrated through an example of a brake-by-wire system. - Highlights: • We propose technique to integrate HiP-HOPS and model checking. • State machines can be systematically constructed from HiP-HOPS. • The strengths of different MBSA techniques are combined. • Demonstrated through modeling and analysis of brake-by-wire system. • Root cause analysis is automated and system dynamic behaviors analyzed and verified

  1. Integrative analysis for finding genes and networks involved in diabetes and other complex diseases

    DEFF Research Database (Denmark)

    Bergholdt, R.; Størling, Zenia, Marian; Hansen, Kasper Lage

    2007-01-01

    We have developed an integrative analysis method combining genetic interactions, identified using type 1 diabetes genome scan data, and a high-confidence human protein interaction network. Resulting networks were ranked by the significance of the enrichment of proteins from interacting regions. We...... identified a number of new protein network modules and novel candidate genes/proteins for type 1 diabetes. We propose this type of integrative analysis as a general method for the elucidation of genes and networks involved in diabetes and other complex diseases....

  2. Integrated Reliability and Risk Analysis System (IRRAS) Version 2.0 user's guide

    International Nuclear Information System (INIS)

    Russell, K.D.; Sattison, M.B.; Rasmuson, D.M.

    1990-06-01

    The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the user the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification. Also provided in the system is an integrated full-screen editor for use when interfacing with remote mainframe computer systems. Version 1.0 of the IRRAS program was released in February of 1987. Since that time, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 2.0 and is the subject of this user's guide. Version 2.0 of IRRAS provides all of the same capabilities as Version 1.0 and adds a relational data base facility for managing the data, improved functionality, and improved algorithm performance. 9 refs., 292 figs., 4 tabs

  3. Integrating forest inventory and analysis data into a LIDAR-based carbon monitoring system

    Science.gov (United States)

    Kristofer D. Johnson; Richard Birdsey; Andrew O Finley; Anu Swantaran; Ralph Dubayah; Craig Wayson; Rachel. Riemann

    2014-01-01

    Forest Inventory and Analysis (FIA) data may be a valuable component of a LIDAR-based carbon monitoring system, but integration of the two observation systems is not without challenges. To explore integration methods, two wall-to-wall LIDAR-derived biomass maps were compared to FIA data at both the plot and county levels in Anne Arundel and Howard Counties in Maryland...

  4. Solid waste integrated cost analysis model: 1991 project year report

    Energy Technology Data Exchange (ETDEWEB)

    1991-01-01

    The purpose of the City of Houston's 1991 Solid Waste Integrated Cost Analysis Model (SWICAM) project was to continue the development of a computerized cost analysis model. This model is to provide solid waste managers with tool to evaluate the dollar cost of real or hypothetical solid waste management choices. Those choices have become complicated by the implementation of Subtitle D of the Resources Conservation and Recovery Act (RCRA) and the EPA's Integrated Approach to managing municipal solid waste;. that is, minimize generation, maximize recycling, reduce volume (incinerate), and then bury (landfill) only the remainder. Implementation of an integrated solid waste management system involving all or some of the options of recycling, waste to energy, composting, and landfilling is extremely complicated. Factors such as hauling distances, markets, and prices for recyclable, costs and benefits of transfer stations, and material recovery facilities must all be considered. A jurisdiction must determine the cost impacts of implementing a number of various possibilities for managing, handling, processing, and disposing of waste. SWICAM employs a single Lotus 123 spreadsheet to enable a jurisdiction to predict or assess the costs of its waste management system. It allows the user to select his own process flow for waste material and to manipulate the model to include as few or as many options as he or she chooses. The model will calculate the estimated cost for those choices selected. The user can then change the model to include or exclude waste stream components, until the mix of choices suits the user. Graphs can be produced as a visual communication aid in presenting the results of the cost analysis. SWICAM also allows future cost projections to be made.

  5. Respiromics – An integrative analysis linking mitochondrial bioenergetics to molecular signatures

    Directory of Open Access Journals (Sweden)

    Ellen Walheim

    2018-03-01

    Full Text Available Objective: Energy metabolism is challenged upon nutrient stress, eventually leading to a variety of metabolic diseases that represent a major global health burden. Methods: Here, we combine quantitative mitochondrial respirometry (Seahorse technology and proteomics (LC-MS/MS-based total protein approach to understand how molecular changes translate to changes in mitochondrial energy transduction during diet-induced obesity (DIO in the liver. Results: The integrative analysis reveals that significantly increased palmitoyl-carnitine respiration is supported by an array of proteins enriching lipid metabolism pathways. Upstream of the respiratory chain, the increased capacity for ATP synthesis during DIO associates strongest to mitochondrial uptake of pyruvate, which is routed towards carboxylation. At the respiratory chain, robust increases of complex I are uncovered by cumulative analysis of single subunit concentrations. Specifically, nuclear-encoded accessory subunits, but not mitochondrial-encoded or core units, appear to be permissive for enhanced lipid oxidation. Conclusion: Our integrative analysis, that we dubbed “respiromics”, represents an effective tool to link molecular changes to functional mechanisms in liver energy metabolism, and, more generally, can be applied for mitochondrial analysis in a variety of metabolic and mitochondrial disease models. Keywords: Mitochondria, Respirometry, Proteomics, Mitochondrial pyruvate carrier, Liver disease, Bioenergetics, Obesity, Diabetes

  6. Transient flow analysis of integrated valve opening process

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Xinming; Qin, Benke; Bo, Hanliang, E-mail: bohl@tsinghua.edu.cn; Xu, Xingxing

    2017-03-15

    Highlights: • The control rod hydraulic driving system (CRHDS) is a new type of built-in control rod drive technology and the integrated valve (IV) is the key control component. • The transient flow experiment induced by IV is conducted and the test results are analyzed to get its working mechanism. • The theoretical model of IV opening process is established and applied to get the changing rule of the transient flow characteristic parameters. - Abstract: The control rod hydraulic driving system (CRHDS) is a new type of built-in control rod drive technology and the IV is the key control component. The working principle of integrated valve (IV) is analyzed and the IV hydraulic experiment is conducted. There is transient flow phenomenon in the valve opening process. The theoretical model of IV opening process is established by the loop system control equations and boundary conditions. The valve opening boundary condition equation is established based on the IV three dimensional flow field analysis results and the dynamic analysis of the valve core movement. The model calculation results are in good agreement with the experimental results. On this basis, the model is used to analyze the transient flow under high temperature condition. The peak pressure head is consistent with the one under room temperature and the pressure fluctuation period is longer than the one under room temperature. Furthermore, the changing rule of pressure transients with the fluid and loop structure parameters is analyzed. The peak pressure increases with the flow rate and the peak pressure decreases with the increase of the valve opening time. The pressure fluctuation period increases with the loop pipe length and the fluctuation amplitude remains largely unchanged under different equilibrium pressure conditions. The research results lay the base for the vibration reduction analysis of the CRHDS.

  7. Integrative Workflows for Metagenomic Analysis

    Directory of Open Access Journals (Sweden)

    Efthymios eLadoukakis

    2014-11-01

    Full Text Available The rapid evolution of all sequencing technologies, described by the term Next Generation Sequencing (NGS, have revolutionized metagenomic analysis. They constitute a combination of high-throughput analytical protocols, coupled to delicate measuring techniques, in order to potentially discover, properly assemble and map allelic sequences to the correct genomes, achieving particularly high yields for only a fraction of the cost of traditional processes (i.e. Sanger. From a bioinformatic perspective, this boils down to many gigabytes of data being generated from each single sequencing experiment, rendering the management or even the storage, critical bottlenecks with respect to the overall analytical endeavor. The enormous complexity is even more aggravated by the versatility of the processing steps available, represented by the numerous bioinformatic tools that are essential, for each analytical task, in order to fully unveil the genetic content of a metagenomic dataset. These disparate tasks range from simple, nonetheless non-trivial, quality control of raw data to exceptionally complex protein annotation procedures, requesting a high level of expertise for their proper application or the neat implementation of the whole workflow. Furthermore, a bioinformatic analysis of such scale, requires grand computational resources, imposing as the sole realistic solution, the utilization of cloud computing infrastructures. In this review article we discuss different, integrative, bioinformatic solutions available, which address the aforementioned issues, by performing a critical assessment of the available automated pipelines for data management, quality control and annotation of metagenomic data, embracing various, major sequencing technologies and applications.

  8. Development of a 3-D flow analysis computer program for integral reactor

    International Nuclear Information System (INIS)

    Youn, H. Y.; Lee, K. H.; Kim, H. K.; Whang, Y. D.; Kim, H. C.

    2003-01-01

    A 3-D computational fluid dynamics program TASS-3D is being developed for the flow analysis of primary coolant system consists of complex geometries such as SMART. A pre/post processor also is being developed to reduce the pre/post processing works such as a computational grid generation, set-up the analysis conditions and analysis of the calculated results. TASS-3D solver employs a non-orthogonal coordinate system and FVM based on the non-staggered grid system. The program includes the various models to simulate the physical phenomena expected to be occurred in the integral reactor and will be coupled with core dynamics code, core T/H code and the secondary system code modules. Currently, the application of TASS-3D is limited to the single phase of liquid, but the code will be further developed including 2-phase phenomena expected for the normal operation and the various transients of the integrator reactor in the next stage

  9. An Integrated Strategy Framework (ISF) for Combining Porter's 5-Forces, Diamond, PESTEL, and SWOT Analysis

    OpenAIRE

    Anton, Roman

    2015-01-01

    INTRODUCTION Porter's Five-Forces, Porter's Diamond, PESTEL, the 6th-Forths, and Humphrey's SWOT analysis are among the most important and popular concepts taught in business schools around the world. A new integrated strategy framework (ISF) combines all major concepts. PURPOSE Porter's Five-Forces, Porter's Diamond, PESTEL, the 6th-Forths, and Humphrey's SWOT analysis are among the most important and popular concepts taught in business schools around the world. A new integrated strategy fr...

  10. Integrated Community Energy Systems: engineering analysis and design bibliography. [368 citations

    Energy Technology Data Exchange (ETDEWEB)

    Calm, J.M.; Sapienza, G.R.

    1979-05-01

    This bibliography cites 368 documents that may be helpful in the planning, analysis, and design of Integrated Community Energy Systems. It has been prepared for use primarily by engineers and others involved in the development and implementation of ICES concepts. These documents include products of a number of Government research, development, demonstration, and commercialization programs; selected studies and references from the literature of various technical societies and institutions; and other selected material. The key programs which have produced cited reports are the Department of Energy Community Systems Program (DOE/CSP), the Department of Housing and Urban Development Modular Integrated Utility Systems Program (HUD/MIUS), and the Department of Health, Education, and Welfare Integrated Utility Systems Program (HEW/IUS). The cited documents address experience gained both in the U.S. and in other countries. Several general engineering references and bibliographies pertaining to technologies or analytical methods that may be helpful in the analysis and design of ICES are also included. The body of relevant literature is rapidly growing and future updates are therefore planned. Each citation includes identifying information, a source, descriptive information, and an abstract. The citations are indexed both by subjects and authors, and the subject index is extensively cross-referenced to simplify its use.

  11. Supercritical kinetic analysis in simplified system of fuel debris using integral kinetic model

    International Nuclear Information System (INIS)

    Tuya, Delgersaikhan; Obara, Toru

    2016-01-01

    Highlights: • Kinetic analysis in simplified weakly coupled fuel debris system was performed. • The integral kinetic model was used to simulate criticality accidents. • The fission power and released energy during simulated accident were obtained. • Coupling between debris regions and its effect on the fission power was obtained. - Abstract: Preliminary prompt supercritical kinetic analyses in a simplified coupled system of fuel debris designed to roughly resemble a melted core of a nuclear reactor were performed using an integral kinetic model. The integral kinetic model, which can describe region- and time-dependent fission rate in a coupled system of arbitrary geometry, was used because the fuel debris system is weakly coupled in terms of neutronics. The results revealed some important characteristics of coupled systems, such as the coupling between debris regions and the effect of the coupling on the fission rate and released energy in each debris region during the simulated criticality accident. In brief, this study showed that the integral kinetic model can be applied to supercritical kinetic analysis in fuel debris systems and also that it can be a useful tool for investigating the effect of the coupling on consequences of a supercritical accident.

  12. A Scalable Data Integration and Analysis Architecture for Sensor Data of Pediatric Asthma.

    Science.gov (United States)

    Stripelis, Dimitris; Ambite, José Luis; Chiang, Yao-Yi; Eckel, Sandrah P; Habre, Rima

    2017-04-01

    According to the Centers for Disease Control, in the United States there are 6.8 million children living with asthma. Despite the importance of the disease, the available prognostic tools are not sufficient for biomedical researchers to thoroughly investigate the potential risks of the disease at scale. To overcome these challenges we present a big data integration and analysis infrastructure developed by our Data and Software Coordination and Integration Center (DSCIC) of the NIBIB-funded Pediatric Research using Integrated Sensor Monitoring Systems (PRISMS) program. Our goal is to help biomedical researchers to efficiently predict and prevent asthma attacks. The PRISMS-DSCIC is responsible for collecting, integrating, storing, and analyzing real-time environmental, physiological and behavioral data obtained from heterogeneous sensor and traditional data sources. Our architecture is based on the Apache Kafka, Spark and Hadoop frameworks and PostgreSQL DBMS. A main contribution of this work is extending the Spark framework with a mediation layer, based on logical schema mappings and query rewriting, to facilitate data analysis over a consistent harmonized schema. The system provides both batch and stream analytic capabilities over the massive data generated by wearable and fixed sensors.

  13. Control-oriented Automatic System for Transport Analysis (ASTRA)-Matlab integration for Tokamaks

    International Nuclear Information System (INIS)

    Sevillano, M.G.; Garrido, I.; Garrido, A.J.

    2011-01-01

    The exponential growth in energy consumption has led to a renewed interest in the development of alternatives to fossil fuels. Between the unconventional resources that may help to meet this energy demand, nuclear fusion has arisen as a promising source, which has given way to an unprecedented interest in solving the different control problems existing in nuclear fusion reactors such as Tokamaks. The aim of this manuscript is to show how one of the most popular codes used to simulate the performance of Tokamaks, the Automatic System For Transport Analysis (ASTRA) code, can be integrated into the Matlab-Simulink tool in order to make easier and more comfortable the development of suitable controllers for Tokamaks. As a demonstrative case study to show the feasibility and the goodness of the proposed ASTRA-Matlab integration, a modified anti-windup Proportional Integral Derivative (PID)-based controller for the loop voltage of a Tokamak has been implemented. The integration achieved represents an original and innovative work in the Tokamak control area and it provides new possibilities for the development and application of advanced control schemes to the standardized and widely extended ASTRA transport code for Tokamaks. -- Highlights: → The paper presents a useful tool for rapid prototyping of different solutions to deal with the control problems arising in Tokamaks. → The proposed tool embeds the standardized Automatic System For Transport Analysis (ASTRA) code for Tokamaks within the well-known Matlab-Simulink software. → This allows testing and combining diverse control schemes in a unified way considering the ASTRA as the plant of the system. → A demonstrative Proportional Integral Derivative (PID)-based case study is provided to show the feasibility and capabilities of the proposed integration.

  14. Empirical Analysis of the Integration Activity of Business Structures in the Regions of Russia

    Directory of Open Access Journals (Sweden)

    Maria Gennadyevna Karelina

    2015-12-01

    Full Text Available The article investigates the integration activity of business structures in the regions of Russia. A wide variety of approaches to the study of the problems and prospects of economic integration and the current dispute on the role of integration processes in the regional economic development have determined the complexity of the concepts “integration” and “integration activities” in order to develop the objective conditions to analyse the integration activity of business structures in the Russian regions. The monitoring of the current legal system of the Russian Federation carried out in the area of statistics and compiling statistical databases on mergers and acquisitions has showed the absence of the formal executive authority dealing with the compiling and collections of information on the integration activity at the regional level. In this connection, the data of Russian information and analytical agencies are made from the information and analytical base. As the research tools, the methods of analysis of structural changes, methods of analysis of economic differentiation and concentration, methods of non-parametric statistics are used. The article shows the close relationship between the social and economic development of the subjects of Russia and the integrated business structures functioning on its territory. An investigation of the integration activity structure and dynamics in the subjects of the Russian Federation based on the statistical data for the period from 2003 to 2012 has revealed the increasing heterogeneity of the integration activity of business structures in the regions of Russia. The hypothesis of a substantial divergence of mergers and acquisitions of corporate structures in the Russian regions was confirmed by the high values of the Gini coefficient, the Herfindahl index, and the decile coefficient of differentiation. The research results are of practical importance since they can be used to improve the existing

  15. Interfaces and Integration of Medical Image Analysis Frameworks: Challenges and Opportunities.

    Science.gov (United States)

    Covington, Kelsie; McCreedy, Evan S; Chen, Min; Carass, Aaron; Aucoin, Nicole; Landman, Bennett A

    2010-05-25

    Clinical research with medical imaging typically involves large-scale data analysis with interdependent software toolsets tied together in a processing workflow. Numerous, complementary platforms are available, but these are not readily compatible in terms of workflows or data formats. Both image scientists and clinical investigators could benefit from using the framework which is a most natural fit to the specific problem at hand, but pragmatic choices often dictate that a compromise platform is used for collaboration. Manual merging of platforms through carefully tuned scripts has been effective, but exceptionally time consuming and is not feasible for large-scale integration efforts. Hence, the benefits of innovation are constrained by platform dependence. Removing this constraint via integration of algorithms from one framework into another is the focus of this work. We propose and demonstrate a light-weight interface system to expose parameters across platforms and provide seamless integration. In this initial effort, we focus on four platforms Medical Image Analysis and Visualization (MIPAV), Java Image Science Toolkit (JIST), command line tools, and 3D Slicer. We explore three case studies: (1) providing a system for MIPAV to expose internal algorithms and utilize these algorithms within JIST, (2) exposing JIST modules through self-documenting command line interface for inclusion in scripting environments, and (3) detecting and using JIST modules in 3D Slicer. We review the challenges and opportunities for light-weight software integration both within development language (e.g., Java in MIPAV and JIST) and across languages (e.g., C/C++ in 3D Slicer and shell in command line tools).

  16. Urban water metabolism efficiency assessment: integrated analysis of available and virtual water.

    Science.gov (United States)

    Huang, Chu-Long; Vause, Jonathan; Ma, Hwong-Wen; Yu, Chang-Ping

    2013-05-01

    Resolving the complex environmental problems of water pollution and shortage which occur during urbanization requires the systematic assessment of urban water metabolism efficiency (WME). While previous research has tended to focus on either available or virtual water metabolism, here we argue that the systematic problems arising during urbanization require an integrated assessment of available and virtual WME, using an indicator system based on material flow analysis (MFA) results. Future research should focus on the following areas: 1) analysis of available and virtual water flow patterns and processes through urban districts in different urbanization phases in years with varying amounts of rainfall, and their environmental effects; 2) based on the optimization of social, economic and environmental benefits, establishment of an indicator system for urban WME assessment using MFA results; 3) integrated assessment of available and virtual WME in districts with different urbanization levels, to facilitate study of the interactions between the natural and social water cycles; 4) analysis of mechanisms driving differences in WME between districts with different urbanization levels, and the selection of dominant social and economic driving indicators, especially those impacting water resource consumption. Combinations of these driving indicators could then be used to design efficient water resource metabolism solutions, and integrated management policies for reduced water consumption. Copyright © 2013 Elsevier B.V. All rights reserved.

  17. A hybrid approach to device integration on a genetic analysis platform

    International Nuclear Information System (INIS)

    Brennan, Des; Justice, John; Aherne, Margaret; Galvin, Paul; Jary, Dorothee; Kurg, Ants; Berik, Evgeny; Macek, Milan

    2012-01-01

    Point-of-care (POC) systems require significant component integration to implement biochemical protocols associated with molecular diagnostic assays. Hybrid platforms where discrete components are combined in a single platform are a suitable approach to integration, where combining multiple device fabrication steps on a single substrate is not possible due to incompatible or costly fabrication steps. We integrate three devices each with a specific system functionality: (i) a silicon electro-wetting-on-dielectric (EWOD) device to move and mix sample and reagent droplets in an oil phase, (ii) a polymer microfluidic chip containing channels and reservoirs and (iii) an aqueous phase glass microarray for fluorescence microarray hybridization detection. The EWOD device offers the possibility of fully integrating on-chip sample preparation using nanolitre sample and reagent volumes. A key challenge is sample transfer from the oil phase EWOD device to the aqueous phase microarray for hybridization detection. The EWOD device, waveguide performance and functionality are maintained during the integration process. An on-chip biochemical protocol for arrayed primer extension (APEX) was implemented for single nucleotide polymorphism (SNiP) analysis. The prepared sample is aspirated from the EWOD oil phase to the aqueous phase microarray for hybridization. A bench-top instrumentation system was also developed around the integrated platform to drive the EWOD electrodes, implement APEX sample heating and image the microarray after hybridization. (paper)

  18. Integrative Genomic Analysis of Cholangiocarcinoma Identifies Distinct IDH-Mutant Molecular Profiles

    DEFF Research Database (Denmark)

    Farshidfar, Farshad; Zheng, Siyuan; Gingras, Marie-Claude

    2017-01-01

    Cholangiocarcinoma (CCA) is an aggressive malignancy of the bile ducts, with poor prognosis and limited treatment options. Here, we describe the integrated analysis of somatic mutations, RNA expression, copy number, and DNA methylation by The Cancer Genome Atlas of a set of predominantly intrahep...

  19. Recent Advances in the Method of Forces: Integrated Force Method of Structural Analysis

    Science.gov (United States)

    Patnaik, Surya N.; Coroneos, Rula M.; Hopkins, Dale A.

    1998-01-01

    Stress that can be induced in an elastic continuum can be determined directly through the simultaneous application of the equilibrium equations and the compatibility conditions. In the literature, this direct stress formulation is referred to as the integrated force method. This method, which uses forces as the primary unknowns, complements the popular equilibrium-based stiffness method, which considers displacements as the unknowns. The integrated force method produces accurate stress, displacement, and frequency results even for modest finite element models. This version of the force method should be developed as an alternative to the stiffness method because the latter method, which has been researched for the past several decades, may have entered its developmental plateau. Stress plays a primary role in the development of aerospace and other products, and its analysis is difficult. Therefore, it is advisable to use both methods to calculate stress and eliminate errors through comparison. This paper examines the role of the integrated force method in analysis, animation and design.

  20. Computer-aided-engineering system for modeling and analysis of ECLSS integration testing

    Science.gov (United States)

    Sepahban, Sonbol

    1987-01-01

    The accurate modeling and analysis of two-phase fluid networks found in environmental control and life support systems is presently undertaken by computer-aided engineering (CAE) techniques whose generalized fluid dynamics package can solve arbitrary flow networks. The CAE system for integrated test bed modeling and analysis will also furnish interfaces and subsystem/test-article mathematical models. Three-dimensional diagrams of the test bed are generated by the system after performing the requisite simulation and analysis.

  1. Patient Segmentation Analysis Offers Significant Benefits For Integrated Care And Support.

    Science.gov (United States)

    Vuik, Sabine I; Mayer, Erik K; Darzi, Ara

    2016-05-01

    Integrated care aims to organize care around the patient instead of the provider. It is therefore crucial to understand differences across patients and their needs. Segmentation analysis that uses big data can help divide a patient population into distinct groups, which can then be targeted with care models and intervention programs tailored to their needs. In this article we explore the potential applications of patient segmentation in integrated care. We propose a framework for population strategies in integrated care-whole populations, subpopulations, and high-risk populations-and show how patient segmentation can support these strategies. Through international case examples, we illustrate practical considerations such as choosing a segmentation logic, accessing data, and tailoring care models. Important issues for policy makers to consider are trade-offs between simplicity and precision, trade-offs between customized and off-the-shelf solutions, and the availability of linked data sets. We conclude that segmentation can provide many benefits to integrated care, and we encourage policy makers to support its use. Project HOPE—The People-to-People Health Foundation, Inc.

  2. PGSB/MIPS PlantsDB Database Framework for the Integration and Analysis of Plant Genome Data.

    Science.gov (United States)

    Spannagl, Manuel; Nussbaumer, Thomas; Bader, Kai; Gundlach, Heidrun; Mayer, Klaus F X

    2017-01-01

    Plant Genome and Systems Biology (PGSB), formerly Munich Institute for Protein Sequences (MIPS) PlantsDB, is a database framework for the integration and analysis of plant genome data, developed and maintained for more than a decade now. Major components of that framework are genome databases and analysis resources focusing on individual (reference) genomes providing flexible and intuitive access to data. Another main focus is the integration of genomes from both model and crop plants to form a scaffold for comparative genomics, assisted by specialized tools such as the CrowsNest viewer to explore conserved gene order (synteny). Data exchange and integrated search functionality with/over many plant genome databases is provided within the transPLANT project.

  3. Integrated thermal and nonthermal treatment technology and subsystem cost sensitivity analysis

    International Nuclear Information System (INIS)

    Harvego, L.A.; Schafer, J.J.

    1997-02-01

    The U.S. Department of Energy's (DOE) Environmental Management Office of Science and Technology (EM-50) authorized studies on alternative systems for treating contact-handled DOE mixed low-level radioactive waste (MLLW). The on-going Integrated Thermal Treatment Systems' (ITTS) and the Integrated Nonthermal Treatment Systems' (INTS) studies satisfy this request. EM-50 further authorized supporting studies including this technology and subsystem cost sensitivity analysis. This analysis identifies areas where technology development could have the greatest impact on total life cycle system costs. These areas are determined by evaluating the sensitivity of system life cycle costs relative to changes in life cycle component or phase costs, subsystem costs, contingency allowance, facility capacity, operating life, and disposal costs. For all treatment systems, the most cost sensitive life cycle phase is the operations and maintenance phase and the most cost sensitive subsystem is the receiving and inspection/preparation subsystem. These conclusions were unchanged when the sensitivity analysis was repeated on a present value basis. Opportunity exists for technology development to reduce waste receiving and inspection/preparation costs by effectively minimizing labor costs, the major cost driver, within the maintenance and operations phase of the life cycle

  4. Experiment 2008 – A Two Station Re-Measurement of the Geometry of the EE-3 Near Casing Fracture

    Energy Technology Data Exchange (ETDEWEB)

    Potter, Robert M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pearson, Christopher F. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    1982-03-10

    Analysis of the accelerometer system response to 11 microseismic events created in Experiment 2007 indicates that they are located in an ellipsoidal volume whose major axis direction is N 48° E and dips 47° to the SW. The intermediate axis is essentially horizontal and whose direction N 42° W is the strike of the plane containing the two major axes. The dimensions of the three axes are 315,100 and 65 m respectively. The relationship of this seismic feature to the downhole wellbore map is shown in Figures 1 and 2. It will be noted that the ellipsoid is tangent to the injection point in EE-3 and descends at a 45 angle. The plan view shown in Figure 1 indicates that the zone of seismic activity nearly cut the EE-2 wellbore at a depth of 11500 ft (TVD). Examination of the EE-2 wellbore geology and drilling history shows a well defined zone from 11450-11550 ft TVD with a very fast drilling rate (30 ft/hr) and extensive alteration. Laney labels it as a fault zone. This then coudl be an unpressurized part of the planar feature described above.

  5. Sparse multivariate factor analysis regression models and its applications to integrative genomics analysis.

    Science.gov (United States)

    Zhou, Yan; Wang, Pei; Wang, Xianlong; Zhu, Ji; Song, Peter X-K

    2017-01-01

    The multivariate regression model is a useful tool to explore complex associations between two kinds of molecular markers, which enables the understanding of the biological pathways underlying disease etiology. For a set of correlated response variables, accounting for such dependency can increase statistical power. Motivated by integrative genomic data analyses, we propose a new methodology-sparse multivariate factor analysis regression model (smFARM), in which correlations of response variables are assumed to follow a factor analysis model with latent factors. This proposed method not only allows us to address the challenge that the number of association parameters is larger than the sample size, but also to adjust for unobserved genetic and/or nongenetic factors that potentially conceal the underlying response-predictor associations. The proposed smFARM is implemented by the EM algorithm and the blockwise coordinate descent algorithm. The proposed methodology is evaluated and compared to the existing methods through extensive simulation studies. Our results show that accounting for latent factors through the proposed smFARM can improve sensitivity of signal detection and accuracy of sparse association map estimation. We illustrate smFARM by two integrative genomics analysis examples, a breast cancer dataset, and an ovarian cancer dataset, to assess the relationship between DNA copy numbers and gene expression arrays to understand genetic regulatory patterns relevant to the disease. We identify two trans-hub regions: one in cytoband 17q12 whose amplification influences the RNA expression levels of important breast cancer genes, and the other in cytoband 9q21.32-33, which is associated with chemoresistance in ovarian cancer. © 2016 WILEY PERIODICALS, INC.

  6. Integrated Data Collection Analysis (IDCA) Program — Ammonium Nitrate

    Energy Technology Data Exchange (ETDEWEB)

    Sandstrom, Mary M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Geoffrey W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Daniel N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pollard, Colin J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Warner, Kirstin F. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Sorensen, Daniel N. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Remmers, Daniel L. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Phillips, Jason J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shelley, Timothy J. [Bureau of Alcohol, Tobacco and Firearms, Redstone Arsenal, AL (United States); Reyes, Jose A. [Applied Research Associates, Tyndall AFB, FL (United States); Hsu, Peter C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Reynolds, John G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-05-17

    The Integrated Data Collection Analysis (IDCA) program is conducting a proficiency study for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are the results for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of ammonium nitrate (AN). AN was tested, in most cases, as both received from manufacturer and dried/sieved. The participants found the AN to be: 1) insensitive in Type 12A impact testing (although with a wide range of values), 2) completely insensitive in BAM friction testing, 3) less sensitive than the RDX standard in ABL friction testing, 4) less sensitive than RDX in ABL ESD testing, and 5) less sensitive than RDX and PETN in DSC thermal analyses.

  7. Integrating Data Transformation in Principal Components Analysis

    KAUST Repository

    Maadooliat, Mehdi

    2015-01-02

    Principal component analysis (PCA) is a popular dimension reduction method to reduce the complexity and obtain the informative aspects of high-dimensional datasets. When the data distribution is skewed, data transformation is commonly used prior to applying PCA. Such transformation is usually obtained from previous studies, prior knowledge, or trial-and-error. In this work, we develop a model-based method that integrates data transformation in PCA and finds an appropriate data transformation using the maximum profile likelihood. Extensions of the method to handle functional data and missing values are also developed. Several numerical algorithms are provided for efficient computation. The proposed method is illustrated using simulated and real-world data examples.

  8. Painlevйe analysis and integrability of two-coupled non-linear ...

    Indian Academy of Sciences (India)

    the Painlevйe property. In this case the system is expected to be integrable. In recent years more attention is paid to the study of coupled non-linear oscilla- ... Painlevйe analysis. To be self-contained, in §2 we briefly outline the salient features.

  9. Development of a three dimensional elastic plastic analysis system for the integrity evaluation of nuclear power plant components

    International Nuclear Information System (INIS)

    Huh, Nam Su; Im, Chang Ju; Kim, Young Jin; Pyo, Chang Ryul; Park, Chi Yong

    2000-01-01

    In order to evaluate the integrity of nuclear power plant components, the analysis based on fracture mechanics is crucial. For this purpose, finite element method is popularly used to obtain J-integral. However, it is time consuming to design the finite element model of a cracked structure. Also, the J-integral should by verified by alternative methods since it may differ depending on the calculation method. The objective of this paper is to develop a three-dimensional elastic-plastic J-integral analysis system which is named as EPAS program. The EPAS program consists of an automatic mesh generator for a through-wall crack and a surface crack, a solver based on ABAQUS program, and a J-integral calculation program which provides DI (Domain Integral) and EDI (Equivalent Domain Integral) based J-integral calculation. Using the EPAS program, an optimized finite element model for a cracked structure can be generated and corresponding J-integral can be obtained subsequently

  10. Identifiability of location and magnitude of flow barriers in slightly compressible flow

    NARCIS (Netherlands)

    Kahrobaei, S.; Mansoori Habib Abadi, M.; Joosten, G.J.P.; Hof, Van den P.M.J.; Jansen, J.D.

    2015-01-01

    Classic identifiability analysis of flow barriers in incompressible single-phase flow reveals that it is not possible to identify the location and permeability of low-permeability barriers from production data (wellbore pressures and rates), and that only averaged reservoir properties in between

  11. Identifiability of location and magnitude of flow barriers in slightly compressible flow

    NARCIS (Netherlands)

    Kahrobaei, S.; Mansoori Habib Abadi, M.; Joosten, G.J.P.; Van den Hof, P.; Jansen, J.D.

    2016-01-01

    Classic identifiability analysis of flow barriers in incompressible single-phase flow reveals that it is not possible to identify the location and permeability of low-permeability barriers from production data (wellbore pressures and rates), and that only averaged reservoir properties in between

  12. Case-study application of venture analysis: the integrated energy utility. Volume 3. Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Fein, E; Gordon, T J; King, R; Kropp, F G; Shuchman, H L; Stover, J; Hausz, W; Meyer, C

    1978-11-01

    The appendices for a case-study application of venture analysis for an integrated energy utility for commercialization are presented. The following are included and discussed: utility interviews; net social benefits - quantitative calculations; the financial analysis model; market penetration decision model; international district heating systems; political and regulatory environment; institutional impacts.

  13. Optimization of the integration time of pulse shape analysis for dual-layer GSO detector with different amount of Ce

    International Nuclear Information System (INIS)

    Yamamoto, Seiichi

    2008-01-01

    For a multi-layer depth-of-interaction (DOI) detector using different decay times, pulse shape analysis based on two different integration times is often used to distinguish scintillators in DOI direction. This method measures a partial integration and a full integration, and calculates the ratio of these two to obtain the pulse shape distribution. The full integration time is usually set to integrate full width of the scintillation pulse. However, the optimum partial integration time is not obvious for obtaining the best separation of the pulse shape distribution. To make it clear, a theoretical analysis and experiments were conducted for pulse shape analysis by changing the partial integration time using a scintillation detector of GSOs with different amount of Ce. A scintillation detector with 1-in. round photomultiplier tube (PMT) optically coupled GSO of 1.5 mol% (decay time: 35 ns) and that of 0.5 mol% (decay time: 60 ns) was used for the experiments. The signal from PMT was digitally integrated with partial (50-150 ns) and full (160 ns) integration times and ratio of these two was calculated to obtain the pulse shape distribution. In the theoretical analysis, partial integration time of 50 ns showed largest distance between two peaks of the pulse shape distribution. In the experiments, it showed maximum at 70-80 ns of partial integration time. The peak to valley ratio showed the maximum at 120-130 ns. Because the separation of two peaks is determined by the peak to valley ratio, we conclude the optimum partial integration time for these combinations of GSOs is around 120-130 ns, relatively longer than the expected value

  14. Parallel processing of structural integrity analysis codes

    International Nuclear Information System (INIS)

    Swami Prasad, P.; Dutta, B.K.; Kushwaha, H.S.

    1996-01-01

    Structural integrity analysis forms an important role in assessing and demonstrating the safety of nuclear reactor components. This analysis is performed using analytical tools such as Finite Element Method (FEM) with the help of digital computers. The complexity of the problems involved in nuclear engineering demands high speed computation facilities to obtain solutions in reasonable amount of time. Parallel processing systems such as ANUPAM provide an efficient platform for realising the high speed computation. The development and implementation of software on parallel processing systems is an interesting and challenging task. The data and algorithm structure of the codes plays an important role in exploiting the parallel processing system capabilities. Structural analysis codes based on FEM can be divided into two categories with respect to their implementation on parallel processing systems. The first category codes such as those used for harmonic analysis, mechanistic fuel performance codes need not require the parallelisation of individual modules of the codes. The second category of codes such as conventional FEM codes require parallelisation of individual modules. In this category, parallelisation of equation solution module poses major difficulties. Different solution schemes such as domain decomposition method (DDM), parallel active column solver and substructuring method are currently used on parallel processing systems. Two codes, FAIR and TABS belonging to each of these categories have been implemented on ANUPAM. The implementation details of these codes and the performance of different equation solvers are highlighted. (author). 5 refs., 12 figs., 1 tab

  15. FACILITATING INTEGRATED SPATIO-TEMPORAL VISUALIZATION AND ANALYSIS OF HETEROGENEOUS ARCHAEOLOGICAL AND PALAEOENVIRONMENTAL RESEARCH DATA

    Directory of Open Access Journals (Sweden)

    C. Willmes

    2012-07-01

    Full Text Available In the context of the Collaborative Research Centre 806 "Our way to Europe" (CRC806, a research database is developed for integrating data from the disciplines of archaeology, the geosciences and the cultural sciences to facilitate integrated access to heterogeneous data sources. A practice-oriented data integration concept and its implementation is presented in this contribution. The data integration approach is based on the application of Semantic Web Technology and is applied to the domains of archaeological and palaeoenvironmental data. The aim is to provide integrated spatio-temporal access to an existing wealth of data to facilitate research on the integrated data basis. For the web portal of the CRC806 research database (CRC806-Database, a number of interfaces and applications have been evaluated, developed and implemented for exposing the data to interactive analysis and visualizations.

  16. Thermodynamic analysis and optimization of IT-SOFC-based integrated coal gasification fuel cell power plants

    NARCIS (Netherlands)

    Romano, M.C.; Campanari, S.; Spallina, V.; Lozza, G.

    2011-01-01

    This work discusses the thermodynamic analysis of integrated gasification fuel cell plants, where a simple cycle gas turbine works in a hybrid cycle with a pressurized intermediate temperature–solid oxide fuel cell (SOFC), integrated with a coal gasification and syngas cleanup island and a bottoming

  17. Process Integration Analysis of an Industrial Hydrogen Production Process

    OpenAIRE

    Stolten, Detlef; Grube, Thomas; Tock, Laurence; Maréchal, François; Metzger, Christian; Arpentinier, Philippe

    2010-01-01

    The energy efficiency of an industrial hydrogen production process using steam methane reforming (SMR) combined with the water gas shift reaction (WGS) is analyzed using process integration techniques based on heat cascade calculation and pinch analysis with the aim of identifying potential measures to enhance the process performance. The challenge is to satisfy the high temperature heat demand of the SMR reaction by minimizing the consumption of natural gas to feed the combustion and to expl...

  18. Integrate life-cycle assessment and risk analysis results, not methods.

    Science.gov (United States)

    Linkov, Igor; Trump, Benjamin D; Wender, Ben A; Seager, Thomas P; Kennedy, Alan J; Keisler, Jeffrey M

    2017-08-04

    Two analytic perspectives on environmental assessment dominate environmental policy and decision-making: risk analysis (RA) and life-cycle assessment (LCA). RA focuses on management of a toxicological hazard in a specific exposure scenario, while LCA seeks a holistic estimation of impacts of thousands of substances across multiple media, including non-toxicological and non-chemically deleterious effects. While recommendations to integrate the two approaches have remained a consistent feature of environmental scholarship for at least 15 years, the current perception is that progress is slow largely because of practical obstacles, such as a lack of data, rather than insurmountable theoretical difficulties. Nonetheless, the emergence of nanotechnology presents a serious challenge to both perspectives. Because the pace of nanomaterial innovation far outstrips acquisition of environmentally relevant data, it is now clear that a further integration of RA and LCA based on dataset completion will remain futile. In fact, the two approaches are suited for different purposes and answer different questions. A more pragmatic approach to providing better guidance to decision-makers is to apply the two methods in parallel, integrating only after obtaining separate results.

  19. Simulation, integration, and economic analysis of gas-to-liquid processes

    International Nuclear Information System (INIS)

    Bao, Buping; El-Halwagi, Mahmoud M.; Elbashir, Nimir O.

    2010-01-01

    Gas-to-liquid (GTL) involves the chemical conversion of natural gas into synthetic crude that can be upgraded and separated into different useful hydrocarbon fractions including liquid transportation fuels. Such technology can also be used to convert other abundant natural resources such as coal and biomass to fuels and value added chemicals (referred to as coal-to-liquid (CTL) and biomass-to-liquid (BTL)). A leading GTL technology is the Fischer-Tropsch (FT) process. The objective of this work is to provide a techno-economic analysis of the GTL process and to identify optimization and integration opportunities for cost saving and reduction of energy usage while accounting for the environmental impact. First, a base-case flowsheet is synthesized to include the key processing steps of the plant. Then, a computer-aided process simulation is carried out to determine the key mass and energy flows, performance criteria, and equipment specifications. Next, energy and mass integration studies are performed to address the following items: (a) heating and cooling utilities, (b) combined heat and power (process cogeneration), (c) management of process water, (c) optimization of tail gas allocation, and (d) recovery of catalyst-supporting hydrocarbon solvents. Finally, these integration studies are conducted and the results are documented in terms of conserving energy and mass resources as well as providing economic impact. Finally, an economic analysis is undertaken to determine the plant capacity needed to achieve the break-even point and to estimate the return on investment for the base-case study. (author)

  20. Web-GIS approach for integrated analysis of heterogeneous georeferenced data

    Science.gov (United States)

    Okladnikov, Igor; Gordov, Evgeny; Titov, Alexander; Shulgina, Tamara

    2014-05-01

    Georeferenced datasets are currently actively used for modeling, interpretation and forecasting of climatic and ecosystem changes on different spatial and temporal scales [1]. Due to inherent heterogeneity of environmental datasets as well as their huge size (up to tens terabytes for a single dataset) a special software supporting studies in the climate and environmental change areas is required [2]. Dedicated information-computational system for integrated analysis of heterogeneous georeferenced climatological and meteorological data is presented. It is based on combination of Web and GIS technologies according to Open Geospatial Consortium (OGC) standards, and involves many modern solutions such as object-oriented programming model, modular composition, and JavaScript libraries based on GeoExt library (http://www.geoext.org), ExtJS Framework (http://www.sencha.com/products/extjs) and OpenLayers software (http://openlayers.org). The main advantage of the system lies in it's capability to perform integrated analysis of time series of georeferenced data obtained from different sources (in-situ observations, model results, remote sensing data) and to combine the results in a single map [3, 4] as WMS and WFS layers in a web-GIS application. Also analysis results are available for downloading as binary files from the graphical user interface or can be directly accessed through web mapping (WMS) and web feature (WFS) services for a further processing by the user. Data processing is performed on geographically distributed computational cluster comprising data storage systems and corresponding computational nodes. Several geophysical datasets represented by NCEP/NCAR Reanalysis II, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 Reanalysis, ECMWF ERA Interim Reanalysis, MRI/JMA APHRODITE's Water Resources Project Reanalysis, DWD Global Precipitation Climatology Centre's data, GMAO Modern Era-Retrospective analysis for Research and Applications, reanalysis of Monitoring

  1. Interpretation of horizontal well production logs: influence of logging tool

    Energy Technology Data Exchange (ETDEWEB)

    Ozkan, E. [Colorado School of Mines, Boulder, CO (United States); Sarica, C. [Pennsylvania State Univ., College Park, PA (United States); Haci, M. [Drilling Measurements, Inc (United States)

    1998-12-31

    The influence of a production-logging tool on wellbore flow rate and pressure measurements was investigated, focusing on the disturbence caused by the production-logging tool and the coiled tubing on the original flow conditions in the wellbore. The investigation was carried out using an analytical model and single-phase liquid flow was assumed. Results showed that the production-logging tool influenced the measurements as shown by the deviation of the original flow-rate, pressure profiles and low-conductivity wellbores. High production rates increase the effect of the production-logging tool. Recovering or inferring the original flow conditions in the wellbore from the production-logging data is a very complex process which cannot be solved easily. For this reason, the conditions under which the information obtained by production-logging is meaningful is of considerable practical interest. 7 refs., 2 tabs., 15 figs.

  2. A Case Study of a Mixed Methods Study Engaged in Integrated Data Analysis

    Science.gov (United States)

    Schiazza, Daniela Marie

    2013-01-01

    The nascent field of mixed methods research has yet to develop a cohesive framework of guidelines and procedures for mixed methods data analysis (Greene, 2008). To support the field's development of analytical frameworks, this case study reflects on the development and implementation of a mixed methods study engaged in integrated data analysis.…

  3. Mechanisms and mediation in survival analysis: towards an integrated analytical framework.

    LENUS (Irish Health Repository)

    Haase, Trutz

    2016-02-29

    A wide-ranging debate has taken place in recent years on mediation analysis and causal modelling, raising profound theoretical, philosophical and methodological questions. The authors build on the results of these discussions to work towards an integrated approach to the analysis of research questions that situate survival outcomes in relation to complex causal pathways with multiple mediators. The background to this contribution is the increasingly urgent need for policy-relevant research on the nature of inequalities in health and healthcare.

  4. Integration of XRootD into the cloud infrastructure for ALICE data analysis

    Science.gov (United States)

    Kompaniets, Mikhail; Shadura, Oksana; Svirin, Pavlo; Yurchenko, Volodymyr; Zarochentsev, Andrey

    2015-12-01

    Cloud technologies allow easy load balancing between different tasks and projects. From the viewpoint of the data analysis in the ALICE experiment, cloud allows to deploy software using Cern Virtual Machine (CernVM) and CernVM File System (CVMFS), to run different (including outdated) versions of software for long term data preservation and to dynamically allocate resources for different computing activities, e.g. grid site, ALICE Analysis Facility (AAF) and possible usage for local projects or other LHC experiments. We present a cloud solution for Tier-3 sites based on OpenStack and Ceph distributed storage with an integrated XRootD based storage element (SE). One of the key features of the solution is based on idea that Ceph has been used as a backend for Cinder Block Storage service for OpenStack, and in the same time as a storage backend for XRootD, with redundancy and availability of data preserved by Ceph settings. For faster and easier OpenStack deployment was applied the Packstack solution, which is based on the Puppet configuration management system. Ceph installation and configuration operations are structured and converted to Puppet manifests describing node configurations and integrated into Packstack. This solution can be easily deployed, maintained and used even in small groups with limited computing resources and small organizations, which usually have lack of IT support. The proposed infrastructure has been tested on two different clouds (SPbSU & BITP) and integrates successfully with the ALICE data analysis model.

  5. Integrative analysis to select cancer candidate biomarkers to targeted validation

    Science.gov (United States)

    Heberle, Henry; Domingues, Romênia R.; Granato, Daniela C.; Yokoo, Sami; Canevarolo, Rafael R.; Winck, Flavia V.; Ribeiro, Ana Carolina P.; Brandão, Thaís Bianca; Filgueiras, Paulo R.; Cruz, Karen S. P.; Barbuto, José Alexandre; Poppi, Ronei J.; Minghim, Rosane; Telles, Guilherme P.; Fonseca, Felipe Paiva; Fox, Jay W.; Santos-Silva, Alan R.; Coletta, Ricardo D.; Sherman, Nicholas E.; Paes Leme, Adriana F.

    2015-01-01

    Targeted proteomics has flourished as the method of choice for prospecting for and validating potential candidate biomarkers in many diseases. However, challenges still remain due to the lack of standardized routines that can prioritize a limited number of proteins to be further validated in human samples. To help researchers identify candidate biomarkers that best characterize their samples under study, a well-designed integrative analysis pipeline, comprising MS-based discovery, feature selection methods, clustering techniques, bioinformatic analyses and targeted approaches was performed using discovery-based proteomic data from the secretomes of three classes of human cell lines (carcinoma, melanoma and non-cancerous). Three feature selection algorithms, namely, Beta-binomial, Nearest Shrunken Centroids (NSC), and Support Vector Machine-Recursive Features Elimination (SVM-RFE), indicated a panel of 137 candidate biomarkers for carcinoma and 271 for melanoma, which were differentially abundant between the tumor classes. We further tested the strength of the pipeline in selecting candidate biomarkers by immunoblotting, human tissue microarrays, label-free targeted MS and functional experiments. In conclusion, the proposed integrative analysis was able to pre-qualify and prioritize candidate biomarkers from discovery-based proteomics to targeted MS. PMID:26540631

  6. Integrated severe accident containment analysis with the CONTAIN computer code

    International Nuclear Information System (INIS)

    Bergeron, K.D.; Williams, D.C.; Rexroth, P.E.; Tills, J.L.

    1985-12-01

    Analysis of physical and radiological conditions iunside the containment building during a severe (core-melt) nuclear reactor accident requires quantitative evaluation of numerous highly disparate yet coupled phenomenologies. These include two-phase thermodynamics and thermal-hydraulics, aerosol physics, fission product phenomena, core-concrete interactions, the formation and combustion of flammable gases, and performance of engineered safety features. In the past, this complexity has meant that a complete containment analysis would require application of suites of separate computer codes each of which would treat only a narrower subset of these phenomena, e.g., a thermal-hydraulics code, an aerosol code, a core-concrete interaction code, etc. In this paper, we describe the development and some recent applications of the CONTAIN code, which offers an integrated treatment of the dominant containment phenomena and the interactions among them. We describe the results of a series of containment phenomenology studies, based upon realistic accident sequence analyses in actual plants. These calculations highlight various phenomenological effects that have potentially important implications for source term and/or containment loading issues, and which are difficult or impossible to treat using a less integrated code suite

  7. Advanced GPR imaging of sedimentary features: integrated attribute analysis applied to sand dunes

    Science.gov (United States)

    Zhao, Wenke; Forte, Emanuele; Fontolan, Giorgio; Pipan, Michele

    2018-04-01

    We evaluate the applicability and the effectiveness of integrated GPR attribute analysis to image the internal sedimentary features of the Piscinas Dunes, SW Sardinia, Italy. The main objective is to explore the limits of GPR techniques to study sediment-bodies geometry and to provide a non-invasive high-resolution characterization of the different subsurface domains of dune architecture. On such purpose, we exploit the high-quality Piscinas data-set to extract and test different attributes of the GPR trace. Composite displays of multi-attributes related to amplitude, frequency, similarity and textural features are displayed with overlays and RGB mixed models. A multi-attribute comparative analysis is used to characterize different radar facies to better understand the characteristics of internal reflection patterns. The results demonstrate that the proposed integrated GPR attribute analysis can provide enhanced information about the spatial distribution of sediment bodies, allowing an enhanced and more constrained data interpretation.

  8. The Holistic Integrity Test (HIT - quantified resilience analysis

    Directory of Open Access Journals (Sweden)

    Dobson Mike

    2016-01-01

    Full Text Available The Holistic Integrity Test (HIT - Quantified Resilience Analysis. Rising sea levels and wider climate change mean we face an increasing risk from flooding and other natural hazards. Tough economic times make it difficult to economically justify or afford the desired level of engineered risk reduction. Add to this significant uncertainty from a range of future predictions, constantly updated with new science. We therefore need to understand not just how to reduce the risk, but what could happen should above design standard events occur. In flood terms this includes not only the direct impacts (damage and loss of life, but the wider cascade impacts to infrastructure systems and the longer term impacts on the economy and society. However, understanding the “what if” is only the first part of the equation; a range of improvement measures to mitigate such effects need to be identified and implemented. These measures should consider reducing the risk, lessening the consequences, aiding the response, and speeding up the recovery. However, they need to be objectively assessed through quantitative analysis, which underpins them technically and economically. Without such analysis, it cannot be predicted how measures will perform if the extreme events occur. It is also vital to consider all possible hazards as measures for one hazard may hinder the response to another. The Holistic Integrity Test (HIT, uses quantitative system analysis and “HITs” the site, its infrastructure, contained dangers and wider regional system to determine how it copes with a range of severe shock events, Before, During and After the event, whilst also accounting for uncertainty (as illustrated in figure 1. First explained at the TINCE 2014 Nuclear Conference in Paris, it was explained in terms of a Nuclear Facility needing to analyse the site in response to post Fukushima needs; the hit is however universally applicable. The HIT has three key risk reduction goals: The

  9. Water coning in porous media reservoirs for compressed air energy storage

    Energy Technology Data Exchange (ETDEWEB)

    Wiles, L.E.; McCann, R.A.

    1981-06-01

    The general purpose of this work is to define the hydrodynamic and thermodynamic response of a CAES porous media reservoir subjected to simulated air mass cycling. This research will assist in providing design guidelines for the efficient and stable operation of the air storage reservoir. This report presents the analysis and results for the two-phase (air-water), two-dimensional, numerical modeling of CAES porous media reservoirs. The effects of capillary pressure and relative permeability were included. The fluids were considered to be immisicible; there was no phase change; and the system was isothermal. The specific purpose of this analysis was to evaluate the reservoir parameters that were believed to be important to water coning. This phenomenon may occur in reservoirs in which water underlies the air storage zone. It involves the possible intrusion of water into the wellbore or near-wellbore region. The water movement is in response to pressure gradients created during a reservoir discharge cycle. Potential adverse effects due to this water movement are associated with the pressure response of the reservoir and the geochemical stability of the near-wellbore region. The results obtained for the simulated operation of a CAES reservoir suggest that water coning should not be a severe problem, due to the slow response of the water to the pressure gradients and the relatively short duration in which those gradients exist. However, water coning will depend on site-specific conditions, particularly the fluid distributions following bubble development, and, therefore, a water coning analysis should be included as part of site evaluation.

  10. Proceedings of ITOHOS 2008 : The 2008 SPE/PS/CHOA International Thermal Operations and Heavy Oil Symposium : Heavy Oil : Integrating the Pieces

    International Nuclear Information System (INIS)

    2008-10-01

    This multi-disciplinary conference and exhibition combined the Society of Petroleum Engineers (SPE) and the Petroleum Society's (PS) international thermal operations and heavy oil symposium, and the Canadian Heavy Oil Association's (CHOA) annual business meeting. The conference provided a forum to examine emerging technologies and other critical issues affecting the global heavy oil and bitumen industry. The most current technologies from around the world that enhance the recovery of heavy oil and bitumen from oil sand deposits were also showcased. The technical program encompassed the economic, technical, and environmental challenges that the petroleum industry is currently facing. The sessions of the conference were entitled: artificial lift; mining, extraction and cold production; simulation; solvent processes; reservoir characterization; steam generation and water treatment; and, in-situ combustion in Canada. The conference also featured a series of short courses and tutorials on heavy oil wellbore completions and design; drilling horizontal heavy oil wells and steam assisted gravity drainage (SAGD) wells; geomechanical based reservoir monitoring; thermal well design; fiber optic thermal monitoring; heavy oil thermal recovery and economics; wellbore slotting; advanced geomechanics; and, an overview of cold heavy oil production with sand (CHOPS). All 91 presentations from the conference have been catalogued separately for inclusion in this database. refs., tabs., figs

  11. Proceedings of ITOHOS 2008 : The 2008 SPE/PS/CHOA International Thermal Operations and Heavy Oil Symposium : Heavy Oil : Integrating the Pieces

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2008-10-15

    This multi-disciplinary conference and exhibition combined the Society of Petroleum Engineers (SPE) and the Petroleum Society's (PS) international thermal operations and heavy oil symposium, and the Canadian Heavy Oil Association's (CHOA) annual business meeting. The conference provided a forum to examine emerging technologies and other critical issues affecting the global heavy oil and bitumen industry. The most current technologies from around the world that enhance the recovery of heavy oil and bitumen from oil sand deposits were also showcased. The technical program encompassed the economic, technical, and environmental challenges that the petroleum industry is currently facing. The sessions of the conference were entitled: artificial lift; mining, extraction and cold production; simulation; solvent processes; reservoir characterization; steam generation and water treatment; and, in-situ combustion in Canada. The conference also featured a series of short courses and tutorials on heavy oil wellbore completions and design; drilling horizontal heavy oil wells and steam assisted gravity drainage (SAGD) wells; geomechanical based reservoir monitoring; thermal well design; fiber optic thermal monitoring; heavy oil thermal recovery and economics; wellbore slotting; advanced geomechanics; and, an overview of cold heavy oil production with sand (CHOPS). All 91 presentations from the conference have been catalogued separately for inclusion in this database. refs., tabs., figs.

  12. Integrated sudomotor axon reflex sweat stimulation for continuous sweat analyte analysis with individuals at rest.

    Science.gov (United States)

    Sonner, Zachary; Wilder, Eliza; Gaillard, Trudy; Kasting, Gerald; Heikenfeld, Jason

    2017-07-25

    Eccrine sweat has rapidly emerged as a non-invasive, ergonomic, and rich source of chemical analytes with numerous technological demonstrations now showing the ability for continuous electrochemical sensing. However, beyond active perspirers (athletes, workers, etc.), continuous sweat access in individuals at rest has hindered the advancement of both sweat sensing science and technology. Reported here is integration of sudomotor axon reflex sweat stimulation for continuous wearable sweat analyte analysis, including the ability for side-by-side integration of chemical stimulants & sensors without cross-contamination. This integration approach is uniquely compatible with sensors which consume the analyte (enzymatic) or sensors which equilibrate with analyte concentrations. In vivo validation is performed using iontophoretic delivery of carbachol with ion-selective and impedance sensors for sweat analysis. Carbachol has shown prolonged sweat stimulation in directly stimulated regions for five hours or longer. This work represents a significant leap forward in sweat sensing technology, and may be of broader interest to those interested in on-skin sensing integrated with drug-delivery.

  13. Exergy analysis of a combined heat and power plant with integrated lignocellulosic ethanol production

    International Nuclear Information System (INIS)

    Lythcke-Jørgensen, Christoffer; Haglind, Fredrik; Clausen, Lasse R.

    2014-01-01

    Highlights: • We model a system where lignocellulosic ethanol production is integrated with a combined heat and power (CHP) plant. • We conduct an exergy analysis for the ethanol production in six different system operation points. • Integrated operation, district heating (DH) production and low CHP loads all increase the exergy efficiency. • Separate operation has the largest negative impact on the exergy efficiency. • Operation is found to have a significant impact on the exergy efficiency of the ethanol production. - Abstract: Lignocellulosic ethanol production is often assumed integrated in polygeneration systems because of its energy intensive nature. The objective of this study is to investigate potential irreversibilities from such integration, and what impact it has on the efficiency of the integrated ethanol production. An exergy analysis is carried out for a modelled polygeneration system in which lignocellulosic ethanol production based on hydrothermal pretreatment is integrated in an existing combined heat and power (CHP) plant. The ethanol facility is driven by steam extracted from the CHP unit when feasible, and a gas boiler is used as back-up when integration is not possible. The system was evaluated according to six operation points that alternate on the following three different operation parameters: Load in the CHP unit, integrated versus separate operation, and inclusion of district heating production in the ethanol facility. The calculated standard exergy efficiency of the ethanol facility varied from 0.564 to 0.855, of which the highest was obtained for integrated operation at reduced CHP load and full district heating production in the ethanol facility, and the lowest for separate operation with zero district heating production in the ethanol facility. The results suggest that the efficiency of integrating lignocellulosic ethanol production in CHP plants is highly dependent on operation, and it is therefore suggested that the

  14. Integrated intelligent instruments using supercritical fluid technology for soil analysis

    International Nuclear Information System (INIS)

    Liebman, S.A.; Phillips, C.; Fitzgerald, W.; Levy, E.J.

    1994-01-01

    Contaminated soils pose a significant challenge for characterization and remediation programs that require rapid, accurate and comprehensive data in the field or laboratory. Environmental analyzers based on supercritical fluid (SF) technology have been designed and developed for meeting these global needs. The analyzers are designated the CHAMP Systems (Chemical Hazards Automated Multimedia Processors). The prototype instrumentation features SF extraction (SFE) and on-line capillary gas chromatographic (GC) analysis with chromatographic and/or spectral identification detectors, such as ultra-violet, Fourier transform infrared and mass spectrometers. Illustrations are given for a highly automated SFE-capillary GC/flame ionization (FID) configuration to provide validated screening analysis for total extractable hydrocarbons within ca. 5--10 min, as well as a full qualitative/quantitative analysis in 25--30 min. Data analysis using optional expert system and neural networks software is demonstrated for test gasoline and diesel oil mixtures in this integrated intelligent instrument approach to trace organic analysis of soils and sediments

  15. Integrated Software Environment for Pressurized Thermal Shock Analysis

    Directory of Open Access Journals (Sweden)

    Dino Araneo

    2011-01-01

    Full Text Available The present paper describes the main features and an application to a real Nuclear Power Plant (NPP of an Integrated Software Environment (in the following referred to as “platform” developed at University of Pisa (UNIPI to perform Pressurized Thermal Shock (PTS analysis. The platform is written in Java for the portability and it implements all the steps foreseen in the methodology developed at UNIPI for the deterministic analysis of PTS scenarios. The methodology starts with the thermal hydraulic analysis of the NPP with a system code (such as Relap5-3D and Cathare2, during a selected transient scenario. The results so obtained are then processed to provide boundary conditions for the next step, that is, a CFD calculation. Once the system pressure and the RPV wall temperature are known, the stresses inside the RPV wall can be calculated by mean a Finite Element (FE code. The last step of the methodology is the Fracture Mechanics (FM analysis, using weight functions, aimed at evaluating the stress intensity factor (KI at crack tip to be compared with the critical stress intensity factor KIc. The platform automates all these steps foreseen in the methodology once the user specifies a number of boundary conditions at the beginning of the simulation.

  16. Proceedings of a NEA workshop on probabilistic structure integrity analysis and its relationship to deterministic analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-07-01

    This workshop was hosted jointly by the Swedish Nuclear Power Inspectorate (SKi) and the Swedish Royal Institute of Technology (KTH). It was sponsored by the Principal Working Group 3 (PWG-3) of the NEA CSNI. PWG-3 deals with the integrity of structures and components, and has three sub-groups, dealing with the integrity of metal components and structures, ageing of concrete structures, and the seismic behaviour of structures. The sub-group dealing with metal components has three mains areas of activity: non-destructive examination; fracture mechanics; and material degradation. The topic of this workshop is primarily probabilistic fracture mechanics, but probabilistic integrity analysis includes NDE and materials degradation also. Session 1 (5 papers) was devoted to the development of probabilistic models; Session 2 (5 papers) to the random modelling of defects and material properties; Session 3 (8 papers) to the applications of probabilistic modelling to nuclear components; Sessions 4 is a concluding panel discussion

  17. Proceedings of a NEA workshop on probabilistic structure integrity analysis and its relationship to deterministic analysis

    International Nuclear Information System (INIS)

    1996-01-01

    This workshop was hosted jointly by the Swedish Nuclear Power Inspectorate (SKi) and the Swedish Royal Institute of Technology (KTH). It was sponsored by the Principal Working Group 3 (PWG-3) of the NEA CSNI. PWG-3 deals with the integrity of structures and components, and has three sub-groups, dealing with the integrity of metal components and structures, ageing of concrete structures, and the seismic behaviour of structures. The sub-group dealing with metal components has three mains areas of activity: non-destructive examination; fracture mechanics; and material degradation. The topic of this workshop is primarily probabilistic fracture mechanics, but probabilistic integrity analysis includes NDE and materials degradation also. Session 1 (5 papers) was devoted to the development of probabilistic models; Session 2 (5 papers) to the random modelling of defects and material properties; Session 3 (8 papers) to the applications of probabilistic modelling to nuclear components; Sessions 4 is a concluding panel discussion

  18. Analysis and interpretation of geophysical surveys in archaeological sites employing different integrated approach.

    Science.gov (United States)

    Piro, Salvatore; Papale, Enrico; Kucukdemirci, Melda; Zamuner, Daniela

    2017-04-01

    dating from the third century B.C. The second site is always in suburban area and is part of the ancient acropolis Etruscan town of Cerveteri (central Italy). The third site is part of Aizanoi archaeological park (Cavdarhisar, Kutahya, Turkey). To have a better understanding of the subsurface, we performed a different integrated approaches of these data, which consists in fusing the data from all the employed methods, to have a complete visualization of the investigated area. For the processing we have used the following techniques: graphical integration (overlay and RGB colour composite), discrete data analysis (binary data analysis and cluster analysis) and continuous data analysis (data sum, product, max, min and PCA). Ernenwein, E.G. 2009. Integration of multidimensional archaeogeophysical data using supervised and unsupervised classification. Near surface geophysics. Vol 7: 147-158. DOI:10.3997/1873-0604.2009004 Kucukdemirci,M., Piro.S.,Baydemir,N.,Ozer.,E. Zamuner.,D. 2015. Mathematical and Statistical Integration approach on archaeological prospection data,case studies from Aizanoi-Turkey. 43rd Computer Applications and Quantitative Methods in Archaeology, Siena. Kvamme,K.,2007. Integrating Multiple Geophysical Datasets, Remote Sensing in archaeology, Springer,Boston. Piro,S.,Mauriello.,P. and Cammarano.,F.2000. Quantitative Integration of Geophysical methods for Archaeological Prospection. Archaeological prospection 7(4): 203-213. Piro S., Papale E., Zamuner D., 2016. Different integrated geophysical approaches to investigate archaeological sites in urban and suburban area. Geophysical Research Abstracts Vol. 18, EGU2016.

  19. Exergy analysis of a combined heat and power plant with integrated lignocellulosic ethanol production

    DEFF Research Database (Denmark)

    Lythcke-Jørgensen, Christoffer Ernst; Haglind, Fredrik; Clausen, Lasse Røngaard

    2014-01-01

    production. An exergy analysis is carried out for a modelled polygeneration system in which lignocellulosic ethanol production based on hydrothermal pretreatment is integrated in an existing combined heat and power (CHP) plant. The ethanol facility is driven by steam extracted from the CHP unit when feasible...... district heating production in the ethanol facility. The results suggest that the efficiency of integrating lignocellulosic ethanol production in CHP plants is highly dependent on operation, and it is therefore suggested that the expected operation pattern of such polygeneration system is taken......Lignocellulosic ethanol production is often assumed integrated in polygeneration systems because of its energy intensive nature. The objective of this study is to investigate potential irreversibilities from such integration, and what impact it has on the efficiency of the integrated ethanol...

  20. Comprehensive understandings of energy confinement in LHD plasmas through extensive application of the integrated transport analysis suite

    International Nuclear Information System (INIS)

    Yokoyama, M.; Seki, R.; Suzuki, C.; Ida, K.; Osakabe, M.; Satake, S.; Yamada, H.; Murakami, S.

    2014-10-01

    The integrated transport analysis suite, TASK3D-a, has enhanced energy transport analyses in LHD. It has clearly elucidated (1) the systematic dependence of ion and electron energy confinement on wide variation of plasma parameters, and (2) statistically-derived fitting expressions for the ion and electron heat diffusivities (χ i and χ e ), separately, taking also those radial-profile information into account. In particular, the latter approach can outstrip the conventional scaling laws for the global confinement time (τ E ) in terms of its considerations on profiles (temperature, density, heating depositions etc.). This has been made possible with the analysis database accumulated by the extensive application of the integrated transport analysis suite to experiment data. In this proceeding, TASK3D-a analysis-database for high-ion-temperature (high-T i ) plasmas in LHD (Large Helical Device) are exemplified. This approach should be applicable to any other combinations of integrated transport analysis suites and fusion experiments. (author)

  1. Assessment of the TRINO reactor pressure vessel integrity: theoretical analysis and NDE

    Energy Technology Data Exchange (ETDEWEB)

    Milella, P P; Pini, A [ENEA, Rome (Italy)

    1988-12-31

    This document presents the method used for the capability assessment of the Trino reactor pressure vessel. The vessel integrity assessment is divided into the following parts: transients evaluation and selection, fluence estimate for the projected end of life of the vessel, characterization of unirradiated and irradiated materials, thermal and stress analysis, fracture mechanics analysis and eventually fracture input to Non Destructive Examination (NDE). For each part, results are provided. (TEC).

  2. Sensitivity Analysis Based on Markovian Integration by Parts Formula

    Directory of Open Access Journals (Sweden)

    Yongsheng Hang

    2017-10-01

    Full Text Available Sensitivity analysis is widely applied in financial risk management and engineering; it describes the variations brought by the changes of parameters. Since the integration by parts technique for Markov chains is well developed in recent years, in this paper we apply it for computation of sensitivity and show the closed-form expressions for two commonly-used time-continuous Markovian models. By comparison, we conclude that our approach outperforms the existing technique of computing sensitivity on Markovian models.

  3. Analysis of integrated energy systems

    International Nuclear Information System (INIS)

    Matsuhashi, Takaharu; Kaya, Yoichi; Komiyama, Hiroshi; Hayashi, Taketo; Yasukawa, Shigeru.

    1988-01-01

    World attention is now attracted to the concept of Novel Horizontally Integrated Energy System (NHIES). In NHIES, all fossil fuels are fist converted into CO and H 2 . Potential environmental contaminants such as sulfur are removed during this process. CO turbines are mainly used to generate electric power. Combustion is performed in pure oxygen produced through air separation, making it possible to completely prevent the formation of thermal NOx. Thus, NHIES would release very little amount of such substances that would contribute to acid rain. In this system, the intermediate energy sources of CO, H 2 and O 2 are integrated horizontally. They are combined appropriately to produce a specific form of final energy source. The integration of intermediate energy sources can provide a wide variety of final energy sources, allowing any type of fossil fuel to serve as an alternative to other types of fossil fuel. Another feature of NHIES is the positive use of nuclear fuel to reduce the formation of CO 2 . Studies are under way in Japan to develop a new concept of integrated energy system. These studies are especially aimed at decreased overall efficiency and introduction of new liquid fuels that are high in conversion efficiency. Considerations are made on the final form of energy source, robust control, acid fallout, and CO 2 reduction. (Nogami, K.)

  4. Structural integrity analysis of an Ignalina nuclear power plant building subjected to an airplane crash

    International Nuclear Information System (INIS)

    Dundulis, Gintautas; Kulak, Ronald F.; Marchertas, Algirdas; Uspuras, Eugenijus

    2007-01-01

    Recent terrorist attacks using commandeered commercial airliners on civil structures have raised the issue of the ability of nuclear power plants to survive the consequences of an airliner crash. The structural integrity analysis due to the effects of an aircraft crash on an Ignalina nuclear power plant (INPP) accident localization system (ALS) building is the subject of this paper. A combination of the finite element method and empirical relationships were used for the analysis. A global structural integrity analysis was performed for a portion of the ALS building using the dynamic loading from an aircraft crash impact model. The local effects caused by impact of the aircraft's engine on the building wall were evaluated independently by using an empirical formula. The results from the crash analysis of a twin engine commercial aircraft show that the impacted reinforced concrete wall of the ALS building will not have through-the-wall concrete failure, and the reinforcement will not fail. Strain-rate effects were found to delay the onset of cracking. Therefore, the structural integrity of the impacted wall of the INPP ALS building will be maintained during the crash event studied

  5. Structural integrity analysis of an Ignalina nuclear power plant building subjected to an airplane crash

    Energy Technology Data Exchange (ETDEWEB)

    Dundulis, Gintautas [Laboratory of Nuclear Installation Safety, Lithuanian Energy Institute, 3 Breslaujos, 44403 Kaunas-35 (Lithuania)]. E-mail: gintas@isag.lei.lt; Kulak, Ronald F. [RFK Engineering Mechanics Consultants (United States); Marchertas, Algirdas [Northern Illinois University (United States); Uspuras, Eugenijus [Laboratory of Nuclear Installation Safety, Lithuanian Energy Institute, 3 Breslaujos, 44403 Kaunas-35 (Lithuania)

    2007-08-15

    Recent terrorist attacks using commandeered commercial airliners on civil structures have raised the issue of the ability of nuclear power plants to survive the consequences of an airliner crash. The structural integrity analysis due to the effects of an aircraft crash on an Ignalina nuclear power plant (INPP) accident localization system (ALS) building is the subject of this paper. A combination of the finite element method and empirical relationships were used for the analysis. A global structural integrity analysis was performed for a portion of the ALS building using the dynamic loading from an aircraft crash impact model. The local effects caused by impact of the aircraft's engine on the building wall were evaluated independently by using an empirical formula. The results from the crash analysis of a twin engine commercial aircraft show that the impacted reinforced concrete wall of the ALS building will not have through-the-wall concrete failure, and the reinforcement will not fail. Strain-rate effects were found to delay the onset of cracking. Therefore, the structural integrity of the impacted wall of the INPP ALS building will be maintained during the crash event studied.

  6. International Energy Agency (IEA) Greenhouse Gas (GHG) Weyburn-Midale CO₂ Monitoring and Storage Project

    Energy Technology Data Exchange (ETDEWEB)

    Sacuta, Norm [Petroleum Technology Research Centre Incorporated, Saskatchewan (Canada); Young, Aleana [Petroleum Technology Research Centre Incorporated, Saskatchewan (Canada); Worth, Kyle [Petroleum Technology Research Centre Incorporated, Saskatchewan (Canada)

    2015-12-22

    The IEAGHG Weyburn-Midale CO₂ Monitoring and Storage Project (WMP) began in 2000 with the first four years of research that confirmed the suitability of the containment complex of the Weyburn oil field in southeastern Saskatchewan as a storage location for CO₂ injected as part of enhanced oil recovery (EOR) operations. The first half of this report covers research conducted from 2010 to 2012, under the funding of the United States Department of Energy (contract DEFE0002697), the Government of Canada, and various other governmental and industry sponsors. The work includes more in-depth analysis of various components of a measurement, monitoring and verification (MMV) program through investigation of data on site characterization and geological integrity, wellbore integrity, storage monitoring (geophysical and geochemical), and performance/risk assessment. These results then led to the development of a Best Practices Manual (BPM) providing oilfield and project operators with guidance on CO₂ storage and CO₂-EOR. In 2013, the USDOE and Government of Saskatchewan exercised an optional phase of the same project to further develop and deploy applied research tools, technologies, and methodologies to the data and research at Weyburn with the aim of assisting regulators and operators in transitioning CO₂-EOR operations into permanent storage. This work, detailed in the second half of this report, involves seven targeted research projects – evaluating the minimum dataset for confirming secure storage; additional overburden monitoring; passive seismic monitoring; history-matched modelling; developing proper wellbore design; casing corrosion evaluation; and assessment of post CO₂-injected core samples. The results from the final and optional phases of the Weyburn-Midale Project confirm the suitability of CO₂-EOR fields for the injection of CO₂, and further, highlight the necessary MMV and follow-up monitoring required for these operations to be considered

  7. Office of Integrated Assessment and Policy Analysis

    International Nuclear Information System (INIS)

    Parzyck, D.C.

    1980-01-01

    The mission of the Office of Integrated Assessments and Policy Analysis (OIAPA) is to examine current and future policies related to the development and use of energy technologies. The principal ongoing research activity to date has focused on the impacts of several energy sources, including coal, oil shale, solar, and geothermal, from the standpoint of the Resource Conservation and Recovery Act. An additional project has recently been initiated on an evaluation of impacts associated with the implementation of the Toxic Substances Control Act. The impacts of the Resource Conservation and Recovery Act and the Toxic Substances Control Act on energy supply constitute the principal research focus of OIAPA for the near term. From these studies a research approach will be developed to identify certain common elements in the regulatory evaluation cycle as a means of evaluating subsequent environmental, health, and socioeconomic impact. It is planned that an integrated assessment team examine studies completed or underway on the following aspects of major regulations: health, risk assessment, testing protocols, environment control cost/benefits, institutional structures, and facility siting. This examination would assess the methodologies used, determine the general applicability of such studies, and present in a logical form information that appears to have broad general application. A suggested action plan for the State of Tennessee on radioactive and hazardous waste management is outlined

  8. Integrative biological analysis for neuropsychopharmacology.

    Science.gov (United States)

    Emmett, Mark R; Kroes, Roger A; Moskal, Joseph R; Conrad, Charles A; Priebe, Waldemar; Laezza, Fernanda; Meyer-Baese, Anke; Nilsson, Carol L

    2014-01-01

    Although advances in psychotherapy have been made in recent years, drug discovery for brain diseases such as schizophrenia and mood disorders has stagnated. The need for new biomarkers and validated therapeutic targets in the field of neuropsychopharmacology is widely unmet. The brain is the most complex part of human anatomy from the standpoint of number and types of cells, their interconnections, and circuitry. To better meet patient needs, improved methods to approach brain studies by understanding functional networks that interact with the genome are being developed. The integrated biological approaches--proteomics, transcriptomics, metabolomics, and glycomics--have a strong record in several areas of biomedicine, including neurochemistry and neuro-oncology. Published applications of an integrated approach to projects of neurological, psychiatric, and pharmacological natures are still few but show promise to provide deep biological knowledge derived from cells, animal models, and clinical materials. Future studies that yield insights based on integrated analyses promise to deliver new therapeutic targets and biomarkers for personalized medicine.

  9. Microfluidic device for continuous single cells analysis via Raman spectroscopy enhanced by integrated plasmonic nanodimers

    DEFF Research Database (Denmark)

    Perozziello, Gerardo; Candeloro, Patrizio; De Grazia, Antonio

    2016-01-01

    In this work a Raman flow cytometer is presented. It consists of a microfluidic device that takes advantages of the basic principles of Raman spectroscopy and flow cytometry. The microfluidic device integrates calibrated microfluidic channels-where the cells can flow one-by-one -, allowing single...... cell Raman analysis. The microfluidic channel integrates plasmonic nanodimers in a fluidic trapping region. In this way it is possible to perform Enhanced Raman Spectroscopy on single cell. These allow a label-free analysis, providing information about the biochemical content of membrane and cytoplasm...

  10. Predicting long-term performance of engineered geologic carbon dioxide storage systems to inform decisions amidst uncertainty

    Science.gov (United States)

    Pawar, R.

    2016-12-01

    Risk assessment and risk management of engineered geologic CO2 storage systems is an area of active investigation. The potential geologic CO2 storage systems currently under consideration are inherently heterogeneous and have limited to no characterization data. Effective risk management decisions to ensure safe, long-term CO2 storage requires assessing and quantifying risks while taking into account the uncertainties in a storage site's characteristics. The key decisions are typically related to definition of area of review, effective monitoring strategy and monitoring duration, potential of leakage and associated impacts, etc. A quantitative methodology for predicting a sequestration site's long-term performance is critical for making key decisions necessary for successful deployment of commercial scale geologic storage projects where projects will require quantitative assessments of potential long-term liabilities. An integrated assessment modeling (IAM) paradigm which treats a geologic CO2 storage site as a system made up of various linked subsystems can be used to predict long-term performance. The subsystems include storage reservoir, seals, potential leakage pathways (such as wellbores, natural fractures/faults) and receptors (such as shallow groundwater aquifers). CO2 movement within each of the subsystems and resulting interactions are captured through reduced order models (ROMs). The ROMs capture the complex physical/chemical interactions resulting due to CO2 movement and interactions but are computationally extremely efficient. The computational efficiency allows for performing Monte Carlo simulations necessary for quantitative probabilistic risk assessment. We have used the IAM to predict long-term performance of geologic CO2 sequestration systems and to answer questions related to probability of leakage of CO2 through wellbores, impact of CO2/brine leakage into shallow aquifer, etc. Answers to such questions are critical in making key risk management

  11. Development of safety analysis technology for integral reactor

    International Nuclear Information System (INIS)

    Kim, Hee Cheol; Kim, K. K.; Kim, S. H.

    2002-04-01

    The state-of-the-arts for the integral reactor was performed to investigate the safety features. The safety and performance of SMART were assessed using the technologies developed during the study. For this purpose, the computer code system and the analysis methodology were developed and the safety and performance analyses on SMART basic design were carried out for the design basis event and accident. The experimental facilities were designed for the core flow distribution test and the self-pressurizing pressurizer performance test. The tests on the 2-phase critical flow with non-condensable gas were completed and the results were used to assess the critical flow model. Probabilistic Safety Assessment(PSA) was carried out to evaluate the safety level and to optimize the design by identifying and remedying any weakness in the design. A joint study with KINS was carried out to promote licensing environment. The generic safety issues of integral reactors were identified and the solutions were formulated. The economic evaluation of the SMART desalination plant and the activities related to the process control were carried out in the scope of the study

  12. Integration of thermodynamic insights and MINLP optimisation for the synthesis, design and analysis of process flowsheets

    DEFF Research Database (Denmark)

    Hostrup, Martin; Gani, Rafiqul; Kravanja, Zdravko

    1999-01-01

    This paper presents an integrated approach to the solution of process synthesis, design and analysis problems. Integration is achieved by combining two different techniques, synthesis based on thermodynamic insights and structural optimization together with a simulation engine and a properties pr...

  13. Noise analysis of switched integrator preamplifiers

    International Nuclear Information System (INIS)

    Sun Hongbo; Li Yulan; Zhu Weibin

    2004-01-01

    The main noise sources of switched integrator preamplifiers are discussed, and their noise performance are given combined PSpice simulation and experiments on them. Then, some practical methods on how to reduce noise of preamplifiers in two different integrator modes are provided. (authors)

  14. Flat-plate solar array project. Volume 8: Project analysis and integration

    Science.gov (United States)

    Mcguire, P.; Henry, P.

    1986-01-01

    Project Analysis and Integration (PA&I) performed planning and integration activities to support management of the various Flat-Plate Solar Array (FSA) Project R&D activities. Technical and economic goals were established by PA&I for each R&D task within the project to coordinate the thrust toward the National Photovoltaic Program goals. A sophisticated computer modeling capability was developed to assess technical progress toward meeting the economic goals. These models included a manufacturing facility simulation, a photovoltaic power station simulation and a decision aid model incorporating uncertainty. This family of analysis tools was used to track the progress of the technology and to explore the effects of alternative technical paths. Numerous studies conducted by PA&I signaled the achievement of milestones or were the foundation of major FSA project and national program decisions. The most important PA&I activities during the project history are summarized. The PA&I planning function is discussed and how it relates to project direction and important analytical models developed by PA&I for its analytical and assessment activities are reviewed.

  15. Integral Hellmann--Feynman analysis of nonisoelectronic processes and the determination of local ionization potentials

    International Nuclear Information System (INIS)

    Simons, G.

    1975-01-01

    The integral Hellmann--Feynmann theorem is extended to apply to nonisoelectronic processes. A local ionization potential formula is proposed, and test calculations on three different approximate helium wavefunctions are reported which suggest that it may be numerically superior to the standard difference of expectation values. Arguments for the physical utility of the new concept are presented, and an integral Hellmann--Feynman analysis of transition energies is begun

  16. Experimental study on hydration damage mechanism of shale from the Longmaxi Formation in southern Sichuan Basin, China

    Directory of Open Access Journals (Sweden)

    Xiangjun Liu

    2016-03-01

    Full Text Available As a serious problem in drilling operation, wellbore instability restricts efficient development of shale gas. The interaction between the drilling fluid and shale with hydration swelling property would have impact on the generation and propagation mechanism of cracks in shale formation, leading to wellbore instability. In order to investigate the influence of the hydration swelling on the crack propagation, mineral components and physicochemical properties of shale from the Lower Silurian Longmaxi Formation (LF were investigated by using the XRD analysis, cation exchange capabilities (CEC analysis, and SEM observation, and we researched the hydration mechanism of LF shale. Results show that quartz and clay mineral are dominated in mineral composition, and illite content averaged 67% in clay mineral. Meanwhile, CEC of the LF shale are 94.4 mmol/kg. The process of water intruding inside shale along microcracks was able to be observed through high power microscope, meanwhile, the hydration swelling stress would concentrate at the crack tip. The microcracks would propagate, bifurcate and connect with each other, with increase of water immersing time, and it would ultimately develop into macro-fracture. Moreover, the macrocracks extend and coalesce along the bedding, resulting in the rock failure into blocks. Hydration swelling is one of the major causes that lead to wellbore instability of the LF shale, and therefore improving sealing capacity and inhibition of drilling fluid system is an effective measure to stabilize a borehole.

  17. Photometric method for determination of acidity constants through integral spectra analysis

    Science.gov (United States)

    Zevatskiy, Yuriy Eduardovich; Ruzanov, Daniil Olegovich; Samoylov, Denis Vladimirovich

    2015-04-01

    An express method for determination of acidity constants of organic acids, based on the analysis of the integral transmittance vs. pH dependence is developed. The integral value is registered as a photocurrent of photometric device simultaneously with potentiometric titration. The proposed method allows to obtain pKa using only simple and low-cost instrumentation. The optical part of the experimental setup has been optimized through the exclusion of the monochromator device. Thus it only takes 10-15 min to obtain one pKa value with the absolute error of less than 0.15 pH units. Application limitations and reliability of the method have been tested for a series of organic acids of various nature.

  18. Integrated Data Collection Analysis (IDCA) Program - RDX Standard Data Set 2

    Energy Technology Data Exchange (ETDEWEB)

    Sandstrom, Mary M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Geoffrey W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Daniel N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pollard, Colin J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Warner, Kirstin F. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Sorensen, Daniel N. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Remmers, Daniel L. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Phillips, Jason J. [Air Force Research Lab. (AFRL), Tyndall Air Force Base, FL (United States); Shelley, Timothy J. [Applied Research Associates, Tyndall Air Force Base, FL (United States); Reyes, Jose A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hsu, Peter C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Reynolds, John G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-02-20

    The Integrated Data Collection Analysis (IDCA) program is conducting a proficiency study for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are the results for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of the RDX Type II Class 5 standard, from testing the second time in the Proficiency Test. This RDX testing (Set 2) compared to the first (Set 1) was found to have about the same impact sensitivity, have more BAM friction sensitivity, less ABL friction sensitivity, similar ESD sensitivity, and same DSC sensitivity.

  19. Reservoir characterization and final pre-test analysis in support of the compressed-air-energy-storage Pittsfield aquifer field test in Pike County, Illinois

    Energy Technology Data Exchange (ETDEWEB)

    Wiles, L.E.; McCann, R.A.

    1983-06-01

    The work reported is part of a field experimental program to demonstrate and evaluate compressed air energy storage in a porous media aquifer reservoir near Pittsfield, Illinois. The reservoir is described. Numerical modeling of the reservoir was performed concurrently with site development. The numerical models were applied to predict the thermohydraulic performance of the porous media reservoir. This reservoir characterization and pre-test analysis made use of evaluation of bubble development, water coning, thermal development, and near-wellbore desaturation. The work was undertaken to define the time required to develop an air storage bubble of adequate size, to assess the specification of instrumentation and above-ground equipment, and to develop and evaluate operational strategies for air cycling. A parametric analysis was performed for the field test reservoir. (LEW)

  20. Development of Integrated Flood Analysis System for Improving Flood Mitigation Capabilities in Korea

    Science.gov (United States)

    Moon, Young-Il; Kim, Jong-suk

    2016-04-01

    Recently, the needs of people are growing for a more safety life and secure homeland from unexpected natural disasters. Flood damages have been recorded every year and those damages are greater than the annual average of 2 trillion won since 2000 in Korea. It has been increased in casualties and property damages due to flooding caused by hydrometeorlogical extremes according to climate change. Although the importance of flooding situation is emerging rapidly, studies related to development of integrated management system for reducing floods are insufficient in Korea. In addition, it is difficult to effectively reduce floods without developing integrated operation system taking into account of sewage pipe network configuration with the river level. Since the floods result in increasing damages to infrastructure, as well as life and property, structural and non-structural measures should be urgently established in order to effectively reduce the flood. Therefore, in this study, we developed an integrated flood analysis system that systematized technology to quantify flood risk and flood forecasting for supporting synthetic decision-making through real-time monitoring and prediction on flash rain or short-term rainfall by using radar and satellite information in Korea. Keywords: Flooding, Integrated flood analysis system, Rainfall forecasting, Korea Acknowledgments This work was carried out with the support of "Cooperative Research Program for Agriculture Science & Technology Development (Project No. PJ011686022015)" Rural Development Administration, Republic of Korea

  1. Integrated omics analysis of specialized metabolism in medicinal plants.

    Science.gov (United States)

    Rai, Amit; Saito, Kazuki; Yamazaki, Mami

    2017-05-01

    Medicinal plants are a rich source of highly diverse specialized metabolites with important pharmacological properties. Until recently, plant biologists were limited in their ability to explore the biosynthetic pathways of these metabolites, mainly due to the scarcity of plant genomics resources. However, recent advances in high-throughput large-scale analytical methods have enabled plant biologists to discover biosynthetic pathways for important plant-based medicinal metabolites. The reduced cost of generating omics datasets and the development of computational tools for their analysis and integration have led to the elucidation of biosynthetic pathways of several bioactive metabolites of plant origin. These discoveries have inspired synthetic biology approaches to develop microbial systems to produce bioactive metabolites originating from plants, an alternative sustainable source of medicinally important chemicals. Since the demand for medicinal compounds are increasing with the world's population, understanding the complete biosynthesis of specialized metabolites becomes important to identify or develop reliable sources in the future. Here, we review the contributions of major omics approaches and their integration to our understanding of the biosynthetic pathways of bioactive metabolites. We briefly discuss different approaches for integrating omics datasets to extract biologically relevant knowledge and the application of omics datasets in the construction and reconstruction of metabolic models. © 2017 The Authors The Plant Journal © 2017 John Wiley & Sons Ltd.

  2. Integrative Analysis of Gene Expression Data Including an Assessment of Pathway Enrichment for Predicting Prostate Cancer

    Directory of Open Access Journals (Sweden)

    Pingzhao Hu

    2006-01-01

    Full Text Available Background: Microarray technology has been previously used to identify genes that are differentially expressed between tumour and normal samples in a single study, as well as in syntheses involving multiple studies. When integrating results from several Affymetrix microarray datasets, previous studies summarized probeset-level data, which may potentially lead to a loss of information available at the probe-level. In this paper, we present an approach for integrating results across studies while taking probe-level data into account. Additionally, we follow a new direction in the analysis of microarray expression data, namely to focus on the variation of expression phenotypes in predefined gene sets, such as pathways. This targeted approach can be helpful for revealing information that is not easily visible from the changes in the individual genes. Results: We used a recently developed method to integrate Affymetrix expression data across studies. The idea is based on a probe-level based test statistic developed for testing for differentially expressed genes in individual studies. We incorporated this test statistic into a classic random-effects model for integrating data across studies. Subsequently, we used a gene set enrichment test to evaluate the significance of enriched biological pathways in the differentially expressed genes identified from the integrative analysis. We compared statistical and biological significance of the prognostic gene expression signatures and pathways identified in the probe-level model (PLM with those in the probeset-level model (PSLM. Our integrative analysis of Affymetrix microarray data from 110 prostate cancer samples obtained from three studies reveals thousands of genes significantly correlated with tumour cell differentiation. The bioinformatics analysis, mapping these genes to the publicly available KEGG database, reveals evidence that tumour cell differentiation is significantly associated with many

  3. Implementation of the structural integrity analysis for PWR primary components and piping

    International Nuclear Information System (INIS)

    Pellissier-Tanon, A.

    1982-01-01

    The trends on the definition, the assessment and the application of fracture strength evaluation methodology, which have arisen through experience in the design, construction and operation of French 900-MW plants are reviewed. The main features of the methodology proposed in a draft of Appendix ZG of the RCC-M code of practice for the design verification of fracture strength of primary components are presented. The research programs are surveyed and discussed from four viewpoints, first implementation of the LEFM analysis, secondly implementation of the fatigue crack propagation analysis, thirdly analysis of vessel integrity during emergency core cooling, and fourthly methodology for tear fracture analysis. (author)

  4. An Integrated Approach of Model checking and Temporal Fault Tree for System Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Koh, Kwang Yong; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)

    2009-10-15

    Digitalization of instruments and control systems in nuclear power plants offers the potential to improve plant safety and reliability through features such as increased hardware reliability and stability, and improved failure detection capability. It however makes the systems and their safety analysis more complex. Originally, safety analysis was applied to hardware system components and formal methods mainly to software. For software-controlled or digitalized systems, it is necessary to integrate both. Fault tree analysis (FTA) which has been one of the most widely used safety analysis technique in nuclear industry suffers from several drawbacks as described in. In this work, to resolve the problems, FTA and model checking are integrated to provide formal, automated and qualitative assistance to informal and/or quantitative safety analysis. Our approach proposes to build a formal model of the system together with fault trees. We introduce several temporal gates based on timed computational tree logic (TCTL) to capture absolute time behaviors of the system and to give concrete semantics to fault tree gates to reduce errors during the analysis, and use model checking technique to automate the reasoning process of FTA.

  5. Integration of hydrothermal carbonization and a CHP plant: Part 2 –operational and economic analysis

    International Nuclear Information System (INIS)

    Saari, Jussi; Sermyagina, Ekaterina; Kaikko, Juha; Vakkilainen, Esa; Sergeev, Vitaly

    2016-01-01

    Wood-fired combined heat and power (CHP) plants are a proven technology for producing domestic, carbon-neutral heat and power in Nordic countries. One drawback of CHP plants is the low capacity factors due to varying heat loads. In the current economic environment, uncertainty over energy prices creates also uncertainty over investment profitability. Hydrothermal carbonization (HTC) is a promising thermochemical conversion technology for producing an improved, more versatile wood-based fuel. Integrating HTC with a CHP plant allows simplifying the HTC process and extending the CHP plant operating time. An integrated polygeneration plant producing three energy products is also less sensitive to price changes in any one product. This study compares three integration cases chosen from the previous paper, and the case of separate stand-alone plants. The best economic performance is obtained using pressurized hot water from the CHP plant boiler drum as HTC process water. This has the poorest efficiency, but allows the greatest cost reduction in the HTC process and longest CHP plant operating time. The result demonstrates the suitability of CHP plants for integration with a HTC process, and the importance of economic and operational analysis considering annual load variations in sufficient detail. - Highlights: • Integration of wood hydrothermal carbonization with a small CHP plant studied. • Operation and economics of three concepts and stand-alone plants are compared. • Sensitivity analysis is performed. • Results show technical and thermodynamic analysis inadequate and misleading alone. • Minimizing HTC investment, extending CHP operating time important for profitability.

  6. Integrated analysis of wind turbines - The impact of power systems on wind turbine design

    DEFF Research Database (Denmark)

    Barahona Garzón, Braulio

    Megawatt-size wind turbines nowadays operate in very complex environmental conditions, and increasingly demanding power system requirements. Pursuing a cost-effective and reliable wind turbine design is a multidisciplinary task. However nowadays, wind turbine design and research areas...... conditions that stem from disturbances in the power system. An integrated simulation environment, wind turbine models, and power system models are developed in order to take an integral perspective that considers the most important aeroelastic, structural, electrical, and control dynamics. Applications...... of the integrated simulation environment are presented. The analysis of an asynchronous machine, and numerical simulations of a fixedspeed wind turbine in the integrated simulation environment, demonstrate the effects on structural loads of including the generator rotor fluxes dynamics in aeroelastic studies. Power...

  7. A severe accident analysis for the system-integrated modular advanced reactor

    International Nuclear Information System (INIS)

    Jung, Gunhyo; Jae, Moosung

    2015-01-01

    The System-Integrated Modular Advanced Reactor (SMART) that has been recently designed in KOREA and has acquired standard design certification from the nuclear power regulatory body (NSSC) is an integral type reactor with 330MW thermal power. It is a small sized reactor in which the core, steam generator, pressurizer, and reactor coolant pump that are in existing pressurized light water reactors are designed to be within a pressure vessel without any separate pipe connection. In addition, this reactor has much different design characteristics from existing pressurized light water reactors such as the adoption of a passive residual heat removal system and a cavity flooding system. Therefore, the safety of the SMART against severe accidents should be checked through severe accident analysis reflecting the design characteristics of the SMART. For severe accident analysis, an analysis model has been developed reflecting the design information presented in the standard design safety analysis report. The severe accident analysis model has been developed using the MELCOR code that is widely used to evaluate pressurized LWR severe accidents. The steady state accident analysis model for the SMART has been simulated. According to the analysis results, the developed model reflecting the design of the SMART is found to be appropriate. Severe accident analysis has been performed for the representative accident scenarios that lead to core damage to check the appropriateness of the severe accident management plan for the SMART. The SMART has been shown to be safe enough to prevent severe accidents by utilizing severe accident management systems such as a containment spray system, a passive hydrogen recombiner, and a cavity flooding system. In addition, the SMART is judged to have been technically improved remarkably compared to existing PWRs. The SMART has been designed to have a larger reactor coolant inventory compared to its core's thermal power, a large surface area in

  8. Analysis on working pressure selection of ACME integral test facility

    International Nuclear Information System (INIS)

    Chen Lian; Chang Huajian; Li Yuquan; Ye Zishen; Qin Benke

    2011-01-01

    An integral effects test facility, advanced core cooling mechanism experiment facility (ACME) was designed to verify the performance of the passive safety system and validate its safety analysis codes of a pressurized water reactor power plant. Three test facilities for AP1000 design were introduced and review was given. The problems resulted from the different working pressures of its test facilities were analyzed. Then a detailed description was presented on the working pressure selection of ACME facility as well as its characteristics. And the approach of establishing desired testing initial condition was discussed. The selected 9.3 MPa working pressure covered almost all important passive safety system enables the ACME to simulate the LOCAs with the same pressure and property similitude as the prototype. It's expected that the ACME design would be an advanced core cooling integral test facility design. (authors)

  9. Study on Network Error Analysis and Locating based on Integrated Information Decision System

    Science.gov (United States)

    Yang, F.; Dong, Z. H.

    2017-10-01

    Integrated information decision system (IIDS) integrates multiple sub-system developed by many facilities, including almost hundred kinds of software, which provides with various services, such as email, short messages, drawing and sharing. Because the under-layer protocols are different, user standards are not unified, many errors are occurred during the stages of setup, configuration, and operation, which seriously affect the usage. Because the errors are various, which may be happened in different operation phases, stages, TCP/IP communication protocol layers, sub-system software, it is necessary to design a network error analysis and locating tool for IIDS to solve the above problems. This paper studies on network error analysis and locating based on IIDS, which provides strong theory and technology supports for the running and communicating of IIDS.

  10. Setting plug & abandonment barriers with minimum removal of tubulars

    OpenAIRE

    Nessa, Jon Olav

    2012-01-01

    Master's thesis in Petroleum engineering The useful life of an offshore well is determined by the reserves which it contacts, the pressure support within the reservoir and the continued integrity of the wellbore. When a well has reached the end of its lifetime, plugging operations have to be conducted before permanent abandonment. Conventional Plug and Abandonment (P&A) operations will often require removing a section of the casing in order to create cross sectional barriers for well aband...

  11. An integrated safety analysis of intravenous ibuprofen (Caldolor® in adults

    Directory of Open Access Journals (Sweden)

    Southworth SR

    2015-10-01

    Full Text Available Stephen R Southworth,1 Emily J Woodward,2 Alex Peng,2 Amy D Rock21North Mississippi Sports Medicine and Orthopaedic Clinic, PLLC, Tupelo, MS, 2Department of Research and Development, Cumberland Pharmaceuticals Inc., Nashville, TN, USAAbstract: Intravenous (IV nonsteroidal anti-inflammatory drugs such as IV ibuprofen are increasingly used as a component of multimodal pain management in the inpatient and outpatient settings. The safety of IV ibuprofen as assessed in ten sponsored clinical studies is presented in this analysis. Overall, 1,752 adult patients have been included in safety and efficacy trials over 11 years; 1,220 of these patients have received IV ibuprofen and 532 received either placebo or comparator medication. The incidence of adverse events (AEs, serious AEs, and changes in vital signs and clinically significant laboratory parameters have been summarized and compared to patients receiving placebo or active comparator drug. Overall, IV ibuprofen has been well tolerated by hospitalized and outpatient patients when administered both prior to surgery and postoperatively as well as for nonsurgical pain or fever. The overall incidence of AEs is lower in patients receiving IV ibuprofen as compared to those receiving placebo in this integrated analysis. Specific analysis of hematological and renal effects showed no increased risk for patients receiving IV ibuprofen. A subset analysis of elderly patients suggests that no dose adjustment is needed in this higher risk population. This integrated safety analysis demonstrates that IV ibuprofen can be safely administered prior to surgery and continued in the postoperative period as a component of multimodal pain management.Keywords: NSAID, surgical pain, fever, perioperative analgesia, critical care, multimodal pain management

  12. PANDORA: keyword-based analysis of protein sets by integration of annotation sources.

    Science.gov (United States)

    Kaplan, Noam; Vaaknin, Avishay; Linial, Michal

    2003-10-01

    Recent advances in high-throughput methods and the application of computational tools for automatic classification of proteins have made it possible to carry out large-scale proteomic analyses. Biological analysis and interpretation of sets of proteins is a time-consuming undertaking carried out manually by experts. We have developed PANDORA (Protein ANnotation Diagram ORiented Analysis), a web-based tool that provides an automatic representation of the biological knowledge associated with any set of proteins. PANDORA uses a unique approach of keyword-based graphical analysis that focuses on detecting subsets of proteins that share unique biological properties and the intersections of such sets. PANDORA currently supports SwissProt keywords, NCBI Taxonomy, InterPro entries and the hierarchical classification terms from ENZYME, SCOP and GO databases. The integrated study of several annotation sources simultaneously allows a representation of biological relations of structure, function, cellular location, taxonomy, domains and motifs. PANDORA is also integrated into the ProtoNet system, thus allowing testing thousands of automatically generated clusters. We illustrate how PANDORA enhances the biological understanding of large, non-uniform sets of proteins originating from experimental and computational sources, without the need for prior biological knowledge on individual proteins.

  13. Integrated torrefaction vs. external torrefaction - A thermodynamic analysis for the case of a thermochemical biorefinery

    DEFF Research Database (Denmark)

    Clausen, Lasse Røngaard

    2014-01-01

    Integrated and external torrefaction is analyzed and compared via thermodynamic modeling. In this paper, integrated torrefaction is defined as torrefaction integrated with entrained flow gasification. External torrefaction is defined as the decentralized production of torrefied wood pellets...... and centralized conversion of the pellets by entrained flow gasification. First, the syngas production of the two methods was compared. Second, the two methods were compared by considering complete biorefineries with either integrated torrefaction or external torrefaction. The first part of the analysis showed...... from external torrefaction to integrated torrefaction. The costs of this increase in energy efficiency are as follows: 1) more difficult transport, storage and handling of the biomass feedstock (wood chips vs. torrefied wood pellets); 2) reduced plant size; 3) no net electricity production; and 4...

  14. Integrated Data Collection Analysis (IDCA) Program — RDX Standard Data Sets

    Energy Technology Data Exchange (ETDEWEB)

    Sandstrom, Mary M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Geoffrey W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Daniel N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pollard, Colin J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Warner, Kirstin F. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Sorensen, Daniel N. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Remmers, Daniel L. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Phillips, Jason J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shelley, Timothy J. [Bureau of Alcohol, Tobacco and Firearms, Huntsville, AL (United States); Reyes, Jose A. [Applied Research Associates, Tyndall AFB, FL (United States); Hsu, Peter C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Reynolds, John G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-03-04

    The Integrated Data Collection Analysis (IDCA) program is conducting a proficiency study for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are the results for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of the RDX Type II Class 5 standard, for a third and fourth time in the Proficiency Test and averaged with the analysis results from the first and second time. The results, from averaging all four sets (1, 2, 3 and 4) of data suggest a material to have slightly more impact sensitivity, more BAM friction sensitivity, less ABL friction sensitivity, similar ESD sensitivity, and same DSC sensitivity, compared to the results from Set 1, which was used previously as the values for the RDX standard in IDCA Analysis Reports.

  15. User-driven integrated software lives: ``Paleomag'' paleomagnetics analysis on the Macintosh

    Science.gov (United States)

    Jones, Craig H.

    2002-12-01

    "PaleoMag," a paleomagnetics analysis package originally developed for the Macintosh operating system in 1988, allows examination of demagnetization of individual samples and analysis of directional data from collections of samples. Prior to recent reinvigorated development of the software for both Macintosh and Windows, it was widely used despite not running properly on machines and operating systems sold after 1995. This somewhat surprising situation demonstrates that there is a continued need for integrated analysis software within the earth sciences, in addition to well-developed scripting and batch-mode software. One distinct advantage of software like PaleoMag is in the ability to combine quality control with analysis within a unique graphical environment. Because such demands are frequent within the earth sciences, means of nurturing the development of similar software should be found.

  16. Integrated torrefaction vs. external torrefaction – A thermodynamic analysis for the case of a thermochemical biorefinery

    International Nuclear Information System (INIS)

    Clausen, Lasse R.

    2014-01-01

    Integrated and external torrefaction is analyzed and compared via thermodynamic modeling. In this paper, integrated torrefaction is defined as torrefaction integrated with entrained flow gasification. External torrefaction is defined as the decentralized production of torrefied wood pellets and centralized conversion of the pellets by entrained flow gasification. First, the syngas production of the two methods was compared. Second, the two methods were compared by considering complete biorefineries with either integrated torrefaction or external torrefaction. The first part of the analysis showed that the biomass to syngas efficiency can be increased from 63% to 86% (LHV-dry) when switching from external torrefaction to integrated torrefaction. The second part of the analysis showed that the total energy efficiency (biomass to methanol + net electricity) could be increased from 53% to 63% when switching from external torrefaction to integrated torrefaction. The costs of this increase in energy efficiency are as follows: 1) more difficult transport, storage and handling of the biomass feedstock (wood chips vs. torrefied wood pellets); 2) reduced plant size; 3) no net electricity production; and 4) a more complex plant design. - Highlights: • Integrated torrefaction is compared with external torrefaction. • Biomass to syngas energy efficiencies of 63–86% are achieved. • Two thermochemical biorefinery are designed and analysed by thermodynamic modeling. • Biomass to fuel + electricity energy efficiencies of 53–63% are achieved. • The pros and cons of integrated torrefaction are described

  17. A collapse pressure prediction model for horizontal shale gas wells with multiple weak planes

    Directory of Open Access Journals (Sweden)

    Ping Chen

    2015-01-01

    Full Text Available Since collapse of horizontal wellbore through long brittle shale interval is a major problem, the occurrence characteristics of weak planes were analyzed according to outcrop, core, and SEM and FMI data of shale rocks. A strength analysis method was developed for shale rocks with multiple weak planes based on weak-plane strength theory. An analysis was also conducted of the strength characteristics of shale rocks with uniform distribution of multiple weak planes. A collapse pressure prediction model for horizontal wells in shale formation with multiple weak planes was established, which takes into consideration the occurrence of each weak plane, wellbore stress condition, borehole azimuth, and in-situ stress azimuth. Finally, a case study of a horizontal shale gas well in southern Sichuan Basin was conducted. The results show that the intersection angle between the shale bedding plane and the structural fracture is generally large (nearly orthogonal; with the increase of weak plane number, the strength of rock mass declines sharply and is more heavily influenced by weak planes; when there are more than four weak planes, the rock strength tends to be isotropic and the whole strength of rock mass is greatly weakened, significantly increasing the risk of wellbore collapse. With the increase of weak plane number, the drilling fluid density (collapse pressure to keep borehole stability goes up gradually. For instance, the collapse pressure is 1.04 g/cm3 when there are no weak planes, and 1.55 g/cm3 when there is one weak plane, and 1.84 g/cm3 when there are two weak planes. The collapse pressure prediction model for horizontal wells proposed in this paper presented results in better agreement with those in actual situation. This model, more accurate and practical than traditional models, can effectively improve the accuracy of wellbore collapse pressure prediction of horizontal shale gas wells.

  18. Estimation of oil reservoir thermal properties through temperature log data using inversion method

    International Nuclear Information System (INIS)

    Cheng, Wen-Long; Nian, Yong-Le; Li, Tong-Tong; Wang, Chang-Long

    2013-01-01

    Oil reservoir thermal properties not only play an important role in steam injection well heat transfer, but also are the basic parameters for evaluating the oil saturation in reservoir. In this study, for estimating reservoir thermal properties, a novel heat and mass transfer model of steam injection well was established at first, this model made full analysis on the wellbore-reservoir heat and mass transfer as well as the wellbore-formation, and the simulated results by the model were quite consistent with the log data. Then this study presented an effective inversion method for estimating the reservoir thermal properties through temperature log data. This method is based on the heat transfer model in steam injection wells, and can be used to predict the thermal properties as a stochastic approximation method. The inversion method was applied to estimate the reservoir thermal properties of two steam injection wells, it was found that the relative error of thermal conductivity for the two wells were 2.9% and 6.5%, and the relative error of volumetric specific heat capacity were 6.7% and 7.0%,which demonstrated the feasibility of the proposed method for estimating the reservoir thermal properties. - Highlights: • An effective inversion method for predicting the oil reservoir thermal properties was presented. • A novel model for steam injection well made full study on the wellbore-reservoir heat and mass transfer. • The wellbore temperature field and steam parameters can be simulated by the model efficiently. • Both reservoirs and formation thermal properties could be estimated simultaneously by the proposed method. • The estimated steam temperature was quite consistent with the field data

  19. Migration in Deltas: An Integrated Analysis

    Science.gov (United States)

    Nicholls, Robert J.; Hutton, Craig W.; Lazar, Attila; Adger, W. Neil; Allan, Andrew; Arto, Inaki; Vincent, Katharine; Rahman, Munsur; Salehin, Mashfiqus; Sugata, Hazra; Ghosh, Tuhin; Codjoe, Sam; Appeaning-Addo, Kwasi

    2017-04-01

    Deltas and low-lying coastal regions have long been perceived as vulnerable to global sea-level rise, with the potential for mass displacement of exposed populations. The assumption of mass displacement of populations in deltas requires a comprehensive reassessment in the light of present and future migration in deltas, including the potential role of adaptation to influence these decisions. At present, deltas are subject to multiple drivers of environmental change and often have high population densities as they are accessible and productive ecosystems. Climate change, catchment management, subsidence and land cover change drive environmental change across all deltas. Populations in deltas are also highly mobile, with significant urbanization trends and the growth of large cities and mega-cities within or adjacent to deltas across Asia and Africa. Such migration is driven primarily by economic opportunity, yet environmental change in general, and climate change in particular, are likely to play an increasing direct and indirect role in future migration trends. The policy challenges centre on the role of migration within regional adaptation strategies to climate change; the protection of vulnerable populations; and the future of urban settlements within deltas. This paper reviews current knowledge on migration and adaptation to environmental change to discern specific issues pertinent to delta regions. It develops a new integrated methodology to assess present and future migration in deltas using the Volta delta in Ghana, Mahanadi delta in India and Ganges-Brahmaputra-Meghna delta across India and Bangladesh. The integrated method focuses on: biophysical changes and spatial distribution of vulnerability; demographic changes and migration decision-making using multiple methods and data; macro-economic trends and scenarios in the deltas; and the policies and governance structures that constrain and enable adaptation. The analysis is facilitated by a range of

  20. Corporate Disclosure, Materiality, and Integrated Report: An Event Study Analysis

    OpenAIRE

    Maria Cleofe Giorgino; Enrico Supino; Federico Barnabè

    2017-01-01

    Within the extensive literature investigating the impacts of corporate disclosure in supporting the sustainable growth of an organization, few studies have included in the analysis the materiality issue referred to the information being disclosed. This article aims to address this gap, exploring the effect produced on capital markets by the publication of a recent corporate reporting tool, Integrated Report (IR). The features of this tool are that it aims to represent the multidimensional imp...

  1. Integrated Reliability and Risk Analysis System (IRRAS), Version 2.5: Reference manual

    International Nuclear Information System (INIS)

    Russell, K.D.; McKay, M.K.; Sattison, M.B.; Skinner, N.L.; Wood, S.T.; Rasmuson, D.M.

    1991-03-01

    The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the user the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification. Version 1.0 of the IRRAS program was released in February of 1987. Since that time, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 2.5 and is the subject of this Reference Manual. Version 2.5 of IRRAS provides the same capabilities as Version 1.0 and adds a relational data base facility for managing the data, improved functionality, and improved algorithm performance. 7 refs., 348 figs

  2. Integrated corridor management initiative : demonstration phase evaluation - Dallas technical capability analysis test plan.

    Science.gov (United States)

    This report presents the test plan for conducting the Technical Capability Analysis for the United States : Department of Transportation (U.S. DOT) evaluation of the Dallas U.S. 75 Integrated Corridor : Management (ICM) Initiative Demonstration. The ...

  3. SAP crm integration testing

    OpenAIRE

    Černiavskaitė, Marija

    2017-01-01

    This Bachelor's thesis presents SAP CRM and integration systems testing analysis: investigation in SAP CRM and SAP PO systems, presentation of relationship between systems, introduction to third-party system (non-SAP) – Network Informational System (NIS) which has integration with SAP, presentation of best CRM testing practises, analysis and recommendation of integration testing. Practical integration testing is done in accordance to recommendations.

  4. A comparison of response spectrum and direct integration analysis methods as applied to a nuclear component support structure

    International Nuclear Information System (INIS)

    Bryan, B.J.; Flanders, H.E. Jr.

    1992-01-01

    Seismic qualification of Class I nuclear components is accomplished using a variety of analytical methods. This paper compares the results of time history dynamic analyses of a heat exchanger support structure using response spectrum and time history direct integration analysis methods. Dynamic analysis is performed on the detailed component models using the two methods. A nonlinear elastic model is used for both the response spectrum and direct integration methods. A nonlinear model which includes friction and nonlinear springs, is analyzed using time history input by direct integration. The loads from the three cases are compared

  5. Bridging ImmunoGenomic Data Analysis Workflow Gaps (BIGDAWG): An integrated case-control analysis pipeline.

    Science.gov (United States)

    Pappas, Derek J; Marin, Wesley; Hollenbach, Jill A; Mack, Steven J

    2016-03-01

    Bridging ImmunoGenomic Data-Analysis Workflow Gaps (BIGDAWG) is an integrated data-analysis pipeline designed for the standardized analysis of highly-polymorphic genetic data, specifically for the HLA and KIR genetic systems. Most modern genetic analysis programs are designed for the analysis of single nucleotide polymorphisms, but the highly polymorphic nature of HLA and KIR data require specialized methods of data analysis. BIGDAWG performs case-control data analyses of highly polymorphic genotype data characteristic of the HLA and KIR loci. BIGDAWG performs tests for Hardy-Weinberg equilibrium, calculates allele frequencies and bins low-frequency alleles for k×2 and 2×2 chi-squared tests, and calculates odds ratios, confidence intervals and p-values for each allele. When multi-locus genotype data are available, BIGDAWG estimates user-specified haplotypes and performs the same binning and statistical calculations for each haplotype. For the HLA loci, BIGDAWG performs the same analyses at the individual amino-acid level. Finally, BIGDAWG generates figures and tables for each of these comparisons. BIGDAWG obviates the error-prone reformatting needed to traffic data between multiple programs, and streamlines and standardizes the data-analysis process for case-control studies of highly polymorphic data. BIGDAWG has been implemented as the bigdawg R package and as a free web application at bigdawg.immunogenomics.org. Copyright © 2015 American Society for Histocompatibility and Immunogenetics. Published by Elsevier Inc. All rights reserved.

  6. Strategies for Integrated Analysis of Genetic, Epigenetic, and Gene Expression Variation in Cancer

    DEFF Research Database (Denmark)

    Thingholm, Louise B; Andersen, Lars; Makalic, Enes

    2016-01-01

    The development and progression of cancer, a collection of diseases with complex genetic architectures, is facilitated by the interplay of multiple etiological factors. This complexity challenges the traditional single-platform study design and calls for an integrated approach to data analysis...... to integration strategies used for analyzing genetic risk factors for cancer. We critically examine the ability of these strategies to handle the complexity of the human genome and also accommodate information about the biological and functional interactions between the elements that have been measured...

  7. ANALYSIS OF INFLUENCE FACTORS OF ECONOMIC EFFICIENCY ON THE ECONOMY OF THE INTEGRATED STRUCTURES

    Directory of Open Access Journals (Sweden)

    I. P. Bogomolova

    2014-01-01

    Full Text Available Currently in Russia, special attention is paid to the food industry, providing a key influence on the state's economy and food security of the country. The food industry not only creates substantial part of the gross domestic product, which is one of the main sources of fillings budgets of all levels, and contributes to the strengthening of the state in world markets. These circum-stances make it necessary to increase the efficiency of industrial structures by mobilizing factors affecting the economy of enterprises, including by shifting emphasis on the integration of food industry enterprises in General competitiveness of the goods produced, the stability of the entire industry, its leading industries and organizations. The article substantiates the expediency of application of integrated structures, discusses the methods and tools of analysis of influence factors of economic efficiency on the economy of inte-grated structures. Evaluation is recommended in two key areas: assessment of the financial condition and evaluation of training and development of staff, taking into account the strategic objectives of integrated structures. The analysis makes it possible to correctly allocate financial resources and to achieve balanced economic performance management through more effective use of credit re-sources, the rational management of economic parameters optimization of the number of employees and production capacity.

  8. Transient Thermal Analysis of 3-D Integrated Circuits Packages by the DGTD Method

    KAUST Repository

    Li, Ping; Dong, Yilin; Tang, Min; Mao, Junfa; Jiang, Li Jun; Bagci, Hakan

    2017-01-01

    Since accurate thermal analysis plays a critical role in the thermal design and management of the 3-D system-level integration, in this paper, a discontinuous Galerkin time-domain (DGTD) algorithm is proposed to achieve this purpose

  9. Social and Economic Analysis Branch: integrating policy, social, economic, and natural science

    Science.gov (United States)

    Schuster, Rudy; Walters, Katie D.

    2015-01-01

    The Fort Collins Science Center's Social and Economic Analysis Branch provides unique capabilities in the U.S. Geological Survey by leading projects that integrate social, behavioral, economic, and natural science in the context of human–natural resource interactions. Our research provides scientific understanding and support for the management and conservation of our natural resources in support of multiple agency missions. We focus on meeting the scientific needs of the Department of the Interior natural resource management bureaus in addition to fostering partnerships with other Federal and State managers to protect, restore, and enhance our environment. The Social and Economic Analysis Branch has an interdisciplinary group of scientists whose primary functions are to conduct both theoretical and applied social science research, provide technical assistance, and offer training to support the development of skills in natural resource management activities. Management and research issues associated with human-resource interactions typically occur in a unique context and require knowledge of both natural and social sciences, along with the skill to integrate multiple science disciplines. In response to these challenging contexts, Social and Economic Analysis Branch researchers apply a wide variety of social science concepts and methods which complement our rangeland/agricultural, wildlife, ecology, and biology capabilities. The goal of the Social and Economic Analysis Branch's research is to enhance natural-resource management, agency functions, policies, and decisionmaking.

  10. Automated microfluidic devices integrating solid-phase extraction, fluorescent labeling, and microchip electrophoresis for preterm birth biomarker analysis.

    Science.gov (United States)

    Sahore, Vishal; Sonker, Mukul; Nielsen, Anna V; Knob, Radim; Kumar, Suresh; Woolley, Adam T

    2018-01-01

    We have developed multichannel integrated microfluidic devices for automated preconcentration, labeling, purification, and separation of preterm birth (PTB) biomarkers. We fabricated multilayer poly(dimethylsiloxane)-cyclic olefin copolymer (PDMS-COC) devices that perform solid-phase extraction (SPE) and microchip electrophoresis (μCE) for automated PTB biomarker analysis. The PDMS control layer had a peristaltic pump and pneumatic valves for flow control, while the PDMS fluidic layer had five input reservoirs connected to microchannels and a μCE system. The COC layers had a reversed-phase octyl methacrylate porous polymer monolith for SPE and fluorescent labeling of PTB biomarkers. We determined μCE conditions for two PTB biomarkers, ferritin (Fer) and corticotropin-releasing factor (CRF). We used these integrated microfluidic devices to preconcentrate and purify off-chip-labeled Fer and CRF in an automated fashion. Finally, we performed a fully automated on-chip analysis of unlabeled PTB biomarkers, involving SPE, labeling, and μCE separation with 1 h total analysis time. These integrated systems have strong potential to be combined with upstream immunoaffinity extraction, offering a compact sample-to-answer biomarker analysis platform. Graphical abstract Pressure-actuated integrated microfluidic devices have been developed for automated solid-phase extraction, fluorescent labeling, and microchip electrophoresis of preterm birth biomarkers.

  11. Photometric method for determination of acidity constants through integral spectra analysis.

    Science.gov (United States)

    Zevatskiy, Yuriy Eduardovich; Ruzanov, Daniil Olegovich; Samoylov, Denis Vladimirovich

    2015-04-15

    An express method for determination of acidity constants of organic acids, based on the analysis of the integral transmittance vs. pH dependence is developed. The integral value is registered as a photocurrent of photometric device simultaneously with potentiometric titration. The proposed method allows to obtain pKa using only simple and low-cost instrumentation. The optical part of the experimental setup has been optimized through the exclusion of the monochromator device. Thus it only takes 10-15 min to obtain one pKa value with the absolute error of less than 0.15 pH units. Application limitations and reliability of the method have been tested for a series of organic acids of various nature. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. An Analysis of Delay-based and Integrator-based Sequence Detectors for Grid-Connected Converters

    DEFF Research Database (Denmark)

    Khazraj, Hesam; Silva, Filipe Miguel Faria da; Bak, Claus Leth

    2017-01-01

    -signal cancellation operators are the main members of the delay-based sequence detectors. The aim of this paper is to provide a theoretical and experimental comparative study between integrator and delay based sequence detectors. The theoretical analysis is conducted based on the small-signal modelling......Detecting and separating positive and negative sequence components of the grid voltage or current is of vital importance in the control of grid-connected power converters, HVDC systems, etc. To this end, several techniques have been proposed in recent years. These techniques can be broadly...... classified into two main classes: The integrator-based techniques and Delay-based techniques. The complex-coefficient filter-based technique, dual second-order generalized integrator-based method, multiple reference frame approach are the main members of the integrator-based sequence detector and the delay...

  13. Integrating R and Java for Enhancing Interactivity of Algorithmic Data Analysis Software Solutions

    Directory of Open Access Journals (Sweden)

    Titus Felix FURTUNĂ

    2016-06-01

    Full Text Available Conceiving software solutions for statistical processing and algorithmic data analysis involves handling diverse data, fetched from various sources and in different formats, and presenting the results in a suggestive, tailorable manner. Our ongoing research aims to design programming technics for integrating R developing environment with Java programming language for interoperability at a source code level. The goal is to combine the intensive data processing capabilities of R programing language, along with the multitude of statistical function libraries, with the flexibility offered by Java programming language and platform, in terms of graphical user interface and mathematical function libraries. Both developing environments are multiplatform oriented, and can complement each other through interoperability. R is a comprehensive and concise programming language, benefiting from a continuously expanding and evolving set of packages for statistical analysis, developed by the open source community. While is a very efficient environment for statistical data processing, R platform lacks support for developing user friendly, interactive, graphical user interfaces (GUIs. Java on the other hand, is a high level object oriented programming language, which supports designing and developing performant and interactive frameworks for general purpose software solutions, through Java Foundation Classes, JavaFX and various graphical libraries. In this paper we treat both aspects of integration and interoperability that refer to integrating Java code into R applications, and bringing R processing sequences into Java driven software solutions. Our research has been conducted focusing on case studies concerning pattern recognition and cluster analysis.

  14. Integrated minicomputer alpha analysis system

    International Nuclear Information System (INIS)

    Vasilik, D.G.; Coy, D.E.; Seamons, M.; Henderson, R.W.; Romero, L.L.; Thomson, D.A.

    1978-01-01

    Approximately 1,000 stack and occupation air samples from plutonium and uranium facilities at LASL are analyzed daily. The concentrations of radio-nuclides in air are determined by measuring absolute alpha activities of particulates collected on air sample filter media. The Integrated Minicomputer Pulse system (IMPULSE) is an interface between many detectors of extremely simple design and a Digital Equipment Corporation (DEC) PDP-11/04 minicomputer. The detectors are photomultiplier tubes faced with zinc sulfide (ZnS). The average detector background is approximately 0.07 cpm. The IMPULSE system includes two mainframes, each of which can hold up to 64 detectors. The current hardware configuration includes 64 detectors in one mainframe and 40 detectors in the other. Each mainframe contains a minicomputer with 28K words of Random Access Memory. One minicomputer controls the detectors in both mainframes. A second computer was added for fail-safe redundancy and to support other laboratory computer requirements. The main minicomputer includes a dual floppy disk system and a dual DEC 'RK05' disk system for mass storage. The RK05 facilitates report generation and trend analysis. The IMPULSE hardware provides for passage of data from the detectors to the computer, and for passage of status and control information from the computer to the detector stations

  15. Design and analysis of heat exchanger networks for integrated Ca-looping systems

    International Nuclear Information System (INIS)

    Lara, Yolanda; Lisbona, Pilar; Martínez, Ana; Romeo, Luis M.

    2013-01-01

    Highlights: • Heat integration is essential to minimize energy penalties in calcium looping cycles. • A design and analysis of four heat exchanger networks is stated. • New design with higher power, lower costs and lower destroyed exergy than base case. - Abstract: One of the main challenges of carbon capture and storage technologies deals with the energy penalty associated with CO 2 separation and compression processes. Thus, heat integration plays an essential role in the improvement of these systems’ efficiencies. CO 2 capture systems based on Ca-looping process present a great potential for residual heat integration with a new supercritical power plant. The pinch methodology is applied in this study to define the minimum energy requirements of the process and to design four configurations for the required heat exchanger network. The Second Law of Thermodynamics represents a powerful tool for reducing the energy demand since identifying the exergy losses of the system serves to allocate inefficiencies. In parallel, an economic analysis is required to asses the cost reduction achieved by each configuration. This work presents a combination of pinch methodology with economic and exergetic analyses to select the more appropriate configuration of heat exchanger network. The lower costs and minor destroyed exergy obtained for the best proposed network result in a of 0.91% global energy efficiency increase

  16. Final report for DOE Grant No. DE-SC0006609 - Persistence of Microbially Facilitated Calcite Precipitation as an in situ Treatment for Strontium-90

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Robert W. [Univ. of Idaho, Idaho Falls, ID (United States); Fujita, Yoshiko [Idaho National Lab. (INL), Idaho Falls, ID (United States); Hubbard, Susan S. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2013-11-15

    Subsurface radionuclide and metal contaminants throughout the U.S. Department of Energy (DOE) complex pose one of DOE's greatest challenges for long-term stewardship. One promising stabilization mechanism for divalent ions, such as the short-lived radionuclide 90Sr, is co-precipitation in calcite. We have previously found that nutrient addition can stimulate microbial ureolytic activity, that this activity accelerates calcite precipitation and co-precipitation of Sr, and that higher calcite precipitation rates can result in increased Sr partitioning. We have conducted integrated field, laboratory, and computational research to evaluate the relationships between ureolysis and calcite precipitation rates and trace metal partitioning under environmentally relevant conditions, and investigated the coupling between flow/flux manipulations and precipitate distribution. A field experimental campaign conducted at the Integrated Field Research Challenge (IFRC) site located at Rifle, CO was based on a continuous recirculation design; water extracted from a down-gradient well was amended with urea and molasses (a carbon and electron donor) and re-injected into an up-gradient well. The goal of the recirculation design and simultaneous injection of urea and molasses was to uniformly accelerate the hydrolysis of urea and calcite precipitation over the entire inter-wellbore zone. The urea-molasses recirculation phase lasted, with brief interruptions for geophysical surveys, for 12 days and was followed by long-term monitoring which continued for 13 months. A post experiment core located within the inter-wellbore zone was collected on day 321 and characterized with respect to cation exchange capacity, mineral carbonate content, urease activity, ureC gene abundance, extractable ammonium (a urea hydrolysis product) content, and the 13C isotopic composition of solid carbonates. It was also subjected to selective extractions for strontium and uranium. Result

  17. Interpretation of Flow Logs from Nevada Test Site Boreholes to Estimate Hydraulic Conductivity Using Numerical Simulations Constrained by Single-Well Aquifer Tests

    Science.gov (United States)

    Garcia, C. Amanda; Halford, Keith J.; Laczniak, Randell J.

    2010-01-01

    Hydraulic conductivities of volcanic and carbonate lithologic units at the Nevada Test Site were estimated from flow logs and aquifer-test data. Borehole flow and drawdown were integrated and interpreted using a radial, axisymmetric flow model, AnalyzeHOLE. This integrated approach is used because complex well completions and heterogeneous aquifers and confining units produce vertical flow in the annular space and aquifers adjacent to the wellbore. AnalyzeHOLE simulates vertical flow, in addition to horizontal flow, which accounts for converging flow toward screen ends and diverging flow toward transmissive intervals. Simulated aquifers and confining units uniformly are subdivided by depth into intervals in which the hydraulic conductivity is estimated with the Parameter ESTimation (PEST) software. Between 50 and 150 hydraulic-conductivity parameters were estimated by minimizing weighted differences between simulated and measured flow and drawdown. Transmissivity estimates from single-well or multiple-well aquifer tests were used to constrain estimates of hydraulic conductivity. The distribution of hydraulic conductivity within each lithology had a minimum variance because estimates were constrained with Tikhonov regularization. AnalyzeHOLE simulated hydraulic-conductivity estimates for lithologic units across screened and cased intervals are as much as 100 times less than those estimated using proportional flow-log analyses applied across screened intervals only. Smaller estimates of hydraulic conductivity for individual lithologic units are simulated because sections of the unit behind cased intervals of the wellbore are not assumed to be impermeable, and therefore, can contribute flow to the wellbore. Simulated hydraulic-conductivity estimates vary by more than three orders of magnitude across a lithologic unit, indicating a high degree of heterogeneity in volcanic and carbonate-rock units. The higher water transmitting potential of carbonate-rock units relative

  18. Interpretation of Flow Logs from Nevada Test Site Boreholes to Estimate Hydraulic conductivity Using Numerical Simulations Constrained by Single-Well Aquifer Tests

    Energy Technology Data Exchange (ETDEWEB)

    Garcia, C. Amanda; Halford, Keith J.; Laczniak, Randell J.

    2010-02-12

    Hydraulic conductivities of volcanic and carbonate lithologic units at the Nevada Test Site were estimated from flow logs and aquifer-test data. Borehole flow and drawdown were integrated and interpreted using a radial, axisymmetric flow model, AnalyzeHOLE. This integrated approach is used because complex well completions and heterogeneous aquifers and confining units produce vertical flow in the annular space and aquifers adjacent to the wellbore. AnalyzeHOLE simulates vertical flow, in addition to horizontal flow, which accounts for converging flow toward screen ends and diverging flow toward transmissive intervals. Simulated aquifers and confining units uniformly are subdivided by depth into intervals in which the hydraulic conductivity is estimated with the Parameter ESTimation (PEST) software. Between 50 and 150 hydraulic-conductivity parameters were estimated by minimizing weighted differences between simulated and measured flow and drawdown. Transmissivity estimates from single-well or multiple-well aquifer tests were used to constrain estimates of hydraulic conductivity. The distribution of hydraulic conductivity within each lithology had a minimum variance because estimates were constrained with Tikhonov regularization. AnalyzeHOLE simulated hydraulic-conductivity estimates for lithologic units across screened and cased intervals are as much as 100 times less than those estimated using proportional flow-log analyses applied across screened intervals only. Smaller estimates of hydraulic conductivity for individual lithologic units are simulated because sections of the unit behind cased intervals of the wellbore are not assumed to be impermeable, and therefore, can contribute flow to the wellbore. Simulated hydraulic-conductivity estimates vary by more than three orders of magnitude across a lithologic unit, indicating a high degree of heterogeneity in volcanic and carbonate-rock units. The higher water transmitting potential of carbonate-rock units relative

  19. Application of RELAP/SCDAPSIM with integrated uncertainty options to research reactor systems thermal hydraulic analysis

    International Nuclear Information System (INIS)

    Allison, C.M.; Hohorst, J.K.; Perez, M.; Reventos, F.

    2010-01-01

    The RELAP/SCDAPSIM/MOD4.0 code, designed to predict the behavior of reactor systems during normal and accident conditions, is being developed as part of the international SCDAP Development and Training Program (SDTP). RELAP/SCDAPSIM/MOD4.0, which is the first version of RELAP5 completely rewritten to FORTRAN 90/95/2000 standards, uses publicly available RELAP5 and SCDAP models in combination with advanced programming and numerical techniques and other SDTP-member modeling/user options. One such member developed option is an integrated uncertainty analysis package being developed jointly by the Technical University of Catalonia (UPC) and Innovative Systems Software (ISS). This paper briefly summarizes the features of RELAP/SCDAPSIM/MOD4.0 and the integrated uncertainty analysis package, and then presents an example of how the integrated uncertainty package can be setup and used for a simple pipe flow problem. (author)

  20. An integrated probabilistic risk analysis decision support methodology for systems with multiple state variables

    International Nuclear Information System (INIS)

    Sen, P.; Tan, John K.G.; Spencer, David

    1999-01-01

    Probabilistic risk analysis (PRA) methods have been proven to be valuable in risk and reliability analysis. However, a weak link seems to exist between methods for analysing risks and those for making rational decisions. The integrated decision support system (IDSS) methodology presented in this paper attempts to address this issue in a practical manner. In consists of three phases: a PRA phase, a risk sensitivity analysis (SA) phase and an optimisation phase, which are implemented through an integrated computer software system. In the risk analysis phase the problem is analysed by the Boolean representation method (BRM), a PRA method that can deal with systems with multiple state variables and feedback loops. In the second phase the results obtained from the BRM are utilised directly to perform importance and risk SA. In the third phase, the problem is formulated as a multiple objective decision making problem in the form of multiple objective reliability optimisation. An industrial example is included. The resultant solutions of a five objective reliability optimisation are presented, on the basis of which rational decision making can be explored

  1. Present-day stress analysis in the St. Lawrence Lowlands from borehole breakouts and implications for Co2 injection

    Energy Technology Data Exchange (ETDEWEB)

    Konstantinovskaya, E.; Malo, M. [Institut national de la recherche scientifique, Centre Eau, Terre et Environnement (INRS-ETE), 490, de la Couronne, Quebec, G1K 9A9 (Canada)], email: Elena.Konstantinovskaya@ete.inrs.ca; Castillo, D.A.; Hughes, Baker [Reservoir Development Services, GeoMechanics, 5373 West Alabama Street, Suite 300, Houston, Texas, 77056 (United States)], email: David.Castillo@BakerHughes.com

    2011-07-01

    An essential part of reservoir-geomechanical study in the St. Lawrence Lowlands, Quebec, consists of evaluating the geomechanical response of the rock and caprock of the reservoir to different injection scenarios and the long-term storage of CO2. The first phase of this study consists of determining current stress direction. The next step consists of assessing the potential shear failure or reactivation of pre-existing faults and fracture sets caused by changes in the reservoir pressure because of CO2 injection. According to study results, the orientation of the average maximum horizontal stress direction SHmax obtained from stress-induced wellbore borehole breakouts in this area is N59 degree E plus or minus 20 degree. The wellbore breakouts in 17 wells go from 250 m to 4 km depth. A major purpose of this geomechanical study is to estimate to what extent sustainable fluid pressures for CO2 injection can be increased without causing fracturing, faulting, or reactivation of pre-existing faults. To this end, one must determine prevailing stresses (directions and magnitudes), fault geometries and rock strengths.

  2. Proportional-integral controller based small-signal analysis of hybrid distributed generation systems

    International Nuclear Information System (INIS)

    Ray, Prakash K.; Mohanty, Soumya R.; Kishor, Nand

    2011-01-01

    Research highlights: → We aim to minimize the deviation of frequency in an integrated energy resources like offshore wind, photovoltaic (PV), fuel cell (FC) and diesel engine generator (DEG) along with the energy storage elements like flywheel energy storage system (FESS) and battery energy storage system (BESS). → Further ultracapacitor (UC) as an alternative energy storage element and proportional-integral (PI) controller is addressed in order to achieve improvements in the deviation of frequency profiles. → A comparative assessment of frequency deviation for different hybrid systems is also carried out in the presence of high voltage direct current (HVDC) link and high voltage alternating current (HVAC) line. → In the study both qualitative and quantitative analysis reflects the improvements in frequency deviation profiles with use of ultracapacitor (UC) as energy storage element. -- Abstract: The large band variation in the wind speed and unpredictable solar radiation causes remarkable fluctuations of output power in offshore wind and photovoltaic system respectively, which leads to large deviation in the system frequency. In this context, to minimize the deviation in frequency, this paper presents integration of different energy resources like offshore wind, photovoltaic (PV), fuel cell (FC) and diesel engine generator (DEG) along with the energy storage elements like flywheel energy storage system (FESS) and battery energy storage system (BESS). Further ultracapacitor (UC) as an alternative energy storage element and proportional-integral (PI) controller is addressed in order to achieve improvements in the deviation of frequency profiles. A comparative assessment of frequency deviation for different hybrid systems is also carried out in the presence of high-voltage direct current (HVDC) link and high-voltage alternating current (HVAC) line. Frequency deviation for different isolated hybrid systems are presented graphically as well as in terms of

  3. Technical review of the high energy gas stimulation technique

    Energy Technology Data Exchange (ETDEWEB)

    Haney, B.; Cuthill, D. [Computalog Ltd., Calgary, AB (Canada)

    1997-08-01

    High Energy Gas Stimulation (HEGS) or propellant stimulation is a process that enhances production of oil wells by decreasing wellbore damage and increasing near wellbore permeability. The technique has been used on about 7,000 wells with varying results. The HEGS tool is a cast cylinder of solid rocket propellant with a central ignition system. The propellant is fired and as it burns it produces a pressure load on the formation, increasing fracture volume which enhances the flow channels. Background information on the development and application of this stimulation technique was provided. The introduction of fractures around a wellbore is dependent on the pressure loading rate and the dynamic response of the rock. Propellant stimulation relies on controlling the pressure-time behaviour to maximize fracture growth by fluid pressurization. The process is composed of 3 sequential phases: (1) wellbore pressurization, (2) fracture initiation, and (3) fracture extension. A full description of each of these phases was provided. Geologic and well-tool factors that have a significant influence on the fracturing process such as in-situ stress, natural fractures and flaws, formation mechanical properties, formation fluid and flow properties, formation thermal properties, and wellbore, tool, and tamp configuration, were also reviewed. The many applications for HEGS were presented. It was emphasized that the success of HEGS is dependent on pre-stimulation problem evaluation and on proper charge design. Since HEGS will decrease near-wellbore restrictions and initiate formation breakdown, it should only be used in cases where this will be beneficial to the well. Careful attention to engineering will optimize results. 21 refs., 13 figs.

  4. FUZZY DECISION ANALYSIS FOR INTEGRATED ENVIRONMENTAL VULNERABILITY ASSESSMENT OF THE MID-ATLANTIC REGION

    Science.gov (United States)

    A fuzzy decision analysis method for integrating ecological indicators is developed. This is a combination of a fuzzy ranking method and the Analytic Hierarchy Process (AHP). The method is capable ranking ecosystems in terms of environmental conditions and suggesting cumula...

  5. Integrated analysis of core debris interactions and their effects on containment integrity using the CONTAIN computer code

    International Nuclear Information System (INIS)

    Carroll, D.E.; Bergeron, K.D.; Williams, D.C.; Tills, J.L.; Valdez, G.D.

    1987-01-01

    The CONTAIN computer code includes a versatile system of phenomenological models for analyzing the physical, chemical and radiological conditions inside the containment building during severe reactor accidents. Important contributors to these conditions are the interactions which may occur between released corium and cavity concrete. The phenomena associated with interactions between ejected corium debris and the containment atmosphere (Direct Containment Heating or DCH) also pose a potential threat to containment integrity. In this paper, we describe recent enhancements of the CONTAIN code which allow an integrated analysis of these effects in the presence of other mitigating or aggravating physical processes. In particular, the recent inclusion of the CORCON and VANESA models is described and a calculation example presented. With this capability CONTAIN can model core-concrete interactions occurring simultaneously in multiple compartments and can couple the aerosols thereby generated to the mechanistic description of all atmospheric aerosol components. Also discussed are some recent results of modeling the phenomena involved in Direct Containment Heating. (orig.)

  6. ImatraNMR: Novel software for batch integration and analysis of quantitative NMR spectra

    Science.gov (United States)

    Mäkelä, A. V.; Heikkilä, O.; Kilpeläinen, I.; Heikkinen, S.

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D 1H and 13C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request.

  7. Analysis of airframe/engine interactions in integrated flight and propulsion control

    Science.gov (United States)

    Schierman, John D.; Schmidt, David K.

    1991-01-01

    An analysis framework for the assessment of dynamic cross-coupling between airframe and engine systems from the perspective of integrated flight/propulsion control is presented. This analysis involves to determining the significance of the interactions with respect to deterioration in stability robustness and performance, as well as critical frequency ranges where problems may occur due to these interactions. The analysis illustrated here investigates both the airframe's effects on the engine control loops and the engine's effects on the airframe control loops in two case studies. The second case study involves a multi-input/multi-output analysis of the airframe. Sensitivity studies are performed on critical interactions to examine the degradations in the system's stability robustness and performance. Magnitudes of the interactions required to cause instabilities, as well as the frequencies at which the instabilities occur are recorded. Finally, the analysis framework is expanded to include control laws which contain cross-feeds between the airframe and engine systems.

  8. Intra-Wellbore Head Losses in a Horizontal Well with both Kinematic and Frictional Effects in an Anisotropic Confined Aquifer between Two Streams

    Science.gov (United States)

    Wang, Q.; Zhan, H.

    2017-12-01

    Horizontal drilling becomes an appealing technology for water exploration or aquifer remediation in recent decades, due to the decreasing operational cost and many technical advantages over the vertical wells. However, many previous studies on the flow into horizontal wells were based on the uniform flux boundary condition (UFBC) for treating horizontal wells, which could not reflect the physical processes of flow inside the well accurately. In this study, we investigated transient flow into a horizontal well in an anisotropic confined aquifer between two streams for three types of boundary conditions of treating the horizontal well, including UFBC, uniform head boundary condition (UHBC), and mixed-type boundary condition (MTBC). The MTBC model considered both kinematic and frictional effects inside the horizontal well, in which the kinematic effect referred to the accelerational and fluid inflow effects. The new solution of UFBC was derived by superimposing the point sink/source solutions along the axis of the horizontal well with a uniform strength. The solutions of UHBC and MTBC were obtained by a hybrid analytical-numerical method, and an iterative method was proposed to determine the minimum well segment number required to yield sufficiently accurate answer. The results showed that the differences among the UFBC, UHBC, MTBCFriction and MTBC solutions were obvious, in which MTBCFriction represented the solutions considering the frictional effect but ignoring the kinematic effect. The MTBCFriction and MTBC solutions were sensitive to the flow rate, and the difference of these two solutions increases with the flow rate, suggesting that the kinematic effect could not be ignored for studying flow to a horizontal well, especially when the flow rate is great. The well specific inflow (WSI) (which is the inflow per unit screen length at a specified location of the horizontal well) increased with the distance along the wellbore for the MTBC model at early stage, while

  9. An Integrative Analysis to Identify Driver Genes in Esophageal Squamous Cell Carcinoma.

    Directory of Open Access Journals (Sweden)

    Genta Sawada

    Full Text Available Few driver genes have been well established in esophageal squamous cell carcinoma (ESCC. Identification of the genomic aberrations that contribute to changes in gene expression profiles can be used to predict driver genes.We searched for driver genes in ESCC by integrative analysis of gene expression microarray profiles and copy number data. To narrow down candidate genes, we performed survival analysis on expression data and tested the genetic vulnerability of each genes using public RNAi screening data. We confirmed the results by performing RNAi experiments and evaluating the clinical relevance of candidate genes in an independent ESCC cohort.We found 10 significantly recurrent copy number alterations accompanying gene expression changes, including loci 11q13.2, 7p11.2, 3q26.33, and 17q12, which harbored CCND1, EGFR, SOX2, and ERBB2, respectively. Analysis of survival data and RNAi screening data suggested that GRB7, located on 17q12, was a driver gene in ESCC. In ESCC cell lines harboring 17q12 amplification, knockdown of GRB7 reduced the proliferation, migration, and invasion capacities of cells. Moreover, siRNA targeting GRB7 had a synergistic inhibitory effect when combined with trastuzumab, an anti-ERBB2 antibody. Survival analysis of the independent cohort also showed that high GRB7 expression was associated with poor prognosis in ESCC.Our integrative analysis provided important insights into ESCC pathogenesis. We identified GRB7 as a novel ESCC driver gene and potential new therapeutic target.

  10. Analysis of thermal-plastic response of shells of revolution by numerical integration

    International Nuclear Information System (INIS)

    Leonard, J.W.

    1975-01-01

    An economic technique for the numerical analysis of the elasto-plastic behaviour of shells of revolution would be of considerable value in the nuclear reactor industry. A numerical method based on the numerical integration of the governing shell equations has been shown, for elastic cases, to be more efficient than the finite element method when applied to shells of revolution. In the numerical integration method, the governing differential equations of motion are converted into a set of initial-value problems. Each initial-value problem is integrated numerically between meridional boundary points and recombined so as to satisfy boundary conditions. For large-deflection elasto-plastic behaviour, the equations are nonlinear and, hence, are recombined in an iterative manner using the Newton-Raphson procedure. Suppression techniques are incorporated in order to eliminate extraneous solutions within the numerical integration procedure. The Reissner-Meissner shell theory for shells of revolution is adopted to account for large deflection and higher-order rotation effects. The computer modelling of the equations is quite general in that specific shell segment geometries, e.g. cylindrical, spherical, toroidal, conical segments, and any combinations thereof can be handled easily. (Auth.)

  11. Integrated dynamic landscape analysis and modeling system (IDLAMS) : installation manual.

    Energy Technology Data Exchange (ETDEWEB)

    Li, Z.; Majerus, K. A.; Sundell, R. C.; Sydelko, P. J.; Vogt, M. C.

    1999-02-24

    The Integrated Dynamic Landscape Analysis and Modeling System (IDLAMS) is a prototype, integrated land management technology developed through a joint effort between Argonne National Laboratory (ANL) and the US Army Corps of Engineers Construction Engineering Research Laboratories (USACERL). Dr. Ronald C. Sundell, Ms. Pamela J. Sydelko, and Ms. Kimberly A. Majerus were the principal investigators (PIs) for this project. Dr. Zhian Li was the primary software developer. Dr. Jeffrey M. Keisler, Mr. Christopher M. Klaus, and Mr. Michael C. Vogt developed the decision analysis component of this project. It was developed with funding support from the Strategic Environmental Research and Development Program (SERDP), a land/environmental stewardship research program with participation from the US Department of Defense (DoD), the US Department of Energy (DOE), and the US Environmental Protection Agency (EPA). IDLAMS predicts land conditions (e.g., vegetation, wildlife habitats, and erosion status) by simulating changes in military land ecosystems for given training intensities and land management practices. It can be used by military land managers to help predict the future ecological condition for a given land use based on land management scenarios of various levels of training intensity. It also can be used as a tool to help land managers compare different land management practices and further determine a set of land management activities and prescriptions that best suit the needs of a specific military installation.

  12. Calcium-deficiency assessment and biomarker identification by an integrated urinary metabonomics analysis

    Science.gov (United States)

    2013-01-01

    Background Calcium deficiency is a global public-health problem. Although the initial stage of calcium deficiency can lead to metabolic alterations or potential pathological changes, calcium deficiency is difficult to diagnose accurately. Moreover, the details of the molecular mechanism of calcium deficiency remain somewhat elusive. To accurately assess and provide appropriate nutritional intervention, we carried out a global analysis of metabolic alterations in response to calcium deficiency. Methods The metabolic alterations associated with calcium deficiency were first investigated in a rat model, using urinary metabonomics based on ultra-performance liquid chromatography coupled with quadrupole time-of-flight tandem mass spectrometry and multivariate statistical analysis. Correlations between dietary calcium intake and the biomarkers identified from the rat model were further analyzed to confirm the potential application of these biomarkers in humans. Results Urinary metabolic-profiling analysis could preliminarily distinguish between calcium-deficient and non-deficient rats after a 2-week low-calcium diet. We established an integrated metabonomics strategy for identifying reliable biomarkers of calcium deficiency using a time-course analysis of discriminating metabolites in a low-calcium diet experiment, repeating the low-calcium diet experiment and performing a calcium-supplement experiment. In total, 27 biomarkers were identified, including glycine, oxoglutaric acid, pyrophosphoric acid, sebacic acid, pseudouridine, indoxyl sulfate, taurine, and phenylacetylglycine. The integrated urinary metabonomics analysis, which combined biomarkers with regular trends of change (types A, B, and C), could accurately assess calcium-deficient rats at different stages and clarify the dynamic pathophysiological changes and molecular mechanism of calcium deficiency in detail. Significant correlations between calcium intake and two biomarkers, pseudouridine (Pearson

  13. Integration of numerical analysis tools for automated numerical optimization of a transportation package design

    International Nuclear Information System (INIS)

    Witkowski, W.R.; Eldred, M.S.; Harding, D.C.

    1994-01-01

    The use of state-of-the-art numerical analysis tools to determine the optimal design of a radioactive material (RAM) transportation container is investigated. The design of a RAM package's components involves a complex coupling of structural, thermal, and radioactive shielding analyses. The final design must adhere to very strict design constraints. The current technique used by cask designers is uncoupled and involves designing each component separately with respect to its driving constraint. With the use of numerical optimization schemes, the complex couplings can be considered directly, and the performance of the integrated package can be maximized with respect to the analysis conditions. This can lead to more efficient package designs. Thermal and structural accident conditions are analyzed in the shape optimization of a simplified cask design. In this paper, details of the integration of numerical analysis tools, development of a process model, nonsmoothness difficulties with the optimization of the cask, and preliminary results are discussed

  14. Explaining Technology Integration in K-12 Classrooms: A Multilevel Path Analysis Model

    Science.gov (United States)

    Liu, Feng; Ritzhaupt, Albert D.; Dawson, Kara; Barron, Ann E.

    2017-01-01

    The purpose of this research was to design and test a model of classroom technology integration in the context of K-12 schools. The proposed multilevel path analysis model includes teacher, contextual, and school related variables on a teacher's use of technology and confidence and comfort using technology as mediators of classroom technology…

  15. Flipping the Audience Script: An Activity That Integrates Research and Audience Analysis

    Science.gov (United States)

    Lam, Chris; Hannah, Mark A.

    2016-01-01

    This article describes a flipped classroom activity that requires students to integrate research and audience analysis. The activity uses Twitter as a data source. In the activity, students identify a sample, collect customer tweets, and analyze the language of the tweets in an effort to construct knowledge about an audience's values, needs, and…

  16. Implementation of a variable-step integration technique for nonlinear structural dynamic analysis

    International Nuclear Information System (INIS)

    Underwood, P.; Park, K.C.

    1977-01-01

    The paper presents the implementation of a recently developed unconditionally stable implicit time integration method into a production computer code for the transient response analysis of nonlinear structural dynamic systems. The time integrator is packaged with two significant features; a variable step size that is automatically determined and this is accomplished without additional matrix refactorizations. The equations of motion solved by the time integrator must be cast in the pseudo-force form, and this provides the mechanism for controlling the step size. Step size control is accomplished by extrapolating the pseudo-force to the next time (the predicted pseudo-force), then performing the integration step and then recomputing the pseudo-force based on the current solution (the correct pseudo-force); from this data an error norm is constructed, the value of which determines the step size for the next step. To avoid refactoring the required matrix with each step size change a matrix scaling technique is employed, which allows step sizes to change by a factor of 100 without refactoring. If during a computer run the integrator determines it can run with a step size larger than 100 times the original minimum step size, the matrix is refactored to take advantage of the larger step size. The strategy for effecting these features are discussed in detail. (Auth.)

  17. Integration of TGS and CTEN assays using the CTENFIT analysis and databasing program

    International Nuclear Information System (INIS)

    Estep, R.

    2000-01-01

    The CTEN F IT program, written for Windows 9x/NT in C++, performs databasing and analysis of combined thermal/epithermal neutron (CTEN) passive and active neutron assay data and integrates that with isotopics results and gamma-ray data from methods such as tomographic gamma scanning (TGS). The binary database is reflected in a companion Excel database that allows extensive customization via Visual Basic for Applications macros. Automated analysis options make the analysis of the data transparent to the assay system operator. Various record browsers and information displays simplified record keeping tasks

  18. Integrative analysis of gene expression and DNA methylation using unsupervised feature extraction for detecting candidate cancer biomarkers.

    Science.gov (United States)

    Moon, Myungjin; Nakai, Kenta

    2018-04-01

    Currently, cancer biomarker discovery is one of the important research topics worldwide. In particular, detecting significant genes related to cancer is an important task for early diagnosis and treatment of cancer. Conventional studies mostly focus on genes that are differentially expressed in different states of cancer; however, noise in gene expression datasets and insufficient information in limited datasets impede precise analysis of novel candidate biomarkers. In this study, we propose an integrative analysis of gene expression and DNA methylation using normalization and unsupervised feature extractions to identify candidate biomarkers of cancer using renal cell carcinoma RNA-seq datasets. Gene expression and DNA methylation datasets are normalized by Box-Cox transformation and integrated into a one-dimensional dataset that retains the major characteristics of the original datasets by unsupervised feature extraction methods, and differentially expressed genes are selected from the integrated dataset. Use of the integrated dataset demonstrated improved performance as compared with conventional approaches that utilize gene expression or DNA methylation datasets alone. Validation based on the literature showed that a considerable number of top-ranked genes from the integrated dataset have known relationships with cancer, implying that novel candidate biomarkers can also be acquired from the proposed analysis method. Furthermore, we expect that the proposed method can be expanded for applications involving various types of multi-omics datasets.

  19. SEURAT: visual analytics for the integrated analysis of microarray data.

    Science.gov (United States)

    Gribov, Alexander; Sill, Martin; Lück, Sonja; Rücker, Frank; Döhner, Konstanze; Bullinger, Lars; Benner, Axel; Unwin, Antony

    2010-06-03

    In translational cancer research, gene expression data is collected together with clinical data and genomic data arising from other chip based high throughput technologies. Software tools for the joint analysis of such high dimensional data sets together with clinical data are required. We have developed an open source software tool which provides interactive visualization capability for the integrated analysis of high-dimensional gene expression data together with associated clinical data, array CGH data and SNP array data. The different data types are organized by a comprehensive data manager. Interactive tools are provided for all graphics: heatmaps, dendrograms, barcharts, histograms, eventcharts and a chromosome browser, which displays genetic variations along the genome. All graphics are dynamic and fully linked so that any object selected in a graphic will be highlighted in all other graphics. For exploratory data analysis the software provides unsupervised data analytics like clustering, seriation algorithms and biclustering algorithms. The SEURAT software meets the growing needs of researchers to perform joint analysis of gene expression, genomical and clinical data.

  20. SEURAT: Visual analytics for the integrated analysis of microarray data

    Directory of Open Access Journals (Sweden)

    Bullinger Lars

    2010-06-01

    Full Text Available Abstract Background In translational cancer research, gene expression data is collected together with clinical data and genomic data arising from other chip based high throughput technologies. Software tools for the joint analysis of such high dimensional data sets together with clinical data are required. Results We have developed an open source software tool which provides interactive visualization capability for the integrated analysis of high-dimensional gene expression data together with associated clinical data, array CGH data and SNP array data. The different data types are organized by a comprehensive data manager. Interactive tools are provided for all graphics: heatmaps, dendrograms, barcharts, histograms, eventcharts and a chromosome browser, which displays genetic variations along the genome. All graphics are dynamic and fully linked so that any object selected in a graphic will be highlighted in all other graphics. For exploratory data analysis the software provides unsupervised data analytics like clustering, seriation algorithms and biclustering algorithms. Conclusions The SEURAT software meets the growing needs of researchers to perform joint analysis of gene expression, genomical and clinical data.